Quality data collection and management technology of aerospace complex product assembly process
NASA Astrophysics Data System (ADS)
Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo
2017-04-01
Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
ERIC Educational Resources Information Center
Radulescu, Iulian Ionut
2006-01-01
Software complexity is the most important software quality attribute and a very useful instrument in the study of software quality. Is one of the factors that affect most of the software quality characteristics, including maintainability. It is very important to quantity this influence and identify the means to keep it under control; by using…
NASA Astrophysics Data System (ADS)
Huang, Shengzhou; Li, Mujun; Shen, Lianguan; Qiu, Jinfeng; Zhou, Youquan
2018-03-01
A novel fabrication method for high quality aspheric microlens array (MLA) was developed by combining the dose-modulated DMD-based lithography and surface thermal reflow process. In this method, the complex shape of aspheric microlens is pre-modeled via dose modulation in a digital micromirror device (DMD) based maskless projection lithography. And the dose modulation mainly depends on the distribution of exposure dose of photoresist. Then the pre-shaped aspheric microlens is polished by a following non-contact thermal reflow (NCTR) process. Different from the normal process, the reflow process here is investigated to improve the surface quality while keeping the pre-modeled shape unchanged, and thus will avoid the difficulties in generating the aspheric surface during reflow. Fabrication of a designed aspheric MLA with this method was demonstrated in experiments. Results showed that the obtained aspheric MLA was good in both shape accuracy and surface quality. The presented method may be a promising approach in rapidly fabricating high quality aspheric microlens with complex surface.
CNC Machining Of The Complex Copper Electrodes
NASA Astrophysics Data System (ADS)
Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina
2015-07-01
This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.
Individual Differences in Study Processes and the Quality of Learning Outcomes.
ERIC Educational Resources Information Center
Biggs, John
1979-01-01
The relationship between students' study processes and the structural complexity of their learning is examined. Study processes are viewed in terms of three dimensions and are assessed by a questionnaire. Learning quality is expressed in levels of a taxonomy. A study that relates taxonomic levels and retention to study processes is reported.…
Dimensional Precision Research of Wax Molding Rapid Prototyping based on Droplet Injection
NASA Astrophysics Data System (ADS)
Mingji, Huang; Geng, Wu; yan, Shan
2017-11-01
The traditional casting process is complex, the mold is essential products, mold quality directly affect the quality of the product. With the method of rapid prototyping 3D printing to produce mold prototype. The utility wax model has the advantages of high speed, low cost and complex structure. Using the orthogonal experiment as the main method, analysis each factors of size precision. The purpose is to obtain the optimal process parameters, to improve the dimensional accuracy of production based on droplet injection molding.
[Complex automatic data processing in multi-profile hospitals].
Dovzhenko, Iu M; Panov, G D
1990-01-01
The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.
Self-Reacting Friction Stir Welding for Aluminum Complex Curvature Applications
NASA Technical Reports Server (NTRS)
Brown, Randy J.; Martin, W.; Schneider, J.; Hartley, P. J.; Russell, Carolyn; Lawless, Kirby; Jones, Chip
2003-01-01
This viewgraph representation provides an overview of sucessful research conducted by Lockheed Martin and NASA to develop an advanced self-reacting friction stir technology for complex curvature aluminum alloys. The research included weld process development for 0.320 inch Al 2219, sucessful transfer from the 'lab' scale to the production scale tool and weld quality exceeding strenght goals. This process will enable development and implementation of large scale complex geometry hardware fabrication. Topics covered include: weld process development, weld process transfer, and intermediate hardware fabrication.
Systems and processes that ensure high quality care.
Bassett, Sally; Westmore, Kathryn
2012-10-01
This is the second in a series of articles examining the components of good corporate governance. It considers how the structures and processes for quality governance can affect an organisation's ability to be assured about the quality of care. Complex information systems and procedures can lead to poor quality care, but sound structures and processes alone are insufficient to ensure good governance, and behavioural factors play a significant part in making sure that staff are enabled to provide good quality care. The next article in this series looks at how the information reporting of an organisation can affect its governance.
Complexity leadership: a healthcare imperative.
Weberg, Dan
2012-01-01
The healthcare system is plagued with increasing cost and poor quality outcomes. A major contributing factor for these issues is that outdated leadership practices, such as leader-centricity, linear thinking, and poor readiness for innovation, are being used in healthcare organizations. Complexity leadership theory provides a new framework with which healthcare leaders may practice leadership. Complexity leadership theory conceptualizes leadership as a continual process that stems from collaboration, complex systems thinking, and innovation mindsets. Compared to transactional and transformational leadership concepts, complexity leadership practices hold promise to improve cost and quality in health care. © 2012 Wiley Periodicals, Inc.
Using machine-learning methods to analyze economic loss function of quality management processes
NASA Astrophysics Data System (ADS)
Dzedik, V. A.; Lontsikh, P. A.
2018-05-01
During analysis of quality management systems, their economic component is often analyzed insufficiently. To overcome this issue, it is necessary to withdraw the concept of economic loss functions from tolerance thinking and address it. Input data about economic losses in processes have a complex form, thus, using standard tools to solve this problem is complicated. Use of machine learning techniques allows one to obtain precise models of the economic loss function based on even the most complex input data. Results of such analysis contain data about the true efficiency of a process and can be used to make investment decisions.
Assessing Quality in Home Visiting Programs
ERIC Educational Resources Information Center
Korfmacher, Jon; Laszewski, Audrey; Sparr, Mariel; Hammel, Jennifer
2013-01-01
Defining quality and designing a quality assessment measure for home visitation programs is a complex and multifaceted undertaking. This article summarizes the process used to create the Home Visitation Program Quality Rating Tool (HVPQRT) and identifies next steps for its development. The HVPQRT measures both structural and dynamic features of…
Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel
NASA Astrophysics Data System (ADS)
Xie, Yanmin
2011-08-01
Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.
Supplier selection based on complex indicator of finished products quality
NASA Astrophysics Data System (ADS)
Chernikova, Anna; Golovkina, Svetlana; Kuzmina, Svetlana; Demenchenok, Tatiana
2017-10-01
In the article the authors consider possible directions of solving problems when selecting a supplier for deliveries of raw materials and materials of an industrial enterprise, possible difficulties are analyzed and ways of their solution are suggested. Various methods are considered to improve the efficiency of the supplier selection process based on the analysis of the paper bags supplier selection process for the needs of the construction company. In the article the calculation of generalized indicators and complex indicator, which should include single indicators, formed in groups that reflect different aspects of quality, is presented.
Quality Improvement Process in a Large Intensive Care Unit: Structure and Outcomes.
Reddy, Anita J; Guzman, Jorge A
2016-11-01
Quality improvement in the health care setting is a complex process, and even more so in the critical care environment. The development of intensive care unit process measures and quality improvement strategies are associated with improved outcomes, but should be individualized to each medical center as structure and culture can differ from institution to institution. The purpose of this report is to describe the structure of quality improvement processes within a large medical intensive care unit while using examples of the study institution's successes and challenges in the areas of stat antibiotic administration, reduction in blood product waste, central line-associated bloodstream infections, and medication errors. © The Author(s) 2015.
Measuring the quality of therapeutic apheresis care in the pediatric intensive care unit.
Sussmane, Jeffrey B; Torbati, Dan; Gitlow, Howard S
2012-01-01
Our goal was to measure the quality of care provided in the Pediatric Intensive Care Unit (PICU) during Therapeutic Apheresis (TA). We described the care as a step by step process. We designed a flow chart to carefully document each step of the process. We then defined each step with a unique clinical indictor (CI) that represented the exact task we felt provided quality care. These CIs were studied and modified for 1 year. We measured our performance in this process by the number of times we accomplished the CI vs. the total number of CIs that were to be performed. The degree of compliance, with these clinical indicators, was analyzed and used as a metric for quality by calculating how close the process is running exactly as planned or "in control." The Apheresis Process was in control (compliance) for 47% of the indicators, as measured in the aggregate for the first observational year. We then applied the theory of Total Quality Management (TQM) through our Design, Measure, Analyze, Improve, and Control (DMAIC) model. We were able to improve the process and bring it into control by increasing the compliance to > 99.74%, in the aggregate, for the third and fourth quarter of the second year. We have implemented TQM to increase compliance, thus control, of a highly complex and multidisciplinary Pediatric Intensive Care therapy. We have shown a reproducible and scalable measure of quality for a complex clinical process in the PICU, without additional capital expenditure. Copyright © 2011 Wiley-Liss, Inc.
Lucyk, Kelsey; Tang, Karen; Quan, Hude
2017-11-22
Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.
TU-FG-201-10: Quality Management of Accelerated Partial Breast Irradiation (APBI) Plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ji, H; Lorio, V; Cernica, G
2016-06-15
Purpose: Since 2008, over 700 patients received high dose rate (HDR) APBI treatment at Virginia Hospital Center. The complexity involved in the planning process demonstrated a broad variation between patient geometry across all applicators, in relation to anatomical regions of interest. A quality management program instituting various metrics was implemented in March 2013 with the goal of ensuring an optimal plan is achieved for each patient. Methods: For each plan, an in-house complexity index, geometric conformity index, and plan quality index were defined. These indices were obtained for all patients treated. For patients treated after the implementation, the conformity indexmore » and quality index were maximized while other dosimetric limits, such as maximum skin and rib doses, were strictly kept. Subsequently, all evaluation parameters and applicator information were placed in a database for cross-evaluation with similar complexity. Results: Both the conformity and quality indices show good correlation with the complexity index. They decrease as complexity increases for all applicators. Multi lumen type balloon applicators demonstrate a minimal advantage over single lumen applicators in increasingly complex patient geometries, as compared to SAVI applicators which showed considerably greater advantage in these circumstances. After the implementation of the in-house planning protocol, there is a direct improvement of quality for SAVI plans. Conclusion: Due to their interstitial nature, SAVI devices show a better conformity in comparison to balloon-based devices regardless of the number of lumens, especially in complex cases. The quality management program focuses on optimizing indices by utilizing prior planning knowledge based on complexity levels. The database of indices assists in decision making and has subsequently aided in balancing the experience level among planners. This approach has made APBI planning more robust for patient care, with a measurable improvement in the plan quality.« less
Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Wang, Gui-You; Zhao, Zi-Yu; Jia, Xiao-Bin
2013-11-01
As traditional Chinese medicine (TCM) preparation products feature complex compounds and multiple preparation processes, the implementation of quality control in line with the characteristics of TCM preparation products provides a firm guarantee for the clinical efficacy and safety of TCM preparation products. Danshen infusion solution is a preparation commonly used in clinic, but its quality control is restricted to indexes of finished products, which can not guarantee its inherent quality. Our study group has proposed "multi-dimensional structure and process dynamics quality control system" on the basis of "component structure theory", for the purpose of controlling the quality of Danshen infusion solution at multiple levels and in multiple links from the efficacy-related material basis, the safety-related material basis, the characteristics of dosage form to the preparation process. This article, we bring forth new ideas and models to the quality control of TCM preparation products.
Diederichs, Sylvia; Korona, Anna; Staaden, Antje; Kroutil, Wolfgang; Honda, Kohsuke; Ohtake, Hisao; Büchs, Jochen
2014-11-07
Media containing yeast extracts and other complex raw materials are widely used for the cultivation of microorganisms. However, variations in the specific nutrient composition can occur, due to differences in the complex raw material ingredients and in the production of these components. These lot-to-lot variations can affect growth rate, product yield and product quality in laboratory investigations and biopharmaceutical production processes. In the FDA's Process Analytical Technology (PAT) initiative, the control and assessment of the quality of critical raw materials is one key aspect to maintain product quality and consistency. In this study, the Respiration Activity Monitoring System (RAMOS) was used to evaluate the impact of different yeast extracts and commercial complex auto-induction medium lots on metabolic activity and product yield of four recombinant Escherichia coli variants encoding different enzymes. Under non-induced conditions, the oxygen transfer rate (OTR) of E. coli was not affected by a variation of the supplemented yeast extract lot. The comparison of E. coli cultivations under induced conditions exhibited tremendous differences in OTR profiles and volumetric activity for all investigated yeast extract lots of different suppliers as well as lots of the same supplier independent of the E. coli variant. Cultivation in the commercial auto-induction medium lots revealed the same reproducible variations. In cultivations with parallel offline analysis, the highest volumetric activity was found at different cultivation times. Only by online monitoring of the cultures, a distinct cultivation phase (e.g. glycerol depletion) could be detected and chosen for comparable and reproducible offline analysis of the yield of functional product. This work proves that cultivations conducted in complex media may be prone to significant variation in final product quality and quantity if the quality of the raw material for medium preparation is not thoroughly checked. In this study, the RAMOS technique enabled a reliable and reproducible screening and phenotyping of complex raw material lots by online measurement of the respiration activity. Consequently, complex raw material lots can efficiently be assessed if the distinct effects on culture behavior and final product quality and quantity are visualized.
[Quality assurance and quality management in intensive care].
Notz, K; Dubb, R; Kaltwasser, A; Hermes, C; Pfeffer, S
2015-11-01
Treatment success in hospitals, particularly in intensive care units, is directly tied to quality of structure, process, and outcomes. Technological and medical advancements lead to ever more complex treatment situations with highly specialized tasks in intensive care nursing. Quality criteria that can be used to describe and correctly measure those highly complex multiprofessional situations have only been recently developed and put into practice.In this article, it will be shown how quality in multiprofessional teams can be definded and assessed in daily clinical practice. Core aspects are the choice of a nursing theory, quality assurance measures, and quality management. One possible option of quality assurance is the use of standard operating procedures (SOPs). Quality can ultimately only be achieved if professional groups think beyond their boundaries, minimize errors, and establish and live out instructions and SOPs.
Horvat, Ana; Filipovic, Jovan
2018-02-01
This research focuses on Complexity Leadership Theory and the relationship between leadership-examined through the lens of Complexity Leadership Theory-and organizational maturity as an indicator of the performance of health organizations. The research adopts a perspective that conceptualizes organizations as complex adaptive systems and draws upon a survey of opinion of 189 managers working in Serbian health organizations. As the results indicate a dependency between functions of leadership and levels of the maturity of health organizations, we propose a model that connects the two. The study broadens our understanding of the implications of complexity thinking and its reflection on leadership functions and overall organizational performance. The correlations between leadership functions and maturity could have practical applications in policy processing, thus improving the quality of outcomes and the overall level of service quality. © 2017 John Wiley & Sons, Ltd.
Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.
Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald
2017-07-01
The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Telli, Godfrey
2013-01-01
Quality of education is a complex concept. Numerous studies attribute quality of education as an inclusive term that contains access and input on the one hand and process, output or outcome on the other. Others regard access and input of education as separate but equally important concepts of quality of education. For the latter, quality of…
McLean, G; Sutton, M; Guthrie, B
2006-11-01
To examine whether the quality of primary care measured by the 2004 contract varies with socioeconomic deprivation. Retrospective analysis of publicly available data, comparing quality indicators used for payment that allow exclusion of patients (payment quality) and indicators based on the care delivered to all patients (delivered quality). 1024 general practices in Scotland. Regression coefficients summarising the relationships between deprivation and payment and delivered quality. Little systematic association is found between payment quality and deprivation but, for 17 of the 33 indicators examined, delivered quality falls with increasing deprivation. Absolute differences in delivered quality are small for most simpler process measures, such as recording of smoking status or blood pressure. Greater inequalities are seen for more complex process measures such as diagnostic procedures, some intermediate outcome measures such as glycaemic control in diabetes and measures of treatment such as influenza immunisation. The exclusions system succeeds in not penalising practices financially for the characteristics of the population they serve, but does not reward the additional work required in deprived areas and contributes to a continuation of the inverse care law. The contract data collected prevent examination of most complex process or treatment measures and this analysis is likely to underestimate the extent of continuing inequalities in care. Broader lessons cannot be drawn on the effect on inequalities of this new set of incentives until changes are made to the way contract data are collected and analysed.
Waters, Deborah M; Arendt, Elke K; Moroni, Alice V
2017-01-22
Quality of coffee is a complex trait and is influenced by physical and sensory parameters. A complex succession of transformations during the processing of seeds to roasted coffee will inevitably influence the in-cup attributes of coffee. Germination and fermentation of the beans are two bioprocesses that take place during post-harvest treatment, and may lead to significant modifications of coffee attributes. The aim of this review is to address the current knowledge of dynamics of these two processes and their significance for bean modifications and coffee quality. The first part of this review gives an overview of coffee germination and its influence on coffee chemistry and quality. The germination process initiates while these non-orthodox seeds are still inside the cherry. This process is asynchronous and the evolution of germination depends on how the beans are processed. A range of metabolic reactions takes place during germination and can influence the carbohydrate, protein, and lipid composition of the beans. The second part of this review focuses on the microbiota associated with the beans during post-harvesting, exploring its effects on coffee quality and safety. The microbiota associated with the coffee cherries and beans comprise several bacterial, yeast, and fungal species and affects the processing from cherries to coffee beans. Indigenous bacteria and yeasts play a role in the degradation of pulp/mucilage, and their metabolism can affect the sensory attributes of coffee. On the other hand, the fungal population occurring during post-harvest and storage negatively affects coffee quality, especially regarding spoilage, off-tastes, and mycotoxin production.
Simplified process model discovery based on role-oriented genetic mining.
Zhao, Weidong; Liu, Xi; Dai, Weihui
2014-01-01
Process mining is automated acquisition of process models from event logs. Although many process mining techniques have been developed, most of them are based on control flow. Meanwhile, the existing role-oriented process mining methods focus on correctness and integrity of roles while ignoring role complexity of the process model, which directly impacts understandability and quality of the model. To address these problems, we propose a genetic programming approach to mine the simplified process model. Using a new metric of process complexity in terms of roles as the fitness function, we can find simpler process models. The new role complexity metric of process models is designed from role cohesion and coupling, and applied to discover roles in process models. Moreover, the higher fitness derived from role complexity metric also provides a guideline for redesigning process models. Finally, we conduct case study and experiments to show that the proposed method is more effective for streamlining the process by comparing with related studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, X; Yi, J; Xie, C
Purpose: To evaluate the impact of complexity indices on the plan quality and deliverability of volumetric modulated arc therapy (VMAT), and to determine the most significant parameters in the generation of an ideal VMAT plan. Methods: A multi-dimensional exploratory statistical method, canonical correlation analysis (CCA) was adopted to study the correlations between VMAT parameters of complexity, quality and deliverability, as well as their contribution weights with 32 two-arc VMAT nasopharyngeal cancer (NPC) patients and 31 one-arc VMAT prostate cancer patients. Results: The MU per arc (MU/Arc) and MU per control point (MU/CP) of NPC were 337.8±25.2 and 3.7±0.3, respectively, whichmore » were significantly lower than those of prostate cancer patients (MU/Arc : 506.9±95.4, MU/CP : 5.6±1.1). The plan complexity indices indicated that two-arc VMAT plans were more complex than one-arc VMAT plans. Plan quality comparison confirmed that one-arc VMAT plans had a high quality than two-arc VMAT plans. CCA results implied that plan complexity parameters were highly correlated with plan quality with the first two canonical correlations of 0.96, 0.88 (both p<0.001) and significantly correlated with deliverability with the first canonical correlation of 0.79 (p<0.001), plan quality and deliverability was also correlated with the first canonical correlation of 0.71 (p=0.02). Complexity parameters of MU/CP, segment area (SA) per CP, percent of MU/CP less 3 and planning target volume (PTV) were weighted heavily in correlation with plan quality and deliveability . Similar results obtained from individual NPC and prostate CCA analysis. Conclusion: Relationship between complexity, quality, and deliverability parameters were investigated with CCA. MU, SA related parameters and PTV volume were found to have strong effect on the plan quality and deliverability. The presented correlation among different quantified parameters could be used to improve the plan quality and the efficiency of the radiotherapy process when creating a complex VMAT plan.« less
ERIC Educational Resources Information Center
Beerkens, Maarja
2015-01-01
Higher education quality assurance systems develop within a complex political environment where national level goals and priorities interact with European and global developments. Furthermore, quality assurance is influenced by broader processes in the public sector that set expectations with respect to accountability, legitimacy and regulatory…
USDA-ARS?s Scientific Manuscript database
Pasta is a simple food made from water and durum wheat (Triticum turgidum subsp. durum) semolina. As pasta increases in popularity, studies have endeavored to analyze the attributes that contribute to high quality pasta. Despite being a simple food, the laboratory scale analysis of pasta quality is ...
Improving the Quality of E-Learning: Lessons from the eMM
ERIC Educational Resources Information Center
Marshall, S.
2012-01-01
The quality of e-learning can be defined in many different ways, reflecting different stakeholders and the complexity of the systems and processes used in higher education. These different conceptions of quality can be mutually contradictory and, while politically significant, may also be beyond the direct control or influence of institutional…
NASA Astrophysics Data System (ADS)
Salha, A. A.; Stevens, D. K.
2013-12-01
This study presents numerical application and statistical development of Stream Water Quality Modeling (SWQM) as a tool to investigate, manage, and research the transport and fate of water pollutants in Lower Bear River, Box elder County, Utah. The concerned segment under study is the Bear River starting from Cutler Dam to its confluence with the Malad River (Subbasin HUC 16010204). Water quality problems arise primarily from high phosphorus and total suspended sediment concentrations that were caused by five permitted point source discharges and complex network of canals and ducts of varying sizes and carrying capacities that transport water (for farming and agriculture uses) from Bear River and then back to it. Utah Department of Environmental Quality (DEQ) has designated the entire reach of the Bear River between Cutler Reservoir and Great Salt Lake as impaired. Stream water quality modeling (SWQM) requires specification of an appropriate model structure and process formulation according to nature of study area and purpose of investigation. The current model is i) one dimensional (1D), ii) numerical, iii) unsteady, iv) mechanistic, v) dynamic, and vi) spatial (distributed). The basic principle during the study is using mass balance equations and numerical methods (Fickian advection-dispersion approach) for solving the related partial differential equations. Model error decreases and sensitivity increases as a model becomes more complex, as such: i) uncertainty (in parameters, data input and model structure), and ii) model complexity, will be under investigation. Watershed data (water quality parameters together with stream flow, seasonal variations, surrounding landscape, stream temperature, and points/nonpoint sources) were obtained majorly using the HydroDesktop which is a free and open source GIS enabled desktop application to find, download, visualize, and analyze time series of water and climate data registered with the CUAHSI Hydrologic Information System. Processing, assessment of validity, and distribution of time-series data was explored using the GNU R language (statistical computing and graphics environment). Physical, chemical, and biological processes equations were written in FORTRAN codes (High Performance Fortran) in order to compute and solve their hyperbolic and parabolic complexities. Post analysis of results conducted using GNU R language. High performance computing (HPC) will be introduced to expedite solving complex computational processes using parallel programming. It is expected that the model will assess nonpoint sources and specific point sources data to understand pollutants' causes, transfer, dispersion, and concentration in different locations of Bear River. Investigation the impact of reduction/removal in non-point nutrient loading to Bear River water quality management could be addressed. Keywords: computer modeling; numerical solutions; sensitivity analysis; uncertainty analysis; ecosystem processes; high Performance computing; water quality.
Quality assurance paradigms for artificial intelligence in modelling and simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oren, T.I.
1987-04-01
New classes of quality assurance concepts and techniques are required for the advanced knowledge-processing paradigms (such as artificial intelligence, expert systems, or knowledge-based systems) and the complex problems that only simulative systems can cope with. A systematization of quality assurance problems as well as examples are given to traditional and cognizant quality assurance techniques in traditional and cognizant modelling and simulation.
A Process Management System for Networked Manufacturing
NASA Astrophysics Data System (ADS)
Liu, Tingting; Wang, Huifen; Liu, Linyan
With the development of computer, communication and network, networked manufacturing has become one of the main manufacturing paradigms in the 21st century. Under the networked manufacturing environment, there exist a large number of cooperative tasks susceptible to alterations, conflicts caused by resources and problems of cost and quality. This increases the complexity of administration. Process management is a technology used to design, enact, control, and analyze networked manufacturing processes. It supports efficient execution, effective management, conflict resolution, cost containment and quality control. In this paper we propose an integrated process management system for networked manufacturing. Requirements of process management are analyzed and architecture of the system is presented. And a process model considering process cost and quality is developed. Finally a case study is provided to explain how the system runs efficiently.
Analysis of weather patterns associated with air quality degradation and potential health impacts
Emissions from anthropogenic and natural sources into the atmosphere are determined in large measure by prevailing weather conditions through complex physical, dynamical and chemical processes. Air pollution episodes are characterized by degradation in air quality as reflected by...
Quality assurance (QA) of information technology (IT) and Information Management (IM) systems help to ensure that the end product is of known quality and integrity. As the complexity of IT & IM processes increase, so does the need for regular QA evaluation.
The areas revi...
USDA-ARS?s Scientific Manuscript database
Wheat quality is defined by culinary end-uses and processing characteristics. Wheat breeders are interested to identify quantitative trait loci for grain, milling, and end-use quality traits because it is imperative to understand the genetic complexity underlying quantitatively inherited traits to ...
A laser-based vision system for weld quality inspection.
Huang, Wei; Kovacevic, Radovan
2011-01-01
Welding is a very complex process in which the final weld quality can be affected by many process parameters. In order to inspect the weld quality and detect the presence of various weld defects, different methods and systems are studied and developed. In this paper, a laser-based vision system is developed for non-destructive weld quality inspection. The vision sensor is designed based on the principle of laser triangulation. By processing the images acquired from the vision sensor, the geometrical features of the weld can be obtained. Through the visual analysis of the acquired 3D profiles of the weld, the presences as well as the positions and sizes of the weld defects can be accurately identified and therefore, the non-destructive weld quality inspection can be achieved.
A Laser-Based Vision System for Weld Quality Inspection
Huang, Wei; Kovacevic, Radovan
2011-01-01
Welding is a very complex process in which the final weld quality can be affected by many process parameters. In order to inspect the weld quality and detect the presence of various weld defects, different methods and systems are studied and developed. In this paper, a laser-based vision system is developed for non-destructive weld quality inspection. The vision sensor is designed based on the principle of laser triangulation. By processing the images acquired from the vision sensor, the geometrical features of the weld can be obtained. Through the visual analysis of the acquired 3D profiles of the weld, the presences as well as the positions and sizes of the weld defects can be accurately identified and therefore, the non-destructive weld quality inspection can be achieved. PMID:22344308
Considerations In The Design And Specifications Of An Automatic Inspection System
NASA Astrophysics Data System (ADS)
Lee, David T.
1980-05-01
Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.
Korom-Djakovic, Danijela; Canamucio, Anne; Lempa, Michele; Yano, Elizabeth M; Long, Judith A
2016-01-01
This study examined how aspects of quality improvement (QI) culture changed during the introduction of the Veterans Health Administration (VHA) patient-centered medical home initiative and how they were influenced by existing organizational factors, including VHA facility complexity and practice location. A voluntary survey, measuring primary care providers' (PCPs') perspectives on QI culture at their primary care clinics, was administered in 2010 and 2012. Participants were 320 PCPs from hospital- and community-based primary care practices in Pennsylvania, West Virginia, Delaware, New Jersey, New York, and Ohio. PCPs in community-based outpatient clinics reported an improvement in established processes for QI, and communication and cooperation from 2010 to 2012. However, their peers in hospital-based clinics did not report any significant improvements in QI culture. In both years, compared with high-complexity facilities, medium- and low-complexity facilities had better scores on the scales assessing established processes for QI, and communication and cooperation. © The Author(s) 2014.
A Hybrid Interval-Robust Optimization Model for Water Quality Management.
Xu, Jieyu; Li, Yongping; Huang, Guohe
2013-05-01
In water quality management problems, uncertainties may exist in many system components and pollution-related processes ( i.e. , random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval-robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements.
Sens, Brigitte
2010-01-01
The concept of general process orientation as an instrument of organisation development is the core principle of quality management philosophy, i.e. the learning organisation. Accordingly, prestigious quality awards and certification systems focus on process configuration and continual improvement. In German health care organisations, particularly in hospitals, this general process orientation has not been widely implemented yet - despite enormous change dynamics and the requirements of both quality and economic efficiency of health care processes. But based on a consistent process architecture that considers key processes as well as management and support processes, the strategy of excellent health service provision including quality, safety and transparency can be realised in daily operative work. The core elements of quality (e.g., evidence-based medicine), patient safety and risk management, environmental management, health and safety at work can be embedded in daily health care processes as an integrated management system (the "all in one system" principle). Sustainable advantages and benefits for patients, staff, and the organisation will result: stable, high-quality, efficient, and indicator-based health care processes. Hospitals with their broad variety of complex health care procedures should now exploit the full potential of total process orientation. Copyright © 2010. Published by Elsevier GmbH.
An adaptive framework to differentiate receiving water quality impacts on a multi-scale level.
Blumensaat, F; Tränckner, J; Helm, B; Kroll, S; Dirckx, G; Krebs, P
2013-01-01
The paradigm shift in recent years towards sustainable and coherent water resources management on a river basin scale has changed the subject of investigations to a multi-scale problem representing a great challenge for all actors participating in the management process. In this regard, planning engineers often face an inherent conflict to provide reliable decision support for complex questions with a minimum of effort. This trend inevitably increases the risk to base decisions upon uncertain and unverified conclusions. This paper proposes an adaptive framework for integral planning that combines several concepts (flow balancing, water quality monitoring, process modelling, multi-objective assessment) to systematically evaluate management strategies for water quality improvement. As key element, an S/P matrix is introduced to structure the differentiation of relevant 'pressures' in affected regions, i.e. 'spatial units', which helps in handling complexity. The framework is applied to a small, but typical, catchment in Flanders, Belgium. The application to the real-life case shows: (1) the proposed approach is adaptive, covers problems of different spatial and temporal scale, efficiently reduces complexity and finally leads to a transparent solution; and (2) water quality and emission-based performance evaluation must be done jointly as an emission-based performance improvement does not necessarily lead to an improved water quality status, and an assessment solely focusing on water quality criteria may mask non-compliance with emission-based standards. Recommendations derived from the theoretical analysis have been put into practice.
Geotechnical approaches to coal ash content control in mining of complex structure deposits
NASA Astrophysics Data System (ADS)
Batugin, SA; Gavrilov, VL; Khoyutanov, EA
2017-02-01
Coal deposits having complex structure and nonuniform quality coal reserves require improved processes of production quality control. The paper proposes a method to present coal ash content as components of natural and technological dilution. It is chosen to carry out studies on the western site of Elginsk coal deposit, composed of four coal beds of complex structure. The reported estimates of coal ash content in the beds with respect to five components point at the need to account for such data in confirmation exploration, mine planning and actual mining. Basic means of analysis and control of overall ash content and its components are discussed.
Protecting the proteome: Eukaryotic cotranslational quality control pathways
2014-01-01
The correct decoding of messenger RNAs (mRNAs) into proteins is an essential cellular task. The translational process is monitored by several quality control (QC) mechanisms that recognize defective translation complexes in which ribosomes are stalled on substrate mRNAs. Stalled translation complexes occur when defects in the mRNA template, the translation machinery, or the nascent polypeptide arrest the ribosome during translation elongation or termination. These QC events promote the disassembly of the stalled translation complex and the recycling and/or degradation of the individual mRNA, ribosomal, and/or nascent polypeptide components, thereby clearing the cell of improper translation products and defective components of the translation machinery. PMID:24535822
[Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].
Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang
2017-03-01
The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.
Attention Guidance in Learning from a Complex Animation: Seeing Is Understanding?
ERIC Educational Resources Information Center
de Koning, Bjorn B.; Tabbers, Huib K.; Rikers, Remy M. J. P.; Paas, Fred
2010-01-01
To examine how visual attentional resources are allocated when learning from a complex animation about the cardiovascular system, eye movements were registered in the absence and presence of visual cues. Cognitive processing was assessed using cued retrospective reporting, whereas comprehension and transfer tests measured the quality of the…
Guiding and Modelling Quality Improvement in Higher Education Institutions
ERIC Educational Resources Information Center
Little, Daniel
2015-01-01
The article considers the process of creating quality improvement in higher education institutions from the point of view of current organisational theory and social-science modelling techniques. The author considers the higher education institution as a functioning complex of rules, norms and other organisational features and reviews the social…
Complexity in electronic negotiation support systems.
Griessmair, Michele; Strunk, Guido; Vetschera, Rudolf; Koeszegi, Sabine T
2011-10-01
It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.
Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy
2014-12-01
A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.
STARS Proceedings (3-4 December 1991)
1991-12-04
PROJECT PROCESS OBJECTIVES & ASSOCIATED METRICS: Prioritize ECPs: complexity & error-history measures 0 Make vs Buy decisions: Effort & Quality (or...history measures, error- proneness and past histories of trouble with particular modules are very useful measures. Make vs Buy decisions: Does the...Effort offset the gain in Quality relative to buy ... Effort and Quality (or defect rate) histories give helpful indications of how to make this decision
Modelling for Ship Design and Production
1991-09-01
the physical production process. The product has to be delivered within the chain of order processing . The process “ship production” is defined by the...environment is of increasing importance. Changing product types, complexity and parallelism of order processing , short throughput times and fixed due...specialized and high quality products under manu- facturing conditions which ensure economic and effective order processing . Mapping these main
[Service quality in health care: the application of the results of marketing research].
Verheggen, F W; Harteloh, P P
1993-01-01
This paper deals with quality assurance in health care and its relation to quality assurance in trade and industry. We present the service quality model--a model of quality from marketing research--and discuss how it can be applied to health care. Traditional quality assurance appears to have serious flaws. It lacks a general theory of the sources of hazards in the complex process of patient care and tends to stagnate, for no real improvement takes place. Departing from this criticism, modern quality assurance in health care is marked by: defining quality in a preferential sense as "fitness for use"; the use of theories and models of trade and industry (process-control); an emphasis on analyzing the process, instead of merely inspecting it; use of the Deming problem solving technique (plan, do, check, act); improvement of the process of care by altering perceptions of parties involved. We present an experience of application and utilization of this method in the University Hospital Maastricht, The Netherlands. The successful application of this model requires a favorable corporate culture and motivation of the health care workers. This model provides a useful framework to uplift the traditional approach to quality assurance in health care.
Process perspective on image quality evaluation
NASA Astrophysics Data System (ADS)
Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte
2008-01-01
The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.
NASA Astrophysics Data System (ADS)
Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.
2016-08-01
Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.
Monitoring system for the quality assessment in additive manufacturing
NASA Astrophysics Data System (ADS)
Carl, Volker
2015-03-01
Additive Manufacturing (AM) refers to a process by which a set of digital data -representing a certain complex 3dim design - is used to grow the respective 3dim real structure equal to the corresponding design. For the powder-based EOS manufacturing process a variety of plastic and metal materials can be used. Thereby, AM is in many aspects a very powerful tool as it can help to overcome particular limitations in conventional manufacturing. AM enables more freedom of design, complex, hollow and/or lightweight structures as well as product individualisation and functional integration. As such it is a promising approach with respect to the future design and manufacturing of complex 3dim structures. On the other hand, it certainly calls for new methods and standards in view of quality assessment. In particular, when utilizing AM for the design of complex parts used in aviation and aerospace technologies, appropriate monitoring systems are mandatory. In this respect, recently, sustainable progress has been accomplished by joining the common efforts and concerns of a manufacturer Additive Manufacturing systems and respective materials (EOS), along with those of an operator of such systems (MTU Aero Engines) and experienced application engineers (Carl Metrology), using decent know how in the field of optical and infrared methods regarding non-destructive-examination (NDE). The newly developed technology is best described by a high-resolution layer by layer inspection technique, which allows for a 3D tomography-analysis of the complex part at any time during the manufacturing process. Thereby, inspection costs are kept rather low by using smart image-processing methods as well as CMOS sensors instead of infrared detectors. Moreover, results from conventional physical metallurgy may easily be correlated with the predictive results of the monitoring system which not only allows for improvements of the AM monitoring system, but finally leads to an optimisation of the quality and insurance of material security of the complex structure being manufactured. Both, our poster and our oral presentation will explain the data flow between the above mentioned parties involved. A suitable monitoring system for Additive Manufacturing will be introduced, along with a presentation of the respective high resolution data acquisition, as well as the image processing and the data analysis allowing for a precise control of the 3dim growth-process.
Monitoring system for the quality assessment in additive manufacturing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carl, Volker, E-mail: carl@t-zfp.de
Additive Manufacturing (AM) refers to a process by which a set of digital data -representing a certain complex 3dim design - is used to grow the respective 3dim real structure equal to the corresponding design. For the powder-based EOS manufacturing process a variety of plastic and metal materials can be used. Thereby, AM is in many aspects a very powerful tool as it can help to overcome particular limitations in conventional manufacturing. AM enables more freedom of design, complex, hollow and/or lightweight structures as well as product individualisation and functional integration. As such it is a promising approach with respectmore » to the future design and manufacturing of complex 3dim structures. On the other hand, it certainly calls for new methods and standards in view of quality assessment. In particular, when utilizing AM for the design of complex parts used in aviation and aerospace technologies, appropriate monitoring systems are mandatory. In this respect, recently, sustainable progress has been accomplished by joining the common efforts and concerns of a manufacturer Additive Manufacturing systems and respective materials (EOS), along with those of an operator of such systems (MTU Aero Engines) and experienced application engineers (Carl Metrology), using decent know how in the field of optical and infrared methods regarding non-destructive-examination (NDE). The newly developed technology is best described by a high-resolution layer by layer inspection technique, which allows for a 3D tomography-analysis of the complex part at any time during the manufacturing process. Thereby, inspection costs are kept rather low by using smart image-processing methods as well as CMOS sensors instead of infrared detectors. Moreover, results from conventional physical metallurgy may easily be correlated with the predictive results of the monitoring system which not only allows for improvements of the AM monitoring system, but finally leads to an optimisation of the quality and insurance of material security of the complex structure being manufactured. Both, our poster and our oral presentation will explain the data flow between the above mentioned parties involved. A suitable monitoring system for Additive Manufacturing will be introduced, along with a presentation of the respective high resolution data acquisition, as well as the image processing and the data analysis allowing for a precise control of the 3dim growth-process.« less
NASA Astrophysics Data System (ADS)
Anh, N. K.; Phonekeo, V.; My, V. C.; Duong, N. D.; Dat, P. T.
2014-02-01
In recent years, Vietnamese economy has been growing up rapidly and caused serious environmental quality plunging, especially in industrial and mining areas. It brings an enormous threat to a socially sustainable development and the health of human beings. Environmental quality assessment and protection are complex and dynamic processes, since it involves spatial information from multi-sector, multi-region and multi-field sources and needs complicated data processing. Therefore, an effective environmental protection information system is needed, in which considerable factors hidden in the complex relationships will become clear and visible. In this paper, the authors present the methodology which was used to generate environmental hazard maps which are applied to the integration of Analytic Hierarchy Process (AHP) and Geographical Information system (GIS). We demonstrate the results that were obtained from the study area in Dong Trieu district. This research study has contributed an overall perspective of environmental quality and identified the devastated areas where the administration urgently needs to establish an appropriate policy to improve and protect the environment.
Effective Software Engineering Leadership for Development Programs
ERIC Educational Resources Information Center
Cagle West, Marsha
2010-01-01
Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…
Rooijakkers, Michiel; Rabotti, Chiara; Bennebroek, Martijn; van Meerbergen, Jef; Mischi, Massimo
2011-01-01
Non-invasive fetal health monitoring during pregnancy has become increasingly important. Recent advances in signal processing technology have enabled fetal monitoring during pregnancy, using abdominal ECG recordings. Ubiquitous ambulatory monitoring for continuous fetal health measurement is however still unfeasible due to the computational complexity of noise robust solutions. In this paper an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, while increasing the R-peak detection quality compared to existing R-peak detection schemes. Validation of the algorithm is performed on two manually annotated datasets, the MIT/BIH Arrhythmia database and an in-house abdominal database. Both R-peak detection quality and computational complexity are compared to state-of-the-art algorithms as described in the literature. With a detection error rate of 0.22% and 0.12% on the MIT/BIH Arrhythmia and in-house databases, respectively, the quality of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.
A Hybrid Interval–Robust Optimization Model for Water Quality Management
Xu, Jieyu; Li, Yongping; Huang, Guohe
2013-01-01
Abstract In water quality management problems, uncertainties may exist in many system components and pollution-related processes (i.e., random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval–robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements. PMID:23922495
DOT National Transportation Integrated Search
2004-01-01
Accelerated Construction Technology Transfer (ACTT) is a strategic process that uses various innovative techniques, strategies, and technologies to minimize actual construction time, while enhancing quality and safety on today's large, complex multip...
A top-down approach to fabrication of high quality vertical heterostructure nanowire arrays.
Wang, Hua; Sun, Minghua; Ding, Kang; Hill, Martin T; Ning, Cun-Zheng
2011-04-13
We demonstrate a novel top-down approach for fabricating nanowires with unprecedented complexity and optical quality by taking advantage of a nanoscale self-masking effect. We realized vertical arrays of nanowires of 20-40 nm in diameter with 16 segments of complex longitudinal InGaAsP/InP structures. The unprecedented high quality of etched wires is evidenced by the narrowest photoluminescence linewidth ever produced in similar wavelengths, indistinguishable from that of the corresponding wafer. This top-down, mask-free, large scale approach is compatible with the established device fabrication processes and could serve as an important alternative to the bottom-up approach, significantly expanding ranges and varieties of applications of nanowire technology.
Low-Cost Oil Quality Sensor Based on Changes in Complex Permittivity
Pérez, Angel Torres; Hadfield, Mark
2011-01-01
Real time oil quality monitoring techniques help to protect important industry assets, minimize downtime and reduce maintenance costs. The measurement of a lubricant’s complex permittivity is an effective indicator of the oil degradation process and it can be useful in condition based maintenance (CBM) to select the most adequate oil replacement maintenance schedules. A discussion of the working principles of an oil quality sensor based on a marginal oscillator to monitor the losses of the dielectric at high frequencies (>1 MHz) is presented. An electronic design procedure is covered which results in a low cost, effective and ruggedized sensor implementation suitable for use in harsh environments. PMID:22346666
Distance Education for Physicians: Adaptation of a Canadian Experience to Uruguay
ERIC Educational Resources Information Center
Llambi, Laura; Margolis, Alvaro; Toews, John; Dapueto, Juan; Esteves, Elba; Martinez, Elisa; Forster, Thais; Lopez, Antonio; Lockyer, Jocelyn
2008-01-01
Introduction: The production of online high-quality continuing professional development is a complex process that demands familiarity with effective program and content design. Collaboration and sharing across nations would appear to be a reasonable way to improve quality, increase access, and reduce costs. Methods: In this case report, the…
Quality Enhancement and Educational Professional Development
ERIC Educational Resources Information Center
Knight, Peter
2006-01-01
There is a strong international interest in the enhancement of teaching quality. Enhancement is a big job because teaching is an extensive activity. It is a complex job because learning to teach is not, mainly, a formal process: non-formal, practice-based learning is more significant. These two points, extensiveness and practice-based learning,…
Chip Design Process Optimization Based on Design Quality Assessment
NASA Astrophysics Data System (ADS)
Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel
2010-06-01
Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.
Progress on high-performance rapid prototype aluminum mirrors
NASA Astrophysics Data System (ADS)
Woodard, Kenneth S.; Myrick, Bruce H.
2017-05-01
Near net shape parts can be produced using some very old processes (investment casting) and the relatively new direct metal laser sintering (DMLS) process. These processes have significant advantages for complex blank lightweighting and costs but are not inherently suited for producing high performance mirrors. The DMLS process can provide extremely complex lightweight structures but the high residual stresses left in the material results in unstable mirror figure retention. Although not to the extreme intricacy of DMLS, investment casting can also provide complex lightweight structures at considerably lower costs than DMLS and even conventional wrought mirror blanks but the less than 100% density for casting (and also DMLS) limits finishing quality. This paper will cover the progress that has been made to make both the DMLS and investment casting processes into viable near net shape blank options for high performance aluminum mirrors. Finish and figure results will be presented to show performance commensurate with existing conventional processes.
Is the destabilization of the cournot equilibrium a good business strategy in cournot-puu duopoly?
Canovas, Jose S
2011-10-01
It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.
NASA Astrophysics Data System (ADS)
Polosin, A. N.; Chistyakova, T. B.
2018-05-01
In this article, the authors describe mathematical modeling of polymer processing in extruders of various types used in extrusion and calender productions of film materials. The method consists of the synthesis of a static model for calculating throughput, energy consumption of the extruder, extrudate quality indices, as well as a dynamic model for evaluating polymer residence time in the extruder, on which the quality indices depend. Models are adjusted according to the extruder type (single-screw, reciprocating, twin-screw), its screw and head configuration, extruder’s work temperature conditions, and the processed polymer type. Models enable creating extruder screw configurations and determining extruder controlling action values that provide the extrudate of required quality while satisfying extruder throughput and energy consumption requirements. Model adequacy has been verified using polyolefins’ and polyvinylchloride processing data in different extruders. The program complex, based on mathematical models, has been developed in order to control extruders of various types in order to ensure resource and energy saving in multi-assortment productions of polymeric films. Using the program complex in the control system for the extrusion stage of the polymeric film productions enables improving film quality, reducing spoilage, lessening the time required for production line change-over to other throughput and film type assignment.
Clinical quality needs complex adaptive systems and machine learning.
Marsland, Stephen; Buchan, Iain
2004-01-01
The vast increase in clinical data has the potential to bring about large improvements in clinical quality and other aspects of healthcare delivery. However, such benefits do not come without cost. The analysis of such large datasets, particularly where the data may have to be merged from several sources and may be noisy and incomplete, is a challenging task. Furthermore, the introduction of clinical changes is a cyclical task, meaning that the processes under examination operate in an environment that is not static. We suggest that traditional methods of analysis are unsuitable for the task, and identify complexity theory and machine learning as areas that have the potential to facilitate the examination of clinical quality. By its nature the field of complex adaptive systems deals with environments that change because of the interactions that have occurred in the past. We draw parallels between health informatics and bioinformatics, which has already started to successfully use machine learning methods.
Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent
2018-01-01
Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.
NASA Earth Observation Systems and Applications for Health and Air Quality
NASA Technical Reports Server (NTRS)
Omar, Ali H.
2015-01-01
There is a growing body of evidence that the environment can affect human health in ways that are both complex and global in scope. To address some of these complexities, NASA maintains a diverse constellation of Earth observing research satellites, and sponsors research in developing satellite data applications across a wide spectrum of areas. These include environmental health; infectious disease; air quality standards, policies, and regulations; and the impact of climate change on health and air quality in a number of interrelated efforts. The Health and Air Quality Applications fosters the use of observations, modeling systems, forecast development, application integration, and the research to operations transition process to address environmental health effects. NASA has been a primary partner with Federal operational agencies over the past nine years in these areas. This talk presents the background of the Health and Air Quality Applications program, recent accomplishments, and a plan for the future.
Make no mistake—errors can be controlled*
Hinckley, C
2003-01-01
Traditional quality control methods identify "variation" as the enemy. However, the control of variation by itself can never achieve the remarkably low non-conformance rates of world class quality leaders. Because the control of variation does not achieve the highest levels of quality, an inordinate focus on these techniques obscures key quality improvement opportunities and results in unnecessary pain and suffering for patients, and embarrassment, litigation, and loss of revenue for healthcare providers. Recent experience has shown that mistakes are the most common cause of problems in health care as well as in other industrial environments. Excessive product and process complexity contributes to both excessive variation and unnecessary mistakes. The best methods for controlling variation, mistakes, and complexity are each a form of mistake proofing. Using these mistake proofing techniques, virtually every mistake and non-conformance can be controlled at a fraction of the cost of traditional quality control methods. PMID:14532368
Defining the best quality-control systems by design and inspection.
Hinckley, C M
1997-05-01
Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.
Quality Assessment of Established and Emerging Blood Components for Transfusion
Marks, Denese C.
2016-01-01
Blood is donated either as whole blood, with subsequent component processing, or through the use of apheresis devices that extract one or more components and return the rest of the donation to the donor. Blood component therapy supplanted whole blood transfusion in industrialized countries in the middle of the twentieth century and remains the standard of care for the majority of patients receiving a transfusion. Traditionally, blood has been processed into three main blood products: red blood cell concentrates; platelet concentrates; and transfusable plasma. Ensuring that these products are of high quality and that they deliver their intended benefits to patients throughout their shelf-life is a complex task. Further complexity has been added with the development of products stored under nonstandard conditions or subjected to additional manufacturing steps (e.g., cryopreserved platelets, irradiated red cells, and lyophilized plasma). Here we review established and emerging methodologies for assessing blood product quality and address controversies and uncertainties in this thriving and active field of investigation. PMID:28070448
Measuring the complexity of design in real-time imaging software
NASA Astrophysics Data System (ADS)
Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.
2007-02-01
Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.
Improving the Quality of Home Health Care for Children With Medical Complexity.
Nageswaran, Savithri; Golden, Shannon L
2017-08-01
The objectives of this study are to describe the quality of home health care services for children with medical complexity, identify barriers to delivering optimal home health care, and discuss potential solutions to improve home health care delivery. In this qualitative study, we conducted 20 semistructured in-depth interviews with primary caregivers of children with medical complexity, and 4 focus groups with 18 home health nurses. During an iterative analysis process, we identified themes related to quality of home health care. There is substantial variability between home health nurses in the delivery of home health care to children. Lack of skills in nurses is common and has serious negative health consequences for children with medical complexity, including hospitalizations, emergency room visits, and need for medical procedures. Inadequate home health care also contributes to caregiver burden. A major barrier to delivering optimal home health care is the lack of training of home health nurses in pediatric care and technology use. Potential solutions for improving care include home health agencies training nurses in the care of children with medical complexity, support for nurses in clinical problem solving, and reimbursement for training nurses in pediatric home care. Caregiver-level interventions includes preparation of caregivers about: providing medical care for their children at home and addressing problems with home health care services. There are problems in the quality of home health care delivered to children with medical complexity. Training nurses in the care of children with medical complexity and preparing caregivers about home care could improve home health care quality. Copyright © 2017 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Posch, Andreas E; Spadiut, Oliver; Herwig, Christoph
2012-06-22
Filamentous fungi are versatile cell factories and widely used for the production of antibiotics, organic acids, enzymes and other industrially relevant compounds at large scale. As a fact, industrial production processes employing filamentous fungi are commonly based on complex raw materials. However, considerable lot-to-lot variability of complex media ingredients not only demands for exhaustive incoming components inspection and quality control, but unavoidably affects process stability and performance. Thus, switching bioprocesses from complex to defined media is highly desirable. This study presents a strategy for strain characterization of filamentous fungi on partly complex media using redundant mass balancing techniques. Applying the suggested method, interdependencies between specific biomass and side-product formation rates, production of fructooligosaccharides, specific complex media component uptake rates and fungal strains were revealed. A 2-fold increase of the overall penicillin space time yield and a 3-fold increase in the maximum specific penicillin formation rate were reached in defined media compared to complex media. The newly developed methodology enabled fast characterization of two different industrial Penicillium chrysogenum candidate strains on complex media based on specific complex media component uptake kinetics and identification of the most promising strain for switching the process from complex to defined conditions. Characterization at different complex/defined media ratios using only a limited number of analytical methods allowed maximizing the overall industrial objectives of increasing both, method throughput and the generation of scientific process understanding.
2012-01-01
Background Filamentous fungi are versatile cell factories and widely used for the production of antibiotics, organic acids, enzymes and other industrially relevant compounds at large scale. As a fact, industrial production processes employing filamentous fungi are commonly based on complex raw materials. However, considerable lot-to-lot variability of complex media ingredients not only demands for exhaustive incoming components inspection and quality control, but unavoidably affects process stability and performance. Thus, switching bioprocesses from complex to defined media is highly desirable. Results This study presents a strategy for strain characterization of filamentous fungi on partly complex media using redundant mass balancing techniques. Applying the suggested method, interdependencies between specific biomass and side-product formation rates, production of fructooligosaccharides, specific complex media component uptake rates and fungal strains were revealed. A 2-fold increase of the overall penicillin space time yield and a 3-fold increase in the maximum specific penicillin formation rate were reached in defined media compared to complex media. Conclusions The newly developed methodology enabled fast characterization of two different industrial Penicillium chrysogenum candidate strains on complex media based on specific complex media component uptake kinetics and identification of the most promising strain for switching the process from complex to defined conditions. Characterization at different complex/defined media ratios using only a limited number of analytical methods allowed maximizing the overall industrial objectives of increasing both, method throughput and the generation of scientific process understanding. PMID:22727013
ERIC Educational Resources Information Center
Bonometti, Patrizia
2012-01-01
Purpose: The aim of this contribution is to describe a new complexity-science-based approach for improving safety, quality and efficiency and the way it was implemented by TenarisDalmine. Design/methodology/approach: This methodology is called "a safety-building community". It consists of a safety-behaviour social self-construction…
T. Heartsill Scalley; F.N. Scatena; S. Moya; A.E. Lugo
2012-01-01
In heterotrophic streams the retention and export of coarse particulate organic matter and associated elements are fundamental biogeochemical processes that influence water quality, food webs and the structural complexity of forested headwater streams. Nevertheless, few studies have documented the quantity and quality of exported organic matter over multiple years and...
The ability to forecast local and regional air pollution events is challenging since the processes governing the production and sustenance of atmospheric pollutants are complex and often non-linear. Comprehensive atmospheric models, by representing in as much detail as possible t...
ERIC Educational Resources Information Center
Aikman, Sheila; Rao, Nitya
2012-01-01
The article draws on qualitative educational research across a diversity of low-income countries to examine the gendered inequalities in education as complex, multi-faceted and situated rather than a series of barriers to be overcome through linear input-output processes focused on isolated dimensions of quality. It argues that frameworks for…
Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.
2016-01-01
The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient. PMID:27370140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M. Saiful, E-mail: HUQS@UPMC.EDU
The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact ofmore » possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.« less
Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D
2016-07-01
The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.
NASA Astrophysics Data System (ADS)
Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.
2015-12-01
The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.
USDA-ARS?s Scientific Manuscript database
Electronic nose sensors are designed to detect differences in complex air sample matrices. For example, they have been used in the food industry to monitor process performance and quality control. However, no information is available on the application of sensor arrays to monitor process performanc...
A Portable Computer System for Auditing Quality of Ambulatory Care
McCoy, J. Michael; Dunn, Earl V.; Borgiel, Alexander E.
1987-01-01
Prior efforts to effectively and efficiently audit quality of ambulatory care based on comprehensive process criteria have been limited largely by the complexity and cost of data abstraction and management. Over the years, several demonstration projects have generated large sets of process criteria and mapping systems for evaluating quality of care, but these paper-based approaches have been impractical to implement on a routine basis. Recognizing that portable microcomputers could solve many of the technical problems in abstracting data from medical records, we built upon previously described criteria and developed a microcomputer-based abstracting system that facilitates reliable and cost-effective data abstraction.
Evaluation of quality improvement programmes
Ovretveit, J; Gustafson, D
2002-01-01
In response to increasing concerns about quality, many countries are carrying out large scale programmes which include national quality strategies, hospital programmes, and quality accreditation, assessment and review processes. Increasing amounts of resources are being devoted to these interventions, but do they ensure or improve quality of care? There is little research evidence as to their effectiveness or the conditions for maximum effectiveness. Reasons for the lack of evaluation research include the methodological challenges of measuring outcomes and attributing causality to these complex, changing, long term social interventions to organisations or health systems, which themselves are complex and changing. However, methods are available which can be used to evaluate these programmes and which can provide decision makers with research based guidance on how to plan and implement them. This paper describes the research challenges, the methods which can be used, and gives examples and guidance for future research. It emphasises the important contribution which such research can make to improving the effectiveness of these programmes and to developing the science of quality improvement. PMID:12486994
Study of mould design and forming process on advanced polymer-matrix composite complex structure
NASA Astrophysics Data System (ADS)
Li, S. J.; Zhan, L. H.; Bai, H. M.; Chen, X. P.; Zhou, Y. Q.
2015-07-01
Advanced carbon fibre-reinforced polymer-matrix composites are widely applied to aviation manufacturing field due to their outstanding performance. In this paper, the mould design and forming process of the complex composite structure were discussed in detail using the hat stiffened structure as an example. The key issues of the moulddesign were analyzed, and the corresponding solutions were also presented. The crucial control points of the forming process such as the determination of materials and stacking sequence, the temperature and pressure route of the co-curing process were introduced. In order to guarantee the forming quality of the composite hat stiffened structure, a mathematical model about the aperture of rubber mandrel was introduced. The study presented in this paper may provide some actual references for the design and manufacture of the important complex composite structures.
Unpacking the black box of improvement
Ramaswamy, Rohit; Reed, Julie; Livesley, Nigel; Boguslavsky, Victor; Garcia-Elorrio, Ezequiel; Sax, Sylvia; Houleymata, Diarra; Kimble, Leighann; Parry, Gareth
2018-01-01
Abstract During the Salzburg Global Seminar Session 565—‘Better Health Care: How do we learn about improvement?’, participants discussed the need to unpack the ‘black box’ of improvement. The ‘black box’ refers to the fact that when quality improvement interventions are described or evaluated, there is a tendency to assume a simple, linear path between the intervention and the outcomes it yields. It is also assumed that it is enough to evaluate the results without understanding the process of by which the improvement took place. However, quality improvement interventions are complex, nonlinear and evolve in response to local settings. To accurately assess the effectiveness of quality improvement and disseminate the learning, there must be a greater understanding of the complexity of quality improvement work. To remain consistent with the language used in Salzburg, we refer to this as ‘unpacking the black box’ of improvement. To illustrate the complexity of improvement, this article introduces four quality improvement case studies. In unpacking the black box, we present and demonstrate how Cynefin framework from complexity theory can be used to categorize and evaluate quality improvement interventions. Many quality improvement projects are implemented in complex contexts, necessitating an approach defined as ‘probe-sense-respond’. In this approach, teams experiment, learn and adapt their changes to their local setting. Quality improvement professionals intuitively use the probe-sense-respond approach in their work but document and evaluate their projects using language for ‘simple’ or ‘complicated’ contexts, rather than the ‘complex’ contexts in which they work. As a result, evaluations tend to ask ‘How can we attribute outcomes to the intervention?’, rather than ‘What were the adaptations that took place?’. By unpacking the black box of improvement, improvers can more accurately document and describe their interventions, allowing evaluators to ask the right questions and more adequately evaluate quality improvement interventions. PMID:29462325
Quality control in the development of coagulation factor concentrates.
Snape, T J
1987-01-01
Limitation of process change is a major factor contributing to assurance of quality in pharmaceutical manufacturing. This is particularly true in the manufacture of coagulation factor concentrates, for which presumptive testing for poorly defined product characteristics is an integral feature of finished product quality control. The development of new or modified preparations requires that this comfortable position be abandoned, and that the effect on finished product characteristics of changes to individual process steps (and components) be assessed. The degree of confidence in the safety and efficacy of the new product will be determined by, amongst other things, the complexity of the process alteration and the extent to which the results of finished product tests can be considered predictive. The introduction of a heat-treatment step for inactivation of potential viral contaminants in coagulation factor concentrates presents a significant challenge in both respects, quite independent of any consideration of assessment of the effectiveness of the viral inactivation step. These interactions are illustrated by some of the problems encountered with terminal dry heat-treatment (72 h. at 80 degrees C) of factor VIII and prothrombin complex concentrates manufactured by the Blood Products Laboratory.
Effect assessment in work environment interventions: a methodological reflection.
Neumann, W P; Eklund, J; Hansson, B; Lindbeck, L
2010-01-01
This paper addresses a number of issues for work environment intervention (WEI) researchers in light of the mixed results reported in the literature. If researchers emphasise study quality over intervention quality, reviews that exclude case studies with high quality and multifactorial interventions may be vulnerable to 'quality criteria selection bias'. Learning from 'failed' interventions is inhibited by both publication bias and reporting lengths that limit information on relevant contextual and implementation factors. The authors argue for the need to develop evaluation approaches consistent with the complexity of multifactorial WEIs that: a) are owned by and aimed at the whole organisation; and b) include intervention in early design stages where potential impact is highest. Context variety, complexity and instability in and around organisations suggest that attention might usefully shift from generalisable 'proof of effectiveness' to a more nuanced identification of intervention elements and the situations in which they are more likely to work as intended. STATEMENT OF RELEVANCE: This paper considers ergonomics interventions from perspectives of what constitutes quality and 'proof". It points to limitations of traditional experimental intervention designs and argues that the complexity of organisational change, and the need for multifactorial interventions that reach deep into work processes for greater impact, should be recognised.
Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G
2013-01-16
Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.
Hearld, Larry R; Alexander, Jeffrey A; Fraser, Irene; Jiang, H Joanna
2008-06-01
Interest in organizational contributions to the delivery of care has risen significantly in recent years. A challenge facing researchers, practitioners, and policy makers is identifying ways to improve care by improving the organizations that provide this care, given the complexity of health care organizations and the role organizations play in influencing systems of care. This article reviews the literature on the relationship between the structural characteristics and organizational processes of hospitals and quality of care. The review uses Donabedian's structure-process-outcome and level of analysis frameworks to organize the literature. The results of this review indicate that a preponderance of studies are conducted at the hospital level of analysis and are predominantly focused on the organizational structure-quality outcome relationship. The article concludes with recommendations of how health services researchers can expand their research to enhance one's understanding of the relationship between organizational characteristics and quality of care.
ERIC Educational Resources Information Center
Simon, Cecilia; Echeita, Gerardo; Sandoval, Marta; Lopez, Mauricio
2010-01-01
Inclusive education is a complex and multidimensional process that, among other aspirations, tries to foster the rights of every student to obtain a high-quality education. This process focuses on the diversity of needs of all students by increasing participation in learning, cultures, and communities and reducing exclusion within and from…
Multiphase porous media modelling: A novel approach to predicting food processing performance.
Khan, Md Imran H; Joardder, M U H; Kumar, Chandan; Karim, M A
2018-03-04
The development of a physics-based model of food processing is essential to improve the quality of processed food and optimize energy consumption. Food materials, particularly plant-based food materials, are complex in nature as they are porous and have hygroscopic properties. A multiphase porous media model for simultaneous heat and mass transfer can provide a realistic understanding of transport processes and thus can help to optimize energy consumption and improve food quality. Although the development of a multiphase porous media model for food processing is a challenging task because of its complexity, many researchers have attempted it. The primary aim of this paper is to present a comprehensive review of the multiphase models available in the literature for different methods of food processing, such as drying, frying, cooking, baking, heating, and roasting. A critical review of the parameters that should be considered for multiphase modelling is presented which includes input parameters, material properties, simulation techniques and the hypotheses. A discussion on the general trends in outcomes, such as moisture saturation, temperature profile, pressure variation, and evaporation patterns, is also presented. The paper concludes by considering key issues in the existing multiphase models and future directions for development of multiphase models.
Du, Yongzhao; Fu, Yuqing; Zheng, Lixin
2016-12-20
A real-time complex amplitude reconstruction method for determining the dynamic beam quality M2 factor based on a Mach-Zehnder self-referencing interferometer wavefront sensor is developed. By using the proposed complex amplitude reconstruction method, full characterization of the laser beam, including amplitude (intensity profile) and phase information, can be reconstructed from a single interference pattern with the Fourier fringe pattern analysis method in a one-shot measurement. With the reconstructed complex amplitude, the beam fields at any position z along its propagation direction can be obtained by first utilizing the diffraction integral theory. Then the beam quality M2 factor of the dynamic beam is calculated according to the specified method of the Standard ISO11146. The feasibility of the proposed method is demonstrated with the theoretical analysis and experiment, including the static and dynamic beam process. The experimental method is simple, fast, and operates without movable parts and is allowed in order to investigate the laser beam in inaccessible conditions using existing methods.
Square2 - A Web Application for Data Monitoring in Epidemiological and Clinical Studies
Schmidt, Carsten Oliver; Krabbe, Christine; Schössow, Janka; Albers, Martin; Radke, Dörte; Henke, Jörg
2017-01-01
Valid scientific inferences from epidemiological and clinical studies require high data quality. Data generating departments therefore aim to detect data irregularities as early as possible in order to guide quality management processes. In addition, after the completion of data collections the obtained data quality must be evaluated. This can be challenging in complex studies due to a wide scope of examinations, numerous study variables, multiple examiners, devices, and examination centers. This paper describes a Java EE web application used to monitor and evaluate data quality in institutions with complex and multiple studies, named Square 2 . It uses the Java libraries Apache MyFaces 2, extended by BootsFaces for layout and style. RServe and REngine manage calls to R server processes. All study data and metadata are stored in PostgreSQL. R is the statistics backend and LaTeX is used for the generation of print ready PDF reports. A GUI manages the entire workflow. Square 2 covers all steps in the data monitoring workflow, including the setup of studies and their structure, the handling of metadata for data monitoring purposes, selection of variables, upload of data, statistical analyses, and the generation as well as inspection of quality reports. To take into account data protection issues, Square 2 comprises an extensive user rights and roles concept.
Gee, Adrian P.; Richman, Sara; Durett, April; McKenna, David; Traverse, Jay; Henry, Timothy; Fisk, Diann; Pepine, Carl; Bloom, Jeannette; Willerson, James; Prater, Karen; Zhao, David; Koç, Jane Reese; Ellis, Steven; Taylor, Doris; Cogle, Christopher; Moyé, Lemuel; Simari, Robert; Skarlatos, Sonia
2013-01-01
Background Aims Multi-center cellular therapy clinical trials require the establishment and implementation of standardized cell processing protocols and associated quality control mechanisms. The aims here were to develop such an infrastructure in support of the Cardiovascular Cell Therapy Research Network (CCTRN) and to report on the results of processing for the first 60 patients. Methods Standardized cell preparations, consisting of autologous bone marrow mononuclear cells, prepared using the Sepax device were manufactured at each of the five processing facilities that supported the clinical treatment centers. Processing staff underwent centralized training that included proficiency evaluation. Quality was subsequently monitored by a central quality control program that included product evaluation by the CCTRN biorepositories. Results Data from the first 60 procedures demonstrate that uniform products, that met all release criteria, could be manufactured at all five sites within 7 hours of receipt of the bone marrow. Uniformity was facilitated by use of the automated systems (the Sepax for processing and the Endosafe device for endotoxin testing), standardized procedures and centralized quality control. Conclusions Complex multicenter cell therapy and regenerative medicine protocols can, where necessary, successfully utilize local processing facilities once an effective infrastructure is in place to provide training, and quality control. PMID:20524773
Problem of quality assurance during metal constructions welding via robotic technological complexes
NASA Astrophysics Data System (ADS)
Fominykh, D. S.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.
2018-05-01
The problem of minimizing the probability for critical combinations of events that lead to a loss in welding quality via robotic process automation is examined. The problem is formulated, models and algorithms for its solution are developed. The problem is solved by minimizing the criterion characterizing the losses caused by defective products. Solving the problem may enhance the quality and accuracy of operations performed and reduce the losses caused by defective product
Enhancing Learning Performance and Adaptability for Complex Tasks
2005-03-30
development of active learning interventions and techniques that influence the focus and quality of learner regulatory activity (Kozlowski Toney et al...what are the effects of these goal representations on learning strategies, performance, and adaptability? Can active learning inductions, that influence...and mindful process - active learning - are generally associated with improved skill acquisition and adaptability for complex tasks (Smith et al
ERIC Educational Resources Information Center
Hilbig, Annemarie; Proske, Antje
2014-01-01
Although academic writing is a complex interplay of comprehending and producing text the aspect of collecting information from source texts is hardly addressed in writing research. This study examined the impact of instructions supporting the collection process on writing quality, as well as the role of prior motivation and computer experience.…
In-situ acoustic signature monitoring in additive manufacturing processes
NASA Astrophysics Data System (ADS)
Koester, Lucas W.; Taheri, Hossein; Bigelow, Timothy A.; Bond, Leonard J.; Faierson, Eric J.
2018-04-01
Additive manufacturing is a rapidly maturing process for the production of complex metallic, ceramic, polymeric, and composite components. The processes used are numerous, and with the complex geometries involved this can make quality control and standardization of the process and inspection difficult. Acoustic emission measurements have been used previously to monitor a number of processes including machining and welding. The authors have identified acoustic signature measurement as a potential means of monitoring metal additive manufacturing processes using process noise characteristics and those discrete acoustic emission events characteristic of defect growth, including cracks and delamination. Results of acoustic monitoring for a metal additive manufacturing process (directed energy deposition) are reported. The work investigated correlations between acoustic emissions and process noise with variations in machine state and deposition parameters, and provided proof of concept data that such correlations do exist.
Data Quality- and Master Data Management - A Hospital Case.
Arthofer, Klaus; Girardi, Dominic
2017-01-01
Poor data quality prevents the analysis of data for decisions which are critical for business. It also has a negative impact on business processes. Nevertheless the maturity level of data quality- and master data management is still insufficient in many organizations nowadays. This article discusses the corresponding maturity of companies and a management cycle integrating data quality- and master data management in a case dealing with benchmarking in hospitals. In conclusion if data quality and master data are not properly managed, structured data should not be acquired in the first place due to the added expense and complexity.
Second Iteration of Photogrammetric Pipeline to Enhance the Accuracy of Image Pose Estimation
NASA Astrophysics Data System (ADS)
Nguyen, T. G.; Pierrot-Deseilligny, M.; Muller, J.-M.; Thom, C.
2017-05-01
In classical photogrammetric processing pipeline, the automatic tie point extraction plays a key role in the quality of achieved results. The image tie points are crucial to pose estimation and have a significant influence on the precision of calculated orientation parameters. Therefore, both relative and absolute orientations of the 3D model can be affected. By improving the precision of image tie point measurement, one can enhance the quality of image orientation. The quality of image tie points is under the influence of several factors such as the multiplicity, the measurement precision and the distribution in 2D images as well as in 3D scenes. In complex acquisition scenarios such as indoor applications and oblique aerial images, tie point extraction is limited while only image information can be exploited. Hence, we propose here a method which improves the precision of pose estimation in complex scenarios by adding a second iteration to the classical processing pipeline. The result of a first iteration is used as a priori information to guide the extraction of new tie points with better quality. Evaluated with multiple case studies, the proposed method shows its validity and its high potiential for precision improvement.
Identifying challenges in project consultants engagement practices
NASA Astrophysics Data System (ADS)
Shariffuddin, Nadia Alina Amir; Abidin, Nazirah Zainul
2017-10-01
Construction projects, green or conventional, involve multi-faceted disciplines engaged with the goal of delivering products i.e. building, infrastructure etc. at the best quality within stipulated budgets. For green projects, additional attention is added for environmental quality. Due to the various responsibilities and liabilities involved as well as the complexity of the construction process itself, formal engagement of multi-disciplinary professionals i.e. project consultants is required in any construction project. Poor selection of project consultants will lead to a multitude of complications resulting in delay, cost escalation, conflicts and poor quality. This paper explores the challenges that occur during the engagement of project consultants in a green project. As the engagement decision involves developers and architects, these two groups of respondents with green project backgrounds were approached qualitatively using interview technique. The challenges identified are limited experience and knowledge, consultants' fee vs. quality, green complexity, conflicts of interest, clients' extended expectation and less demand in green projects. The construction shifts to green project demands engagement of project consultants with added skills. It is expected that through the identification of challenges, better management and administration can be created which would give impact to the overall process of engagement in green projects.
Optical in situ monitoring of plasma-enhanced atomic layer deposition process
NASA Astrophysics Data System (ADS)
Zeeshan Arshad, Muhammad; Jo, Kyung Jae; Kim, Hyun Gi; Jeen Hong, Sang
2018-06-01
An optical in situ process monitoring method for the early detection of anomalies in plasma process equipment is presented. Cyclic process steps of precursor treatment and plasma reaction for the deposition of an angstrom-scale film increase their complexity to ensure the process quality. However, a small deviation in process parameters, for instance, gas flow rate, process temperature, or RF power, may jeopardize the deposited film quality. As a test vehicle for the process monitoring, we have investigated the aluminum-oxide (Al2O3) encapsulation process in plasma-enhanced atomic layer deposition (PEALD) to form a moisture and oxygen diffusion barrier in organic-light emitting diodes (OLEDs). By optical in situ monitoring, we successfully identified the reduction in oxygen flow rates in the reaction steps, which resulted in a 2.67 times increase in the water vapor transmission ratio (WVTR) of the deposited Al2O3 films. Therefore, we are convinced that the suggested in situ monitoring method is useful for the detection of process shifts or drifts that adversely affect PEALD film quality.
MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control
NASA Astrophysics Data System (ADS)
Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming
2017-09-01
Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.
Model-based quality assessment and base-calling for second-generation sequencing data.
Bravo, Héctor Corrada; Irizarry, Rafael A
2010-09-01
Second-generation sequencing (sec-gen) technology can sequence millions of short fragments of DNA in parallel, making it capable of assembling complex genomes for a small fraction of the price and time of previous technologies. In fact, a recently formed international consortium, the 1000 Genomes Project, plans to fully sequence the genomes of approximately 1200 people. The prospect of comparative analysis at the sequence level of a large number of samples across multiple populations may be achieved within the next five years. These data present unprecedented challenges in statistical analysis. For instance, analysis operates on millions of short nucleotide sequences, or reads-strings of A,C,G, or T's, between 30 and 100 characters long-which are the result of complex processing of noisy continuous fluorescence intensity measurements known as base-calling. The complexity of the base-calling discretization process results in reads of widely varying quality within and across sequence samples. This variation in processing quality results in infrequent but systematic errors that we have found to mislead downstream analysis of the discretized sequence read data. For instance, a central goal of the 1000 Genomes Project is to quantify across-sample variation at the single nucleotide level. At this resolution, small error rates in sequencing prove significant, especially for rare variants. Sec-gen sequencing is a relatively new technology for which potential biases and sources of obscuring variation are not yet fully understood. Therefore, modeling and quantifying the uncertainty inherent in the generation of sequence reads is of utmost importance. In this article, we present a simple model to capture uncertainty arising in the base-calling procedure of the Illumina/Solexa GA platform. Model parameters have a straightforward interpretation in terms of the chemistry of base-calling allowing for informative and easily interpretable metrics that capture the variability in sequencing quality. Our model provides these informative estimates readily usable in quality assessment tools while significantly improving base-calling performance. © 2009, The International Biometric Society.
ERIC Educational Resources Information Center
Marshall, Stephen
2016-01-01
Sense-making is a process of engaging with complex and dynamic environments that provides organisations and their leaders with a flexible and agile model of the world. The seven key properties of sense-making describe a process that is social and that respects the range of different stakeholders in an organisation. It also addresses the need to…
Exploring Pre-Service Teachers' Perceptions of Lesson Planning in Primary Education
ERIC Educational Resources Information Center
Sahin-Taskin, Cigdem
2017-01-01
Planning a lesson is a complex process. The relationship between the quality of a lesson plan and an effective teaching-learning process is widely acknowledged by researchers and educators. Therefore, developing preservice teachers' planning skills is considered key in raising effective teachers. This research aims to understand pre-service…
Use of Failure Mode and Effects Analysis to Improve Emergency Department Handoff Processes.
Sorrentino, Patricia
2016-01-01
The purpose of this article is to describe a quality improvement process using failure mode and effects analysis (FMEA) to evaluate systems handoff communication processes, improve emergency department (ED) throughput and reduce crowding through development of a standardized handoff, and, ultimately, improve patient safety. Risk of patient harm through ineffective communication during handoff transitions is a major reason for breakdown of systems. Complexities of ED processes put patient safety at risk. An increased incidence of submitted patient safety event reports for handoff communication failures between the ED and inpatient units solidified a decision to implement the use of FMEA to identify handoff failures to mitigate patient harm through redesign. The clinical nurse specialist implemented an FMEA. Handoff failure themes were created from deidentified retrospective reviews. Weekly meetings were held over a 3-month period to identify failure modes and determine cause and effect on the process. A functional block diagram process map tool was used to illustrate handoff processes. An FMEA grid was used to list failure modes and assign a risk priority number to quantify results. Multiple areas with actionable failures were identified. A majority of causes for high-priority failure modes were specific to communications. Findings demonstrate the complexity of transition and handoff processes. The FMEA served to identify and evaluate risk of handoff failures and provide a framework for process improvement. A focus on mentoring nurses to quality handoff processes so that it becomes habitual practice is crucial to safe patient transitions. Standardizing content and hardwiring within the system are best practice. The clinical nurse specialist is prepared to provide strong leadership to drive and implement system-wide quality projects.
Study of weld quality real-time monitoring system for auto-body assembly
NASA Astrophysics Data System (ADS)
Xu, Jun; Li, Yong-Bing; Chen, Guan-Long
2005-12-01
Resistance spot welding (RSW) is widely used for the auto-body assembly in automotive industry. But RSW suffers from a major problem of inconsistent quality from weld to weld. The major problem is the complexity of the basic process that may involve material coatings, electrode force, electrode wear, fit up, etc. Therefore weld quality assurance is still a big challenge and goal. Electrode displacement has proved to be a particularly useful signal which correlates well with weld quality. This paper introduces a novel auto-body spot weld quality monitoring system which uses electrode displacement as the quality parameter. This system chooses the latest laser displacement sensor with high resolution to measure the real-time electrode displacement. It solves the interference problem of sensor mounting by designing special fixture, and can be successfully applied on the portable welding machine. It is capable of evaluating weld quality and making diagnosis of process variations such as surface asperities, shunting, worn electrode and weld expansion with real-time electrode displacement. As proved by application in the workshop, the monitoring system has good stability and reliability, and is qualified for monitoring weld quality in process.
NASA Astrophysics Data System (ADS)
Liu, Yang; Zhang, Jian; Pang, Zhicong; Wu, Weihui
2018-04-01
Selective laser melting (SLM) provides a feasible way for manufacturing of complex thin-walled parts directly, however, the energy input during SLM process, namely derived from the laser power, scanning speed, layer thickness and scanning space, etc. has great influence on the thin wall's qualities. The aim of this work is to relate the thin wall's parameters (responses), namely track width, surface roughness and hardness to the process parameters considered in this research (laser power, scanning speed and layer thickness) and to find out the optimal manufacturing conditions. Design of experiment (DoE) was used by implementing composite central design to achieve better manufacturing qualities. Mathematical models derived from the statistical analysis were used to establish the relationships between the process parameters and the responses. Also, the effects of process parameters on each response were determined. Then, a numerical optimization was performed to find out the optimal process set at which the quality features are at their desired values. Based on this study, the relationship between process parameters and SLMed thin-walled structure was revealed and thus, the corresponding optimal process parameters can be used to manufactured thin-walled parts with high quality.
Management of laser welding based on analysis informative signals
NASA Astrophysics Data System (ADS)
Zvezdin, V. V.; Rakhimov, R. R.; Saubanov, Ruz R.; Israfilov, I. H.; Akhtiamov, R. F.
2017-09-01
Features of formation precision weld of metal were presented. It has been shown that the quality of the welding process depends not only on the energy characteristics of the laser processing facility, the temperature of the surface layer, but also on the accuracy of positioning laser focus relative to seam and the workpiece surface. So the laser focus positioning accuracy is an estimate of the quality of the welding process. This approach allows to build a system automated control of the laser technological complex with the stabilization of the setpoint accuracy of of positioning of the laser beam relative to the workpiece surface.
NASA Astrophysics Data System (ADS)
Babakhanova, Kh A.; Varepo, L. G.; Nagornova, I. V.; Babluyk, E. B.; Kondratov, A. P.
2018-04-01
Paper is one of the printing system key components causing the high-quality printed products output. Providing the printing companies with the specified printing properties paper, while simultaneously increasing the paper products range and volume by means of the forecasting methods application and evaluation during the production process, is certainly a relevant problem. The paper presents the printing quality control algorithm taking into consideration the paper printing properties quality assessment depending on the manufacture technological features and composition variation. The information system including raw material and paper properties data and making possible pulp and paper enterprises to select paper composition optimal formulation is proposed taking into account the printing process procedure peculiarities of the paper manufacturing with specified printing properties.
Echeverria-Beirute, Fabian; Murray, Seth C; Klein, Patricia; Kerth, Chris; Miller, Rhonda; Bertrand, Benoit
2018-05-30
Beverage quality is a complex attribute of coffee ( Coffea arabica L.). Genotype (G), environment (E), management (M), postharvest processing, and roasting are all involved. However, little is known about how G × M interactions influence beverage quality. We investigated how yield and coffee leaf rust (CLR) disease (caused by Hemileia vastatrix Berk. et Br.) management affect cup quality and plant performance, in two coffee cultivars. Sensory and chemical analyses revealed that 10 of 70 attributes and 18 of 154 chemical volatile compounds were significantly affected by G and M. Remarkably, acetaminophen was found for the first time in roasted coffee and in higher concentrations under more stressful conditions. A principal component analysis described 87% of the variation in quality and plant overall performance. This study is a first step in understanding the complexity of the physiological, metabolic, and molecular changes in coffee production, which will be useful for the improvement of coffee cultivars.
Advanced Water Quality Modelling in Marine Systems: Application to the Wadden Sea, the Netherlands
NASA Astrophysics Data System (ADS)
Boon, J.; Smits, J. G.
2006-12-01
There is an increasing demand for knowledge and models that arise from water management in relation to water quality, sediment quality (ecology) and sediment accumulation (ecomorphology). Recently, models for sediment diagenesis and erosion developed or incorporated by Delft Hydraulics integrates the relevant physical, (bio)chemical and biological processes for the sediment-water exchange of substances. The aim of the diagenesis models is the prediction of both sediment quality and the return fluxes of substances such as nutrients and micropollutants to the overlying water. The resulting so-called DELWAQ-G model is a new, generic version of the water and sediment quality model of the DELFT3D framework. One set of generic water quality process formulations is used to calculate process rates in both water and sediment compartments. DELWAQ-G involves the explicit simulation of sediment layers in the water quality model with state-of-the-art process kinetics. The local conditions in a water layer or sediment layer such as the dissolved oxygen concentration determine if and how individual processes come to expression. New processes were added for sulphate, sulphide, methane and the distribution of the electron-acceptor demand over dissolved oxygen, nitrate, sulphate and carbon dioxide. DELWAQ-G also includes the dispersive and advective transport processes in the sediment and across the sediment-water interface. DELWAQ-G has been applied for the Wadden Sea. A very dynamic tidal and ecologically active estuary with a complex hydrodynamic behaviour located at the north of the Netherlands. The predicted profiles in the sediment reflect the typical interactions of diagenesis processes.
Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1997-01-01
Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.
Virtual enterprise model for the electronic components business in the Nuclear Weapons Complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferguson, T.J.; Long, K.S.; Sayre, J.A.
1994-08-01
The electronic components business within the Nuclear Weapons Complex spans organizational and Department of Energy contractor boundaries. An assessment of the current processes indicates a need for fundamentally changing the way electronic components are developed, procured, and manufactured. A model is provided based on a virtual enterprise that recognizes distinctive competencies within the Nuclear Weapons Complex and at the vendors. The model incorporates changes that reduce component delivery cycle time and improve cost effectiveness while delivering components of the appropriate quality.
ERIC Educational Resources Information Center
California State Legislature, Sacramento. Joint Legislative Audit Committee.
The California legislature's Joint Legislative Audit Committee has issued a report concerning the Belmont Learning Complex (BLC) and the Los Angeles Unified School District's (LAUSD's) propensity for engaging in a series of school construction projects on contaminated land. The analysis suggests that the LAUSD was made aware of the BLC site's…
Gonzalez Bernaldo de Quiros, Fernan; Dawidowski, Adriana R; Figar, Silvana
2017-02-01
In this study, we aimed: 1) to conceptualize the theoretical challenges facing health information systems (HIS) to represent patients' decisions about health and medical treatments in everyday life; 2) to suggest approaches for modeling these processes. The conceptualization of the theoretical and methodological challenges was discussed in 2015 during a series of interdisciplinary meetings attended by health informatics staff, epidemiologists and health professionals working in quality management and primary and secondary prevention of chronic diseases of the Hospital Italiano de Buenos Aires, together with sociologists, anthropologists and e-health stakeholders. HIS are facing the need and challenge to represent social human processes based on constructivist and complexity theories, which are the current frameworks of human sciences for understanding human learning and socio-cultural changes. Computer systems based on these theories can model processes of social construction of concrete and subjective entities and the interrelationships between them. These theories could be implemented, among other ways, through the mapping of health assets, analysis of social impact through community trials and modeling of complexity with system simulation tools. This analysis suggested the need to complement the traditional linear causal explanations of disease onset (and treatments) that are the bases for models of analysis of HIS with constructivist and complexity frameworks. Both may enlighten the complex interrelationships among patients, health services and the health system. The aim of this strategy is to clarify people's decision making processes to improve the efficiency, quality and equity of the health services and the health system.
An evidence-based framework to measure quality of allied health care.
Grimmer, Karen; Lizarondo, Lucylynn; Kumar, Saravana; Bell, Erica; Buist, Michael; Weinstein, Philip
2014-02-26
There is no standard way of describing the complexities of allied health (AH) care, or its quality. AH is an umbrella term which excludes medicine and nursing, and variably includes disciplines which provide therapy, diagnostic, or scientific services. This paper outlines a framework for a standard approach to evaluate the quality of AH therapy services. A realist synthesis framework describing what AH does, how it does it, and what is achieved, was developed. This was populated by the findings of a systematic review of literature published since 1980 reporting concepts of quality relevant to AH. Articles were included on quality measurement concepts, theories, debates, and/or hypothetical frameworks. Of 139 included articles, 21 reported on descriptions of quality potentially relevant to AH. From these, 24 measures of quality were identified, with 15 potentially relating to what AH does, 17 to how AH delivers care, 8 relating to short term functional outcomes, and 9 relating to longer term functional and health system outcomes. A novel evidence-based quality framework was proposed to address the complexity of AH therapies. This should assist in better evaluation of AH processes and outcomes, costs, and evidence-based engagement of AH providers in healthcare teams.
Reduced order models for prediction of groundwater quality impacts from CO₂ and brine leakage
Zheng, Liange; Carroll, Susan; Bianchi, Marco; ...
2014-12-31
A careful assessment of the risk associated with geologic CO₂ storage is critical to the deployment of large-scale storage projects. A potential risk is the deterioration of groundwater quality caused by the leakage of CO₂ and brine leakage from deep subsurface reservoirs. In probabilistic risk assessment studies, numerical modeling is the primary tool employed to assess risk. However, the application of traditional numerical models to fully evaluate the impact of CO₂ leakage on groundwater can be computationally complex, demanding large processing times and resources, and involving large uncertainties. As an alternative, reduced order models (ROMs) can be used as highlymore » efficient surrogates for the complex process-based numerical models. In this study, we represent the complex hydrogeological and geochemical conditions in a heterogeneous aquifer and subsequent risk by developing and using two separate ROMs. The first ROM is derived from a model that accounts for the heterogeneous flow and transport conditions in the presence of complex leakage functions for CO₂ and brine. The second ROM is obtained from models that feature similar, but simplified flow and transport conditions, and allow for a more complex representation of all relevant geochemical reactions. To quantify possible impacts to groundwater aquifers, the basic risk metric is taken as the aquifer volume in which the water quality of the aquifer may be affected by an underlying CO₂ storage project. The integration of the two ROMs provides an estimate of the impacted aquifer volume taking into account uncertainties in flow, transport and chemical conditions. These two ROMs can be linked in a comprehensive system level model for quantitative risk assessment of the deep storage reservoir, wellbore leakage, and shallow aquifer impacts to assess the collective risk of CO₂ storage projects.« less
Removal of Lead Hydroxides Complexes from Solutions Formed in Silver/Gold: Cyanidation Process
NASA Astrophysics Data System (ADS)
Parga, José R.; Martinez, Raul Flores; Moreno, Hector; Gomes, Andrew Jewel; Cocke, David L.
2014-04-01
The presence of lead hydroxides in "pregnant cyanide solution" decreases the quality of the Dore obtained in the recovery processes of gold and silver, so it is convenient to remove them. The adsorbent capacity of the low cost cow bone powder was investigated for the removal of lead ions from a solution of lead hydroxide complexes at different initial metal ion concentrations (10 to 50 mg/L), and reaction time. Experiments were carried out in batches. The maximum sorption capacity of lead determined by the Langmuir model was found to be 126.58 mg/g, and the separation factor R L was between 0 and 1, indicating a significant affinity of bone for lead. Experimental data follow pseudo-second order kinetics suggesting chemisorption. It is concluded that cow bone powder can be successfully used for the removal of lead ions, and improves the quality of the silver-gold cyanides precipitate.
Consensus Guidelines into the Management of Epilepsy in Adults with an Intellectual Disability
ERIC Educational Resources Information Center
Kerr, M.; Scheepers, M.; Arvio, M.; Beavis, J.; Brandt, C.; Brown, S.; Huber, B.; Iivanainen, M.; Louisse, A. C.; Martin, P.; Marson, A. G.; Prasher, V.; Singh, B. K.; Veendrick, M.; Wallace, R. A.
2009-01-01
Background: Epilepsy has a pervasive impact on the lives of people with intellectual disability and their carers. The delivery of high-quality care is impacted on by the complexity and diversity of epilepsy in this population. This article presents the results of a consensus clinical guideline process. Results: A Delphi process identified a list…
Quality cell therapy manufacturing by design.
Lipsitz, Yonatan Y; Timmins, Nicholas E; Zandstra, Peter W
2016-04-01
Transplantation of live cells as therapeutic agents is poised to offer new treatment options for a wide range of acute and chronic diseases. However, the biological complexity of cells has hampered the translation of laboratory-scale experiments into industrial processes for reliable, cost-effective manufacturing of cell-based therapies. We argue here that a solution to this challenge is to design cell manufacturing processes according to quality-by-design (QbD) principles. QbD integrates scientific knowledge and risk analysis into manufacturing process development and is already being adopted by the biopharmaceutical industry. Many opportunities to incorporate QbD into cell therapy manufacturing exist, although further technology development is required for full implementation. Linking measurable molecular and cellular characteristics of a cell population to final product quality through QbD is a crucial step in realizing the potential for cell therapies to transform healthcare.
van der Voort, P H J; van der Veer, S N; de Vos, M L G
2012-10-01
In the concept of total quality management that was originally developed in industry, the use of quality indicators is essential. The implementation of quality indicators in the intensive care unit to improve the quality of care is a complex process. This process can be described in seven subsequent steps of an indicator-based quality improvement (IBQI) cycle. With this IBQI cycle, a continuous quality improvement can be achieved with the use of indicator data in a benchmark setting. After the development of evidence-based indicators, a sense of urgency has to be created, registration should start, raw data must be analysed, feedback must be given, and interpretation and conclusions must be made, followed by a quality improvement plan. The last step is the implementation of changes that needs a sense of urgency, and this completes the IBQI cycle. Barriers and facilitators are found in each step. They should be identified and addressed in a multifaceted quality improvement strategy. © 2012 The Authors. Acta Anaesthesiologica Scandinavica © 2012 The Acta Anaesthesiologica Scandinavica Foundation.
Gotlib Conn, Lesley; Zwaiman, Ashley; DasGupta, Tracey; Hales, Brigette; Watamaniuk, Aaron; Nathens, Avery B
2018-01-01
Challenges delivering quality care are especially salient during hospital discharge and care transitions. Severely injured patients discharged from a trauma centre will go either home, to rehabilitation or another acute care hospital with complex management needs. This purpose of this study was to explore the experiences of trauma patients and families treated in a regional academic trauma centre to better understand and improve their discharge and care transition experiences. A qualitative study using inductive thematic analysis was conducted between March and October 2016. Telephone interviews were conducted with trauma patients and/or a family member after discharge from the trauma centre. Data collection and analysis were completed inductively and iteratively consistent with a qualitative approach. Twenty-four interviews included 19 patients and 7 family members. Participants' experiences drew attention to discharge and transfer processes that either (1) Fostered quality discharge or (2) Impeded quality discharge. Fostering quality discharge was ward staff preparation efforts; establishing effective care continuity; and, adequate emotional support. Impeding discharge quality was perceived pressure to leave the hospital; imposed transfer decisions; and, sub-optimal communication and coordination around discharge. Patient-provider communication was viewed to be driven by system, rather than patient need. Inter-facility information gaps raised concern about receiving facilities' ability to care for injured patients. The quality of trauma patient discharge and transition experiences is undermined by system- and ward-level processes that compete, rather than align, in producing high quality patient-centred discharge. Local improvement solutions focused on modifiable factors within the trauma centre include patient-oriented discharge education and patient navigation; however, these approaches alone may be insufficient to enhance patient experiences. Trauma patients encounter complex barriers to quality discharge that likely require a comprehensive, multimodal intervention. Copyright © 2017 Elsevier Ltd. All rights reserved.
Schwabl, Alexandra; Gämperle, Erich
2013-01-01
Tibetan recipes are complex formulas from plant and mineral ingredients. Padma Inc. has been producing selected formulas from Tibetan Medicine in Switzerland since more than 40 years. Modern quality standards and Good Manufacturing Practice (GMP) guidelines are followed, ensuring the quality of the raw materials through the manufacturing processes to the finished product. The aim is to provide these valuable formulas to people in the West in a consistently high quality 'made in Switzerland'. The production according to modern quality standards is challenging, draws on many resources, and requires specialized expertise, e.g. in the procurement of raw materials and the quality analysis including pharmacognostic and botanical knowledge.
NASA Astrophysics Data System (ADS)
Glaser, Ulf; Li, Zhichao; Bichmann, Stephan, II; Pfeifer, Tilo
2003-05-01
By China's entry into the WTO, Chinese as well as German companies are facing the question, how to minimize the risk of unfamiliar cooperation partners when developing products. The rise of customer demands concerning quality, product diversity and the reduction of expenses require flexibility and efficiency with reliable component suppliers. In order to build and strengthen sino-german cooperations, a manufacturing control using homogenized and efficient measures to assure high quality is of vital importance. Lack of unifications may cause identical measurements conducted at subcontractors or customers to be carried out with different measurement processes which leads to incomparable results. Rapidly growing company cooperations and simultaneously decreasing of manufacturing scope cause substantial difficulties when coordinating joint quality control activities. "ProSens," a sino-german project consortium consisting of industrial users, technology producers and research institutes, aims at improving selected production processes by: Creation of a homogeneous quality awareness in sino-german cooperations. Sensitization for process accompanying metrology at an early stage of product development. Increase of the process performance by the use of integrated metrology. Reduction of production time and cost. Unification of quality control of complex products by means of efficient measurement strategies and CAD-based inspection planning.
Fast and accurate detection of spread source in large complex networks.
Paluch, Robert; Lu, Xiaoyan; Suchecki, Krzysztof; Szymański, Bolesław K; Hołyst, Janusz A
2018-02-06
Spread over complex networks is a ubiquitous process with increasingly wide applications. Locating spread sources is often important, e.g. finding the patient one in epidemics, or source of rumor spreading in social network. Pinto, Thiran and Vetterli introduced an algorithm (PTVA) to solve the important case of this problem in which a limited set of nodes act as observers and report times at which the spread reached them. PTVA uses all observers to find a solution. Here we propose a new approach in which observers with low quality information (i.e. with large spread encounter times) are ignored and potential sources are selected based on the likelihood gradient from high quality observers. The original complexity of PTVA is O(N α ), where α ∈ (3,4) depends on the network topology and number of observers (N denotes the number of nodes in the network). Our Gradient Maximum Likelihood Algorithm (GMLA) reduces this complexity to O (N 2 log (N)). Extensive numerical tests performed on synthetic networks and real Gnutella network with limitation that id's of spreaders are unknown to observers demonstrate that for scale-free networks with such limitation GMLA yields higher quality localization results than PTVA does.
Quality assurance after process changes of the production of a therapeutic antibody.
Brass, J M; Krummen, K; Moll-Kaufmann, C
1996-12-01
Process development for the production of a therapeutic humanised antibody is a very complex operation. It involves recombinant genetics, verification of a strong expression system, gene amplification, characterisation of a stable host cell expression system, optimisation and design of the mammalian cell culture fermentation system and development of an efficient recovery process resulting in high yields and product quality. Rapid progress in the field and the wish of some pharmaceutical companies for outsourcing their production are the driving forces for process changes relatively late in the development phase. This literature survey is aimed at identifying the limits of acceptable process changes in up scaling of the fermentation and down stream processing of biopharmaceuticals and defining the demand in production validation to prove product equivalency and identity of the isolated, purified therapeutic antibody.
DNAproDB: an interactive tool for structural analysis of DNA–protein complexes
Sagendorf, Jared M.
2017-01-01
Abstract Many biological processes are mediated by complex interactions between DNA and proteins. Transcription factors, various polymerases, nucleases and histones recognize and bind DNA with different levels of binding specificity. To understand the physical mechanisms that allow proteins to recognize DNA and achieve their biological functions, it is important to analyze structures of DNA–protein complexes in detail. DNAproDB is a web-based interactive tool designed to help researchers study these complexes. DNAproDB provides an automated structure-processing pipeline that extracts structural features from DNA–protein complexes. The extracted features are organized in structured data files, which are easily parsed with any programming language or viewed in a browser. We processed a large number of DNA–protein complexes retrieved from the Protein Data Bank and created the DNAproDB database to store this data. Users can search the database by combining features of the DNA, protein or DNA–protein interactions at the interface. Additionally, users can upload their own structures for processing privately and securely. DNAproDB provides several interactive and customizable tools for creating visualizations of the DNA–protein interface at different levels of abstraction that can be exported as high quality figures. All functionality is documented and freely accessible at http://dnaprodb.usc.edu. PMID:28431131
[Quality Management in Medicine: What the Surgeon Needs to Know].
Holtel, M; Roßmüller, T; Frommhold, K
2016-10-01
Quality management (QM) is a method used in the field of economics that was adopted late by the medical sector. The coincidence of quality management and what is referred to as economisation in medicine frequently leads to QM being - incorrectly - perceived as part of the economisation problem rather than as part of its solution. Quality assurance defines and observes key performance indicators for the achievement of quality objectives. QM is a form of active management that intends to systematically exclude the effects of chance. It is supposed to enable those in charge of an institution to deal with complex processes, to influence them and achieve quality even under unfavourable circumstances. Clearly defined written standards are an important aspect of QM and allow for 80 % of patients to be treated faster and less labour-intensively and thus to create more capacity for the individual treatment of the 20 % of patients requiring other than routine care. Standards provide a framework to rely on for department heads and other staff alike. They reduce complexity, support processes in stress situations and prevent inconsistent decisions in the course of treatment. Document management ensures transparent and up-to-date in-house standards and creates continuity. Good documents are short, easy to use, and, at the same time, comply with requirements. Specifications describe in-house standards; validation documents provide a forensically sound documentation. Quality management has a broad impact on an institution. It helps staff reflect on their daily work, and it initiates a reporting and auditing system as well as the systematic management of responses to surveys and complaints. Risk management is another aspect of QM; it provides structures to identify, analyse, assess and modify risks and subject them to risk controlling. Quality management is not necessarily associated with certification. However, if certification is intended, it serves to define requirements, increase motivation for the implementation of measures to be taken, and provide long-term continuity in newly adopted processes. Specialist certificates issued by medical associations frequently emphasise an interdisciplinary treatment approach; however, their certification processes are often of poor quality. The effectiveness and efficiency is evident for individual QM instruments in medicine. It is very likely that quality management improves effectiveness in the whole field of medicine, but this has yet to be proved. Georg Thieme Verlag KG Stuttgart · New York.
Biochemical Reconstitution of the WAVE Regulatory Complex
Chen, Baoyu; Padrick, Shae B.; Henry, Lisa; Rosen, Michael K.
2014-01-01
The WAVE regulatory complex (WRC) is a 400-KDa heteropentameric protein assembly that plays a central role in controlling actin cytoskeletal dynamics in many cellular processes. The WRC acts by integrating diverse cellular cues and stimulating the actin nucleating activity of the Arp2/3 complex at membranes. Biochemical and biophysical studies of the underlying mechanisms of these processes require large amounts of purified WRC. Recent success in recombinant expression, reconstitution, purification and crystallization of the WRC has greatly advanced our understanding of the inhibition, activation and membrane recruitment mechanisms of this complex. But many important questions remain to be answered. Here we summarize and update the methods developed in our laboratory, which allow reliable and flexible production of tens of milligrams of recombinant WRC of crystallographic quality, sufficient for many biochemical and structural studies. PMID:24630101
Testing guide for implementing concrete paving quality control procedures.
DOT National Transportation Integrated Search
2008-03-01
Construction of portland cement concrete pavements is a complex process. A small fraction of the concrete pavements constructed in the : United States over the last few decades have either failed prematurely or exhibited moderate to severe distress. ...
Precision manufacturing for clinical-quality regenerative medicines.
Williams, David J; Thomas, Robert J; Hourd, Paul C; Chandra, Amit; Ratcliffe, Elizabeth; Liu, Yang; Rayment, Erin A; Archer, J Richard
2012-08-28
Innovations in engineering applied to healthcare make a significant difference to people's lives. Market growth is guaranteed by demographics. Regulation and requirements for good manufacturing practice-extreme levels of repeatability and reliability-demand high-precision process and measurement solutions. Emerging technologies using living biological materials add complexity. This paper presents some results of work demonstrating the precision automated manufacture of living materials, particularly the expansion of populations of human stem cells for therapeutic use as regenerative medicines. The paper also describes quality engineering techniques for precision process design and improvement, and identifies the requirements for manufacturing technology and measurement systems evolution for such therapies.
Naleskina, L A
1985-01-01
Analysis of the topography peculiarities and distribution of oxidized melanine and its precursors (DOPA-oxide activity and catecholamine) in pigment nevuses and malignant melanomas of skin shows that the studied peculiarities are a complex of intersupplementary markers of melanine formation, correlate with the quality and the degree of proliferative process expression in tumours of this genesis and may be used for their malignancy rating.
ERIC Educational Resources Information Center
Matz, Amy Kristen
2013-01-01
The development of social competence for children is critical to their ability to navigate social decision making processes; however, children with complex disabilities have many difficulties in developing social competence. In an educational environment, the optimal setting for a child to develop social competence is within the inclusive…
Manipulation complexity in primates coevolved with brain size and terrestriality
Heldstab, Sandra A.; Kosonen, Zaida K.; Koski, Sonja E.; Burkart, Judith M.; van Schaik, Carel P.; Isler, Karin
2016-01-01
Humans occupy by far the most complex foraging niche of all mammals, built around sophisticated technology, and at the same time exhibit unusually large brains. To examine the evolutionary processes underlying these features, we investigated how manipulation complexity is related to brain size, cognitive test performance, terrestriality, and diet quality in a sample of 36 non-human primate species. We categorized manipulation bouts in food-related contexts into unimanual and bimanual actions, and asynchronous or synchronous hand and finger use, and established levels of manipulative complexity using Guttman scaling. Manipulation categories followed a cumulative ranking. They were particularly high in species that use cognitively challenging food acquisition techniques, such as extractive foraging and tool use. Manipulation complexity was also consistently positively correlated with brain size and cognitive test performance. Terrestriality had a positive effect on this relationship, but diet quality did not affect it. Unlike a previous study on carnivores, we found that, among primates, brain size and complex manipulations to acquire food underwent correlated evolution, which may have been influenced by terrestriality. Accordingly, our results support the idea of an evolutionary feedback loop between manipulation complexity and cognition in the human lineage, which may have been enhanced by increasingly terrestrial habits. PMID:27075921
High resolution pollutant measurements in complex urban ...
Measuring air pollution in real-time using an instrumented vehicle platform has been an emerging strategy to resolve air pollution trends at a very fine spatial scale (10s of meters). Achieving second-by-second data representative of urban air quality trends requires advanced instrumentation, such as a quantum cascade laser utilized to resolve carbon monoxide and real-time optical detection of black carbon. An equally challenging area of development is processing and visualization of complex geospatial air monitoring data to decipher key trends of interest. EPA’s Office of Research and Development staff have applied air monitoring to evaluate community air quality in a variety of environments, including assessing air quality surrounding rail yards, evaluating noise wall or tree stand effects on roadside and on-road air quality, and surveying of traffic-related exposure zones for comparison with land-use regression estimates. ORD has ongoing efforts to improve mobile monitoring data collection and interpretation, including instrumentation testing, evaluating the effect of post-processing algorithms on derived trends, and developing a web-based tool called Real-Time Geospatial Data Viewer (RETIGO) allowing for a simple plug-and-play of mobile monitoring data. Example findings from mobile data sets include an estimated 50% in roadside ultrafine particle levels when immediately downwind of a noise barrier, increases in neighborhood-wide black carbon levels (3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.
1997-06-01
Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less
O’Suilleabhain, Padraig E.; Sanghera, Manjit; Patel, Neepa; Khemani, Pravin; Lacritz, Laura H.; Chitnis, Shilpa; Whitworth, Louis A.; Dewey, Richard B.
2016-01-01
Objective To develop a process to improve patient outcomes from deep brain stimulation (DBS) surgery for Parkinson disease (PD), essential tremor (ET), and dystonia. Methods We employed standard quality improvement methodology using the Plan-Do-Study-Act process to improve patient selection, surgical DBS lead implantation, postoperative programming, and ongoing assessment of patient outcomes. Results The result of this quality improvement process was the development of a neuromodulation network. The key aspect of this program is rigorous patient assessment of both motor and non-motor outcomes tracked longitudinally using a REDCap database. We describe how this information is used to identify problems and to initiate Plan-Do-Study-Act cycles to address them. Preliminary outcomes data is presented for the cohort of PD and ET patients who have received surgery since the creation of the neuromodulation network. Conclusions Careful outcomes tracking is essential to ensure quality in a complex therapeutic endeavor like DBS surgery for movement disorders. The REDCap database system is well suited to store outcomes data for the purpose of ongoing quality assurance monitoring. PMID:27711133
Dewey, Richard B; O'Suilleabhain, Padraig E; Sanghera, Manjit; Patel, Neepa; Khemani, Pravin; Lacritz, Laura H; Chitnis, Shilpa; Whitworth, Louis A; Dewey, Richard B
2016-01-01
To develop a process to improve patient outcomes from deep brain stimulation (DBS) surgery for Parkinson disease (PD), essential tremor (ET), and dystonia. We employed standard quality improvement methodology using the Plan-Do-Study-Act process to improve patient selection, surgical DBS lead implantation, postoperative programming, and ongoing assessment of patient outcomes. The result of this quality improvement process was the development of a neuromodulation network. The key aspect of this program is rigorous patient assessment of both motor and non-motor outcomes tracked longitudinally using a REDCap database. We describe how this information is used to identify problems and to initiate Plan-Do-Study-Act cycles to address them. Preliminary outcomes data is presented for the cohort of PD and ET patients who have received surgery since the creation of the neuromodulation network. Careful outcomes tracking is essential to ensure quality in a complex therapeutic endeavor like DBS surgery for movement disorders. The REDCap database system is well suited to store outcomes data for the purpose of ongoing quality assurance monitoring.
Singh, Anuradha; Mantri, Shrikant; Sharma, Monica; Chaudhury, Ashok; Tuli, Rakesh; Roy, Joy
2014-01-16
The cultivated bread wheat (Triticum aestivum L.) possesses unique flour quality, which can be processed into many end-use food products such as bread, pasta, chapatti (unleavened flat bread), biscuit, etc. The present wheat varieties require improvement in processing quality to meet the increasing demand of better quality food products. However, processing quality is very complex and controlled by many genes, which have not been completely explored. To identify the candidate genes whose expressions changed due to variation in processing quality and interaction (quality x development), genome-wide transcriptome studies were performed in two sets of diverse Indian wheat varieties differing for chapatti quality. It is also important to understand the temporal and spatial distributions of their expressions for designing tissue and growth specific functional genomics experiments. Gene-specific two-way ANOVA analysis of expression of about 55 K transcripts in two diverse sets of Indian wheat varieties for chapatti quality at three seed developmental stages identified 236 differentially expressed probe sets (10-fold). Out of 236, 110 probe sets were identified for chapatti quality. Many processing quality related key genes such as glutenin and gliadins, puroindolines, grain softness protein, alpha and beta amylases, proteases, were identified, and many other candidate genes related to cellular and molecular functions were also identified. The ANOVA analysis revealed that the expression of 56 of 110 probe sets was involved in interaction (quality x development). Majority of the probe sets showed differential expression at early stage of seed development i.e. temporal expression. Meta-analysis revealed that the majority of the genes expressed in one or a few growth stages indicating spatial distribution of their expressions. The differential expressions of a few candidate genes such as pre-alpha/beta-gliadin and gamma gliadin were validated by RT-PCR. Therefore, this study identified several quality related key genes including many other genes, their interactions (quality x development) and temporal and spatial distributions. The candidate genes identified for processing quality and information on temporal and spatial distributions of their expressions would be useful for designing wheat improvement programs for processing quality either by changing their expression or development of single nucleotide polymorphisms (SNPs) markers.
2014-01-01
Background The cultivated bread wheat (Triticum aestivum L.) possesses unique flour quality, which can be processed into many end-use food products such as bread, pasta, chapatti (unleavened flat bread), biscuit, etc. The present wheat varieties require improvement in processing quality to meet the increasing demand of better quality food products. However, processing quality is very complex and controlled by many genes, which have not been completely explored. To identify the candidate genes whose expressions changed due to variation in processing quality and interaction (quality x development), genome-wide transcriptome studies were performed in two sets of diverse Indian wheat varieties differing for chapatti quality. It is also important to understand the temporal and spatial distributions of their expressions for designing tissue and growth specific functional genomics experiments. Results Gene-specific two-way ANOVA analysis of expression of about 55 K transcripts in two diverse sets of Indian wheat varieties for chapatti quality at three seed developmental stages identified 236 differentially expressed probe sets (10-fold). Out of 236, 110 probe sets were identified for chapatti quality. Many processing quality related key genes such as glutenin and gliadins, puroindolines, grain softness protein, alpha and beta amylases, proteases, were identified, and many other candidate genes related to cellular and molecular functions were also identified. The ANOVA analysis revealed that the expression of 56 of 110 probe sets was involved in interaction (quality x development). Majority of the probe sets showed differential expression at early stage of seed development i.e. temporal expression. Meta-analysis revealed that the majority of the genes expressed in one or a few growth stages indicating spatial distribution of their expressions. The differential expressions of a few candidate genes such as pre-alpha/beta-gliadin and gamma gliadin were validated by RT-PCR. Therefore, this study identified several quality related key genes including many other genes, their interactions (quality x development) and temporal and spatial distributions. Conclusions The candidate genes identified for processing quality and information on temporal and spatial distributions of their expressions would be useful for designing wheat improvement programs for processing quality either by changing their expression or development of single nucleotide polymorphisms (SNPs) markers. PMID:24433256
NASA Astrophysics Data System (ADS)
Spoelstra, Paul; Djakow, Eugen; Homberg, Werner
2017-10-01
The production of complex organic shapes in sheet metals is gaining more importance in the food industry due to increasing functional and hygienic demands. Hence it is necessary to produce parts with complex geometries promoting cleanability and general sanitation leading to improvement of food safety. In this context, and especially when stainless steel has to be formed into highly complex geometries while maintaining desired surface properties, it is inevitable that alternative manufacturing processes will need to be used which meet these requirements. Rubber pad forming offers high potential when it comes to shaping complex parts with excellent surface quality, with virtually no tool marks and scratches. Especially in cases where only small series are to be produced, rubber pad forming processes offers both technological and economic advantages. Due to the flexible punch, variation in metal thickness can be used with the same forming tool. The investments to set-up Rubber pad forming is low in comparison to conventional sheet metal forming processes. The process facilitates production of shallow sheet metal parts with complex contours and bends. Different bending sequences in a multiple tool set-up can also be conducted. The planned contribution thus describes a brief overview of the rubber pad technology. It shows the prototype rubber pad forming machine which can be used to perform complex part geometries made from stainless steel (1.4301). Based on an analysis of the already existing systems and new machines for rubber pad forming processes, together with their process properties, influencing variables and areas of application, some relevant parts for the food industry are presented.
Validity as a social imperative for assessment in health professions education: a concept analysis.
Marceau, Mélanie; Gallagher, Frances; Young, Meredith; St-Onge, Christina
2018-06-01
Assessment can have far-reaching consequences for future health care professionals and for society. Thus, it is essential to establish the quality of assessment. Few modern approaches to validity are well situated to ensure the quality of complex assessment approaches, such as authentic and programmatic assessments. Here, we explore and delineate the concept of validity as a social imperative in the context of assessment in health professions education (HPE) as a potential framework for examining the quality of complex and programmatic assessment approaches. We conducted a concept analysis using Rodgers' evolutionary method to describe the concept of validity as a social imperative in the context of assessment in HPE. Supported by an academic librarian, we developed and executed a search strategy across several databases for literature published between 1995 and 2016. From a total of 321 citations, we identified 67 articles that met our inclusion criteria. Two team members analysed the texts using a specified approach to qualitative data analysis. Consensus was achieved through full team discussions. Attributes that characterise the concept were: (i) demonstration of the use of evidence considered credible by society to document the quality of assessment; (ii) validation embedded through the assessment process and score interpretation; (iii) documented validity evidence supporting the interpretation of the combination of assessment findings, and (iv) demonstration of a justified use of a variety of evidence (quantitative and qualitative) to document the quality of assessment strategies. The emerging concept of validity as a social imperative highlights some areas of focus in traditional validation frameworks, whereas some characteristics appear unique to HPE and move beyond traditional frameworks. The study reflects the importance of embedding consideration for society and societal concerns throughout the assessment and validation process, and may represent a potential lens through which to examine the quality of complex and programmatic assessment approaches. © 2018 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
[Research on evolution and transition of quality evaluation of Shihu].
Zhao, Yu-Jiao; Han, Bang-Xing; Peng, Hua-Sheng; Peng, Dai-Yin
2016-04-01
Shihu is one of the most valuable Chinese medicines in China. The sources of Shihu are complex and the quality evaluation of it mainly depends on morphologic characteristics distinction. In order to understand the Shihu's quality evaluation concept of Chinese herbalists in the past dynasties. This paper summarizes the methods of quality evaluation in ancient bencao systematically. The ancient bencao records that the quality of Shihu is closely related to its germplasm, habitat, processing and character. The concept of germplasm about shihu includes generalized and narrow sense, besides, the clinical applications of Dendrobium huoshanenese, D. officinale and D. nobile are focused on different diseases. D. huoshanenese has been regarded as Dao-di herbs since Qing Dynasty. The main products of Shihu contain fresh goods and dry goods, their clinical applications are also treated differently. Fengdou is one of the commodity specifications in Shihu. Its processing method probably dated from the Qing Dynasty, but now, it has become the mainstream commodityform. There was a common sense that different kinds of Shihu owed different characters and curative effects in ancient bencao, and the quality would enhance with the increasing viscidity. Therefore, the "Germplasm", "Geoherbs", "processing", "characters" were integrated into traditional quality evaluation methods of Shihu. Meanwhile, we should pay attention to the clinical efficacy of shihu with different species and processing methods. Copyright© by the Chinese Pharmaceutical Association.
Karlberg, Micael; von Stosch, Moritz; Glassey, Jarka
2018-03-07
In today's biopharmaceutical industries, the lead time to develop and produce a new monoclonal antibody takes years before it can be launched commercially. The reasons lie in the complexity of the monoclonal antibodies and the need for high product quality to ensure clinical safety which has a significant impact on the process development time. Frameworks such as quality by design are becoming widely used by the pharmaceutical industries as they introduce a systematic approach for building quality into the product. However, full implementation of quality by design has still not been achieved due to attrition mainly from limited risk assessment of product properties as well as the large number of process factors affecting product quality that needs to be investigated during the process development. This has introduced a need for better methods and tools that can be used for early risk assessment and predictions of critical product properties and process factors to enhance process development and reduce costs. In this review, we investigate how the quantitative structure-activity relationships framework can be applied to an existing process development framework such as quality by design in order to increase product understanding based on the protein structure of monoclonal antibodies. Compared to quality by design, where the effect of process parameters on the drug product are explored, quantitative structure-activity relationships gives a reversed perspective which investigates how the protein structure can affect the performance in different unit operations. This provides valuable information that can be used during the early process development of new drug products where limited process understanding is available. Thus, quantitative structure-activity relationships methodology is explored and explained in detail and we investigate the means of directly linking the structural properties of monoclonal antibodies to process data. The resulting information as a decision tool can help to enhance the risk assessment to better aid process development and thereby overcome some of the limitations and challenges present in QbD implementation today.
[The concept of the development of the state of chemical-analytical environmental monitoring].
Rakhmanin, Iu A; Malysheva, A G
2013-01-01
Chemical and analytical monitoring of the quality of environment is based on the accounting of the trace amount of substances. Considering the multicomponent composition of the environment and running processes of transformation of substances in it, in determination of the danger of the exposure to the chemical pollution of environment on population health there is necessary evaluation based on the simultaneous account of complex of substances really contained in the environment and supplying from different sources. Therefore, in the analytical monitoring of the quality and safety of the environment there is a necessary conversion from the orientation, based on the investigation of specific target substances, to estimation of real complex of compounds.
ERIC Educational Resources Information Center
Limpo, Teresa; Alves, Rui A.; Fidalgo, Raquel
2014-01-01
Background: It is well established that the activity of producing a text is a complex one involving three main cognitive processes: Planning, translating, and revising. Although these processes are crucial in skilled writing, beginning and developing writers seem to struggle with them, mainly with planning and revising. Aims: To trace the…
Qualification Procedures for VHSIC/VLSI
1990-12-01
alternative approach for qualification of complex microcircuits. To address the technical issues related to a process oriented qualification approach, the...methodology of microcircuit process control to promote the United States to a position of supplying the nighest quality and most reliable...available resources . o Coordinate document reviews with weekly and monthly status reviews on progress. o Summarize results and collate into four basic
Implementation of Quality Management in Core Service Laboratories
Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.
2010-01-01
CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.
Objective assessment of MPEG-2 video quality
NASA Astrophysics Data System (ADS)
Gastaldo, Paolo; Zunino, Rodolfo; Rovetta, Stefano
2002-07-01
The increasing use of video compression standards in broadcasting television systems has required, in recent years, the development of video quality measurements that take into account artifacts specifically caused by digital compression techniques. In this paper we present a methodology for the objective quality assessment of MPEG video streams by using circular back-propagation feedforward neural networks. Mapping neural networks can render nonlinear relationships between objective features and subjective judgments, thus avoiding any simplifying assumption on the complexity of the model. The neural network processes an instantaneous set of input values, and yields an associated estimate of perceived quality. Therefore, the neural-network approach turns objective quality assessment into adaptive modeling of subjective perception. The objective features used for the estimate are chosen according to the assessed relevance to perceived quality and are continuously extracted in real time from compressed video streams. The overall system mimics perception but does not require any analytical model of the underlying physical phenomenon. The capability to process compressed video streams represents an important advantage over existing approaches, like avoiding the stream-decoding process greatly enhances real-time performance. Experimental results confirm that the system provides satisfactory, continuous-time approximations for actual scoring curves concerning real test videos.
Creating an Overall Environmental Quality Index: Assessing Available Data
Background and Objectives: The interaction between environmental insults and human health is a complex process. Environmental exposures tend to cluster and disamenities such as landfills or industrial plants are often located in neighborhoods with high a percentage of minority a...
ERIC Educational Resources Information Center
Lunetta, Vincent N.; And Others
1984-01-01
Advocates including environmental issues balanced with basic science concepts/processes to provide a sound science foundation. Suggests case studies of regional environmental issues to sensitize/motivate students while reflecting complex nature of science/society issues. Issues considered include: fresh water quality, earthquake predication,…
Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Yang, Chan; Cui, Xianglong; Shi, Xinyuan; Qiao, Yanjiang
2016-01-01
The quality of Chinese herbal medicine tablets suffers from batch-to-batch variability due to a lack of manufacturing process understanding. In this paper, the Panax notoginseng saponins (PNS) immediate release tablet was taken as the research subject. By defining the dissolution of five active pharmaceutical ingredients and the tablet tensile strength as critical quality attributes (CQAs), influences of both the manipulated process parameters introduced by an orthogonal experiment design and the intermediate granules' properties on the CQAs were fully investigated by different chemometric methods, such as the partial least squares, the orthogonal projection to latent structures, and the multiblock partial least squares (MBPLS). By analyzing the loadings plots and variable importance in the projection indexes, the granule particle sizes and the minimal punch tip separation distance in tableting were identified as critical process parameters. Additionally, the MBPLS model suggested that the lubrication time in the final blending was also important in predicting tablet quality attributes. From the calculated block importance in the projection indexes, the tableting unit was confirmed to be the critical process unit of the manufacturing line. The results demonstrated that the combinatorial use of different multivariate modeling methods could help in understanding the complex process relationships as a whole. The output of this study can then be used to define a control strategy to improve the quality of the PNS immediate release tablet.
Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A
2014-01-01
Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.
NASA Astrophysics Data System (ADS)
Homberg, Werner; Hornjak, Daniel
2011-05-01
Friction spinning is a new innovative and promising incremental forming technology implying high potential regarding the manufacturing of complex functionally graded workpieces and enhancing existing forming limits of conventional metal spinning processes. The friction spinning process is based on the integration of thermo-mechanical friction subprocesses in this incremental forming process. By choosing the appropriate process parameters, e.g. axial feed rate or relative motion, the contact conditions between tool and workpiece can be influenced in a defined way and, thus, a required temperature profile can be obtained. Friction spinning allows the extension of forming limits compared to conventional metal spinning in order to produce multifunctional components with locally varying properties and the manufacturing of e.g. complex hollow parts made of tubes, profiles, or sheet metals. In this way, it meets the demands regarding efficiency and the manufacturing of functionally graded lightweight components. There is e.g. the possibility of locally increasing the wall thickness in joining zones and, as a consequence, achieving higher quality of the joint at decreased expense. These products are not or only hardly producible by conventional processes so far. In order to benefit from the advantages and potentials of this new innovative process new tooling systems and concepts are indispensable which fulfill the special requirements of this thermo-mechanical process concerning thermal and tribological loads and which allow simultaneous and defined forming and friction operations. An important goal of the corresponding research work at the Chair of Forming and Machining Technology at the University of Paderborn is the development of tool systems that allow the manufacturing of such complex parts by simple uniaxial or sequential biaxial linear tool paths. In the paper, promising tool systems and geometries as well as results of theoretical and experimental research work (e.g. regarding the influence and interaction of process parameters on the workpiece quality) will be discussed. Furthermore, possibilities regarding the manufacturing of geometries (demonstrator workpieces) which are not or only hardly producible with conventional processes will be presented.
Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal
2016-11-04
Fused deposition modeling (FDM) additive manufacturing has been intensively used for many industrial applications due to its attractive advantages over traditional manufacturing processes. The process parameters used in FDM have significant influence on the part quality and its properties. This process produces the plastic part through complex mechanisms and it involves complex relationships between the manufacturing conditions and the quality of the processed part. In the present study, the influence of multi-level manufacturing parameters on the temperature-dependent dynamic mechanical properties of FDM processed parts was investigated using IV-optimality response surface methodology (RSM) and multilayer feed-forward neural networks (MFNNs). The process parameters considered for optimization and investigation are slice thickness, raster to raster air gap, deposition angle, part print direction, bead width, and number of perimeters. Storage compliance and loss compliance were considered as response variables. The effect of each process parameter was investigated using developed regression models and multiple regression analysis. The surface characteristics are studied using scanning electron microscope (SEM). Furthermore, performance of optimum conditions was determined and validated by conducting confirmation experiment. The comparison between the experimental values and the predicted values by IV-Optimal RSM and MFNN was conducted for each experimental run and results indicate that the MFNN provides better predictions than IV-Optimal RSM.
Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal
2016-01-01
Fused deposition modeling (FDM) additive manufacturing has been intensively used for many industrial applications due to its attractive advantages over traditional manufacturing processes. The process parameters used in FDM have significant influence on the part quality and its properties. This process produces the plastic part through complex mechanisms and it involves complex relationships between the manufacturing conditions and the quality of the processed part. In the present study, the influence of multi-level manufacturing parameters on the temperature-dependent dynamic mechanical properties of FDM processed parts was investigated using IV-optimality response surface methodology (RSM) and multilayer feed-forward neural networks (MFNNs). The process parameters considered for optimization and investigation are slice thickness, raster to raster air gap, deposition angle, part print direction, bead width, and number of perimeters. Storage compliance and loss compliance were considered as response variables. The effect of each process parameter was investigated using developed regression models and multiple regression analysis. The surface characteristics are studied using scanning electron microscope (SEM). Furthermore, performance of optimum conditions was determined and validated by conducting confirmation experiment. The comparison between the experimental values and the predicted values by IV-Optimal RSM and MFNN was conducted for each experimental run and results indicate that the MFNN provides better predictions than IV-Optimal RSM. PMID:28774019
NASA Astrophysics Data System (ADS)
Jackson-Blake, L. A.; Sample, J. E.; Wade, A. J.; Helliwell, R. C.; Skeffington, R. A.
2017-07-01
Catchment-scale water quality models are increasingly popular tools for exploring the potential effects of land management, land use change and climate change on water quality. However, the dynamic, catchment-scale nutrient models in common usage are complex, with many uncertain parameters requiring calibration, limiting their usability and robustness. A key question is whether this complexity is justified. To explore this, we developed a parsimonious phosphorus model, SimplyP, incorporating a rainfall-runoff model and a biogeochemical model able to simulate daily streamflow, suspended sediment, and particulate and dissolved phosphorus dynamics. The model's complexity was compared to one popular nutrient model, INCA-P, and the performance of the two models was compared in a small rural catchment in northeast Scotland. For three land use classes, less than six SimplyP parameters must be determined through calibration, the rest may be based on measurements, while INCA-P has around 40 unmeasurable parameters. Despite substantially simpler process-representation, SimplyP performed comparably to INCA-P in both calibration and validation and produced similar long-term projections in response to changes in land management. Results support the hypothesis that INCA-P is overly complex for the study catchment. We hope our findings will help prompt wider model comparison exercises, as well as debate among the water quality modeling community as to whether today's models are fit for purpose. Simpler models such as SimplyP have the potential to be useful management and research tools, building blocks for future model development (prototype code is freely available), or benchmarks against which more complex models could be evaluated.
Quality of care provided in a special needs plan using a nurse care manager model.
Wenger, Neil S; Roth, Carol P; Martin, David; Nickels, Lorraine; Beckman, Robin; Kamberg, Caren; Mach, John; Ganz, David A
2011-10-01
To comprehensively evaluate the quality of care provided in special needs plans (SNPs; Medicare Advantage plans that aim to provide specialized care for complex older adults) and specifically the nurse care management model in the community setting. We adapted 107 process-of-care quality measures across 12 conditions from the Assessing Care of Vulnerable Elders set to obtain a clinically detailed evaluation of the quality of care received by complex older enrollees in a dual eligible Evercare SNP. We abstracted 13 months of primary care medical records to delineate quality of care provided by physicians and whether there was value added from the nurse care manager model. Dual eligible Evercare SNP located in central Florida. Two-hundred thirty-one vulnerable older enrollees in the SNP who had complex disease. Based on physician medical records alone, the 231 high-risk participants (mean age 77, 67% women) received recommended care for 53% of 5,569 evaluated clinical circumstances, ranging from 12% for end-of-life care to 78% for diabetes mellitus. In fewer than 40% of these clinical circumstances was recommended care provided for dementia, falls, and urinary incontinence. In a second analysis accounting for care provided by both the Evercare nurse and the physician, recommended care was provided to patients in 69% of the 5,684 evaluated clinical circumstances. Comprehensive quality measurement applied to vulnerable older adults enrolled in one mature SNP showed that the Evercare nurse model addresses important deficits in physician care for geriatric conditions. Such measurement should be applied to other SNP models and to compare SNP care with that for complex, older, fee-for-service Medicare cohorts. © 2011, Copyright the Authors Journal compilation © 2011, The American Geriatrics Society.
Ott, Caroline Vintergaard; Vinberg, Maj; Kessing, Lars V; Miskowiak, Kamilla W
2016-08-01
This is a secondary data analysis from our erythropoietin (EPO) trials. We examine (I) whether EPO improves speed of complex cognitive processing across bipolar and unipolar disorder, (II) if objective and subjective baseline cognitive impairment increases patients׳ chances of treatment-efficacy and (III) if cognitive improvement correlates with better subjective cognitive function, quality of life and socio-occupational capacity. Patients with unipolar or bipolar disorder were randomized to eight weekly EPO (N=40) or saline (N=39) infusions. Cognition, mood, quality of life and socio-occupational capacity were assessed at baseline (week 1), after treatment completion (week 9) and at follow-up (week 14). We used repeated measures analysis of covariance to investigate the effect of EPO on speed of complex cognitive processing. With logistic regression, we examined whether baseline cognitive impairment predicted treatment-efficacy. Pearson correlations were used to assess associations between objective and subjective cognition, quality of life and socio-occupational capacity. EPO improved speed of complex cognitive processing across affective disorders at weeks 9 and 14 (p≤0.05). In EPO-treated patients, baseline cognitive impairment increased the odds of treatment-efficacy on cognition at weeks 9 and 14 by a factor 9.7 (95% CI:1.2-81.1) and 9.9 (95% CI:1.1-88.4), respectively (p≤0.04). Subjective cognitive complaints did not affect chances of treatment-efficacy (p≥0.45). EPO-associated cognitive improvement correlated with reduced cognitive complaints but not with quality of life or socio-occupational function. As the analyses were performed post-hoc, findings are only hypothesis-generating. In conclusion, pro-cognitive effects of EPO occurred across affective disorders. Neuropsychological screening for cognitive dysfunction may be warranted in future cognition trials. Copyright © 2016 Elsevier B.V. and ECNP. All rights reserved.
New solutions and applications of 3D computer tomography image processing
NASA Astrophysics Data System (ADS)
Effenberger, Ira; Kroll, Julia; Verl, Alexander
2008-02-01
As nowadays the industry aims at fast and high quality product development and manufacturing processes a modern and efficient quality inspection is essential. Compared to conventional measurement technologies, industrial computer tomography (CT) is a non-destructive technology for 3D-image data acquisition which helps to overcome their disadvantages by offering the possibility to scan complex parts with all outer and inner geometric features. In this paper new and optimized methods for 3D image processing, including innovative ways of surface reconstruction and automatic geometric feature detection of complex components, are presented, especially our work of developing smart online data processing and data handling methods, with an integrated intelligent online mesh reduction. Hereby the processing of huge and high resolution data sets is guaranteed. Besides, new approaches for surface reconstruction and segmentation based on statistical methods are demonstrated. On the extracted 3D point cloud or surface triangulation automated and precise algorithms for geometric inspection are deployed. All algorithms are applied to different real data sets generated by computer tomography in order to demonstrate the capabilities of the new tools. Since CT is an emerging technology for non-destructive testing and inspection more and more industrial application fields will use and profit from this new technology.
NASA Astrophysics Data System (ADS)
Chitrakar, S.; Miller, S. N.; Liu, T.; Caffrey, P. A.
2015-12-01
Water quality data have been collected from three representative stream reaches in a coalbed methane (CBM) development area for over five years to improve the understanding of salt loading in the system. These streams are located within Atlantic Rim development area of the Muddy Creek in south-central Wyoming. Significant development of CBM wells is ongoing in the study area. Three representative sampling stream reaches included the Duck Pond Draw and Cow Creek, which receive co-produced water, and; South Fork Creek, and upstream Cow Creek which do not receive co-produced water. Water samples were assayed for various parameters which included sodium, calcium, magnesium, fluoride, chlorine, nitrate, O-phosphate, sulfate, carbonate, bicarbonates, and other water quality parameters such as pH, conductivity, and TDS. Based on these water quality parameters we have investigated various hydrochemical and geochemical processes responsible for the high variability in water quality in the region. However, effective interpretation of complex databases to understand aforementioned processes has been a challenging task due to the system's complexity. In this work we applied multivariate statistical techniques including cluster analysis (CA), principle component analysis (PCA) and discriminant analysis (DA) to analyze water quality data and identify similarities and differences among our locations. First, CA technique was applied to group the monitoring sites based on the multivariate similarities. Second, PCA technique was applied to identify the prevalent parameters responsible for the variation of water quality in each group. Third, the DA technique was used to identify the most important factors responsible for variation of water quality during low flow season and high flow season. The purpose of this study is to improve the understanding of factors or sources influencing the spatial and temporal variation of water quality. The ultimate goal of this whole research is to develop coupled salt loading and GIS-based hydrological modelling tool that will be able to simulate the salt loadings under various user defined scenarios in the regions undergoing CBM development. Therefore, the findings from this study will be used to formulate the predominant processes responsible for solute loading.
Evaluating supplier quality performance using analytical hierarchy process
NASA Astrophysics Data System (ADS)
Kalimuthu Rajoo, Shanmugam Sundram; Kasim, Maznah Mat; Ahmad, Nazihah
2013-09-01
This paper elaborates the importance of evaluating supplier quality performance to an organization. Supplier quality performance evaluation reflects the actual performance of the supplier exhibited at customer's end. It is critical in enabling the organization to determine the area of improvement and thereafter works with supplier to close the gaps. Success of the customer partly depends on supplier's quality performance. Key criteria as quality, cost, delivery, technology support and customer service are categorized as main factors in contributing to supplier's quality performance. 18 suppliers' who were manufacturing automotive application parts evaluated in year 2010 using weight point system. There were few suppliers with common rating which led to common ranking observed by few suppliers'. Analytical Hierarchy Process (AHP), a user friendly decision making tool for complex and multi criteria problems was used to evaluate the supplier's quality performance challenging the weight point system that was used for 18 suppliers'. The consistency ratio was checked for criteria and sub-criteria. Final results of AHP obtained with no overlap ratings, therefore yielded a better decision making methodology as compared to weight point rating system.
Process Control and Development for Ultrasonic Additive Manufacturing with Embedded Fibers
NASA Astrophysics Data System (ADS)
Hehr, Adam J.
Ultrasonic additive manufacturing (UAM) is a recent additive manufacturing technology which combines ultrasonic metal welding, CNC machining, and mechanized foil layering to create large gapless near net-shape metallic parts. The process has been attracting much attention lately due to its low formation temperature, the capability to join dissimilar metals, and the ability to create complex design features not possible with traditional subtractive processes alone. These process attributes enable light-weighting of structures and components in an unprecedented way. However, UAM is currently limited to niche areas due to the lack of quality tracking and inadequate scientific understanding of the process. As a result, this thesis work is focused on improving both component quality tracking and process understanding through the use of average electrical power input to the welder. Additionally, the understanding and application space of embedding fibers into metals using UAM is investigated, with particular focus on NiTi shape memory alloy fibers.
Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K
2015-06-01
Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Sobottka, Stephan B; Töpfer, Armin; Eberlein-Gonska, Maria; Schackert, Gabriele; Albrecht, D Michael
2010-01-01
Six Sigma is an innovative management- approach to reach practicable zero- defect quality in medical service processes. The Six Sigma principle utilizes strategies, which are based on quantitative measurements and which seek to optimize processes, limit deviations or dispersion from the target process. Hence, Six Sigma aims to eliminate errors or quality problems of all kinds. A pilot project to optimize the preparation for neurosurgery could now show that the Six Sigma method enhanced patient safety in medical care, while at the same time disturbances in the hospital processes and failure costs could be avoided. All six defined safety relevant quality indicators were significantly improved by changes in the workflow by using a standardized process- and patient- oriented approach. Certain defined quality standards such as a 100% complete surgical preparation at start of surgery and the required initial contact of the surgeon with the patient/ surgical record on the eve of surgery could be fulfilled within the range of practical zero- defect quality. Likewise, the degree of completion of the surgical record by 4 p.m. on the eve of surgery and their quality could be improved by a factor of 170 and 16, respectively, at sigma values of 4.43 and 4.38. The other two safety quality indicators "non-communicated changes in the OR- schedule" and the "completeness of the OR- schedule by 12:30 a.m. on the day before surgery" also show an impressive improvement by a factor of 2.8 and 7.7, respectively, corresponding with sigma values of 3.34 and 3.51. The results of this pilot project demonstrate that the Six Sigma method is eminently suitable for improving quality of medical processes. In our experience this methodology is suitable, even for complex clinical processes with a variety of stakeholders. In particular, in processes in which patient safety plays a key role, the objective of achieving a zero- defect quality is reasonable and should definitely be aspirated. Copyright © 2010. Published by Elsevier GmbH.
Ellis, Beverley; Herbert, Stuart Ian
2011-01-01
To identify key elements and characteristics of complex adaptive systems (CAS) relevant to implementing clinical governance, drawing on lessons from quality improvement programmes and the use of informatics in primary care. The research strategy includes a literature review to develop theoretical models of clinical governance of quality improvement in primary care organisations (PCOs) and a survey of PCOs. Complex adaptive system theories are a valuable tool to help make sense of natural phenomena, which include human responses to problem solving within the sampled PCOs. The research commenced with a survey; 76% (n16) of respondents preferred to support the implementation of clinical governance initiatives guided by outputs from general practice electronic health records. There was considerable variation in the way in which consultation data was captured, recorded and organised. Incentivised information sharing led to consensus on coding policies and models of data recording ahead of national contractual requirements. Informatics was acknowledged as a mechanism to link electronic health record outputs, quality improvement and resources. Investment in informatics was identified as a development priority in order to embed clinical governance principles in practice. Complex adaptive system theory usefully describes evolutionary change processes, providing insight into how the origins of quality assurance were predicated on rational reductionism and linearity. New forms of governance do not neutralise previous models, but add further dimensions to them. Clinical governance models have moved from deterministic and 'objective' factors to incorporate cultural aspects with feedback about quality enabled by informatics. The socio-technical lessons highlighted should inform healthcare management.
Creating an Overall Environmental Quality Index to Examine Health Outcomes
The interaction between environmental conditions and human health transpire from complex processes. Environmental exposures tend to cluster and disamenities such as landfills or industrial plants are often located in areas with high a percentage of minority and poor residents. Wh...
European Society of Gynaecologic Oncology Quality Indicators for Advanced Ovarian Cancer Surgery.
Querleu, Denis; Planchamp, François; Chiva, Luis; Fotopoulou, Christina; Barton, Desmond; Cibula, David; Aletti, Giovanni; Carinelli, Silvestro; Creutzberg, Carien; Davidson, Ben; Harter, Philip; Lundvall, Lene; Marth, Christian; Morice, Philippe; Rafii, Arash; Ray-Coquard, Isabelle; Rockall, Andrea; Sessa, Cristiana; van der Zee, Ate; Vergote, Ignace; du Bois, Andreas
2016-09-01
The surgical management of advanced ovarian cancer involves complex surgery. Implementation of a quality management program has a major impact on survival. The goal of this work was to develop a list of quality indicators (QIs) for advanced ovarian cancer surgery that can be used to audit and improve the clinical practice. This task has been carried out under the auspices of the European Society of Gynaecologic Oncology (ESGO). Quality indicators were based on scientific evidence and/or expert consensus. A 4-step evaluation process included a systematic literature search for the identification of potential QIs and the documentation of scientific evidence, physical meetings of an ad hoc multidisciplinarity International Development Group, an internal validation of the targets and scoring system, and an external review process involving physicians and patients. Ten structural, process, or outcome indicators were selected. Quality indicators 1 to 3 are related to achievement of complete cytoreduction, caseload in the center, training, and experience of the surgeon. Quality indicators 4 to 6 are related to the overall management, including active participation to clinical research, decision-making process within a structured multidisciplinary team, and preoperative workup. Quality indicator 7 addresses the high value of adequate perioperative management. Quality indicators 8 to 10 highlight the need of recording pertinent information relevant to improvement of quality. An ESGO-approved template for the operative report has been designed. Quality indicators were described using a structured format specifying what the indicator is measuring, measurability specifications, and targets. Each QI was associated with a score, and an assessment form was built. The ESGO quality criteria can be used for self-assessment, for institutional or governmental quality assurance programs, and for the certification of centers. Quality indicators and corresponding targets give practitioners and health administrators a quantitative basis for improving care and organizational processes in the surgical management of advanced ovarian cancer.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
A method for developing standardised interactive education for complex clinical guidelines
2012-01-01
Background Although systematic use of the Perinatal Society of Australia and New Zealand internationally endorsed Clinical Practice Guideline for Perinatal Mortality (PSANZ-CPG) improves health outcomes, implementation is inadequate. Its complexity is a feature known to be associated with non-compliance. Interactive education is effective as a guideline implementation strategy, but lacks an agreed definition. SCORPIO is an educational framework containing interactive and didactic teaching, but has not previously been used to implement guidelines. Our aim was to transform the PSANZ-CPG into an education workshop to develop quality standardised interactive education acceptable to participants for learning skills in collaborative interprofessional care. Methods The workshop was developed using the construct of an educational framework (SCORPIO), the PSANZ-CPG, a transformation process and tutor training. After a pilot workshop with key target and stakeholder groups, modifications were made to this and subsequent workshops based on multisource written observations from interprofessional participants, tutors and an independent educator. This participatory action research process was used to monitor acceptability and educational standards. Standardised interactive education was defined as the attainment of content and teaching standards. Quantitative analysis of positive expressed as a percentage of total feedback was used to derive a total quality score. Results Eight workshops were held with 181 participants and 15 different tutors. Five versions resulted from the action research methodology. Thematic analysis of multisource observations identified eight recurring education themes or quality domains used for standardisation. The two content domains were curriculum and alignment with the guideline and the six teaching domains; overload, timing, didacticism, relevance, reproducibility and participant engagement. Engagement was the most challenging theme to resolve. Tutors identified all themes for revision whilst participants identified a number of teaching but no content themes. From version 1 to 5, a significant increasing trend in total quality score was obtained; participants: 55%, p=0.0001; educator: 42%, p=0.0004; tutor peers: 57%, p=0.0001. Conclusions Complex clinical guidelines can be developed into a workshop acceptable to interprofessional participants. Eight quality domains provide a framework to standardise interactive teaching for complex clinical guidelines. Tutor peer review is important for content validity. This methodology may be useful for other guideline implementation. PMID:23131137
A method for developing standardised interactive education for complex clinical guidelines.
Vaughan, Janet I; Jeffery, Heather E; Raynes-Greenow, Camille; Gordon, Adrienne; Hirst, Jane; Hill, David A; Arbuckle, Susan
2012-11-06
Although systematic use of the Perinatal Society of Australia and New Zealand internationally endorsed Clinical Practice Guideline for Perinatal Mortality (PSANZ-CPG) improves health outcomes, implementation is inadequate. Its complexity is a feature known to be associated with non-compliance. Interactive education is effective as a guideline implementation strategy, but lacks an agreed definition. SCORPIO is an educational framework containing interactive and didactic teaching, but has not previously been used to implement guidelines. Our aim was to transform the PSANZ-CPG into an education workshop to develop quality standardised interactive education acceptable to participants for learning skills in collaborative interprofessional care. The workshop was developed using the construct of an educational framework (SCORPIO), the PSANZ-CPG, a transformation process and tutor training. After a pilot workshop with key target and stakeholder groups, modifications were made to this and subsequent workshops based on multisource written observations from interprofessional participants, tutors and an independent educator. This participatory action research process was used to monitor acceptability and educational standards. Standardised interactive education was defined as the attainment of content and teaching standards. Quantitative analysis of positive expressed as a percentage of total feedback was used to derive a total quality score. Eight workshops were held with 181 participants and 15 different tutors. Five versions resulted from the action research methodology. Thematic analysis of multisource observations identified eight recurring education themes or quality domains used for standardisation. The two content domains were curriculum and alignment with the guideline and the six teaching domains; overload, timing, didacticism, relevance, reproducibility and participant engagement. Engagement was the most challenging theme to resolve. Tutors identified all themes for revision whilst participants identified a number of teaching but no content themes. From version 1 to 5, a significant increasing trend in total quality score was obtained; participants: 55%, p=0.0001; educator: 42%, p=0.0004; tutor peers: 57%, p=0.0001. Complex clinical guidelines can be developed into a workshop acceptable to interprofessional participants. Eight quality domains provide a framework to standardise interactive teaching for complex clinical guidelines. Tutor peer review is important for content validity. This methodology may be useful for other guideline implementation.
Kahl, Johannes; Bodroza-Solarov, Marija; Busscher, Nicolaas; Hajslova, Jana; Kneifel, Wolfgang; Kokornaczyk, Maria Olga; van Ruth, Saskia; Schulzova, Vera; Stolz, Peter
2014-10-01
Organic food quality determination needs multi-dimensional evaluation tools. The main focus is on the authentication as an analytical verification of the certification process. New fingerprinting approaches such as ultra-performance liquid chromatography-mass spectrometry, gas chromatography-mass spectrometry, direct analysis in real time-high-resolution mass spectrometry as well as crystallization with and without the presence of additives seem to be promising methods in terms of time of analysis and detecting organic system-related parameters. For further methodological development, a system approach is recommended, which also takes into account food structure aspects. Furthermore, the authentication of processed organic samples needs more consciousness, hence most of organic food is complex and processed. © 2013 Society of Chemical Industry.
NASA Technical Reports Server (NTRS)
Singh, N. B.; Duval, W. M.
1991-01-01
Physical vapor transport processes were studied for the purpose of identifying the magnitude of convective effects on the crystal growth process. The effects of convection on crystal quality were were studied by varying the aspect ratio and those thermal conditions which ultimately affect thermal convection during physical vapor transport. An important outcome of the present study was the observation that the convection growth rate increased up to a certain value and then dropped to a constant value for high aspect ratios. This indicated that a very complex transport had occurred which could not be explained by linear stability theory. Better quality crystals grown at a low Rayleigh number confirmed that improved properties are possible in convectionless environments.
Nurse manager cognitive decision-making amidst stress and work complexity.
Shirey, Maria R; Ebright, Patricia R; McDaniel, Anna M
2013-01-01
The present study provides insight into nurse manager cognitive decision-making amidst stress and work complexity. Little is known about nurse manager decision-making amidst stress and work complexity. Because nurse manager decisions have the potential to impact patient care quality and safety, understanding their decision-making processes is useful for designing supportive interventions. This qualitative descriptive study interviewed 21 nurse managers from three hospitals to answer the research question: What decision-making processes do nurse managers utilize to address stressful situations in their nurse manager role? Face-to-face interviews incorporating components of the Critical Decision Method illuminated expert-novice practice differences. Content analysis identified one major theme and three sub-themes. The present study produced a cognitive model that guides nurse manager decision-making related to stressful situations. Experience in the role, organizational context and situation factors influenced nurse manager cognitive decision-making processes. Study findings suggest that chronic exposure to stress and work complexity negatively affects nurse manager health and their decision-making processes potentially threatening individual, patient and organizational outcomes. Cognitive decision-making varies based on nurse manager experience and these differences have coaching and mentoring implications. This present study contributes a current understanding of nurse manager decision-making amidst stress and work complexity. © 2012 Blackwell Publishing Ltd.
Resin infiltration transfer technique
Miller, David V [Pittsburgh, PA; Baranwal, Rita [Glenshaw, PA
2009-12-08
A process has been developed for fabricating composite structures using either reaction forming or polymer infiltration and pyrolysis techniques to densify the composite matrix. The matrix and reinforcement materials of choice can include, but are not limited to, silicon carbide (SiC) and zirconium carbide (ZrC). The novel process can be used to fabricate complex, net-shape or near-net shape, high-quality ceramic composites with a crack-free matrix.
de la Mare, William; Ellis, Nick; Pascual, Ricardo; Tickell, Sharon
2012-04-01
Simulation models have been widely adopted in fisheries for management strategy evaluation (MSE). However, in catchment management of water quality, MSE is hampered by the complexity of both decision space and the hydrological process models. Empirical models based on monitoring data provide a feasible alternative to process models; they run much faster and, by conditioning on data, they can simulate realistic responses to management actions. Using 10 years of water quality indicators from Queensland, Australia, we built an empirical model suitable for rapid MSE that reproduces the water quality variables' mean and covariance structure, adjusts the expected indicators through local management effects, and propagates effects downstream by capturing inter-site regression relationships. Empirical models enable managers to search the space of possible strategies using rapid assessment. They provide not only realistic responses in water quality indicators but also variability in those indicators, allowing managers to assess strategies in an uncertain world. Copyright © 2012 Elsevier Ltd. All rights reserved.
Food structure: Its formation and relationships with other properties.
Joardder, Mohammad U H; Kumar, Chandan; Karim, M A
2017-04-13
Food materials are complex in nature as it has heterogeneous, amorphous, hygroscopic and porous properties. During processing, microstructure of food materials changes which significantly affects other properties of food. An appropriate understanding of the microstructure of the raw food material and its evolution during processing is critical in order to understand and accurately describe dehydration processes and quality anticipation. This review critically assesses the factors that influence the modification of microstructure in the course of drying of fruits and vegetables. The effect of simultaneous heat and mass transfer on microstructure in various drying methods is investigated. Effects of changes in microstructure on other functional properties of dried foods are discussed. After an extensive review of the literature, it is found that development of food structure significantly depends on fresh food properties and process parameters. Also, modification of microstructure influences the other properties of final product. An enhanced understanding of the relationships between food microstructure, drying process parameters and final product quality will facilitate the energy efficient optimum design of the food processor in order to achieve high-quality food.
A Review on Advanced Treatment of Pharmaceutical Wastewater
NASA Astrophysics Data System (ADS)
Guo, Y.; Qi, P. S.; Liu, Y. Z.
2017-05-01
The composition of pharmaceutical wastewater is complex, which is high concentration of organic matter, microbial toxicity, high salt, and difficult to biodegrade. After secondary treatment, there are still trace amounts of suspended solids and dissolved organic matter. To improve the quality of pharmaceutical wastewater effluent, advanced treatment is essential. In this paper, the classification of the pharmaceutical technology was introduced, and the characteristics of pharmaceutical wastewater effluent quality were summarized. The methods of advanced treatment of pharmaceutical wastewater were reviewed afterwards, which included coagulation and sedimentation, flotation, activated carbon adsorption, membrane separation, advanced oxidation processes, membrane separation and biological treatment. Meanwhile, the characteristics of each process were described.
Catalysts and process for liquid hydrocarbon fuel production
White, Mark G.; Ranaweera, Samantha A.; Henry, William P.
2016-08-02
The present invention provides a novel process and system in which a mixture of carbon monoxide and hydrogen synthesis gas, or syngas, is converted into hydrocarbon mixtures composed of high quality distillates, gasoline components, and lower molecular weight gaseous olefins in one reactor or step. The invention utilizes a novel supported bimetallic ion complex catalyst for conversion, and provides methods of preparing such novel catalysts and use of the novel catalysts in the process and system of the invention.
ENVIRONMENTAL QUALITY AND LANDSCAPE-RISK ASSESSMENT IN THE YANTRA RIVER BASIN
Landscape characteristics exert their impact on the processes occurring in river basins in many directions and may influence in a different way the environmental security and some related constraints like extreme natural events. The complex nature of landscape structure and dynam...
Changes in landscape heterogeneity, historic landcover change, and human disturbance regimes are governed by complex interrelated landscape processes that modify lake water quality through the addition of nutrients, sediment, anthropogenic chemicals, and changes in major ion conc...
Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach
USDA-ARS?s Scientific Manuscript database
The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...
Specimen preparation for high-resolution cryo-EM
Passmore, Lori A.; Russo, Christopher J.
2016-01-01
Imaging a material with electrons at near-atomic resolution requires a thin specimen that is stable in the vacuum of the transmission electron microscope. For biological samples, this comprises a thin layer of frozen aqueous solution containing the biomolecular complex of interest. The process of preparing a high-quality specimen is often the limiting step in the determination of structures by single-particle electron cryomicroscopy (cryo-EM). Here we describe a systematic approach for going from a purified biomolecular complex in aqueous solution to high-resolution electron micrographs that are suitable for 3D structure determination. This includes a series of protocols for the preparation of vitrified specimens on various specimen supports, including all-gold and graphene. We also describe techniques for troubleshooting when a preparation fails to yield suitable specimens, and common mistakes to avoid during each part of the process. Finally, we include recommendations for obtaining the highest quality micrographs from prepared specimens with current microscope, detector and support technology. PMID:27572723
Cost and quality implications of discrepancies between admitting and discharge diagnoses.
McNutt, Robert; Johnson, Tricia; Kane, Jason; Ackerman, Mariel; Odwazny, Richard; Bardhan, Jaydeep
2012-01-01
Presenting and discharge diagnoses of hospitalized patients may differ as a result of patient complexity, diagnostic dilemmas, or errors in clinical judgment at the time of primary assessment. When diagnoses at admission and discharge are not in agreement, this discrepancy may indicate more complex processes of care and resultant costs. It is unclear whether surrogate measures reflecting quality of care are impacted by discrepant diagnoses. To assess whether an association exists between admitting and discharge International Classification of Diseases, Ninth Revision (ICD-9) diagnosis codes and other quality markers including hospital length of stay, total cost of care, and 30-day readmission rate. This was a retrospective, cross-sectional analysis of general internal medicine patients aged 18 years and older. Diagnosis discrepancy was defined as a difference between the 3-digit ICD-9 diagnosis code at admission and the principal 3-digit ICD-9 diagnosis code at discharge. Sixty-eight percent of patients had a diagnosis discrepancy. Diagnosis discrepancy was associated with a 0.41-day increase in length of stay (P < .001), $663 increase in direct costs (P < .001), and a 1.55 times greater odds of readmission within 30 days (P < .001). Diagnosis discrepancy was associated with hospital quality outcome measures. This finding likely reflects variations in patients' diagnostic complexity.
Fernandez-Ricaud, Luciano; Kourtchenko, Olga; Zackrisson, Martin; Warringer, Jonas; Blomberg, Anders
2016-06-23
Phenomics is a field in functional genomics that records variation in organismal phenotypes in the genetic, epigenetic or environmental context at a massive scale. For microbes, the key phenotype is the growth in population size because it contains information that is directly linked to fitness. Due to technical innovations and extensive automation our capacity to record complex and dynamic microbial growth data is rapidly outpacing our capacity to dissect and visualize this data and extract the fitness components it contains, hampering progress in all fields of microbiology. To automate visualization, analysis and exploration of complex and highly resolved microbial growth data as well as standardized extraction of the fitness components it contains, we developed the software PRECOG (PREsentation and Characterization Of Growth-data). PRECOG allows the user to quality control, interact with and evaluate microbial growth data with ease, speed and accuracy, also in cases of non-standard growth dynamics. Quality indices filter high- from low-quality growth experiments, reducing false positives. The pre-processing filters in PRECOG are computationally inexpensive and yet functionally comparable to more complex neural network procedures. We provide examples where data calibration, project design and feature extraction methodologies have a clear impact on the estimated growth traits, emphasising the need for proper standardization in data analysis. PRECOG is a tool that streamlines growth data pre-processing, phenotypic trait extraction, visualization, distribution and the creation of vast and informative phenomics databases.
Reeve, Joanne; Cooper, Lucy; Harrington, Sean; Rosbottom, Peter; Watkins, Jane
2016-09-06
Health services face the challenges created by complex problems, and so need complex intervention solutions. However they also experience ongoing difficulties in translating findings from research in this area in to quality improvement changes on the ground. BounceBack was a service development innovation project which sought to examine this issue through the implementation and evaluation in a primary care setting of a novel complex intervention. The project was a collaboration between a local mental health charity, an academic unit, and GP practices. The aim was to translate the charity's model of care into practice-based evidence describing delivery and impact. Normalisation Process Theory (NPT) was used to support the implementation of the new model of primary mental health care into six GP practices. An integrated process evaluation evaluated the process and impact of care. Implementation quickly stalled as we identified problems with the described model of care when applied in a changing and variable primary care context. The team therefore switched to using the NPT framework to support the systematic identification and modification of the components of the complex intervention: including the core components that made it distinct (the consultation approach) and the variable components (organisational issues) that made it work in practice. The extra work significantly reduced the time available for outcome evaluation. However findings demonstrated moderately successful implementation of the model and a suggestion of hypothesised changes in outcomes. The BounceBack project demonstrates the development of a complex intervention from practice. It highlights the use of Normalisation Process Theory to support development, and not just implementation, of a complex intervention; and describes the use of the research process in the generation of practice-based evidence. Implications for future translational complex intervention research supporting practice change through scholarship are discussed.
Calvet, Amandine; Ryder, Alan G
2014-08-20
The quality of the cell culture media used in biopharmaceutical manufacturing is a crucial factor affecting bioprocess performance and the quality of the final product. Due to their complex composition these media are inherently unstable, and significant compositional variations can occur particularly when in the prepared liquid state. For example photo-degradation of cell culture media can have adverse effects on cell viability and thus process performance. There is therefore, from quality control, quality assurance and process management view points, an urgent demand for the development of rapid and inexpensive tools for the stability monitoring of these complex mixtures. Spectroscopic methods, based on fluorescence or Raman measurements, have now become viable alternatives to more time-consuming and expensive (on a unit analysis cost) chromatographic and/or mass spectrometry based methods for routine analysis of media. Here we demonstrate the application of surface enhanced Raman scattering (SERS) spectroscopy for the simple, fast, analysis of cell culture media degradation. Once stringent reproducibility controls are implemented, chemometric data analysis methods can then be used to rapidly monitor the compositional changes in chemically defined media. SERS shows clearly that even when media are stored at low temperature (2-8°C) and in the dark, significant chemical changes occur, particularly with regard to cysteine/cystine concentration. Copyright © 2014 Elsevier B.V. All rights reserved.
A quality by design approach to scale-up of high-shear wet granulation process.
Pandey, Preetanshu; Badawy, Sherif
2016-01-01
High-shear wet granulation is a complex process that in turn makes scale-up a challenging task. Scale-up of high-shear wet granulation process has been studied extensively in the past with various different methodologies being proposed in the literature. This review article discusses existing scale-up principles and categorizes the various approaches into two main scale-up strategies - parameter-based and attribute-based. With the advent of quality by design (QbD) principle in drug product development process, an increased emphasis toward the latter approach may be needed to ensure product robustness. In practice, a combination of both scale-up strategies is often utilized. In a QbD paradigm, there is also a need for an increased fundamental and mechanistic understanding of the process. This can be achieved either by increased experimentation that comes at higher costs, or by using modeling techniques, that are also discussed as part of this review.
Lessons from industry: one school's transformation toward "lean" curricular governance.
Stratton, Terry D; Rudy, David W; Sauer, Marlene J; Perman, Jay A; Jennings, C Darrell
2007-04-01
As medical education grapples with organizational calls for centralized curricular oversight, programs may be compelled to respond by establishing highly vertical, stacked governance structures. Although these models offer discrete advantages over the horizontal, compartmentalized structures they are designed to replace, they pose new challenges to ensuring curricular quality and the educational innovations that drive the curricula. The authors describe a hybrid quality-assurance (QA) governance structure introduced in 2003 at the University of Kentucky College of Medicine (UKCOM) that ensures centralized curricular oversight of the educational product while allowing individualized creative control over the educational process. Based on a Lean production model, this approach draws on industry experiences that strategically separate institutional accountability (management) for a quality curriculum from the decision-making processes required to ensure it (production). In so doing, the authors acknowledge general similarities and key differences between overseeing the manufacture of a complex product versus the education of a physician-emphasizing the structured, sequential, and measurable nature of each process. Further, the authors briefly trace the emergence of quality approaches in manufacturing and discuss the philosophical changes that accompany transition to an institutional governance system that relies on vigorous, robust performance measures to offer continuous feedback on curricular quality.
Early warning risk assessment for drinking water production: decoding subtle evidence
NASA Astrophysics Data System (ADS)
Merz, Christoph; Lischeid, Gunnar; Böttcher, Steven
2016-04-01
Due to increasing demands for high quality water for drinking water supply all over the world there is acute need for methods to detect possible threats to groundwater resources early. Especially drinking water production in complex geologic settings has a particularly high risk for unexpected degradation of the groundwater quality due to the unknown interplay between anthropogenically induced hydraulic changes and geochemical processes. This study investigates the possible benefit of the Principal Component Analysis (PCA) for groundwater and drinking water management using common sets of physicochemical monitoring data. The approach was used to identify the prevailing processes driving groundwater quality shifts and related threats, which might be masked in anthropogenically impacted aquifer systems. The approach was applied to a data set from a waterworks located in the state of Brandenburg, NE Germany, which has been operating since nearly four decades. The region faces confronting and increasing demands due to rising peri-urban settlements. The PCA subdivided the data set according to different strengths of effects induced by differing geochemical processes at different sites in the capture zone of the waterworks and varying in time. Thus a spatial assessment of these processes could be performed as well as a temporal assessment of long-term groundwater quality shifts in the extracted water. The analysis revealed that over the period of 16 years of water withdrawal the geochemistry of the extracted groundwater had become increasingly more dissimilar compared to the characteristics found at the majority of observation wells. This component could be identified as highly mineralized CaSO4 dominated water from unexamined deeper zones of the aquifer system. Due to the complex geochemical and hydraulic interactions in the system, this process was masked and was not evident in the data set without validation by the applied statistical analysis. The findings give a clear indication of a potential threat to the groundwater resources in this region with danger for drinking water contamination in a medium-term period.
Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Yang, Chan; Cui, Xianglong; Shi, Xinyuan; Qiao, Yanjiang
2016-01-01
The quality of Chinese herbal medicine tablets suffers from batch-to-batch variability due to a lack of manufacturing process understanding. In this paper, the Panax notoginseng saponins (PNS) immediate release tablet was taken as the research subject. By defining the dissolution of five active pharmaceutical ingredients and the tablet tensile strength as critical quality attributes (CQAs), influences of both the manipulated process parameters introduced by an orthogonal experiment design and the intermediate granules’ properties on the CQAs were fully investigated by different chemometric methods, such as the partial least squares, the orthogonal projection to latent structures, and the multiblock partial least squares (MBPLS). By analyzing the loadings plots and variable importance in the projection indexes, the granule particle sizes and the minimal punch tip separation distance in tableting were identified as critical process parameters. Additionally, the MBPLS model suggested that the lubrication time in the final blending was also important in predicting tablet quality attributes. From the calculated block importance in the projection indexes, the tableting unit was confirmed to be the critical process unit of the manufacturing line. The results demonstrated that the combinatorial use of different multivariate modeling methods could help in understanding the complex process relationships as a whole. The output of this study can then be used to define a control strategy to improve the quality of the PNS immediate release tablet. PMID:27932865
Worku, Mohammed; de Meulenaer, Bruno; Duchateau, Luc; Boeckx, Pascal
2018-03-01
Although various studies have assessed altitude, shade and postharvest processing effects on biochemical content and quality of coffee beans, data on their interactions are scarce. The individual and interactive effects of these factors on the caffeine, chlorogenic acids (CGA) and sucrose contents as well as physical and sensory qualities of green coffee beans from large plantations in southwestern Ethiopia were evaluated. Caffeine and CGA contents decreased with increasing altitude; they respectively declined 0.12 and 1.23gkg -1 100m -1 . Sucrose content increased with altitude; however, the altitude effect was significant for wet-processed beans (3.02gkg -1 100m -1 ), but not for dry-processed beans (0.36g kg -1 100m -1 ). Similarly, sucrose content increased with altitude with much stronger effect for coffee grown without shade (2.11gkg -1 100m -1 ) compared to coffee grown under shade (0.93gkg -1 100m -1 ). Acidity increased with altitude when coffee was grown under shade (0.22 points 100m -1 ), but no significant altitude effect was observed on coffee grown without shade. Beans grown without shade showed a higher physical quality score for dry (37.2) than for wet processing (29.1). These results generally underline the complex interaction effects between altitude and shade or postharvest processing on biochemical composition and quality of green arabica coffee beans. Copyright © 2017. Published by Elsevier Ltd.
Optical Computers and Space Technology
NASA Technical Reports Server (NTRS)
Abdeldayem, Hossin A.; Frazier, Donald O.; Penn, Benjamin; Paley, Mark S.; Witherow, William K.; Banks, Curtis; Hicks, Rosilen; Shields, Angela
1995-01-01
The rapidly increasing demand for greater speed and efficiency on the information superhighway requires significant improvements over conventional electronic logic circuits. Optical interconnections and optical integrated circuits are strong candidates to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by the conventional electronic logic circuits. The new optical technology has increased the demand for high quality optical materials. NASA's recent involvement in processing optical materials in space has demonstrated that a new and unique class of high quality optical materials are processible in a microgravity environment. Microgravity processing can induce improved orders in these materials and could have a significant impact on the development of optical computers. We will discuss NASA's role in processing these materials and report on some of the associated nonlinear optical properties which are quite useful for optical computers technology.
Statistical process control: separating signal from noise in emergency department operations.
Pimentel, Laura; Barrueto, Fermin
2015-05-01
Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
BatMass: a Java Software Platform for LC-MS Data Visualization in Proteomics and Metabolomics.
Avtonomov, Dmitry M; Raskind, Alexander; Nesvizhskii, Alexey I
2016-08-05
Mass spectrometry (MS) coupled to liquid chromatography (LC) is a commonly used technique in metabolomic and proteomic research. As the size and complexity of LC-MS-based experiments grow, it becomes increasingly more difficult to perform quality control of both raw data and processing results. In a practical setting, quality control steps for raw LC-MS data are often overlooked, and assessment of an experiment's success is based on some derived metrics such as "the number of identified compounds". The human brain interprets visual data much better than plain text, hence the saying "a picture is worth a thousand words". Here, we present the BatMass software package, which allows for performing quick quality control of raw LC-MS data through its fast visualization capabilities. It also serves as a testbed for developers of LC-MS data processing algorithms by providing a data access library for open mass spectrometry file formats and a means of visually mapping processing results back to the original data. We illustrate the utility of BatMass with several use cases of quality control and data exploration.
BatMass: a Java software platform for LC/MS data visualization in proteomics and metabolomics
Avtonomov, Dmitry; Raskind, Alexander; Nesvizhskii, Alexey I.
2017-01-01
Mass spectrometry (MS) coupled to liquid chromatography (LC) is a commonly used technique in metabolomic and proteomic research. As the size and complexity of LC/MS based experiments grow, it becomes increasingly more difficult to perform quality control of both raw data and processing results. In a practical setting, quality control steps for raw LC/MS data are often overlooked and assessment of an experiment's success is based on some derived metrics such as “the number of identified compounds”. Human brain interprets visual data much better than plain text, hence the saying “a picture is worth a thousand words”. Here we present BatMass software package which allows to perform quick quality control of raw LC/MS data through its fast visualization capabilities. It also serves as a testbed for developers of LC/MS data processing algorithms by providing a data access library for open mass spectrometry file formats and a means of visually mapping processing results back to the original data. We illustrate the utility of BatMass with several use cases of quality control and data exploration. PMID:27306858
Zhong, Yi; Zhu, Jieqiang; Yang, Zhenzhong; Shao, Qing; Fan, Xiaohui; Cheng, Yiyu
2018-01-31
To ensure pharmaceutical quality, chemistry, manufacturing and control (CMC) research is essential. However, due to the inherent complexity of Chinese medicine (CM), CMC study of CM remains a great challenge for academia, industry, and regulatory agencies. Recently, quality-marker (Q-marker) was proposed to establish quality standards or quality analysis approaches of Chinese medicine, which sheds a light on Chinese medicine's CMC study. Here manufacture processes of Panax Notoginseng Saponins (PNS) is taken as a case study and the present work is to establish a Q-marker based research strategy for CMC of Chinese medicine. The Q-markers of Panax Notoginseng Saponins (PNS) is selected and established by integrating chemical profile with pharmacological activities. Then, the key processes of PNS manufacturing are identified by material flow analysis. Furthermore, modeling algorithms are employed to explore the relationship between Q-markers and critical process parameters (CPPs) of the key processes. At last, CPPs of the key processes are optimized in order to improving the process efficiency. Among the 97 identified compounds, Notoginsenoside R 1 , ginsenoside Rg 1 , Re, Rb 1 and Rd are selected as the Q-markers of PNS. Our analysis on PNS manufacturing show the extraction process and column chromatography process are the key processes. With the CPPs of each process as the inputs and Q-markers' contents as the outputs, two process prediction models are built separately for the extraction process and column chromatography process of Panax notoginseng, which both possess good prediction ability. Based on the efficiency models of extraction process and column chromatography process we constructed, the optimal CPPs of both processes are calculated. Our results show that the Q-markers derived from CMC research strategy can be applied to analyze the manufacturing processes of Chinese medicine to assure product's quality and promote key processes' efficiency simultaneously. Copyright © 2018 Elsevier GmbH. All rights reserved.
Hagerman, Nancy S; Varughese, Anna M; Kurth, C Dean
2014-06-01
Cognitive aids are tangible or intangible instruments that guide users in decision-making and in the completion of a complex series of tasks. Common examples include mnemonics, checklists, and algorithms. Cognitive aids constitute very effective approaches to achieve well tolerated, high quality healthcare because they promote highly reliable processes that reduce the likelihood of failure. This review describes recent advances in quality improvement for pediatric anesthesiology with emphasis on application of cognitive aids to impact patient safety and outcomes. Quality improvement encourages the examination of systems to create stable processes and ultimately high-value care. Quality improvement initiatives in pediatric anesthesiology have been shown to improve outcomes and the delivery of efficient and effective care at many institutions. The use of checklists, in particular, improves adherence to evidence-based care in crisis situations, decreases catheter-associated bloodstream infections, reduces blood product utilization, and improves communication during the patient handoff process. Use of this simple tool has been associated with decreased morbidity, fewer medical errors, improved provider satisfaction, and decreased mortality in nonanesthesia disciplines as well. Successful quality improvement initiatives utilize cognitive aids such as checklists and have been shown to optimize pediatric patient experience and anesthesia outcomes and reduce perioperative complications.
Influence of rainfall and catchment characteristics on urban stormwater quality.
Liu, An; Egodawatta, Prasanna; Guan, Yuntao; Goonetilleke, Ashantha
2013-02-01
The accuracy and reliability of urban stormwater quality modelling outcomes are important for stormwater management decision making. The commonly adopted approach where only a limited number of factors are used to predict urban stormwater quality may not adequately represent the complexity of the quality response to a rainfall event or site-to-site differences to support efficient treatment design. This paper discusses an investigation into the influence of rainfall and catchment characteristics on urban stormwater quality in order to investigate the potential areas for errors in current stormwater quality modelling practices. It was found that the influence of rainfall characteristics on pollutant wash-off is step-wise based on specific thresholds. This means that a modelling approach where the wash-off process is predicted as a continuous function of rainfall intensity and duration is not appropriate. Additionally, other than conventional catchment characteristics, namely, land use and impervious surface fraction, other catchment characteristics such as impervious area layout, urban form and site specific characteristics have an important influence on both, pollutant build-up and wash-off processes. Finally, the use of solids as a surrogate to estimate other pollutant species was found to be inappropriate. Individually considering build-up and wash-off processes for each pollutant species should be the preferred option. Copyright © 2012 Elsevier B.V. All rights reserved.
Yagahara, Ayako; Yokooka, Yuki; Jiang, Guoqian; Tsuji, Shintarou; Fukuda, Akihisa; Nishimoto, Naoki; Kurowarabi, Kunio; Ogasawara, Katsuhiko
2018-03-01
Describing complex mammography examination processes is important for improving the quality of mammograms. It is often difficult for experienced radiologic technologists to explain the process because their techniques depend on their experience and intuition. In our previous study, we analyzed the process using a new bottom-up hierarchical task analysis and identified key components of the process. Leveraging the results of the previous study, the purpose of this study was to construct a mammographic examination process ontology to formally describe the relationships between the process and image evaluation criteria to improve the quality of mammograms. First, we identified and created root classes: task, plan, and clinical image evaluation (CIE). Second, we described an "is-a" relation referring to the result of the previous study and the structure of the CIE. Third, the procedural steps in the ontology were described using the new properties: "isPerformedBefore," "isPerformedAfter," and "isPerformedAfterIfNecessary." Finally, the relationships between tasks and CIEs were described using the "isAffectedBy" property to represent the influence of the process on image quality. In total, there were 219 classes in the ontology. By introducing new properties related to the process flow, a sophisticated mammography examination process could be visualized. In relationships between tasks and CIEs, it became clear that the tasks affecting the evaluation criteria related to positioning were greater in number than those for image quality. We developed a mammographic examination process ontology that makes knowledge explicit for a comprehensive mammography process. Our research will support education and help promote knowledge sharing about mammography examination expertise.
Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben
2014-01-01
This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinwiddie, Ralph Barton; Dehoff, Ryan R; Lloyd, Peter D
2013-01-01
Oak Ridge National Laboratory (ORNL) has been utilizing the ARCAM electron beam melting technology to additively manufacture complex geometric structures directly from powder. Although the technology has demonstrated the ability to decrease costs, decrease manufacturing lead-time and fabricate complex structures that are impossible to fabricate through conventional processing techniques, certification of the component quality can be challenging. Because the process involves the continuous deposition of successive layers of material, each layer can be examined without destructively testing the component. However, in-situ process monitoring is difficult due to metallization on inside surfaces caused by evaporation and condensation of metal from themore » melt pool. This work describes a solution to one of the challenges to continuously imaging inside of the chamber during the EBM process. Here, the utilization of a continuously moving Mylar film canister is described. Results will be presented related to in-situ process monitoring and how this technique results in improved mechanical properties and reliability of the process.« less
Duncan, Fiona; Haigh, Carol
2013-10-01
To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.
Grant, Aileen; Dreischulte, Tobias; Treweek, Shaun; Guthrie, Bruce
2012-08-28
Trials of complex interventions are criticized for being 'black box', so the UK Medical Research Council recommends carrying out a process evaluation to explain the trial findings. We believe it is good practice to pre-specify and publish process evaluation protocols to set standards and minimize bias. Unlike protocols for trials, little guidance or standards exist for the reporting of process evaluations. This paper presents the mixed-method process evaluation protocol of a cluster randomized trial, drawing on a framework designed by the authors. This mixed-method evaluation is based on four research questions and maps data collection to a logic model of how the data-driven quality improvement in primary care (DQIP) intervention is expected to work. Data collection will be predominately by qualitative case studies in eight to ten of the trial practices, focus groups with patients affected by the intervention and quantitative analysis of routine practice data, trial outcome and questionnaire data and data from the DQIP intervention. We believe that pre-specifying the intentions of a process evaluation can help to minimize bias arising from potentially misleading post-hoc analysis. We recognize it is also important to retain flexibility to examine the unexpected and the unintended. From that perspective, a mixed-methods evaluation allows the combination of exploratory and flexible qualitative work, and more pre-specified quantitative analysis, with each method contributing to the design, implementation and interpretation of the other.As well as strengthening the study the authors hope to stimulate discussion among their academic colleagues about publishing protocols for evaluations of randomized trials of complex interventions. DATA-DRIVEN QUALITY IMPROVEMENT IN PRIMARY CARE TRIAL REGISTRATION: ClinicalTrials.gov: NCT01425502.
Optimization of Collision Detection in Surgical Simulations
NASA Astrophysics Data System (ADS)
Custură-Crăciun, Dan; Cochior, Daniel; Neagu, Corneliu
2014-11-01
Just like flight and spaceship simulators already represent a standard, we expect that soon enough, surgical simulators should become a standard in medical applications. A simulations quality is strongly related to the image quality as well as the degree of realism of the simulation. Increased quality requires increased resolution, increased representation speed but more important, a larger amount of mathematical equations. To make it possible, not only that we need more efficient computers, but especially more calculation process optimizations. A simulator executes one of the most complex sets of calculations each time it detects a contact between the virtual objects, therefore optimization of collision detection is fatal for the work-speed of a simulator and hence in its quality
Zhang, Xia; Hu, Changqin
2017-09-08
Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Polyphenols in foods are more complex than often thought.
Cheynier, Véronique
2005-01-01
Dietary polyphenols show a great diversity of structures, ranging from rather simple molecules (monomers and oligomers) to polymers. Higher-molecular-weight structures (with molecular weights of > 500) are usually designated as tannins, which refers to their ability to interact with proteins. Among them, condensed tannins (proanthocyanidins) are particularly important because of their wide distribution in plants and their contributions to major food qualities. All phenolic compounds are highly unstable and rapidly transformed into various reaction products when the plant cells are damaged (for instance, during food processing), thus adding to the complexity of dietary polyphenol composition. The polyphenol composition of plant-derived foods and beverages depends on that of the raw material used but also on the extraction process and subsequent biochemical and chemical reactions of plant polyphenols. The occurrence of specific tannin-like compounds (ie, thearubigins and theaflavins) arising from enzymatic oxidation is well documented in black tea. Various chemical reactions involving anthocyanins and/or flavanols have been demonstrated to occur during red wine aging. Current knowledge regarding the reaction mechanisms involved in some of these processes and the structures of the resulting products is reviewed. Their effects on organoleptic and nutritional quality are also discussed.
Propp, Kathleen M; Apker, Julie; Zabava Ford, Wendy S; Wallace, Nancy; Serbenski, Michele; Hofmeister, Nancee
2010-01-01
Nurses occupy a central position in today's increasingly collaborative health care teams that place a premium on quality patient care. In this study we examined critical team processes and identified specific nurse-team communication practices that were perceived by team members to enhance patient outcomes. Fifty patient-care team members were interviewed to uncover forms of nurse communication perceived to improve team performance. Using a grounded theory approach and constant comparative analysis, study findings reveal two critical processes nurses contribute to as the most central and consistent members of the health care team: ensuring quality decisions and promoting a synergistic team. Moreover, the findings reveal 15 specific nurse-team communication practices that comprise these processes, and thereby are theorized to improve patient outcomes.
Template-Based Modeling of Protein-RNA Interactions.
Zheng, Jinfang; Kundrotas, Petras J; Vakser, Ilya A; Liu, Shiyong
2016-09-01
Protein-RNA complexes formed by specific recognition between RNA and RNA-binding proteins play an important role in biological processes. More than a thousand of such proteins in human are curated and many novel RNA-binding proteins are to be discovered. Due to limitations of experimental approaches, computational techniques are needed for characterization of protein-RNA interactions. Although much progress has been made, adequate methodologies reliably providing atomic resolution structural details are still lacking. Although protein-RNA free docking approaches proved to be useful, in general, the template-based approaches provide higher quality of predictions. Templates are key to building a high quality model. Sequence/structure relationships were studied based on a representative set of binary protein-RNA complexes from PDB. Several approaches were tested for pairwise target/template alignment. The analysis revealed a transition point between random and correct binding modes. The results showed that structural alignment is better than sequence alignment in identifying good templates, suitable for generating protein-RNA complexes close to the native structure, and outperforms free docking, successfully predicting complexes where the free docking fails, including cases of significant conformational change upon binding. A template-based protein-RNA interaction modeling protocol PRIME was developed and benchmarked on a representative set of complexes.
Metamodeling and optimization of the THF process with pulsating pressure
NASA Astrophysics Data System (ADS)
Bucconi, Marco; Strano, Matteo
2018-05-01
Tube hydroforming is a process used in various applications to form the tube in a desired complex shape, by combining the use of internal pressure, which provides the required stress to yield the material, and axial feeding, which helps the material to flow towards the bulging zone. In many studies it has been demonstrated how wrinkling and bursting defects can be severely reduced by means of a pulsating pressure, and how the so-called hammering hydroforming enhances the formability of the material. The definition of the optimum pressure and axial feeding profiles represent a daunting challenge in the designing phase of the hydroforming operation of a new part. The quality of the formed part is highly dependent on the amplitude and the peak value of the pulsating pressure, along with the axial stroke. In this paper, a research is reported, conducted by means of explicit finite element simulations of a hammering THF operation and metamodeling techniques aimed at optimizing the process parameters for the production of a complex part. The improved formability is explored for different factors and an optimization strategy is used to determine the most convenient pressure and axial feed profile curves for the hammering THF process of the examined part. It is shown how the pulsating pressure allows the minimization of the energy input in the process, still respecting final quality requirements.
Vidor, Emmanuel; Soubeyrand, Benoit
2016-12-01
The manufacture of DTP-backboned combination vaccines is complex, and vaccine quality is evaluated by both batch composition and conformance of manufacturing history. Since their first availability, both the manufacturing regulations for DTP combination vaccines and their demand have evolved significantly. This has resulted in a constant need to modify manufacturing and quality control processes. Areas covered: Regulations that govern the manufacture of complex vaccines can be inconsistent between countries and need to be aligned with the regulatory requirements that apply in all countries of distribution. Changes in product mix and quantities can lead to uncertainty in vaccine supply maintenance. These problems are discussed in the context of the importance of these products as essential public health tools. Expert commentary: Increasing demand for complex vaccines globally has led to problems in supply due to intrinsically complex manufacturing and regulatory procedures. Vaccine manufacturers are fully engaged in the resolution of these challenges, but currently changes in demand need ideally to be anticipated approximately 3 years in advance due to long production cycle times.
Complex Networks Analysis of Manual and Machine Translations
NASA Astrophysics Data System (ADS)
Amancio, Diego R.; Antiqueira, Lucas; Pardo, Thiago A. S.; da F. Costa, Luciano; Oliveira, Osvaldo N.; Nunes, Maria G. V.
Complex networks have been increasingly used in text analysis, including in connection with natural language processing tools, as important text features appear to be captured by the topology and dynamics of the networks. Following previous works that apply complex networks concepts to text quality measurement, summary evaluation, and author characterization, we now focus on machine translation (MT). In this paper we assess the possible representation of texts as complex networks to evaluate cross-linguistic issues inherent in manual and machine translation. We show that different quality translations generated by MT tools can be distinguished from their manual counterparts by means of metrics such as in- (ID) and out-degrees (OD), clustering coefficient (CC), and shortest paths (SP). For instance, we demonstrate that the average OD in networks of automatic translations consistently exceeds the values obtained for manual ones, and that the CC values of source texts are not preserved for manual translations, but are for good automatic translations. This probably reflects the text rearrangements humans perform during manual translation. We envisage that such findings could lead to better MT tools and automatic evaluation metrics.
Regulation by consensus: The expanded use of regulatory negotiation under the Clean Air Act
DOE Office of Scientific and Technical Information (OSTI.GOV)
Claiborne, M.L.
This article discusses the consensus building approach, which stems from the more formal regulatory negotiation process under the Negotiated Rulemaking Act of 1990, for improving air quality. The article uses as examples the joint plan to improve air quality and visibility in the Grand Canyon and 15 other national parks and wilderness areas in the SW USA, and the Southern Appalachian Mountain initiative tackling more complex issues including visibility, ground ozone, acid deposition, etc.
Malloy, Erin; Butt, Shiraz; Sorter, Michael
2010-01-01
Inpatient child and adolescent psychiatry leadership roles are often multifaceted, necessitating strong clinical knowledge and skills, organizational and leadership abilities, and in the academic setting the desire and skill in teaching and research. Early career psychiatrists who do possess these attributes may find themselves unprepared for such challenges as dealing with complex administrative and economic issues, accreditation, legal matters, and multitasking. This article offers a primer addressing these basic issues and in managing change through quality improvement processes.
An Ethnomethodological Perspective on How Middle School Students Addressed a Water Quality Problem
ERIC Educational Resources Information Center
Belland, Brian R.; Gu, Jiangyue; Kim, Nam Ju; Turner, David J.
2016-01-01
Science educators increasingly call for students to address authentic scientific problems in science class. One form of authentic science problem--socioscientific issue--requires that students engage in complex reasoning by considering both scientific and social implications of problems. Computer-based scaffolding can support this process by…
USDA-ARS?s Scientific Manuscript database
In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...
Creative Thinking: Processes, Strategies, and Knowledge
ERIC Educational Resources Information Center
Mumford, Michael D.; Medeiros, Kelsey E.; Partlow, Paul J.
2012-01-01
Creative achievements are the basis for progress in our world. Although creative achievement is influenced by many variables, the basis for creativity is held to lie in the generation of high-quality, original, and elegant solutions to complex, novel, ill-defined problems. In the present effort, we examine the cognitive capacities that make…
A tutorial for developing a topical cream formulation based on the Quality by Design approach.
Simões, Ana; Veiga, Francisco; Vitorino, Carla; Figueiras, Ana
2018-06-20
The pharmaceutical industry has entered in a new era, as there is a growing interest in increasing the quality standards of dosage forms, through the implementation of more structured development and manufacturing approaches. For many decades, the manufacturing of drug products was controlled by a regulatory framework to guarantee the quality of the final product through a fixed process and exhaustive testing. Limitations related to the Quality by Test (QbT) system have been widely acknowledged. The emergence of Quality by Design (QbD) as a systematic and risk-based approach introduced a new quality concept based on a good understanding of how raw materials and process parameters influence the final quality profile. Although the QbD system has been recognized as a revolutionary approach to product development and manufacturing, its full implementation in the pharmaceutical field is still limited. This is particularly evident in the case of semisolid complex formulation development. The present review aims at establishing a practical QbD framework to describe all stages comprised in the pharmaceutical development of a conventional cream in a comprehensible manner. Copyright © 2018. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Fatig, Michael
1993-01-01
Flight operations and the preparation for it has become increasingly complex as mission complexities increase. Further, the mission model dictates that a significant increase in flight operations activities is upon us. Finally, there is a need for process improvement and economy in the operations arena. It is therefore time that we recognize flight operations as a complex process requiring a defined, structured, and life cycle approach vitally linked to space segment, ground segment, and science operations processes. With this recognition, an FOT Tool Kit consisting of six major components designed to provide tools to guide flight operations activities throughout the mission life cycle was developed. The major components of the FOT Tool Kit and the concepts behind the flight operations life cycle process as developed at NASA's GSFC for GSFC-based missions are addressed. The Tool Kit is therefore intended to increase productivity, quality, cost, and schedule performance of the flight operations tasks through the use of documented, structured methodologies; knowledge of past lessons learned and upcoming new technology; and through reuse and sharing of key products and special application programs made possible through the development of standardized key products and special program directories.
Triage of oxidation-prone proteins by Sqstm1/p62 within the mitochondria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Minjung; Shin, Jaekyoon, E-mail: jkshin@med.skku.ac.kr
2011-09-16
Highlights: {yields} The mitochondrion contains its own protein quality control system. {yields} p62 localizes within the mitochondria and forms mega-dalton sized complexes. {yields} p62 interacts with oxidation-prone proteins and the proteins of quality control. {yields} In vitro delivery of p62 improves mitochondrial functions. {yields} p62 is implicated as a participant in mitochondrial protein quality control. -- Abstract: As the mitochondrion is vulnerable to oxidative stress, cells have evolved several strategies to maintain mitochondrial integrity, including mitochondrial protein quality control mechanisms and autophagic removal of damaged mitochondria. Involvement of an autophagy adaptor, Sqstm1/p62, in the latter process has been recently described.more » In the present study, we provide evidence that a portion of p62 directly localizes within the mitochondria and supports stable electron transport by forming heterogeneous protein complexes. Matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF) of mitochondrial proteins co-purified with p62 revealed that p62 interacts with several oxidation-prone proteins, including a few components of the electron transport chain complexes, as well as multiple chaperone molecules and redox regulatory enzymes. Accordingly, p62-deficient mitochondria exhibited compromised electron transport, and the compromised function was partially restored by in vitro delivery of p62. These results suggest that p62 plays an additional role in maintaining mitochondrial integrity at the vicinity of target machineries through its function in relation to protein quality control.« less
Causes of cine image quality deterioration in cardiac catheterization laboratories.
Levin, D C; Dunham, L R; Stueve, R
1983-10-01
Deterioration of cineangiographic image quality can result from malfunctions or technical errors at a number of points along the cine imaging chain: generator and automatic brightness control, x-ray tube, x-ray beam geometry, image intensifier, optics, cine camera, cine film, film processing, and cine projector. Such malfunctions or errors can result in loss of image contrast, loss of spatial resolution, improper control of film optical density (brightness), or some combination thereof. While the electronic and photographic technology involved is complex, physicians who perform cardiac catheterization should be conversant with the problems and what can be done to solve them. Catheterization laboratory personnel have control over a number of factors that directly affect image quality, including radiation dose rate per cine frame, kilovoltage or pulse width (depending on type of automatic brightness control), cine run time, selection of small or large focal spot, proper object-intensifier distance and beam collimation, aperture of the cine camera lens, selection of cine film, processing temperature, processing immersion time, and selection of developer.
Ueno, Yoshifumi; Aikawa, Shimpei; Kondo, Akihiko; Akimoto, Seiji
2015-08-01
Photosynthetic organisms change the quantity and/or quality of their pigment-protein complexes and the interactions among these complexes in response to light conditions. In the present study, we analyzed light adaptation of the unicellular red alga Cyanidioschyzon merolae, whose pigment composition is similar to that of cyanobacteria because its phycobilisomes (PBS) lack phycoerythrin. C. merolae were grown under different light qualities, and their responses were measured by steady-state absorption, steady-state fluorescence, and picosecond time-resolved fluorescence spectroscopies. Cells were cultivated under four monochromatic light-emitting diodes (blue, green, yellow, and red), and changes in pigment composition and energy transfer were observed. Cells grown under blue and green light increased their relative phycocyanin levels compared with cells cultured under white light. Energy-transfer processes to photosystem I (PSI) were sensitive to yellow and red light. The contribution of direct energy transfer from PBS to PSI increased only under yellow light, while red light induced a reduction in energy transfer from photosystem II to PSI and an increase in energy transfer from light-harvesting chlorophyll protein complex I to PSI. Differences in pigment composition, growth, and energy transfer under different light qualities are discussed.
Meteorological determinants of air quality
NASA Astrophysics Data System (ADS)
Turoldo, F.; Del Frate, S.; Gallai, I.; Giaiotti, D. B.; Montanari, F.; Stel, F.; Goi, D.
2010-09-01
Air quality is the result of complex phenomena, among which the major role is played by human emissions of pollutants. Atmospheric processes act as determinants, e.g., modulating, dumping or amplifying the effects of emissions as an orchestra's director does with musical instruments. In this work, a series of small-scale and meso-scale meteorological determinants of air-quality are presented as they are observed in an area characterized by complex orography (Friuli Venezia Giulia, in the north-eastern side of Italy). In particular, attention is devoted to: i) meso-scale flows favouring the persistence of high concentrations of particulate matter; ii) meso-scale periodic flows (breezes) favouring high values of particulate matter; iii) local-scale thermodynamic behaviour favouring high atmospheric values of nitrogen oxides. The effects of these different classes of determinants are shown through comparisons between anthropic emissions (mainly traffic) and ground-based measurements. The relevance of complex orography (relatively steep relieves near to the sea) is shown for the meso-scale flows and, in particular, for local-scale periodic flows, which favour the increase of high pollutants concentrations mainly in summer, when the breezes regime is particularly relevant. Part of these results have been achieved through the ETS - Alpine Space EU project iMONITRAF!
A general framework for a collaborative water quality knowledge and information network.
Dalcanale, Fernanda; Fontane, Darrell; Csapo, Jorge
2011-03-01
Increasing knowledge about the environment has brought about a better understanding of the complexity of the issues, and more information publicly available has resulted into a steady shift from centralized decision making to increasing levels of participatory processes. The management of that information, in turn, is becoming more complex. One of the ways to deal with the complexity is the development of tools that would allow all players, including managers, researchers, educators, stakeholders and the civil society, to be able to contribute to the information system, in any level they are inclined to do so. In this project, a search for the available technology for collaboration, methods of community filtering, and community-based review was performed and the possible implementation of these tools to create a general framework for a collaborative "Water Quality Knowledge and Information Network" was evaluated. The main goals of the network are to advance water quality education and knowledge; encourage distribution and access to data; provide networking opportunities; allow public perceptions and concerns to be collected; promote exchange of ideas; and, give general, open, and free access to information. A reference implementation was made available online and received positive feedback from the community, which also suggested some possible improvements.
A General Framework for a Collaborative Water Quality Knowledge and Information Network
NASA Astrophysics Data System (ADS)
Dalcanale, Fernanda; Fontane, Darrell; Csapo, Jorge
2011-03-01
Increasing knowledge about the environment has brought about a better understanding of the complexity of the issues, and more information publicly available has resulted into a steady shift from centralized decision making to increasing levels of participatory processes. The management of that information, in turn, is becoming more complex. One of the ways to deal with the complexity is the development of tools that would allow all players, including managers, researchers, educators, stakeholders and the civil society, to be able to contribute to the information system, in any level they are inclined to do so. In this project, a search for the available technology for collaboration, methods of community filtering, and community-based review was performed and the possible implementation of these tools to create a general framework for a collaborative "Water Quality Knowledge and Information Network" was evaluated. The main goals of the network are to advance water quality education and knowledge; encourage distribution and access to data; provide networking opportunities; allow public perceptions and concerns to be collected; promote exchange of ideas; and, give general, open, and free access to information. A reference implementation was made available online and received positive feedback from the community, which also suggested some possible improvements.
Intelligent methods for the process parameter determination of plastic injection molding
NASA Astrophysics Data System (ADS)
Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn
2018-03-01
Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.
Landslide Phenomena in Sevan National Park-Armenia
NASA Astrophysics Data System (ADS)
Lazarov, Dimitrov; Minchev, Dimitar; Aleksanyan, Gurgen; Ilieva, Maya
2010-12-01
Based on data from master and slave complex images obtained on 30 August 2008 and 4 October 2008 by satellite ENVISAT with ASAR sensor,all processing chain is performed to evaluate landslides phenomena in Sevan National park - Republic of Armenia. For this purpose Identification Deformation Inspection and Observation Tool developed by Berlin University of Technology is applied. This software package uses a freely available DEM of the Shuttle Radar Topography Mission (SRTM) and performs a fully automatic generation of differential SAR interferograms from ENVISAT single look complex SAR data. All interferometric processing steps are implemented with maximum quality and precision. The results illustrate almost calm Earth surface in the area of Sevan Lake.
Vaseem, Mohammad; McKerricher, Garret; Shamim, Atif
2016-01-13
Currently, silver-nanoparticle-based inkjet ink is commercially available. This type of ink has several serious problems such as a complex synthesis protocol, high cost, high sintering temperatures (∼200 °C), particle aggregation, nozzle clogging, poor shelf life, and jetting instability. For the emerging field of printed electronics, these shortcomings in conductive inks are barriers for their widespread use in practical applications. Formulating particle-free silver inks has potential to solve these issues and requires careful design of the silver complexation. The ink complex must meet various requirements, such as in situ reduction, optimum viscosity, storage and jetting stability, smooth uniform sintered films, excellent adhesion, and high conductivity. This study presents a robust formulation of silver-organo-complex (SOC) ink, where complexing molecules act as reducing agents. The 17 wt % silver loaded ink was printed and sintered on a wide range of substrates with uniform surface morphology and excellent adhesion. The jetting stability was monitored for 5 months to confirm that the ink was robust and highly stable with consistent jetting performance. Radio frequency inductors, which are highly sensitive to metal quality, were demonstrated as a proof of concept on flexible PEN substrate. This is a major step toward producing high-quality electronic components with a robust inkjet printing process.
Impact of initial surface parameters on the final quality of laser micro-polished surfaces
NASA Astrophysics Data System (ADS)
Chow, Michael; Bordatchev, Evgueni V.; Knopf, George K.
2012-03-01
Laser micro-polishing (LμP) is a new laser-based microfabrication technology for improving surface quality during a finishing operation and for producing parts and surfaces with near-optical surface quality. The LμP process uses low power laser energy to melt a thin layer of material on the previously machined surface. The polishing effect is achieved as the molten material in the laser-material interaction zone flows from the elevated regions to the local minimum due to surface tension. This flow of molten material then forms a thin ultra-smooth layer on the top surface. The LμP is a complex thermo-dynamic process where the melting, flow and redistribution of molten material is significantly influenced by a variety of process parameters related to the laser, the travel motions and the material. The goal of this study is to analyze the impact of initial surface parameters on the final surface quality. Ball-end micromilling was used for preparing initial surface of samples from H13 tool steel that were polished using a Q-switched Nd:YAG laser. The height and width of micromilled scallops (waviness) were identified as dominant parameter affecting the quality of the LμPed surface. By adjusting process parameters, the Ra value of a surface, having a waviness period of 33 μm and a peak-to-valley value of 5.9 μm, was reduced from 499 nm to 301 nm, improving the final surface quality by 39.7%.
Kim, Dongcheol; Rhee, Sehun
2002-01-01
CO(2) welding is a complex process. Weld quality is dependent on arc stability and minimizing the effects of disturbances or changes in the operating condition commonly occurring during the welding process. In order to minimize these effects, a controller can be used. In this study, a fuzzy controller was used in order to stabilize the arc during CO(2) welding. The input variable of the controller was the Mita index. This index estimates quantitatively the arc stability that is influenced by many welding process parameters. Because the welding process is complex, a mathematical model of the Mita index was difficult to derive. Therefore, the parameter settings of the fuzzy controller were determined by performing actual control experiments without using a mathematical model of the controlled process. The solution, the Taguchi method was used to determine the optimal control parameter settings of the fuzzy controller to make the control performance robust and insensitive to the changes in the operating conditions.
Taming wild yeast: potential of conventional and nonconventional yeasts in industrial fermentations.
Steensels, Jan; Verstrepen, Kevin J
2014-01-01
Yeasts are the main driving force behind several industrial food fermentation processes, including the production of beer, wine, sake, bread, and chocolate. Historically, these processes developed from uncontrolled, spontaneous fermentation reactions that rely on a complex mixture of microbes present in the environment. Because such spontaneous processes are generally inconsistent and inefficient and often lead to the formation of off-flavors, most of today's industrial production utilizes defined starter cultures, often consisting of a specific domesticated strain of Saccharomyces cerevisiae, S. bayanus, or S. pastorianus. Although this practice greatly improved process consistency, efficiency, and overall quality, it also limited the sensorial complexity of the end product. In this review, we discuss how Saccharomyces yeasts were domesticated to become the main workhorse of food fermentations, and we investigate the potential and selection of nonconventional yeasts that are often found in spontaneous fermentations, such as Brettanomyces, Hanseniaspora, and Pichia spp.
NASA Astrophysics Data System (ADS)
Kumbhar, N. N.; Mulay, A. V.
2016-08-01
The Additive Manufacturing (AM) processes open the possibility to go directly from Computer-Aided Design (CAD) to a physical prototype. These prototypes are used as test models before it is finalized as well as sometimes as a final product. Additive Manufacturing has many advantages over the traditional process used to develop a product such as allowing early customer involvement in product development, complex shape generation and also save time as well as money. Additive manufacturing also possess some special challenges that are usually worth overcoming such as Poor Surface quality, Physical Properties and use of specific raw material for manufacturing. To improve the surface quality several attempts had been made by controlling various process parameters of Additive manufacturing and also applying different post processing techniques on components manufactured by Additive manufacturing. The main objective of this work is to document an extensive literature review in the general area of post processing techniques which are used in Additive manufacturing.
Geodynamics branch data base for main magnetic field analysis
NASA Technical Reports Server (NTRS)
Langel, Robert A.; Baldwin, R. T.
1991-01-01
The data sets used in geomagnetic field modeling at GSFC are described. Data are measured and obtained from a variety of information and sources. For clarity, data sets from different sources are categorized and processed separately. The data base is composed of magnetic observatory data, surface data, high quality aeromagnetic, high quality total intensity marine data, satellite data, and repeat data. These individual data categories are described in detail in a series of notebooks in the Geodynamics Branch, GSFC. This catalog reviews the original data sets, the processing history, and the final data sets available for each individual category of the data base and is to be used as a reference manual for the notebooks. Each data type used in geomagnetic field modeling has varying levels of complexity requiring specialized processing routines for satellite and observatory data and two general routines for processing aeromagnetic, marine, land survey, and repeat data.
Numerical simulation of polishing U-tube based on solid-liquid two-phase
NASA Astrophysics Data System (ADS)
Li, Jun-ye; Meng, Wen-qing; Wu, Gui-ling; Hu, Jing-lei; Wang, Bao-zuo
2018-03-01
As the advanced technology to solve the ultra-precision machining of small hole structure parts and complex cavity parts, the abrasive grain flow processing technology has the characteristics of high efficiency, high quality and low cost. So this technology in many areas of precision machining has an important role. Based on the theory of solid-liquid two-phase flow coupling, a solid-liquid two-phase MIXTURE model is used to simulate the abrasive flow polishing process on the inner surface of U-tube, and the temperature, turbulent viscosity and turbulent dissipation rate in the process of abrasive flow machining of U-tube were compared and analyzed under different inlet pressure. In this paper, the influence of different inlet pressure on the surface quality of the workpiece during abrasive flow machining is studied and discussed, which provides a theoretical basis for the research of abrasive flow machining process.
An argon ion beam milling process for native AlOx layers enabling coherent superconducting contacts
NASA Astrophysics Data System (ADS)
Grünhaupt, Lukas; von Lüpke, Uwe; Gusenkova, Daria; Skacel, Sebastian T.; Maleeva, Nataliya; Schlör, Steffen; Bilmes, Alexander; Rotzinger, Hannes; Ustinov, Alexey V.; Weides, Martin; Pop, Ioan M.
2017-08-01
We present an argon ion beam milling process to remove the native oxide layer forming on aluminum thin films due to their exposure to atmosphere in between lithographic steps. Our cleaning process is readily integrable with conventional fabrication of Josephson junction quantum circuits. From measurements of the internal quality factors of superconducting microwave resonators with and without contacts, we place an upper bound on the residual resistance of an ion beam milled contact of 50 mΩ μm2 at a frequency of 4.5 GHz. Resonators for which only 6% of the total foot-print was exposed to the ion beam milling, in areas of low electric and high magnetic fields, showed quality factors above 106 in the single photon regime, and no degradation compared to single layer samples. We believe these results will enable the development of increasingly complex superconducting circuits for quantum information processing.
Ishikura, Satoshi
2008-11-01
The process of radiotherapy (RT) is complex and involves understanding of the principles of medical physics, radiobiology, radiation safety, dosimetry, radiation treatment planning, simulation and interaction of radiation with other treatment modalities. Each step in the integrated process of RT needs quality control and quality assurance (QA) to prevent errors and to give high confidence that patients will receive the prescribed treatment correctly. Recent advances in RT, including intensity-modulated and image-guided RT, focus on the need for a systematic RTQA program that balances patient safety and quality with available resources. It is necessary to develop more formal error mitigation and process analysis methods, such as failure mode and effect analysis, to focus available QA resources optimally on process components. External audit programs are also effective. The International Atomic Energy Agency has operated both an on-site and off-site postal dosimetry audit to improve practice and to assure the dose from RT equipment. Several countries have adopted a similar approach for national clinical auditing. In addition, clinical trial QA has a significant role in enhancing the quality of care. The Advanced Technology Consortium has pioneered the development of an infrastructure and QA method for advanced technology clinical trials, including credentialing and individual case review. These activities have an impact not only on the treatment received by patients enrolled in clinical trials, but also on the quality of treatment administered to all patients treated in each institution, and have been adopted globally; by the USA, Europe and Japan also.
Atsuta, Yoshiko
2016-01-01
Collection and analysis of information on diseases and post-transplant courses of allogeneic hematopoietic stem cell transplant recipients have played important roles in improving therapeutic outcomes in hematopoietic stem cell transplantation. Efficient, high-quality data collection systems are essential. The introduction of the Second-Generation Transplant Registry Unified Management Program (TRUMP2) is intended to improve data quality and more efficient data management. The TRUMP2 system will also expand possible uses of data, as it is capable of building a more complex relational database. The construction of an accessible data utilization system for adequate data utilization by researchers would promote greater research activity. Study approval and management processes and authorship guidelines also need to be organized within this context. Quality control of processes for data manipulation and analysis will also affect study outcomes. Shared scripts have been introduced to define variables according to standard definitions for quality control and improving efficiency of registry studies using TRUMP data.
[Real-time detection of quality of Chinese materia medica: strategy of NIR model evaluation].
Wu, Zhi-sheng; Shi, Xin-yuan; Xu, Bing; Dai, Xing-xing; Qiao, Yan-jiang
2015-07-01
The definition of critical quality attributes of Chinese materia medica ( CMM) was put forward based on the top-level design concept. Nowadays, coupled with the development of rapid analytical science, rapid assessment of critical quality attributes of CMM was firstly carried out, which was the secondary discipline branch of CMM. Taking near infrared (NIR) spectroscopy as an example, which is a rapid analytical technology in pharmaceutical process over the past decade, systematic review is the chemometric parameters in NIR model evaluation. According to the characteristics of complexity of CMM and trace components analysis, a multi-source information fusion strategy of NIR model was developed for assessment of critical quality attributes of CMM. The strategy has provided guideline for NIR reliable analysis in critical quality attributes of CMM.
A Systems Engineering Approach to Quality Assurance for Aerospace Testing
NASA Technical Reports Server (NTRS)
Shepherd, Christena C.
2014-01-01
On the surface, it appears that AS9100 has little to say about how to apply a Quality Management System (QMS) to major aerospace test programs (or even smaller ones). It also appears that there is little in the quality engineering Body of Knowledge (BOK) that applies to testing, unless it is nondestructive examination (NDE), or some type of lab or bench testing associated with the manufacturing process. However, if one examines: a) how the systems engineering (SE) processes are implemented throughout a test program; and b) how these SE processes can be mapped to the requirements of AS9100, a number of areas for involvement of the quality professional are revealed. What often happens is that quality assurance during a test program is limited to inspections of the test article; what could be considered a manufacturing al fresco approach. This limits the quality professional and is a disservice to the programs and projects, since there are a number of ways that quality can enhance critical processes, and support efforts to improve risk reduction, efficiency and effectiveness. The Systems Engineering (SE) discipline is widely used in aerospace to ensure the progress from Stakeholder Expectations (the President, Congress, the taxpayers) to a successful, delivered product or service. Although this is well known, what is not well known is that these same SE processes are implemented in varying complexity, to prepare for and implement test projects that support research, development, verification and validation, qualification, and acceptance test projects. Although the test organization's terminology may vary from the SE terminology, and from one test service provider to another, the basic process is followed by successful, reliable testing organizations. For this analysis, NASA Procedural Requirements (NPR) 7123.1, NASA Systems Engineering Processes and Requirements is used to illustrate the SE processes that are used for major aerospace testing. Many of these processes are also implemented for smaller test projects, and this set of processes will also look familiar to those who have participated in launch site activation and flight demonstrations.
Chen, Qing; Xu, Pengfei; Liu, Wenzhong
2016-01-01
Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel–least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify) automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method. PMID:26986726
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
Blasting preparation for selective mining of complex structured ore deposition
NASA Astrophysics Data System (ADS)
Marinin, M. A.; Dolzhikov, V. V.
2017-10-01
Technological features of ore mining in the open pit development for processing of complex structured ore deposit of steeply falling occurrence have been considered. The technological schemes of ore bodies mining under different conditions of occurrence, consistency and capacity have been considered and offered in the paper. These technologies permit to reduce losses and dilution, but to increase the completeness and quality of mined ore. A method of subsequent selective excavation of ore bodies has been proposed. The method is based on the complex use of buffer-blasting technology for the muck mass and the principle of trim blasting at ore-rock junctions.
The role of metrics and measurements in a software intensive total quality management environment
NASA Technical Reports Server (NTRS)
Daniels, Charles B.
1992-01-01
Paramax Space Systems began its mission as a member of the Rockwell Space Operations Company (RSOC) team which was the successful bidder on a massive operations consolidation contract for the Mission Operations Directorate (MOD) at JSC. The contract awarded to the team was the Space Transportation System Operations Contract (STSOC). Our initial challenge was to accept responsibility for a very large, highly complex and fragmented collection of software from eleven different contractors and transform it into a coherent, operational baseline. Concurrently, we had to integrate a diverse group of people from eleven different companies into a single, cohesive team. Paramax executives recognized the absolute necessity to develop a business culture based on the concept of employee involvement to execute and improve the complex process of our new environment. Our executives clearly understood that management needed to set the example and lead the way to quality improvement. The total quality management policy and the metrics used in this endeavor are presented.
2012-01-01
Background Quality improvement (QI) programs focused on mastery of content by individual staff members are the current standard to improve resident outcomes in nursing homes. However, complexity science suggests that learning is a social process that occurs within the context of relationships and interactions among individuals. Thus, QI programs will not result in optimal changes in staff behavior unless the context for social learning is present. Accordingly, we developed CONNECT, an intervention to foster systematic use of management practices, which we propose will enhance effectiveness of a nursing home Falls QI program by strengthening the staff-to-staff interactions necessary for clinical problem-solving about complex problems such as falls. The study aims are to compare the impact of the CONNECT intervention, plus a falls reduction QI intervention (CONNECT + FALLS), to the falls reduction QI intervention alone (FALLS), on fall-related process measures, fall rates, and staff interaction measures. Methods/design Sixteen nursing homes will be randomized to one of two study arms, CONNECT + FALLS or FALLS alone. Subjects (staff and residents) are clustered within nursing homes because the intervention addresses social processes and thus must be delivered within the social context, rather than to individuals. Nursing homes randomized to CONNECT + FALLS will receive three months of CONNECT first, followed by three months of FALLS. Nursing homes randomized to FALLS alone receive three months of FALLs QI and are offered CONNECT after data collection is completed. Complexity science measures, which reflect staff perceptions of communication, safety climate, and care quality, will be collected from staff at baseline, three months after, and six months after baseline to evaluate immediate and sustained impacts. FALLS measures including quality indicators (process measures) and fall rates will be collected for the six months prior to baseline and the six months after the end of the intervention. Analysis will use a three-level mixed model. Discussion By focusing on improving local interactions, CONNECT is expected to maximize staff's ability to implement content learned in a falls QI program and integrate it into knowledge and action. Our previous pilot work shows that CONNECT is feasible, acceptable and appropriate. Trial Registration ClinicalTrials.gov: NCT00636675 PMID:22376375
NASA Technical Reports Server (NTRS)
Vickers, John H.; Pelham, Larry I.
1993-01-01
Automated fiber placement is a manufacturing process used for producing complex composite structures. It is a notable leap to the state-of-the-art in technology for automated composite manufacturing. The fiber placement capability was established at the Marshall Space Flight Center's (MSFC) Productivity Enhancement Complex in 1992 in collaboration with Thiokol Corporation to provide materials and processes research and development, and to fabricate components for many of the Center's Programs. The Fiber Placement System (FPX) was developed as a distinct solution to problems inherent to other automated composite manufacturing systems. This equipment provides unique capabilities to build composite parts in complex 3-D shapes with concave and other asymmetrical configurations. Components with complex geometries and localized reinforcements usually require labor intensive efforts resulting in expensive, less reproducible components; the fiber placement system has the features necessary to overcome these conditions. The mechanical systems of the equipment have the motion characteristics of a filament winder and the fiber lay-up attributes of a tape laying machine, with the additional capabilities of differential tow payout speeds, compaction and cut-restart to selectively place the correct number of fibers where the design dictates. This capability will produce a repeatable process resulting in lower cost and improved quality and reliability.
Choi, D J; Park, H
2001-11-01
For control and automation of biological treatment processes, lack of reliable on-line sensors to measure water quality parameters is one of the most important problems to overcome. Many parameters cannot be measured directly with on-line sensors. The accuracy of existing hardware sensors is also not sufficient and maintenance problems such as electrode fouling often cause trouble. This paper deals with the development of software sensor techniques that estimate the target water quality parameter from other parameters using the correlation between water quality parameters. We focus our attention on the preprocessing of noisy data and the selection of the best model feasible to the situation. Problems of existing approaches are also discussed. We propose a hybrid neural network as a software sensor inferring wastewater quality parameter. Multivariate regression, artificial neural networks (ANN), and a hybrid technique that combines principal component analysis as a preprocessing stage are applied to data from industrial wastewater processes. The hybrid ANN technique shows an enhancement of prediction capability and reduces the overfitting problem of neural networks. The result shows that the hybrid ANN technique can be used to extract information from noisy data and to describe the nonlinearity of complex wastewater treatment processes.
Martínez-Pardo, María Esther; Mariano-Magaña, David
2007-01-01
Tissue banking is a complex operation concerned with the organisation and coordination of all the steps, that is, from donor selection up to storage and distribution of the final products for therapeutic, diagnostic, instruction and research purposes. An appropriate quality framework should be established in order to cover all the specific methodology as well as the general aspects of quality management, such as research and development, design, instruction and training, specific documentation, traceability, corrective action, client satisfaction, and the like. Such a framework can be obtained by developing a quality management system (QMS) in accordance with a suitable international standard: ISO 9001:2000. This paper presents the implementation process of the tissue bank QMS at the Instituto Nacional de Investigaciones Nucleares in Mexico. The objective of the paper is to share the experience gained by the tissue bank personnel [radiosterilised tissue bank (BTR)] at the Instituto Nacional de Investigaciones Nucleares (ININ, National Institute of Nuclear Research), during implementation of the ISO 9001:2000 certification process. At present, the quality management system (QMS) of ININ also complies with the Mexican standard NMX-CC-9001:2000. The scope of this QMS is Research, Development and Processing of Biological Tissues Sterilised by Gamma Radiation, among others.
NASA Astrophysics Data System (ADS)
Calvo, Juan; Nieto, Juanjo
2016-09-01
The management of human crowds in extreme situations is a complex subject which requires to take into account a variety of factors. To name a few, the understanding of human behaviour, the psychological and behavioural features of individuals, the quality of the venue and the stress level of the pedestrian need to be addressed in order to select the most appropriate action during an evacuation process on a complex venue. In this sense, the mathematical modeling of such complex phenomena can be regarded as a very useful tool to understand and predict these situations. As presented in [4], mathematical models can provide guidance to the personnel in charge of managing evacuation processes, by means of helping to design a set of protocols, among which the most appropriate during a given critical situation is then chosen.
Forward design of a complex enzyme cascade reaction
Hold, Christoph; Billerbeck, Sonja; Panke, Sven
2016-01-01
Enzymatic reaction networks are unique in that one can operate a large number of reactions under the same set of conditions concomitantly in one pot, but the nonlinear kinetics of the enzymes and the resulting system complexity have so far defeated rational design processes for the construction of such complex cascade reactions. Here we demonstrate the forward design of an in vitro 10-membered system using enzymes from highly regulated biological processes such as glycolysis. For this, we adapt the characterization of the biochemical system to the needs of classical engineering systems theory: we combine online mass spectrometry and continuous system operation to apply standard system theory input functions and to use the detailed dynamic system responses to parameterize a model of sufficient quality for forward design. This allows the facile optimization of a 10-enzyme cascade reaction for fine chemical production purposes. PMID:27677244
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, J.C.A.; Hsu, C.Y.; Taiwan SPIN Research Center, National Chung Cheng University, Chiayi, Taiwan
2004-12-13
Proper as well as under- and over-oxided CoFe-AlO{sub x}-CoFe magnetic tunnel junctions (MTJs) have been systematically investigated in a frequency range from 10{sup 2} to 10{sup 8} Hz by complex capacitance spectroscopy. The dielectric relaxation behavior of the MTJs remarkably disobeys the typical Cole-Cole arc law probably due to the existence of imperfectly blocked Schottky barrier in the metal-insulator interface. The dielectric relaxation response can be successfully modeled on the basis of Debye relaxation by incorporating an interfacial dielectric contribution. In addition, complex capacitance spectroscopy demonstrates significant sensitivity to the oxidation process of metallic Al layers, i.e., almost a fingerprintmore » of under, proper, and over oxidation. This technique provides a fast and simple method to inspect the AlO{sub x} barrier quality of MTJs.« less
Managing quality and compliance.
McNeil, Alice; Koppel, Carl
2015-01-01
Critical care nurses assume vital roles in maintaining patient care quality. There are distinct facets to the process including standard setting, regulatory compliance, and completion of reports associated with these endeavors. Typically, multiple niche software applications are required and user interfaces are varied and complex. Although there are distinct quality indicators that must be tracked as well as a list of serious or sentinel events that must be documented and reported, nurses may not know the precise steps to ensure that information is properly documented and actually reaches the proper authorities for further investigation and follow-up actions. Technology advances have permitted the evolution of a singular software platform, capable of monitoring quality indicators and managing all facets of reporting associated with regulatory compliance.
A modeling analysis program for the JPL Table Mountain Io sodium cloud data
NASA Technical Reports Server (NTRS)
Smyth, W. H.; Goldberg, B. A.
1986-01-01
Progress and achievements in the second year are discussed in three main areas: (1) data quality review of the 1981 Region B/C images; (2) data processing activities; and (3) modeling activities. The data quality review revealed that almost all 1981 Region B/C images are of sufficient quality to be valuable in the analyses of the JPL data set. In the second area, the major milestone reached was the successful development and application of complex image-processing software required to render the original image data suitable for modeling analysis studies. In the third area, the lifetime description of sodium atoms in the planet magnetosphere was improved in the model to include the offset dipole nature of the magnetic field as well as an east-west electric field. These improvements are important in properly representing the basic morphology as well as the east-west asymmetries of the sodium cloud.
Batalden, Paul; Stevens, David; Ogrinc, Greg; Mooney, Susan
2008-01-01
In 2005 we published draft guidelines for reporting studies of quality improvement interventions as the initial step in a consensus process for development of a more definitive version. The current article contains the revised version, which we refer to as SQUIRE (Standards for QUality Improvement Reporting Excellence). We describe the consensus process, which included informal feedback, formal written commentaries, input from publication guideline developers, review of the literature on the epistemology of improvement and on methods for evaluating complex social programs, and a meeting of stakeholders for critical review of the guidelines’ content and wording, followed by commentary on sequential versions from an expert consultant group. Finally, we examine major differences between SQUIRE and the initial draft, and consider limitations of and unresolved questions about SQUIRE; we also describe ancillary supporting documents and alternative versions under development, and plans for dissemination, testing, and further development of SQUIRE. PMID:18830766
Choices of capture chromatography technology in antibody manufacturing processes.
DiLeo, Michael; Ley, Arthur; Nixon, Andrew E; Chen, Jie
2017-11-15
The capture process employed in monoclonal antibody downstream purification is not only the most critically impacted process by increased antibody titer resulting from optimized mammalian cell culture expression systems, but also the most important purification step in determining overall process throughput, product quality, and economics. Advances in separation technology for capturing antibodies from complex feedstocks have been one focus of downstream purification process innovation for past 10 years. In this study, we evaluated new generation chromatography resins used in the antibody capture process including Protein A, cation exchange, and mixed mode chromatography to address the benefits and unique challenges posed by each chromatography approach. Our results demonstrate the benefit of improved binding capacity of new generation Protein A resins, address the concern of high concentration surge caused aggregation when using new generation cation exchange resins with over 100mg/mL binding capacity, and highlight the potential of multimodal cation exchange resins for capture process design. The new landscape of capture chromatography technologies provides options to achieve overall downstream purification outcome with high product quality and process efficiency. Copyright © 2017 Elsevier B.V. All rights reserved.
Blecker, Steve W.; Stillings, Lisa L.; Amacher, Michael C.; Ippolito, James A.; DeCrappeo, Nicole M.
2010-01-01
The myriad definitions of soil/ecosystem quality or health are often driven by ecosystem and management concerns, and they typically focus on the ability of the soil to provide functions relating to biological productivity and/or environmental quality. A variety of attempts have been made to create indices that quantify the complexities of soil quality and provide a means of evaluating the impact of various natural and anthropogenic disturbances. Though not without their limitations, indices can improve our understanding of the controls behind ecosystem processes and allow for the distillation of information to help link scientific and management communities. In terrestrial systems, indices were initially developed and modified for agroecosystems; however, the number of studies implementing such indices in nonagricultural systems is growing. Soil quality indices (SQIs) are typically composed of biological (and sometimes physical and chemical) parameters that attempt to reduce the complexity of a system into a metric of a soil’s ability to carry out one or more functions.The indicators utilized in SQIs can be as varied as the studies themselves, reflecting the complexity of the soil and ecosystems in which they function. Regardless, effective soil quality indicators should correlate well with soil or ecosystem processes, integrate those properties and processes, and be relevant to management practices. Commonly applied biological indicators include measures associated with soil microbial activity or function (for example, carbon and nitrogen mineralization, respiration, microbial biomass, enzyme activity. Cost, accessibility, ease of interpretation, and presence of existing data often dictate indicator selection given the number of available measures. We employed a large number of soil biological, chemical, and physical measures, along with measures of vegetation cover, density, and productivity, in order to test the utility and sensitivity of these measures within various mineralized terranes. We were also interested in examining these relations in the context of determining appropriate reference conditions with which to compare reclamation efforts.The purpose of this report is to present the data used to develop indices of soil and ecosystem quality associated with mineralized terranes (areas enriched in metal-bearing minerals), specifically podiform chromite, quartz alunite, and Mo/Cu porphyry systems. Within each of these mineralized terranes, a nearby unmineralized counterpart was chosen for comparison. The data consist of soil biological, chemical, and physical parameters, along with vegetation measurements for each of the sites described below. Synthesis of these data and index development will be the subject of future publications.
Harvesting geographic features from heterogeneous raster maps
NASA Astrophysics Data System (ADS)
Chiang, Yao-Yi
2010-11-01
Raster maps offer a great deal of geospatial information and are easily accessible compared to other geospatial data. However, harvesting geographic features locked in heterogeneous raster maps to obtain the geospatial information is challenging. This is because of the varying image quality of raster maps (e.g., scanned maps with poor image quality and computer-generated maps with good image quality), the overlapping geographic features in maps, and the typical lack of metadata (e.g., map geocoordinates, map source, and original vector data). Previous work on map processing is typically limited to a specific type of map and often relies on intensive manual work. In contrast, this thesis investigates a general approach that does not rely on any prior knowledge and requires minimal user effort to process heterogeneous raster maps. This approach includes automatic and supervised techniques to process raster maps for separating individual layers of geographic features from the maps and recognizing geographic features in the separated layers (i.e., detecting road intersections, generating and vectorizing road geometry, and recognizing text labels). The automatic technique eliminates user intervention by exploiting common map properties of how road lines and text labels are drawn in raster maps. For example, the road lines are elongated linear objects and the characters are small connected-objects. The supervised technique utilizes labels of road and text areas to handle complex raster maps, or maps with poor image quality, and can process a variety of raster maps with minimal user input. The results show that the general approach can handle raster maps with varying map complexity, color usage, and image quality. By matching extracted road intersections to another geospatial dataset, we can identify the geocoordinates of a raster map and further align the raster map, separated feature layers from the map, and recognized features from the layers with the geospatial dataset. The road vectorization and text recognition results outperform state-of-art commercial products, and with considerably less user input. The approach in this thesis allows us to make use of the geospatial information of heterogeneous maps locked in raster format.
Instructional Assessment Strategies for Health and Physical Education
ERIC Educational Resources Information Center
Constantinou, Phoebe
2017-01-01
Assessment is an integral part of the instructional process. It can provide valuable information to both students and teachers. Assessments can be a vector to quality corrective instruction, a second chance for struggling students to demonstrate success, and a means to provide a more complex challenge for advanced students. This article discusses…
Illustrating Business Marketing Concepts through the Value Chain Game
ERIC Educational Resources Information Center
Liao-Troth, Sara; Thomas, Stephanie P.; Webb, G. Scott
2015-01-01
The Value Chain Game is an activity that helps students to develop a holistic understanding of the processes and challenges in managing the value chain so that customer needs are met. Competing value chains work to produce and sell two products. Seasonal demand, quality defects, transportation delays, and audits offer complexities that represent…
Due to complex population dynamics and source-sink metapopulation processes, animal fitness sometimes varies across landscapes in ways that cannot be deduced from simple density patterns. In this study, we examine spatial patterns in fitness using a combination of intensive fiel...
Bilingualism Matters: One Size Does Not Fit All
ERIC Educational Resources Information Center
Gathercole, Virginia C. Mueller
2014-01-01
The articles in this special issue provide a complex picture of acquisition in bilinguals in which the factors that contribute to patterns of performance in bilingual children's two languages are myriad and diverse. The processes and contours of development in bilingual children are influenced, not only by the quantity, quality, and contexts…
NASA Astrophysics Data System (ADS)
Schoitsch, Erwin
1988-07-01
Our society is depending more and more on the reliability of embedded (real-time) computer systems even in every-day life. Considering the complexity of the real world, this might become a severe threat. Real-time programming is a discipline important not only in process control and data acquisition systems, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt- and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other and with respect to their potential to quality and safety.
An approach to quality and security of supply for single-use bioreactors.
Barbaroux, Magali; Gerighausen, Susanne; Hackel, Heiko
2014-01-01
Single-use systems (also referred to as disposables) have become a huge part of the bioprocessing industry, which raised concern in the industry regarding quality and security of supply. Processes must be in place to assure the supply and control of outsourced activities and quality of purchased materials along the product life cycle. Quality and security of supply for single-use bioreactors (SUBs) are based on a multidisciplinary approach. Developing a state-of-the-art SUB-system based on quality by design (QbD) principles requires broad expertise and know-how including the cell culture application, polymer chemistry, regulatory requirements, and a deep understanding of the biopharmaceutical industry. Using standardized products reduces the complexity and strengthens the robustness of the supply chain. Well-established supplier relations including risk mitigation strategies are the basis for achieving long-term security of supply. Well-developed quality systems including change control approaches aligned with the requirements of the biopharmaceutical industry are a key factor in supporting long-term product availability. This chapter outlines the approach to security of supply for key materials used in single-use production processes for biopharmaceuticals from a supplier perspective.
NASA Astrophysics Data System (ADS)
Dolimont, Adrien; Rivière-Lorphèvre, Edouard; Ducobu, François; Backaert, Stéphane
2018-05-01
Additive manufacturing is growing faster and faster. This leads us to study the functionalization of the parts that are produced by these processes. Electron Beam melting (EBM) is one of these technologies. It is a powder based additive manufacturing (AM) method. With this process, it is possible to manufacture high-density metal parts with complex topology. One of the big problems with these technologies is the surface finish. To improve the quality of the surface, some finishing operations are needed. In this study, the focus is set on chemical polishing. The goal is to determine how the chemical etching impacts the dimensional accuracy and the surface roughness of EBM parts. To this end, an experimental campaign was carried out on the most widely used material in EBM, Ti6Al4V. Different exposure times were tested. The impact of these times on surface quality was evaluated. To help predicting the excess thickness to be provided, the dimensional impact of chemical polishing on EBM parts was estimated. 15 parts were measured before and after chemical machining. The improvement of surface quality was also evaluated after each treatment.
An Overview of the 3C-STAR project
NASA Astrophysics Data System (ADS)
Zhang, Y.
2009-04-01
Over the past three decades, city clusters have played a leading role in the economic growth of China, owing to their collective economic capacity and interdependency. However, pollution prevention lags behind the economic boom, led to a general decline in air quality in city clusters. As a result, industrial emissions and traffic exhausts together contribute to high levels of ozone (O3) and fine particulate matter (PM2.5) pollution problems ranging from urban to regional scale. Such high levels of both primary and secondary airborne pollutants lead to the development of a (perhaps typically Chinese) "air pollution complex" concept. Air pollution complex is particularly true and significant in Beijing-Tianjin area, Pearl River Delta (PRD) and Yangtze River Delta. The concurrent high concentrations of O3 and PM2.5 in PRD as well as in other China city clusters have led to rather unique pollution characteristics due to interactions between primary emissions and photochemical processes, between gaseous compounds and aerosol phase species, and between local and regional scale processes. The knowledge and experience needed to find solutions to the unique pollution complex in China are still lacking. Starting from 2007, we launch a major project "Synthesized Prevention Techniques for Air Pollution Complex and Integrated Demonstration in Key City-Cluster Region" (3C-STAR) to address those problems scientifically and technically. The purpose of the project is to build up the capacity of regional air pollution control and to establish regional coordination mechanism for joint implementation of pollution control. The project includes a number of key components technically: regional air quality monitoring network and super-sites, regional dynamic emission inventory of multi-pollutants, regional ensemble air quality forecasting model system, and regional management system supported by decision making platform. The 3C-STAR project selected PRD as a core area to have technical demonstration, and thus provide opportunities as well as challenges for PRD to improve its regional air quality. An integrated field measurement campaign 3C-STAR2008 was organized during October 15-November 19, 2008, including 3-D regional air quality monitoring network, two super-sites, and in-site meteorological and air quality forecasting. With the efforts of more than 100 scientists and students from 12 research institutes, the 3C-STAR2008 was conducted with great success. A great amount of data with rigorous QA/QC procedures has been obtained and data analysis is underway. In this talk, an overview of the 3C-STAR project will be presented, together with major findings from previous PRD campaigns (PRD2004 and PRD2006).
Learning process mapping heuristics under stochastic sampling overheads
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Wah, Benjamin W.
1991-01-01
A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.
Davenport, Paul B; Carter, Kimberly F; Echternach, Jeffrey M; Tuck, Christopher R
2018-02-01
High-reliability organizations (HROs) demonstrate unique and consistent characteristics, including operational sensitivity and control, situational awareness, hyperacute use of technology and data, and actionable process transformation. System complexity and reliance on information-based processes challenge healthcare organizations to replicate HRO processes. This article describes a healthcare organization's 3-year journey to achieve key HRO features to deliver high-quality, patient-centric care via an operations center powered by the principles of high-reliability data and software to impact patient throughput and flow.
Effective Processing of the Iron Ores
NASA Astrophysics Data System (ADS)
Kuskov, Vadim; Kuskova, Yana; Udovitsky, Vladimir
2017-11-01
Effective technology for a complex wasteless processing of the iron ores has been designed and includes three main components (plats): comminution plant, briquette plant, pigment plant. The comminution is done per energy effective technology. Using of briquetting for ores clotting enables the costs cut and brings to a higher level of environmental safety of the process. Briquette formation can be done as a regular pressing, as an extrusion. Developed technology allows to produce high quality competitively products for metallurgy industry and red iron oxide pigments. The whole production line impacts the environment in a minimal manner.
Adherence to outpatient epilepsy quality indicators at a tertiary epilepsy center
Pourdeyhimi, R.; Wolf, B.J.; Simpson, A.N.; Martz, G.U.
2014-01-01
Introduction Quality indicators for the treatment of people with epilepsy were published in 2010. This is the first report of adherence to all measures in routine care of people with epilepsy at a level 4 comprehensive epilepsy center in the US. Methods Two hundred patients with epilepsy were randomly selected from the clinics of our comprehensive epilepsy center, and all visits during 2011 were abstracted for documentation of adherence to the eight quality indicators. Alternative measures were constructed to evaluate failure of adherence. Detailed descriptions of all equations are provided. Results Objective measures (EEG, imaging) showed higher adherence than counseling measures (safety). Initial visits showed higher adherence. Variations in the interpretation of the quality measure result in different adherence values. Advanced practice providers and physicians had different adherence patterns. No patient-specific patterns of adherence were seen. Discussion This is the first report of adherence to all the epilepsy quality indicators for a sample of patients during routine care in a level 4 epilepsy center in the US. Overall adherence was similar to that previously reported on similar measures. Precise definitions of adherence equations are essential for accurate measurement. Complex measures result in lower adherence. Counseling measures showed low adherence, possibly highlighting a difference between practice and documentation. Adherence to the measures as written does not guarantee high quality care. Conclusion The current quality indicators have value in the process of improving quality of care. Future approaches may be refined to eliminate complex measures and incorporate features linked to outcomes. PMID:25171260
NASA Astrophysics Data System (ADS)
Brecher, Christian; Baum, Christoph; Bastuck, Thomas
2015-03-01
Economically advantageous microfabrication technologies for lab-on-a-chip diagnostic devices substituting commonly used glass etching or injection molding processes are one of the key enablers for the emerging market of microfluidic devices. On-site detection in fields of life sciences, point of care diagnostics and environmental analysis requires compact, disposable and highly functionalized systems. Roll-to-roll production as a high volume process has become the emerging fabrication technology for integrated, complex high technology products within recent years (e.g. fuel cells). Differently functionalized polymer films enable researchers to create a new generation of lab-on-a-chip devices by combining electronic, microfluidic and optical functions in multilayer architecture. For replication of microfluidic and optical functions via roll-to-roll production process competitive approaches are available. One of them is to imprint fluidic channels and optical structures of micro- or nanometer scale from embossing rollers into ultraviolet (UV) curable lacquers on polymer substrates. Depending on dimension, shape and quantity of those structures there are alternative manufacturing technologies for the embossing roller. Ultra-precise diamond turning, electroforming or casting polymer materials are used either for direct structuring or manufacturing of roller sleeves. Mastering methods are selected for application considering replication quality required and structure complexity. Criteria for the replication quality are surface roughness and contour accuracy. Structure complexity is evaluated by shapes producible (e.g. linear, circular) and aspect ratio. Costs for the mastering process and structure lifetime are major cost factors. The alternative replication approaches are introduced and analyzed corresponding to the criteria presented. Advantages and drawbacks of each technology are discussed and exemplary applications are presented.
Publicly disclosed information about the quality of health care: response of the US public
Schneider, E; Lieberman, T
2001-01-01
Public disclosure of information about the quality of health plans, hospitals, and doctors continues to be controversial. The US experience of the past decade suggests that sophisticated quality measures and reporting systems that disclose information on quality have improved the process and outcomes of care in limited ways in some settings, but these efforts have not led to the "consumer choice" market envisaged. Important reasons for this failure include limited salience of objective measures to consumers, the complexity of the task of interpretation, and insufficient use of quality results by organised purchasers and insurers to inform contracting and pricing decisions. Nevertheless, public disclosure may motivate quality managers and providers to undertake changes that improve the delivery of care. Efforts to measure and report information about quality should remain public, but may be most effective if they are targeted to the needs of institutional and individual providers of care. Key Words: public disclosure; quality of health care; quality improvement PMID:11389318
Process monitoring of additive manufacturing by using optical tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zenzinger, Guenter, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Bamberg, Joachim, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Ladewig, Alexander, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de
2015-03-31
Parts fabricated by means of additive manufacturing are usually of complex shape and owing to the fabrication procedure by using selective laser melting (SLM), potential defects and inaccuracies are often very small in lateral size. Therefore, an adequate quality inspection of such parts is rather challenging, while non-destructive-techniques (NDT) are difficult to realize, but considerable efforts are necessary in order to ensure the quality of SLM-parts especially used for aerospace components. Thus, MTU Aero Engines is currently focusing on the development of an Online Process Control system which monitors and documents the complete welding process during the SLM fabrication procedure.more » A high-resolution camera system is used to obtain images, from which tomographic data for a 3dim analysis of SLM-parts are processed. From the analysis, structural irregularities and structural disorder resulting from any possible erroneous melting process become visible and may be allocated anywhere within the 3dim structure. Results of our optical tomography (OT) method as obtained on real defects are presented.« less
Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano
2015-10-10
Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.
Sabio Paz, Verónica; Panattieri, Néstor D; Cristina Godio, Farmacéutica; Ratto, María E; Arpí, Lucrecia; Dackiewicz, Nora
2015-10-01
Patient safety and quality of care has become a challenge for health systems. Health care is an increasingly complex and risky activity, as it represents a combination of human, technological and organizational processes. It is necessary, therefore, to take effective actions to reduce the adverse events and mitigate its impact. This glossary is a local adaptation of key terms and concepts from the international bibliographic sources. The aim is providing a common language for assessing patient safety processes and compare them.
NASA Astrophysics Data System (ADS)
Choi, C.; Baek, Y.; Lee, B. M.; Kim, K. H.; Rim, Y. S.
2017-12-01
We report solution-processed, amorphous indium-gallium-zinc-oxide-based (a-IGZO-based) thin-film transistors (TFTs). Our proposed solution-processed a-IGZO films, using a simple spin-coating method, were formed through nitrate ligand-based metal complexes, and they were annealed at low temperature (250 °C) to achieve high-quality oxide films and devices. We investigated solution-processed a-IGZO TFTs with various thicknesses, ranging from 4 to 16 nm. The 4 nm-thick TFT films had smooth morphology and high-density, and they exhibited excellent performance, i.e. a high saturation mobility of 7.73 ± 0.44 cm2 V-1 s-1, a sub-threshold swing of 0.27 V dec-1, an on/off ratio of ~108, and a low threshold voltage of 3.10 ± 0.30 V. However, the performance of the TFTs degraded as the film thickness was increased. We further performed positive and negative bias stress tests to examine their electrical stability, and it was noted that the operating behavior of the devices was highly stable. Despite a small number of free charges, the high performance of the ultrathin a-IGZO TFTs was attributed to the small effect of the thickness of the channel, low bulk resistance, the quality of the a-IGZO/SiO2 interface, and high film density.
H2S-mediated thermal and photochemical methane activation.
Baltrusaitis, Jonas; de Graaf, Coen; Broer, Ria; Patterson, Eric V
2013-12-02
Sustainable, low-temperature methods for natural gas activation are critical in addressing current and foreseeable energy and hydrocarbon feedstock needs. Large portions of natural gas resources are still too expensive to process due to their high content of hydrogen sulfide gas (H2S) mixed with methane, deemed altogether as sub-quality or "sour" gas. We propose a unique method of activation to form a mixture of sulfur-containing hydrocarbon intermediates, CH3SH and CH3SCH3 , and an energy carrier such as H2. For this purpose, we investigated the H2S-mediated methane activation to form a reactive CH3SH species by means of direct photolysis of sub-quality natural gas. Photoexcitation of hydrogen sulfide in the CH4 + H2S complex resulted in a barrierless relaxation by a conical intersection to form a ground-state CH3SH + H2 complex. The resulting CH3SH could further be coupled over acidic catalysts to form higher hydrocarbons, and the resulting H2 used as a fuel. This process is very different from conventional thermal or radical-based processes and can be driven photolytically at low temperatures, with enhanced control over the conditions currently used in industrial oxidative natural gas activation. Finally, the proposed process is CO2 neutral, as opposed to the current industrial steam methane reforming (SMR). Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Low-complexity camera digital signal imaging for video document projection system
NASA Astrophysics Data System (ADS)
Hsia, Shih-Chang; Tsai, Po-Shien
2011-04-01
We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.
Smart manufacturing of complex shaped pipe components
NASA Astrophysics Data System (ADS)
Salchak, Y. A.; Kotelnikov, A. A.; Sednev, D. A.; Borikov, V. N.
2018-03-01
Manufacturing industry is constantly improving. Nowadays the most relevant trend is widespread automation and optimization of the production process. This paper represents a novel approach for smart manufacturing of steel pipe valves. The system includes two main parts: mechanical treatment and quality assurance units. Mechanical treatment is performed by application of the milling machine with implementation of computerized numerical control, whilst the quality assurance unit contains three testing modules for different tasks, such as X-ray testing, optical scanning and ultrasound testing modules. The advances of each of them provide reliable results that contain information about any failures of the technological process, any deviations of geometrical parameters of the valves. The system also allows detecting defects on the surface or in the inner structure of the component.
NASA Astrophysics Data System (ADS)
Starikov, A. I.; Nekrasov, R. Yu; Teploukhov, O. J.; Soloviev, I. V.; Narikov, K. A.
2016-10-01
Manufactures, machinery and equipment improve of constructively as science advances and technology, and requirements are improving of quality and longevity. That is, the requirements for surface quality and precision manufacturing, oil and gas equipment parts are constantly increasing. Production of oil and gas engineering products on modern machine tools with computer numerical control - is a complex synthesis of technical and electrical equipment parts, as well as the processing procedure. Technical machine part wears during operation and in the electrical part are accumulated mathematical errors. Thus, the above-mentioned disadvantages of any of the following parts of metalworking equipment affect the manufacturing process of products in general, and as a result lead to the flaw.
NASA Astrophysics Data System (ADS)
Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd
2017-10-01
Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.
Selenocysteine incorporation: A trump card in the game of mRNA decay
Shetty, Sumangala P.; Copeland, Paul R.
2015-01-01
The incorporation of the 21st amino acid, selenocysteine (Sec), occurs on mRNAs that harbor in-frame stop codons because the Sec-tRNASec recognizes a UGA codon. This sets up an intriguing interplay between translation elongation, translation termination and the complex machinery that marks mRNAs that contain premature termination codons for degradation, leading to nonsense mediated mRNA decay (NMD). In this review we discuss the intricate and complex relationship between this key quality control mechanism and the process of Sec incorporation in mammals. PMID:25622574
A new large-scale manufacturing platform for complex biopharmaceuticals.
Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer
2012-12-01
Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.
Care pathways for organ donation after brain death: guidance from available literature?
Hoste, Pieter; Vanhaecht, Kris; Ferdinande, Patrick; Rogiers, Xavier; Eeckloo, Kristof; Blot, Stijn; Hoste, Eric; Vogelaers, Dirk; Vandewoude, Koenraad
2016-10-01
A discussion of the literature concerning the impact of care pathways in the complex and by definition multidisciplinary process of organ donation following brain death. Enhancing the quality and safety of organs for transplantation has become a central concern for governmental and professional organizations. At the local hospital level, a donor coordinator can use a range of interventions to improve the donation and procurement process. Care pathways have been proven to represent an effective intervention in several settings for optimizing processes and outcomes. A discussion paper. A systematic review of the Medline, CINAHL, EMBASE and The Cochrane Library databases was conducted for articles published until June 2015, using the keywords donation after brain death and care pathways. Each paper was reviewed to investigate the effects of existing care pathways for donation after brain death. An additional search for unpublished information was conducted. Although literature supports care pathways as an effective intervention in several settings, few studies have explored its use and effectiveness for complex care processes such as donation after brain death. Nurses should be aware of their role in the donation process. Care pathways have the potential to support them, but their effectiveness has been insufficiently explored. Further research should focus on the development and standardization of the clinical content of a care pathway for donation after brain death and the identification of quality indicators. These should be used in a prospective effectiveness assessment of the proposed pathway. © 2016 John Wiley & Sons Ltd.
H2S mediated thermal and photochemical methane activation
Baltrusaitis, Jonas; de Graaf, Coen; Broer, Ria; Patterson, Eric
2013-01-01
Sustainable, low temperature methods of natural gas activation are critical in addressing current and foreseeable energy and hydrocarbon feedstock needs. Large portions of natural gas resources are still too expensive to process due to their high content of hydrogen sulfide gas (H2S) in mixture with methane, CH4, altogether deemed as sub-quality or “sour” gas. We propose a unique method for activating this “sour” gas to form a mixture of sulfur-containing hydrocarbon intermediates, CH3SH and CH3SCH3, and an energy carrier, such as H2. For this purpose, we computationally investigated H2S mediated methane activation to form a reactive CH3SH species via direct photolysis of sub-quality natural gas. Photoexcitation of hydrogen sulfide in the CH4+H2S complex results in a barrier-less relaxation via a conical intersection to form a ground state CH3SH+H2 complex. The resulting CH3SH can further be heterogeneously coupled over acidic catalysts to form higher hydrocarbons while the H2 can be used as a fuel. This process is very different from a conventional thermal or radical-based processes and can be driven photolytically at low temperatures, with enhanced controllability over the process conditions currently used in industrial oxidative natural gas activation. Finally, the proposed process is CO2 neutral, as opposed to the currently industrially used methane steam reforming (SMR). PMID:24150813
Vecchi, Simona; Agabiti, Nera; Mitrova, Susanna; Cacciani, Laura; Amato, Laura; Davoli, Marina; Bargagli, Anna Maria
2016-01-01
we analysed evidence on effective interventions to improve the quality of care and management in patients with diabetes type 2. This review focuses particularly on audit and feedback intervention, targeted to healthcare providers, and continuous quality improvement (CQI) involving health professionals and health care systems, respectively. we searched The Cochrane Library, PubMed, and EMBASE (search period: January 2005-December 2015) to identify systematic reviews (SR) and randomized controlled trials (RCTs) considering patients' outcomes and process measures as quality indicators in diabetes care. Selection of studies and data extraction were carried out independently by two reviewers. Methodological quality of individual studies was assessed using the checklist «Assessment of methodological quality of systematic review» (AMSTAR) and the Cochrane's tool, respectively. We produced summaries of results for each study design. the search process resulted in 810 citations. One SR and 7 RCTs that compared any intervention in which audit and feedback and CQI was a component vs. other interventions were selected. The SR found that audit and feedback activity was associated with improvements of glycaemic (mean difference: 0.26; 95%CI 0.08;0.44) and cholesterol control (mean difference: 0.03; 95%CI -0.04;0.10). CQI interventions were not associated with an improvement of quality of diabetes care. The RCTs considered in this review compared a broad range of interventions including feedback as unique activity or as part of more complex strategies. The methodological quality was generally poor in all the included trials. the available evidence suggests that audit and feedback and CQI improve quality of care in diabetic patients, although the effect is small and heterogeneous among process and outcomes indicators.
NASA Astrophysics Data System (ADS)
Kwon, Seyong; Cho, Chang Hyun; Kwon, Youngmee; Lee, Eun Sook; Park, Je-Kyun
2017-04-01
Immunohistochemistry (IHC) plays an important role in biomarker-driven cancer therapy. Although there has been a high demand for standardized and quality assured IHC, it has rarely been achieved due to the complexity of IHC testing and the subjective validation-based process flow of IHC quality control. We present here a microfluidic immunostaining system for the standardization of IHC by creating a microfluidic linearly graded antibody (Ab)-staining device and a reference cell microarray. Unlike conventional efforts, our system deals primarily with the screening of biomarker staining conditions for quantitative quality assurance testing in IHC. We characterized the microfluidic matching of Ab staining intensity using three HER2 Abs produced by different manufacturers. The quality of HER2 Ab was also validated using tissues of breast cancer patients, demonstrating that our system is an efficient and powerful tool for the standardization and quality assurance of IHC.
Karavitis, G.A.
1984-01-01
The SIMSYS2D two-dimensional water-quality simulation system is a large-scale digital modeling software system used to simulate flow and transport of solutes in freshwater and estuarine environments. Due to the size, processing requirements, and complexity of the system, there is a need to easily move the system and its associated files between computer sites when required. A series of job control language (JCL) procedures was written to allow transferability between IBM and IBM-compatible computers. (USGS)
Template-Based Modeling of Protein-RNA Interactions
Zheng, Jinfang; Kundrotas, Petras J.; Vakser, Ilya A.
2016-01-01
Protein-RNA complexes formed by specific recognition between RNA and RNA-binding proteins play an important role in biological processes. More than a thousand of such proteins in human are curated and many novel RNA-binding proteins are to be discovered. Due to limitations of experimental approaches, computational techniques are needed for characterization of protein-RNA interactions. Although much progress has been made, adequate methodologies reliably providing atomic resolution structural details are still lacking. Although protein-RNA free docking approaches proved to be useful, in general, the template-based approaches provide higher quality of predictions. Templates are key to building a high quality model. Sequence/structure relationships were studied based on a representative set of binary protein-RNA complexes from PDB. Several approaches were tested for pairwise target/template alignment. The analysis revealed a transition point between random and correct binding modes. The results showed that structural alignment is better than sequence alignment in identifying good templates, suitable for generating protein-RNA complexes close to the native structure, and outperforms free docking, successfully predicting complexes where the free docking fails, including cases of significant conformational change upon binding. A template-based protein-RNA interaction modeling protocol PRIME was developed and benchmarked on a representative set of complexes. PMID:27662342
Li, Boyan; Ryan, Paul W; Shanahan, Michael; Leister, Kirk J; Ryder, Alan G
2011-11-01
The application of fluorescence excitation-emission matrix (EEM) spectroscopy to the quantitative analysis of complex, aqueous solutions of cell culture media components was investigated. These components, yeastolate, phytone, recombinant human insulin, eRDF basal medium, and four different chemically defined (CD) media, are used for the formulation of basal and feed media employed in the production of recombinant proteins using a Chinese Hamster Ovary (CHO) cell based process. The comprehensive analysis (either identification or quality assessment) of these materials using chromatographic methods is time consuming and expensive and is not suitable for high-throughput quality control. The use of EEM in conjunction with multiway chemometric methods provided a rapid, nondestructive analytical method suitable for the screening of large numbers of samples. Here we used multiway robust principal component analysis (MROBPCA) in conjunction with n-way partial least squares discriminant analysis (NPLS-DA) to develop a robust routine for both the identification and quality evaluation of these important cell culture materials. These methods are applicable to a wide range of complex mixtures because they do not rely on any predetermined compositional or property information, thus making them potentially very useful for sample handling, tracking, and quality assessment in biopharmaceutical industries.
CVD2014-A Database for Evaluating No-Reference Video Quality Assessment Algorithms.
Nuutinen, Mikko; Virtanen, Toni; Vaahteranoksa, Mikko; Vuori, Tero; Oittinen, Pirkko; Hakkinen, Jukka
2016-07-01
In this paper, we present a new video database: CVD2014-Camera Video Database. In contrast to previous video databases, this database uses real cameras rather than introducing distortions via post-processing, which results in a complex distortion space in regard to the video acquisition process. CVD2014 contains a total of 234 videos that are recorded using 78 different cameras. Moreover, this database contains the observer-specific quality evaluation scores rather than only providing mean opinion scores. We have also collected open-ended quality descriptions that are provided by the observers. These descriptions were used to define the quality dimensions for the videos in CVD2014. The dimensions included sharpness, graininess, color balance, darkness, and jerkiness. At the end of this paper, a performance study of image and video quality algorithms for predicting the subjective video quality is reported. For this performance study, we proposed a new performance measure that accounts for observer variance. The performance study revealed that there is room for improvement regarding the video quality assessment algorithms. The CVD2014 video database has been made publicly available for the research community. All video sequences and corresponding subjective ratings can be obtained from the CVD2014 project page (http://www.helsinki.fi/psychology/groups/visualcognition/).
Agents for Change: Nonphysician Medical Providers and Health Care Quality
Boucher, Nathan A; McMillen, Marvin A; Gould, James S
2015-01-01
Quality medical care is a clinical and public health imperative, but defining quality and achieving improved, measureable outcomes are extremely complex challenges. Adherence to best practice invariably improves outcomes. Nonphysician medical providers (NPMPs), such as physician assistants and advanced practice nurses (eg, nurse practitioners, advanced practice registered nurses, certified registered nurse anesthetists, and certified nurse midwives), may be the first caregivers to encounter the patient and can act as agents for change for an organization’s quality-improvement mandate. NPMPs are well positioned to both initiate and ensure optimal adherence to best practices and care processes from the moment of initial contact because they have robust clinical training and are integral to trainee/staff education and the timely delivery of care. The health care quality aspects that the practicing NPMP can affect are objective, appreciative, and perceptive. As bedside practitioners and participants in the administrative and team process, NPMPs can fine-tune care delivery, avoiding the problem areas defined by the Institute of Medicine: misuse, overuse, and underuse of care. This commentary explores how NPMPs can affect quality by 1) supporting best practices through the promotion of guidelines and protocols, and 2) playing active, if not leadership, roles in patient engagement and organizational quality-improvement efforts. PMID:25663213
Simon, Christian; Caballero, Carmela
2018-05-24
It is without question in the best interest of our patients, if we can identify ways to improve the quality of care we deliver to them. Great progress has been made within the last 25 years in terms of development and implementation of quality-assurance (QA) platforms and quality improvement programs for surgery in general, and within this context for head and neck surgery. As of now, we have successfully identified process indicators that impact outcome of our patients and the quality of care we deliver as surgeons. We have developed risk calculators to determine the risk for complications of individual surgical patients. We have created perioperative guidelines for complex head and neck procedures. We have in Europe and North America created audit registries that can gather and analyze data from institutions across the world to better understand which processes need change to obtain good outcomes and improve quality of care. QA platforms can be tested within the clearly defined environment of prospective clinical trials. If positive, such programs could be rolled out within national healthcare systems, if feasible. Testing quality programs in clinical trials could be a versatile tool to help head neck cancer patients benefit directly from such initiatives on a global level.
Quality measurement in diabetes care.
Leas, Brian F; Berman, Bettina; Kash, Kathryn M; Crawford, Albert G; Toner, Richard W; Goldfarb, Neil I; Nash, David B
2009-10-01
This study aimed to evaluate diabetes quality measurement efforts, assess their strengths and areas for improvement, and identify gaps not adequately addressed by these measures. We conducted an environmental scan of diabetes quality measures, focusing on metrics included in the National Quality Measures Clearinghouse or promulgated by leading measurement organizations. Key informant interviews were also completed with thought leaders who develop, promote, and use quality measures. The environmental scan identified 146 distinct measures spanning 31 clinical processes or outcomes. This suggests a measurement system that is both redundant and inconsistent, with many different measures assessing the same clinical indicators. Interviewees believe that current diabetes measurement efforts are excessively broad and complex and expressed a need for better harmonization of these measures. Several gaps were also found, including a lack of measures focusing on population health, structural elements of health care, and prevention of diabetes.
A Global Observatory of Lake Water Quality
NASA Astrophysics Data System (ADS)
Tyler, Andrew N.; Hunter, Peter D.; Spyrakos, Evangelos; Neil, Claire; Simis, Stephen; Groom, Steve; Merchant, Chris J.; Miller, Claire A.; O'Donnell, Ruth; Scott, E. Marian
2017-04-01
Our planet's surface waters are a fundamental resource encompassing a broad range of ecosystems that are core to global biogeochemical cycling, biodiversity and food and energy security. Despite this, these same waters are impacted by multiple natural and anthropogenic pressures and drivers of environmental change. The complex interaction between physical, chemical and biological processes in surface waters poses significant challenges for in situ monitoring and assessment and this often limits our ability to adequately capture the dynamics of aquatic systems and our understanding of their status, functioning and response to pressures. Recent developments in the availability of satellite platforms for Earth observation (including ESA's Copernicus Programme) offers an unprecedented opportunity to deliver measures of water quality at a global scale. The UK NERC-funded GloboLakes project is a five-year research programme investigating the state of lakes and their response to climatic and other environmental drivers of change through the realization of a near-real time satellite based observatory (Sentinel-3) and archive data processing (MERIS, SeaWiFS) to produce a 20-year time-series of observed ecological parameters and lake temperature for more than 1000 lakes globally. However, the diverse and complex optical properties of lakes mean that algorithm performance often varies markedly between different water types. The GloboLakes project is overcoming this challenge by developing a processing chain whereby algorithms are dynamically selected according to the optical properties of the lake under observation. The development and validation of the GloboLakes processing chain has been supported by access to extensive in situ data from more than thirty partners around the world that are now held in the LIMNADES community-owned data repository developed under the auspices of GloboLakes. This approach has resulted in a step-change in our ability to produce regional and global water quality products for optically-complex waters complete with greatly improved uncertainty estimates. The value of these data and the future scientific opportunities they provide will be illustrated with examples of how it can be used to improve our understanding of the impact of global environmental change on inland, transitional and near-shore coastal waters.
NASA Astrophysics Data System (ADS)
Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.
2014-02-01
Watershed-scale modeling can be a valuable tool to aid in quantification of water quality and yield; however, several challenges remain. In many watersheds, it is difficult to adequately quantify hydrologic partitioning. Data scarcity is prevalent, accuracy of spatially distributed meteorology is difficult to quantify, forest encroachment and land use issues are common, and surface water and groundwater abstractions substantially modify watershed-based processes. Our objective is to assess the capability of the Soil and Water Assessment Tool (SWAT) model to capture event-based and long-term monsoonal rainfall-runoff processes in complex mountainous terrain. To accomplish this, we developed a unique quality-control, gap-filling algorithm for interpolation of high-frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. The interdisciplinary model was calibrated to a unique combination of statistical, hydrologic, and plant growth metrics. Our results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. The addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. While this study shows the challenges of applying the SWAT model to complex terrain and extreme environments; by incorporating anthropogenic features into modeling scenarios, we can enhance our understanding of the hydroecological impact.
Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.
Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep
2016-04-01
This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk < 1.0 as "incapable" (1). A C pk > 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.
Chaplin, E; Bailey, M; Crosby, R; Gorman, D; Holland, X; Hippe, C; Hoff, T; Nawrocki, D; Pichette, S; Thota, N
1999-06-01
Health care has a number of historical barriers to capturing the voice of the customer and to incorporating customer wants into health care services, whether the customer is a patient, an insurer, or a community. Quality function deployment (QFD) is a set of tools and practices that can help overcome these barriers to form a process for the planning and design or redesign of products and services. The goal of the project was to increase referral volume and to improve a rehabilitation hospital's capacity to provide comprehensive medical and/or legal evaluations for people with complex and catastrophic injuries or illnesses. HIGH-LEVEL VIEW OF QFD AS A PROCESS: The steps in QFD are as follows: capture of the voice of the customer, quality deployment, functions deployment, failure mode deployment, new process deployment, and task deployment. The output of each step becomes the input to a matrix tool or table of the next step of the process. In 3 1/2 months a nine-person project team at Continental Rehabilitation Hospital (San Diego) used QFD tools to capture the voice of the customer, use these data as the basis for a questionnaire on important qualities of service from the customer's perspective, obtain competitive data on how the organization was perceived to be meeting the demanded qualities, identify measurable dimensions and targets of these qualities, and incorporate the functions and tasks into the delivery of service which are necessary to meet the demanded qualities. The future of providing health care services will belong to organizations that can adapt to a rapidly changing environment and to demands for new products and services that are produced and delivered in new ways.
Jin, Hao; Huang, Hai; Dong, Wei; Sun, Jian; Liu, Anding; Deng, Meihong; Dirsch, Olaf; Dahmen, Uta
2012-08-01
As repeatedly operating rat liver transplantation (LTx) until animals survive is inefficient in respect to time and use of living animals, we developed a new training concept. METHODS AND CONCEPTS: Training was divided into four phases: pretraining-phase, basic-microsurgical-training phase, advanced-microsurgical-training phases, and expert-microsurgical-training phase. Two "productivity-phases" were introduced right after the basic- and advanced-microsurgical-training phases, respectively, to allow the trainee to accumulate experience and to be scientifically productive before proceeding to a more complex procedure. PDCA cycles and quality criteria were employed to control the learning-process and the surgical quality. Predefined quality criteria included survival rate, intraoperative, postoperative, and histologic parameters. Three trainees participated in the LTx training and achieved their first survival record within 4-10 operations. All of them completely mastered the LTx in fewer procedures (31, 60 and 26 procedures) as reported elsewhere, and the more complex arterialized or partial LTx were mastered by trainee A and B in additional 9 and 13 procedures, respectively. Fast progress was possible due to a high number of training in the 2 Productivity-phases. The stepwise and PDCA-based training program increased the efficiency of LTx training, whereas the constant application and development of predefined quality criteria guaranteed the quality of microsurgery. Copyright © 2012 Elsevier Inc. All rights reserved.
Kristin, Julia; Glaas, Marcel Fabian; Stenin, Igor; Albrecht, Angelika; Klenzner, Thomas; Schipper, Jörg; Eysel-Gosepath, Katrin
2017-11-01
Monitoring the health-related quality of life (HRQOL) for patients with vestibular schwannoma (VS) has garnered increasing interest. In German-speaking countries, there is no disease-specific questionnaire available similar to the "Penn Acoustic Neuroma Quality-of-life Scale" (PANQOL). We translated the PANQOL for German-speaking patients based on a multistep protocol that included not only a forward-backward translation but also linguistic and sociocultural adaptations. The process consists of translation, synthesis, back translation, review by an expert committee, administration of the prefinal version to our patients, submission and appraisal of all written documents by our research team. The required multidisciplinary team for translation comprised head and neck surgeons, language professionals (German and English), a professional translator, and bilingual participants. A total of 123 patients with VS underwent microsurgical procedures via different approaches at our clinic between January 2007 and January 2017. Among these, 72 patients who underwent the translabyrinthine approach participated in the testing of the German-translated PANQOL. The first German version of the PANQOL questionnaire was created by a multistep translation process. The responses indicate that the questionnaire is simple to administer and applicable to our patients. The use of a multistep process to translate quality-of-life questionnaires is complex and time-consuming. However, this process was performed properly and resulted in a version of the PANQOL for assessing the quality of life of German-speaking patients with VS.
Transfer of knowledge from sound quality measurement to noise impact evaluation
NASA Astrophysics Data System (ADS)
Genuit, Klaus
2004-05-01
It is well known that the measurement and analysis of sound quality requires a complex procedure with consideration of the physical, psychoacoustical and psychological aspects of sound. Sound quality cannot be described only by a simple value based on A-weighted sound pressure level measurements. The A-weighted sound pressure level is sufficient to predict the probabilty that the human ear could be damaged by sound but the A-weighted level is not the correct descriptor for the annoyance of a complex sound situation given by several different sound events at different and especially moving positions (soundscape). On the one side, the consideration of the spectral distribution and the temporal pattern (psychoacoustics) is requested and, on the other side, the subjective attitude with respect to the sound situation, the expectation and experience of the people (psychology) have to be included in context with the complete noise impact evaluation. This paper describes applications of the newest methods of sound quality measurements-as it is well introduced at the car manufacturers-based on artifical head recordings and signal processing comparable to the human hearing used in noisy environments like community/traffic noise.
The importance of improving the quality of emergency surgery for a regional quality collaborative.
Smith, Margaret; Hussain, Adnan; Xiao, Jane; Scheidler, William; Reddy, Haritha; Olugbade, Kola; Cummings, Dustin; Terjimanian, Michael; Krapohl, Greta; Waits, Seth A; Campbell, Darrell; Englesbe, Michael J
2013-04-01
Within a large, statewide collaborative, significant improvement in surgical quality has been appreciated (9.0% reduction in morbidity for elective general and vascular surgery). Our group has not noted such quality improvement in the care of patients who had emergency operations. With this work, we aim to describe the scope of emergency surgical care within the Michigan Surgical Quality Collaborative, variations in outcomes among hospitals, and variations in adherence to evidence-based process measures. Overall, these data will form a basis for a broad-based quality improvement initiative within Michigan. We report morbidity, mortality, and costs of emergency and elective general and vascular surgery cases (N = 190,826) within 34 hospitals participating in the Michigan Surgical Quality Collaborative from 2005 to 2010. Adjusted hospital-specific outcomes were calculated using a stepwise multivariable logistic regression model. Adjustment covariates included patient specific comorbidities and case complexity. Hospitals were also compared on the basis of their adherence to evidence-based process measures [measures at the patient level for each case-Surgical Care Improvement Project (SCIP)-1 and SCIP-2 compliance]. Emergency procedures account for approximately 11% of total cases, yet they represented 47% of mortalities and 28% of surgical complications. The complication-specific cost to payers was $126 million for emergency cases and $329 million for elective cases. Adjusted patient outcomes varied widely within Michigan Surgical Quality Collaborative hospitals; morbidity and mortality rates ranged from 16.3% to 33.9% and 4.0% to 12.4%, respectively. The variation among hospitals was not correlated with volume of emergency cases and case complexity. Hospital performance in emergency surgery was found to not depend on its share of emergent cases but rather was found to directly correlate with its performance in elective surgery. For emergency colectomies, there was a wide variation in compliance with SCIP-1 and SCIP-2 measures and overall compliance (42.0%) was markedly lower than that for elective colon surgery (81.7%). Emergency surgical procedures are an important target for future quality improvement efforts within Michigan. Future work will identify best practices within high-performing hospitals and disseminate these practices within the collaborative.
Addressing and Presenting Quality of Satellite Data via Web-Based Services
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory; Lynnes, C.; Ahmad, S.; Fox, P.; Zednik, S.; West, P.
2011-01-01
With the recent attention to climate change and proliferation of remote-sensing data utilization, climate model and various environmental monitoring and protection applications have begun to increasingly rely on satellite measurements. Research application users seek good quality satellite data, with uncertainties and biases provided for each data point. However, different communities address remote sensing quality issues rather inconsistently and differently. We describe our attempt to systematically characterize, capture, and provision quality and uncertainty information as it applies to the NASA MODIS Aerosol Optical Depth data product. In particular, we note the semantic differences in quality/bias/uncertainty at the pixel, granule, product, and record levels. We outline various factors contributing to uncertainty or error budget; errors. Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having directly managing complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. We describe our experiences developing a semantic, provenance-aware, expert-knowledge advisory system applied to NASA Giovanni web-based Earth science data analysis tool as part of the ESTO AIST-funded Multi-sensor Data Synergy Advisor project.
Microstructural Influence on Mechanical Properties in Plasma Microwelding of Ti6Al4V Alloy
NASA Astrophysics Data System (ADS)
Baruah, M.; Bag, S.
2016-11-01
The complexity of joining Ti6Al4V alloy enhances with reduction in sheet thickness. The present work puts emphasis on microplasma arc welding (MPAW) of 500-μm-thick Ti6Al4V alloy in butt joint configuration. Using controlled and regulated arc current, the MPAW process is specifically designed to use in joining of thin sheet components over a wide range of process parameters. The weld quality is assessed by carefully controlling the process parameters and by reducing the formation of oxides. The combined effect of welding speed and current on the weld joint properties is evaluated for joining of Ti6Al4V alloy. The macro- and microstructural characterizations of the weldment by optical microscopy as well as the analysis of mechanical properties by microtensile and microhardness test have been performed. The weld joint quality is affected by specifically designed fixture that controls the oxidation of the joint and introduces high cooling rate. Hence, the solidified microstructure of welded specimen influences the mechanical properties of the joint. The butt joint of titanium alloy by MPAW at optimal process parameters is of very high quality, without any internal defects and with minimum residual distortion.
Ratcliffe, Elizabeth; Hourd, Paul; Guijarro-Leach, Juan; Rayment, Erin; Williams, David J; Thomas, Robert J
2013-01-01
Commercial regenerative medicine will require large quantities of clinical-specification human cells. The cost and quality of manufacture is notoriously difficult to control due to highly complex processes with poorly defined tolerances. As a step to overcome this, we aimed to demonstrate the use of 'quality-by-design' tools to define the operating space for economic passage of a scalable human embryonic stem cell production method with minimal cell loss. Design of experiments response surface methodology was applied to generate empirical models to predict optimal operating conditions for a unit of manufacture of a previously developed automatable and scalable human embryonic stem cell production method. Two models were defined to predict cell yield and cell recovery rate postpassage, in terms of the predictor variables of media volume, cell seeding density, media exchange and length of passage. Predicted operating conditions for maximized productivity were successfully validated. Such 'quality-by-design' type approaches to process design and optimization will be essential to reduce the risk of product failure and patient harm, and to build regulatory confidence in cell therapy manufacturing processes.
ERIC Educational Resources Information Center
Patterson, Olga
2012-01-01
Domain adaptation of natural language processing systems is challenging because it requires human expertise. While manual effort is effective in creating a high quality knowledge base, it is expensive and time consuming. Clinical text adds another layer of complexity to the task due to privacy and confidentiality restrictions that hinder the…
The Role of Key Actors in School Governance: An Italian Evidence
ERIC Educational Resources Information Center
Salvioni, Daniela; Gandini, Giuseppina; Franzoni, Simona; Gennari, Francesca
2012-01-01
The greater awareness of the role of key actors in the school governance processes and the need to expect a "new leader" in the increasing school complexity are essential conditions to reform the schools from within, so as to provide them with skills related to globalisation, improvement to the educational quality, strengthening of…
ERIC Educational Resources Information Center
Makopoulou, Kyriaki
2018-01-01
Background: Research evidence on what makes CPD effective is accumulating. Yet, fundamental questions about the specific features that lead to programme success. Furthermore, very little research investigates the nature and quality of CPD providers' (tutors) practices. Taking a closer look at how CPD providers support teachers to learn is…
Riis, Allan; Jensen, Cathrine Elgaard; Bro, Flemming; Maindal, Helle Terkildsen; Petersen, Karin Dam; Jensen, Martin Bach
2013-10-20
Evidence-based clinical practice guidelines may improve treatment quality, but the uptake of guideline recommendations is often incomplete and slow. Recently new low back pain guidelines are being launched in Denmark. The guidelines are considered to reduce personal and public costs. The aim of this study is to evaluate whether a complex, multifaceted implementation strategy of the low back pain guidelines will reduce secondary care referral and improve patient outcomes compared to the usual simple implementation strategy. In a two-armed cluster randomised trial, 100 general practices (clusters) and 2,700 patients aged 18 to 65 years from the North Denmark region will be included. Practices are randomly allocated 1:1 to a simple or a complex implementation strategy. Intervention practices will receive a complex implementation strategy, including guideline facilitator visits, stratification tools, and quality reports on low back pain treatment. Primary outcome is referral to secondary care. Secondary outcomes are pain, physical function, health-related quality of life, patient satisfaction with care and treatment outcome, employment status, and sick leave. Primary and secondary outcomes pertain to the patient level. Assessments of outcomes are blinded and follow the intention-to-treat principle. Additionally, a process assessment will evaluate the degree to which the intervention elements will be delivered as planned, as well as measure changes in beliefs and behaviours among general practitioners and patients. This study provides knowledge concerning the process and effect of an intervention to implement low back pain guidelines in general practice, and will provide insight on essential elements to include in future implementation strategies in general practice. Registered as NCT01699256 on ClinicalTrials.gov.
NASA Astrophysics Data System (ADS)
Xu, Gang; Li, Ming; Mourrain, Bernard; Rabczuk, Timon; Xu, Jinlan; Bordas, Stéphane P. A.
2018-01-01
In this paper, we propose a general framework for constructing IGA-suitable planar B-spline parameterizations from given complex CAD boundaries consisting of a set of B-spline curves. Instead of forming the computational domain by a simple boundary, planar domains with high genus and more complex boundary curves are considered. Firstly, some pre-processing operations including B\\'ezier extraction and subdivision are performed on each boundary curve in order to generate a high-quality planar parameterization; then a robust planar domain partition framework is proposed to construct high-quality patch-meshing results with few singularities from the discrete boundary formed by connecting the end points of the resulting boundary segments. After the topology information generation of quadrilateral decomposition, the optimal placement of interior B\\'ezier curves corresponding to the interior edges of the quadrangulation is constructed by a global optimization method to achieve a patch-partition with high quality. Finally, after the imposition of C1=G1-continuity constraints on the interface of neighboring B\\'ezier patches with respect to each quad in the quadrangulation, the high-quality B\\'ezier patch parameterization is obtained by a C1-constrained local optimization method to achieve uniform and orthogonal iso-parametric structures while keeping the continuity conditions between patches. The efficiency and robustness of the proposed method are demonstrated by several examples which are compared to results obtained by the skeleton-based parameterization approach.
Oceanic forcing of coral reefs.
Lowe, Ryan J; Falter, James L
2015-01-01
Although the oceans play a fundamental role in shaping the distribution and function of coral reefs worldwide, a modern understanding of the complex interactions between ocean and reef processes is still only emerging. These dynamics are especially challenging owing to both the broad range of spatial scales (less than a meter to hundreds of kilometers) and the complex physical and biological feedbacks involved. Here, we review recent advances in our understanding of these processes, ranging from the small-scale mechanics of flow around coral communities and their influence on nutrient exchange to larger, reef-scale patterns of wave- and tide-driven circulation and their effects on reef water quality and perceived rates of metabolism. We also examine regional-scale drivers of reefs such as coastal upwelling, internal waves, and extreme disturbances such as cyclones. Our goal is to show how a wide range of ocean-driven processes ultimately shape the growth and metabolism of coral reefs.
Blood Sampling and Preparation Procedures for Proteomic Biomarker Studies of Psychiatric Disorders.
Guest, Paul C; Rahmoune, Hassan
2017-01-01
A major challenge in proteomic biomarker discovery and validation for psychiatric diseases is the inherent biological complexity underlying these conditions. There are also many technical issues which hinder this process such as the lack of standardization in sampling, processing and storage of bio-samples in preclinical and clinical settings. This chapter describes a reproducible procedure for sampling blood serum and plasma that is specifically designed for maximizing data quality output in two-dimensional gel electrophoresis, multiplex immunoassay and mass spectrometry profiling studies.
Jennings, Larissa; Bertrand, Jane; Rech, Dino; Harvey, Steven A.; Hatzold, Karin; Samkange, Christopher A.; Omondi Aduda, Dickens S.; Fimbo, Bennett; Cherutich, Peter; Perry, Linnea; Castor, Delivette; Njeuhmeli, Emmanuel
2014-01-01
Background The rapid expansion of voluntary medical male circumcision (VMMC) has raised concerns whether health systems can deliver and sustain VMMC according to minimum quality criteria. Methods and Findings A comparative process evaluation was used to examine data from SYMMACS, the Systematic Monitoring of the Voluntary Medical Male Circumcision Scale-Up, among health facilities providing VMMC across two years of program scale-up. Site-level assessments examined the availability of guidelines, supplies and equipment, infection control, and continuity of care services. Direct observation of VMMC surgeries were used to assess care quality. Two sample tests of proportions and t-tests were used to examine differences in the percent of facilities meeting requisite preparedness standards and the mean number of directly-observed surgical tasks performed correctly. Results showed that safe, high quality VMMC can be implemented and sustained at-scale, although substantial variability was observed over time. In some settings, facility preparedness and VMMC service quality improved as the number of VMMC facilities increased. Yet, lapses in high performance and expansion of considerably deficient services were also observed. Surgical tasks had the highest quality scores, with lower performance levels in infection control, pre-operative examinations, and post-operative patient monitoring and counseling. The range of scale-up models used across countries additionally underscored the complexity of delivering high quality VMMC. Conclusions Greater efforts are needed to integrate VMMC scale-up and quality improvement processes in sub-Saharan African settings. Monitoring of service quality, not just adverse events reporting, will be essential in realizing the full health impact of VMMC for HIV prevention. PMID:24801073
Sorge, John P; Harmon, C Reid; Sherman, Susan M; Baillie, E Eugene
2005-07-01
We used data management software to compare pathology report data concerning regional lymph node sampling for colorectal carcinoma from 2 institutions using different dissection methods. Data were retrieved from 2 disparate anatomic pathology information systems for all cases of colorectal carcinoma in 2003 involving the ascending and descending colon. Initial sorting of the data included overall lymph node recovery to assess differences between the dissection methods at the 2 institutions. Additional segregation of the data was used to challenge the application's capability of accurately addressing the complexity of the process. This software approach can be used to evaluate data from disparate computer systems, and we demonstrate how an automated function can enable institutions to compare internal pathologic assessment processes and the results of those comparisons. The use of this process has future implications for pathology quality assurance in other areas.
Inference or Enaction? The Impact of Genre on the Narrative Processing of Other Minds
Carney, James; Wlodarski, Rafael; Dunbar, Robin
2014-01-01
Do narratives shape how humans process other minds or do they presuppose an existing theory of mind? This study experimentally investigated this problem by assessing subject responses to systematic alterations in the genre, levels of intentionality, and linguistic complexity of narratives. It showed that the interaction of genre and intentionality level are crucial in determining how narratives are cognitively processed. Specifically, genres that deployed evolutionarily familiar scenarios (relationship stories) were rated as being higher in quality when levels of intentionality were increased; conversely, stories that lacked evolutionary familiarity (espionage stories) were rated as being lower in quality with increases in intentionality level. Overall, the study showed that narrative is not solely either the origin or the product of our intuitions about other minds; instead, different genres will have different—even opposite—effects on how we understand the mind states of others. PMID:25470279
Inference or enaction? The impact of genre on the narrative processing of other minds.
Carney, James; Wlodarski, Rafael; Dunbar, Robin
2014-01-01
Do narratives shape how humans process other minds or do they presuppose an existing theory of mind? This study experimentally investigated this problem by assessing subject responses to systematic alterations in the genre, levels of intentionality, and linguistic complexity of narratives. It showed that the interaction of genre and intentionality level are crucial in determining how narratives are cognitively processed. Specifically, genres that deployed evolutionarily familiar scenarios (relationship stories) were rated as being higher in quality when levels of intentionality were increased; conversely, stories that lacked evolutionary familiarity (espionage stories) were rated as being lower in quality with increases in intentionality level. Overall, the study showed that narrative is not solely either the origin or the product of our intuitions about other minds; instead, different genres will have different-even opposite-effects on how we understand the mind states of others.
Natural language processing in an intelligent writing strategy tutoring system.
McNamara, Danielle S; Crossley, Scott A; Roscoe, Rod
2013-06-01
The Writing Pal is an intelligent tutoring system that provides writing strategy training. A large part of its artificial intelligence resides in the natural language processing algorithms to assess essay quality and guide feedback to students. Because writing is often highly nuanced and subjective, the development of these algorithms must consider a broad array of linguistic, rhetorical, and contextual features. This study assesses the potential for computational indices to predict human ratings of essay quality. Past studies have demonstrated that linguistic indices related to lexical diversity, word frequency, and syntactic complexity are significant predictors of human judgments of essay quality but that indices of cohesion are not. The present study extends prior work by including a larger data sample and an expanded set of indices to assess new lexical, syntactic, cohesion, rhetorical, and reading ease indices. Three models were assessed. The model reported by McNamara, Crossley, and McCarthy (Written Communication 27:57-86, 2010) including three indices of lexical diversity, word frequency, and syntactic complexity accounted for only 6% of the variance in the larger data set. A regression model including the full set of indices examined in prior studies of writing predicted 38% of the variance in human scores of essay quality with 91% adjacent accuracy (i.e., within 1 point). A regression model that also included new indices related to rhetoric and cohesion predicted 44% of the variance with 94% adjacent accuracy. The new indices increased accuracy but, more importantly, afford the means to provide more meaningful feedback in the context of a writing tutoring system.
The role of the insula in intuitive expert bug detection in computer code: an fMRI study.
Castelhano, Joao; Duarte, Isabel C; Ferreira, Carlos; Duraes, Joao; Madeira, Henrique; Castelo-Branco, Miguel
2018-05-09
Software programming is a complex and relatively recent human activity, involving the integration of mathematical, recursive thinking and language processing. The neural correlates of this recent human activity are still poorly understood. Error monitoring during this type of task, requiring the integration of language, logical symbol manipulation and other mathematical skills, is particularly challenging. We therefore aimed to investigate the neural correlates of decision-making during source code understanding and mental manipulation in professional participants with high expertise. The present fMRI study directly addressed error monitoring during source code comprehension, expert bug detection and decision-making. We used C code, which triggers the same sort of processing irrespective of the native language of the programmer. We discovered a distinct role for the insula in bug monitoring and detection and a novel connectivity pattern that goes beyond the expected activation pattern evoked by source code understanding in semantic language and mathematical processing regions. Importantly, insula activity levels were critically related to the quality of error detection, involving intuition, as signalled by reported initial bug suspicion, prior to final decision and bug detection. Activity in this salience network (SN) region evoked by bug suspicion was predictive of bug detection precision, suggesting that it encodes the quality of the behavioral evidence. Connectivity analysis provided evidence for top-down circuit "reutilization" stemming from anterior cingulate cortex (BA32), a core region in the SN that evolved for complex error monitoring such as required for this type of recent human activity. Cingulate (BA32) and anterolateral (BA10) frontal regions causally modulated decision processes in the insula, which in turn was related to activity of math processing regions in early parietal cortex. In other words, earlier brain regions used during evolution for other functions seem to be reutilized in a top-down manner for a new complex function, in an analogous manner as described for other cultural creations such as reading and literacy.
Welzenbach, Julia; Neuhoff, Christiane; Looft, Christian; Schellander, Karl; Tholen, Ernst; Große-Brinkhaus, Christine
2016-01-01
The aim of this study was to elucidate the underlying biochemical processes to identify potential key molecules of meat quality traits drip loss, pH of meat 1 h post-mortem (pH1), pH in meat 24 h post-mortem (pH24) and meat color. An untargeted metabolomics approach detected the profiles of 393 annotated and 1,600 unknown metabolites in 97 Duroc × Pietrain pigs. Despite obvious differences regarding the statistical approaches, the four applied methods, namely correlation analysis, principal component analysis, weighted network analysis (WNA) and random forest regression (RFR), revealed mainly concordant results. Our findings lead to the conclusion that meat quality traits pH1, pH24 and color are strongly influenced by processes of post-mortem energy metabolism like glycolysis and pentose phosphate pathway, whereas drip loss is significantly associated with metabolites of lipid metabolism. In case of drip loss, RFR was the most suitable method to identify reliable biomarkers and to predict the phenotype based on metabolites. On the other hand, WNA provides the best parameters to investigate the metabolite interactions and to clarify the complex molecular background of meat quality traits. In summary, it was possible to attain findings on the interaction of meat quality traits and their underlying biochemical processes. The detected key metabolites might be better indicators of meat quality especially of drip loss than the measured phenotype itself and potentially might be used as bio indicators. PMID:26919205
NASA Astrophysics Data System (ADS)
Lai, L.; Tzortziou, M.; Gilerson, A.; Foster, R.
2013-12-01
Dissolved Organic Matter (DOM) and its colored component, (CDOM) are sensitive indicators of environmental pollution, nutrient enrichment, water quality and plays a key role in a broad range of processes and climate-related biogeochemical cycles in estuarine and coastal ecosystems. Because of its strong influence on how ocean color is viewed, CDOM can provide an invaluable optical tool for coastal zone environmental assessment and from space. There is a continuous cycle of sources and sinks of CDOM from terrestrial sources to the wetlands to the estuaries and to the ocean waters. Terrestrial inputs from natural processes, anthropogenic activities, exchanges with the atmosphere, rich biodiversity and high primary productivity, physical, photochemical and microbial processes affect not only the amount but also the quality and optical signature of CDOM in near-shore waters. In this study, new measurements are presented of the optical characteristics of CDOM collected from the Chesapeake Bay estuarine environment. Measured parameters include absorption spectra, estimated spectral slopes, slope ratios, DOC-specific CDOM absorption as well as 3D CDOM fluorescence emission-excitation matrices. Such results will provide insight of the measured CDOM in this complex environment and the complex process that affect CDOM quality and amount during transport to the estuary and coastal ocean. New field campaigns will be conducted in August and September in the Chesapeake Bay estuary and the coast of the Gulf of Mexico to collect more samples for analysis of CDOM dynamics and link field observations and measurements to satellite ocean color retrievals of estuarine biogeochemical processes. In addition, advanced satellite CDOM data distribution and usage is discussed as it has considerable operational value and practical application beyond the scientific community and research. Keywords: CDOM, carbon dynamics, estuaries, coastal ecosystems, optical properties, satellite applications, data distribution
A service for the application of data quality information to NASA earth science satellite records
NASA Astrophysics Data System (ADS)
Armstrong, E. M.; Xing, Z.; Fry, C.; Khalsa, S. J. S.; Huang, T.; Chen, G.; Chin, T. M.; Alarcon, C.
2016-12-01
A recurring demand in working with satellite-based earth science data records is the need to apply data quality information. Such quality information is often contained within the data files as an array of "flags", but can also be represented by more complex quality descriptions such as combinations of bit flags, or even other ancillary variables that can be applied as thresholds to the geophysical variable of interest. For example, with Level 2 granules from the Group for High Resolution Sea Surface Temperature (GHRSST) project up to 6 independent variables could be used to screen the sea surface temperature measurements on a pixel-by-pixel basis. Quality screening of Level 3 data from the Soil Moisture Active Passive (SMAP) instrument can be become even more complex, involving 161 unique bit states or conditions a user can screen for. The application of quality information is often a laborious process for the user until they understand the implications of all the flags and bit conditions, and requires iterative approaches using custom software. The Virtual Quality Screening Service, a NASA ACCESS project, is addressing these issues and concerns. The project has developed an infrastructure to expose, apply, and extract quality screening information building off known and proven NASA components for data extraction and subset-by-value, data discovery, and exposure to the user of granule-based quality information. Further sharing of results through well-defined URLs and web service specifications has also been implemented. The presentation will focus on overall description of the technologies and informatics principals employed by the project. Examples of implementations of the end-to-end web service for quality screening with GHRSST and SMAP granules will be demonstrated.
NASA Astrophysics Data System (ADS)
Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.
2006-09-01
Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.
Abraham, Sushil; Bain, David; Bowers, John; Larivee, Victor; Leira, Francisco; Xie, Jasmina
2015-01-01
The technology transfer of biological products is a complex process requiring control of multiple unit operations and parameters to ensure product quality and process performance. To achieve product commercialization, the technology transfer sending unit must successfully transfer knowledge about both the product and the process to the receiving unit. A key strategy for maximizing successful scale-up and transfer efforts is the effective use of engineering and shake-down runs to confirm operational performance and product quality prior to embarking on good manufacturing practice runs such as process performance qualification runs. We consider key factors to consider in making the decision to perform shake-down or engineering runs. We also present industry benchmarking results of how engineering runs are used in drug substance technology transfers alongside the main themes and best practices that have emerged. Our goal is to provide companies with a framework for ensuring the "right first time" technology transfers with effective deployment of resources within increasingly aggressive timeline constraints. © PDA, Inc. 2015.
Face-to-face handoff: improving transfer to the pediatric intensive care unit after cardiac surgery.
Vergales, Jeffrey; Addison, Nancy; Vendittelli, Analise; Nicholson, Evelyn; Carver, D Jeannean; Stemland, Christopher; Hoke, Tracey; Gangemi, James
2015-01-01
The goal was to develop and implement a comprehensive, primarily face-to-face handoff process that begins in the operating room and concludes at the bedside in the intensive care unit (ICU) for pediatric patients undergoing congenital heart surgery. Involving all stakeholders in the planning phase, the framework of the handoff system encompassed a combination of a formalized handoff tool, focused process steps that occurred prior to patient arrival in the ICU, and an emphasis on face-to-face communication at the conclusion of the handoff. The final process was evaluated by the use of observer checklists to examine quality metrics and timing for all patients admitted to the ICU following cardiac surgery. The process was found to improve how various providers view the efficiency of handoff, the ease of asking questions at each step, and the overall capability to improve patient care regardless of overall surgical complexity. © 2014 by the American College of Medical Quality.
Blackwell, Rebecca Wright Née; Lowton, Karen; Robert, Glenn; Grudzen, Corita; Grocott, Patricia
2017-03-01
Increasing use of emergency departments among older patients with palliative needs has led to the development of several service-level interventions intended to improve care quality. There is little evidence of patient and family involvement in developmental processes, and little is known about the experiences of - and preferences for - palliative care delivery in this setting. Participatory action research seeking to enable collaborative working between patients and staff should enhance the impact of local quality improvement work but has not been widely implemented in such a complex setting. To critique the feasibility of this methodology as a quality improvement intervention in complex healthcare settings, laying a foundation for future work. an Emergency Department in a large teaching hospital in the United Kingdom. Experience-based Co-design incorporating: 150h of nonparticipant observation; semi-structured interviews with 15 staff members about their experiences of palliative care delivery; 5 focus groups with 64 staff members to explore challenges in delivering palliative care; 10 filmed semi-structured interviews with palliative care patients or their family members; a co-design event involving staff, patients and family members. the study successfully identified quality improvement priorities leading to changes in Emergency Department-palliative care processes. Further outputs were the creation of a patient-family-staff experience training DVD to encourage reflective discussion and the identification and application of generic design principles for improving palliative care in the Emergency Department. There were benefits and challenges associated with using Experience-based Co-design in this setting. Benefits included the flexibility of the approach, the high levels of engagement and responsiveness of patients, families and staff, and the impact of using filmed narrative interviews to enhance the 'voice' of seldom heard patients and families. Challenges included high levels of staff turnover during the 19 month project, significant time constraints in the Emergency Department and the ability of older patients and their families to fully participate in the co-design process. Experience-based Co-design is a useful approach for encouraging collaborative working between vulnerable patients, family and staff in complex healthcare environments. The flexibility of the approach allows the specific needs of participants to be accounted for, enabling fuller engagement with those who typically may not be invited to contribute to quality improvement work. Recommendations for future studies in this and similar settings include testing the 'accelerated' form of the approach and experimenting with alternative ways of increasing involvement of patients/families in the co-design phase. Copyright © 2017 Elsevier Ltd. All rights reserved.
Exploring the concept of quality care for the person who is dying.
Stefanou, Nichola; Faircloth, Sandra
2010-12-01
The concept of good quality care for the patient who is dying is diverse and complex. Many of the actions that are being taken to increase the quality of care of the dying patient are based around outcome, uniformity of service and standardization of process. There are two main areas that are referred to when dealing with care of the dying patient; end-of-life care and palliative care. High quality end-of-life care is increasingly recognized as an ethical obligation of health-care providers, clinicians and organizations, and yet there appears little evidence from the patients' perspective. There are many national and local initiatives taking place to improve the quality of care people receive towards the end of their life. This being said initiatives alone will not achieve good quality care and deliver good patient experiences. Only clinicians working at the front line can truly influence the way in which quality is improved and good experiences delivered.
Adherence to outpatient epilepsy quality indicators at a tertiary epilepsy center.
Pourdeyhimi, R; Wolf, B J; Simpson, A N; Martz, G U
2014-10-01
Quality indicators for the treatment of people with epilepsy were published in 2010. This is the first report of adherence to all measures in routine care of people with epilepsy at a level 4 comprehensive epilepsy center in the US. Two hundred patients with epilepsy were randomly selected from the clinics of our comprehensive epilepsy center, and all visits during 2011 were abstracted for documentation of adherence to the eight quality indicators. Alternative measures were constructed to evaluate failure of adherence. Detailed descriptions of all equations are provided. Objective measures (EEG, imaging) showed higher adherence than counseling measures (safety). Initial visits showed higher adherence. Variations in the interpretation of the quality measure result in different adherence values. Advanced practice providers and physicians had different adherence patterns. No patient-specific patterns of adherence were seen. This is the first report of adherence to all the epilepsy quality indicators for a sample of patients during routine care in a level 4 epilepsy center in the US. Overall adherence was similar to that previously reported on similar measures. Precise definitions of adherence equations are essential for accurate measurement. Complex measures result in lower adherence. Counseling measures showed low adherence, possibly highlighting a difference between practice and documentation. Adherence to the measures as written does not guarantee high quality care. The current quality indicators have value in the process of improving quality of care. Future approaches may be refined to eliminate complex measures and incorporate features linked to outcomes. Copyright © 2014 Elsevier Inc. All rights reserved.
Data Quality Verification at STScI - Automated Assessment and Your Data
NASA Astrophysics Data System (ADS)
Dempsey, R.; Swade, D.; Scott, J.; Hamilton, F.; Holm, A.
1996-12-01
As satellite based observatories improve their ability to deliver wider varieties and more complex types of scientific data, so to does the process of analyzing and reducing these data. It becomes correspondingly imperative that Guest Observers or Archival Researchers have access to an accurate, consistent, and easily understandable summary of the quality of their data. Previously, at the STScI, an astronomer would display and examine the quality and scientific usefulness of every single observation obtained with HST. Recently, this process has undergone a major reorganization at the Institute. A major part of the new process is that the majority of data are assessed automatically with little or no human intervention. As part of routine processing in the OSS--PODPS Unified System (OPUS), the Observatory Monitoring System (OMS) observation logs, the science processing trailer file (also known as the TRL file), and the science data headers are inspected by an automated tool, AUTO_DQ. AUTO_DQ then determines if any anomalous events occurred during the observation or through processing and calibration of the data that affects the procedural quality of the data. The results are placed directly into the Procedural Data Quality (PDQ) file as a string of predefined data quality keywords and comments. These in turn are used by the Contact Scientist (CS) to check the scientific usefulness of the observations. In this manner, the telemetry stream is checked for known problems such as losses of lock, re-centerings, or degraded guiding, for example, while missing data or calibration errors are also easily flagged. If the problem is serious, the data are then queued for manual inspection by an astronomer. The success of every target acquisition is verified manually. If serious failures are confirmed, the PI and the scheduling staff are notified so that options concerning rescheduling the observations can be explored.
Adaptive identifier for uncertain complex nonlinear systems based on continuous neural networks.
Alfaro-Ponce, Mariel; Cruz, Amadeo Argüelles; Chairez, Isaac
2014-03-01
This paper presents the design of a complex-valued differential neural network identifier for uncertain nonlinear systems defined in the complex domain. This design includes the construction of an adaptive algorithm to adjust the parameters included in the identifier. The algorithm is obtained based on a special class of controlled Lyapunov functions. The quality of the identification process is characterized using the practical stability framework. Indeed, the region where the identification error converges is derived by the same Lyapunov method. This zone is defined by the power of uncertainties and perturbations affecting the complex-valued uncertain dynamics. Moreover, this convergence zone is reduced to its lowest possible value using ideas related to the so-called ellipsoid methodology. Two simple but informative numerical examples are developed to show how the identifier proposed in this paper can be used to approximate uncertain nonlinear systems valued in the complex domain.
Ishii, Lisa; Pronovost, Peter J; Demski, Renee; Wylie, Gill; Zenilman, Michael
2016-06-01
An increasing volume of ambulatory surgeries has led to an increase in the number of ambulatory surgery centers (ASCs). Some academic health systems have aligned with ASCs to create a more integrated care delivery system. Yet, these centers are diverse in many areas, including specialty types, ownership models, management, physician employment, and regulatory oversight. Academic health systems then face challenges in integrating these ASCs into their organizations. Johns Hopkins Medicine created the Ambulatory Surgery Coordinating Council in 2014 to manage, standardize, and promote peer learning among its eight ASCs. The Armstrong Institute for Patient Safety and Quality provided support and a model for this organization through its quality management infrastructure. The physician-led council defined a mission and created goals to identify best practices, uniformly provide the highest-quality patient-centered care, and continuously improve patient outcomes and experience across ASCs. Council members built trust and agreed on a standardized patient safety and quality dashboard to report measures that include regulatory, care process, patient experience, and outcomes data. The council addressed unintentional outcomes and process variation across the system and agreed to standard approaches to optimize quality. Council members also developed a process for identifying future goals, standardizing care practices and electronic medical record documentation, and creating quality and safety policies. The early success of the council supports the continuation of the Armstrong Institute model for physician-led quality management. Other academic health systems can learn from this model as they integrate ASCs into their complex organizations.
Understanding current steam sterilization recommendations and guidelines.
Spry, Cynthia
2008-10-01
Processing surgical instruments in preparation for surgery is a complex multistep practice. It is impractical to culture each and every item to determine sterility; therefore, the best assurance of a sterile product is careful execution of every step in the process coupled with an ongoing quality control program. Perioperative staff nurses and managers responsible for instrument processing, whether for a single instrument or multiple sets, must be knowledgeable with regard to cleaning; packaging; cycle selection; and the use of physical, chemical, and biological monitors. Nurses also should be able to resolve issues related to loaner sets, flash sterilization, and extended cycles.
Groundwater ages and mixing in the Piceance Basin natural gas province, Colorado
McMahon, Peter B.; Thomas, Judith C.; Hunt, Andrew G.
2013-01-01
Reliably identifying the effects of energy development on groundwater quality can be difficult because baseline assessments of water quality completed before the onset of energy development are rare and because interactions between hydrocarbon reservoirs and aquifers can be complex, involving both natural and human processes. Groundwater age and mixing data can strengthen interpretations of monitoring data from those areas by providing better understanding of the groundwater flow systems. Chemical, isotopic, and age tracers were used to characterize groundwater ages and mixing with deeper saline water in three areas of the Piceance Basin natural gas province. The data revealed a complex array of groundwater ages (50,000 years) and mixing patterns in the basin that helped explain concentrations and sources of methane in groundwater. Age and mixing data also can strengthen the design of monitoring programs by providing information on time scales at which water quality changes in aquifers might be expected to occur. This information could be used to establish maximum allowable distances of monitoring wells from energy development activity and the appropriate duration of monitoring.
Toward edge minability for role mining in bipartite networks
NASA Astrophysics Data System (ADS)
Dong, Lijun; Wang, Yi; Liu, Ran; Pi, Benjie; Wu, Liuyi
2016-11-01
Bipartite network models have been extensively used in information security to automatically generate role-based access control (RBAC) from dataset. This process is called role mining. However, not all the topologies of bipartite networks are suitable for role mining; some edges may even reduce the quality of role mining. This causes unnecessary time consumption as role mining is NP-hard. Therefore, to promote the quality of role mining results, the capability that an edge composes roles with other edges, called the minability of edge, needs to be identified. We tackle the problem from an angle of edge importance in complex networks; that is an edge easily covered by roles is considered to be more important. Based on this idea, the k-shell decomposition of complex networks is extended to reveal the different minability of edges. By this way, a bipartite network can be quickly purified by excluding the low-minability edges from role mining, and thus the quality of role mining can be effectively improved. Extensive experiments via the real-world datasets are conducted to confirm the above claims.
Lange, Rogier; Ter Heine, Rob; van der Gronde, Toon; Selles, Suzanne; de Klerk, John; Bloemendal, Haiko; Hendrikse, Harry
2016-07-30
Rhenium-188-HEDP ((188)Re-HEDP) is a therapeutic radiopharmaceutical for treatment of osteoblastic bone metastases. No standard procedure for the preparation of this radiopharmaceutical is available. Preparation conditions may influence the quality and in vivo behaviour of this product. In this study we investigate the effect of critical process parameters on product quality and stability of (188)Re-HEDP. A stepwise approach was used, based on the quality by design (QbD) concept of the ICH Q8 (Pharmaceutical Development) guideline. Potential critical process conditions were identified. Variables tested were the elution volume, the freshness of the eluate, the reaction temperature and time, and the stability of the product upon dilution and storage. The impact of each variable on radiochemical purity was investigated. The acceptable ranges were established by boundary testing. With 2ml eluate, adequate radiochemical purity and stability were found. Nine ml eluate yielded a product that was less stable. Using eluate stored for 24h resulted in acceptable radiochemical purity. Complexation for 30min at room temperature, at 60°C and at 100°C generated appropriate and stable products. A complexation time of 10min at 90°C was too short, whereas heating 60min resulted in products that passed quality control and were stable. Diluting the end product and storage at 32.5°C resulted in notable decomposition. Two boundary tests, an elution volume of 9ml and a heating time of 10min, yielded products of inadequate quality or stability. The product was found to be instable after dilution or when stored above room temperature. Our findings show that our previously developed preparation method falls well within the proven acceptable ranges. Applying QbD principles is feasible and worthwhile for the small-scale preparation of radiopharmaceuticals. Copyright © 2016 Elsevier B.V. All rights reserved.
Label propagation algorithm for community detection based on node importance and label influence
NASA Astrophysics Data System (ADS)
Zhang, Xian-Kun; Ren, Jing; Song, Chen; Jia, Jia; Zhang, Qian
2017-09-01
Recently, the detection of high-quality community has become a hot spot in the research of social network. Label propagation algorithm (LPA) has been widely concerned since it has the advantages of linear time complexity and is unnecessary to define objective function and the number of community in advance. However, LPA has the shortcomings of uncertainty and randomness in the label propagation process, which affects the accuracy and stability of the community. For large-scale social network, this paper proposes a novel label propagation algorithm for community detection based on node importance and label influence (LPA_NI). The experiments with comparative algorithms on real-world networks and synthetic networks have shown that LPA_NI can significantly improve the quality of community detection and shorten the iteration period. Also, it has better accuracy and stability in the case of similar complexity.
The Effect of Task Complexity on the Quality of EFL Learners' Argumentative Writing
ERIC Educational Resources Information Center
Sadeghi, Karim; Mosalli, Zahra
2013-01-01
Based on Robinson's (2005) Cognition Hypothesis and Skehan and Foster's (2001) Limited Attentional Capacity Model, the current study attempted to investigate the effect of manipulating task complexity on argumentative writing quality in terms of lexical complexity, fluency, grammatical accuracy, and syntactic complexity. Task complexity was…
Salmikangas, Paula; Menezes-Ferreira, Margarida; Reischl, Ilona; Tsiftsoglou, Asterios; Kyselovic, Jan; Borg, John Joseph; Ruiz, Sol; Flory, Egbert; Trouvin, Jean-Hugues; Celis, Patrick; Ancans, Janis; Timon, Marcos; Pante, Guido; Sladowski, Dariusz; Lipnik-Stangelj, Metoda; Schneider, Christian K
2015-01-01
During the past decade, a large number of cell-based medicinal products have been tested in clinical trials for the treatment of various diseases and tissue defects. However, licensed products and those approaching marketing authorization are still few. One major area of challenge is the manufacturing and quality development of these complex products, for which significant manipulation of cells might be required. While the paradigms of quality, safety and efficacy must apply also to these innovative products, their demonstration may be demanding. Demonstration of comparability between production processes and batches may be difficult for cell-based medicinal products. Thus, the development should be built around a well-controlled manufacturing process and a qualified product to guarantee reproducible data from nonclinical and clinical studies.
Mathaes, Roman; Mahler, Hanns-Christian; Roggo, Yves; Huwyler, Joerg; Eder, Juergen; Fritsch, Kamila; Posset, Tobias; Mohl, Silke; Streubel, Alexander
2016-01-01
Capping equipment used in good manufacturing practice manufacturing features different designs and a variety of adjustable process parameters. The overall capping result is a complex interplay of the different capping process parameters and is insufficiently described in literature. It remains poorly studied how the different capping equipment designs and capping equipment process parameters (e.g., pre-compression force, capping plate height, turntable rotating speed) contribute to the final residual seal force of a sealed container closure system and its relation to container closure integrity and other drug product quality parameters. Stopper compression measured by computer tomography correlated to residual seal force measurements.In our studies, we used different container closure system configurations from different good manufacturing practice drug product fill & finish facilities to investigate the influence of differences in primary packaging, that is, vial size and rubber stopper design on the capping process and the capped drug product. In addition, we compared two large-scale good manufacturing practice manufacturing capping equipment and different capping equipment settings and their impact on product quality and integrity, as determined by residual seal force.The capping plate to plunger distance had a major influence on the obtained residual seal force values of a sealed vial, whereas the capping pre-compression force and the turntable rotation speed showed only a minor influence on the residual seal force of a sealed vial. Capping process parameters could not easily be transferred from capping equipment of different manufacturers. However, the residual seal force tester did provide a valuable tool to compare capping performance of different capping equipment. No vial showed any leakage greater than 10(-8)mbar L/s as measured by a helium mass spectrometry system, suggesting that container closure integrity was warranted in the residual seal force range tested for the tested container closure systems. Capping equipment used in good manufacturing practice manufacturing features different designs and a variety of adjustable process parameters. The overall capping result is a complex interplay of the different capping process parameters and is insufficiently described in the literature. It remains poorly studied how the different capping equipment designs and capping equipment process parameters contribute to the final capping result.In this study, we used different container closure system configurations from different good manufacturing process drug product fill & finish facilities to investigate the influence of the vial size and the rubber stopper design on the capping process. In addition, we compared two examples of large-scale good manufacturing process capping equipment and different capping equipment settings and their impact on product quality and integrity, as determined by residual seal force. © PDA, Inc. 2016.
Nonisothermal glass molding for the cost-efficient production of precision freeform optics
NASA Astrophysics Data System (ADS)
Vu, Anh-Tuan; Kreilkamp, Holger; Dambon, Olaf; Klocke, Fritz
2016-07-01
Glass molding has become a key replication-based technology to satisfy intensively growing demands of complex precision optics in the today's photonic market. However, the state-of-the-art replicative technologies are still limited, mainly due to their insufficiency to meet the requirements of mass production. This paper introduces a newly developed nonisothermal glass molding in which a complex-shaped optic is produced in a very short process cycle. The innovative molding technology promises a cost-efficient production because of increased mold lifetime, less energy consumption, and high throughput from a fast process chain. At the early stage of the process development, the research focuses on an integration of finite element simulation into the process chain to reduce time and labor-intensive cost. By virtue of numerical modeling, defects including chill ripples and glass sticking in the nonisothermal molding process can be predicted and the consequent effects are avoided. In addition, the influences of process parameters and glass preforms on the surface quality, form accuracy, and residual stress are discussed. A series of experiments was carried out to validate the simulation results. The successful modeling, therefore, provides a systematic strategy for glass preform design, mold compensation, and optimization of the process parameters. In conclusion, the integration of simulation into the entire nonisothermal glass molding process chain will significantly increase the manufacturing efficiency as well as reduce the time-to-market for the mass production of complex precision yet low-cost glass optics.
[Quality assurance in intensive care: the situation in Switzerland].
Frutiger, A
1999-10-30
The movement for quality in medicine is starting to take on the dimensions of a crusade. Quite logically it has also reached the intensive care community. Due to their complex multidisciplinary functioning and because of the high costs involved, ICUs are model services reflecting the overall situation in our hospitals. The situation of Swiss intensive care is particularly interesting, because for over 25 years standards for design and staffing of Swiss ICUs have been in effect and were enforced via onsite visits by the Swiss Society of Intensive Care without government involvement. Swiss intensive care thus defined its structures long before the word "accreditation" had even been used in this context. While intensive care in Switzerland is practised in clearly defined, well equipped and adequately staffed units, much less is known about process quality and outcomes of these services. Statistics on admissions, length of stay and length of mechanical ventilation, as well as severity data based on a simple classification system, are collected nationwide and allow some limited insight into the overall process of care. Results of intensive care are not systematically assessed. In response to the constant threat of cost containment, Swiss ICUs should increasingly focus on process quality and results, while maintaining their existing good structures.
Quality control of mRNP biogenesis: networking at the transcription site.
Eberle, Andrea B; Visa, Neus
2014-08-01
Eukaryotic cells carry out quality control (QC) over the processes of RNA biogenesis to inactivate or eliminate defective transcripts, and to avoid their production. In the case of protein-coding transcripts, the quality controls can sense defects in the assembly of mRNA-protein complexes, in the processing of the precursor mRNAs, and in the sequence of open reading frames. Different types of defect are monitored by different specialized mechanisms. Some of them involve dedicated factors whose function is to identify faulty molecules and target them for degradation. Others are the result of a more subtle balance in the kinetics of opposing activities in the mRNA biogenesis pathway. One way or another, all such mechanisms hinder the expression of the defective mRNAs through processes as diverse as rapid degradation, nuclear retention and transcriptional silencing. Three major degradation systems are responsible for the destruction of the defective transcripts: the exosome, the 5'-3' exoribonucleases, and the nonsense-mediated mRNA decay (NMD) machinery. This review summarizes recent findings on the cotranscriptional quality control of mRNA biogenesis, and speculates that a protein-protein interaction network integrates multiple mRNA degradation systems with the transcription machinery. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
De Dreuzy, J. R.; Marçais, J.; Moatar, F.; Minaudo, C.; Courtois, Q.; Thomas, Z.; Longuevergne, L.; Pinay, G.
2017-12-01
Integration of hydrological and biogeochemical processes led to emerging patterns at the catchment scale. Monitoring in rivers reflects the aggregation of these effects. While discharge time series have been measured for decades, high frequency water quality monitoring in rivers now provides prominent measurements to characterize the interplay between hydrological and biogeochemical processes, especially to infer the processes that happen in the heterogeneous subsurface. However, we still lack frameworks to relate observed patterns to specific processes, because of the "organized complexity" of hydrological systems. Indeed, it is unclear what controls, for example, patterns in concentration-discharge (C/Q) relationships due to non-linear processes and hysteresis effects. Here we develop a non-intensive process-based model to test how the integration of different landforms (i.e. geological heterogeneities and structures, topographical features) with different biogeochemical reactivity assumptions (e.g. reactive zone locations) can shape the overall water quality time series. With numerical experiments, we investigate typical patterns in high frequency C/Q relationships. In headwater basins, we found that typical hysteretic patterns in C/Q relationships observed in data time series can be attributed to differences in water and solute locations stored across the hillslope. At the catchment scale though, these effects tend to average out by integrating contrasted hillslopes' landforms. Together these results suggest that information contained in headwater water quality monitoring can be used to understand how hydrochemical processes determine downstream conditions.
A fast and efficient segmentation scheme for cell microscopic image.
Lebrun, G; Charrier, C; Lezoray, O; Meurie, C; Cardot, H
2007-04-27
Microscopic cellular image segmentation schemes must be efficient for reliable analysis and fast to process huge quantity of images. Recent studies have focused on improving segmentation quality. Several segmentation schemes have good quality but processing time is too expensive to deal with a great number of images per day. For segmentation schemes based on pixel classification, the classifier design is crucial since it is the one which requires most of the processing time necessary to segment an image. The main contribution of this work is focused on how to reduce the complexity of decision functions produced by support vector machines (SVM) while preserving recognition rate. Vector quantization is used in order to reduce the inherent redundancy present in huge pixel databases (i.e. images with expert pixel segmentation). Hybrid color space design is also used in order to improve data set size reduction rate and recognition rate. A new decision function quality criterion is defined to select good trade-off between recognition rate and processing time of pixel decision function. The first results of this study show that fast and efficient pixel classification with SVM is possible. Moreover posterior class pixel probability estimation is easy to compute with Platt method. Then a new segmentation scheme using probabilistic pixel classification has been developed. This one has several free parameters and an automatic selection must dealt with, but criteria for evaluate segmentation quality are not well adapted for cell segmentation, especially when comparison with expert pixel segmentation must be achieved. Another important contribution in this paper is the definition of a new quality criterion for evaluation of cell segmentation. The results presented here show that the selection of free parameters of the segmentation scheme by optimisation of the new quality cell segmentation criterion produces efficient cell segmentation.
Barclay-Goddard, Ruth; King, Judy; Dubouloz, Claire-Jehanne; Schwartz, Carolyn E
2012-02-01
A major goal of treatment for people living with chronic illness or disability is self-management leading to optimized health-related quality of life. This change process has been described in the adult education literature as transformative learning, while in health-related quality of life research, response shift has emerged as a key concept. Response shift and transformative learning literature were reviewed, and the theoretical frameworks of the 2 concepts were compared and contrasted. Response shift is described as a change in internal standards, values, or definition of a construct (eg, health-related quality of life) over time, commonly seen in individuals with chronic illness. In the context of chronic illness, transformative learning is described as a complex process of personal change including beliefs, feelings, knowledge, and values. Transformative learning is often triggered by the diagnosis of a chronic illness. This results in a critical reflection of taken-for-granted assumptions and leads to new ways of thinking, influencing personal changes in daily living. Comparing the models of response shift and transformative learning in chronic illness, the catalyst in response shift appears comparable with the trigger in transformational learning; mechanisms to process of changing; and perceived quality of life to outcomes. Both transformative learning and response shift have much to offer health care providers in understanding the learning process for the person living with chronic illness or disability to optimize their quality of life. Suggestions for future research in response shift and transformative learning in individuals with chronic health conditions and disability are proposed. Copyright © 2012 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Selle, B.; Schwientek, M.
2012-04-01
Water quality of ground and surface waters in catchments is typically driven by many complex and interacting processes. While small scale processes are often studied in great detail, their relevance and interplay at catchment scales remain often poorly understood. For many catchments, extensive monitoring data on water quality have been collected for different purposes. These heterogeneous data sets contain valuable information on catchment scale processes but are rarely analysed using integrated methods. Principle component analysis (PCA) has previously been applied to this kind of data sets. However, a detailed analysis of scores, which are an important result of a PCA, is often missing. Mathematically, PCA expresses measured variables on water quality, e.g. nitrate concentrations, as linear combination of independent, not directly observable key processes. These computed key processes are represented by principle components. Their scores are interpretable as process intensities which vary in space and time. Subsequently, scores can be correlated with other key variables and catchment characteristics, such as water travel times and land use that were not considered in PCA. This detailed analysis of scores represents an extension of the commonly applied PCA which could considerably improve the understanding of processes governing water quality at catchment scales. In this study, we investigated the 170 km2 Ammer catchment in SW Germany which is characterised by an above average proportion of agricultural (71%) and urban (17%) areas. The Ammer River is mainly fed by karstic springs. For PCA, we separately analysed concentrations from (a) surface waters of the Ammer River and its tributaries, (b) spring waters from the main aquifers and (c) deep groundwater from production wells. This analysis was extended by a detailed analysis of scores. We analysed measured concentrations on major ions and selected organic micropollutants. Additionally, redox-sensitive variables and environmental tracers indicating groundwater age were analysed for deep groundwater from production wells. For deep groundwater, we found that microbial turnover was stronger influenced by local availability of energy sources than by travel times of groundwater to the wells. Groundwater quality primarily reflected the input of pollutants determined by landuse, e.g. agrochemicals. We concluded that for water quality in the Ammer catchment, conservative mixing of waters with different origin is more important than reactive transport processes along the flow path.
Singh, Ravendra; Román-Ospino, Andrés D; Romañach, Rodolfo J; Ierapetritou, Marianthi; Ramachandran, Rohit
2015-11-10
The pharmaceutical industry is strictly regulated, where precise and accurate control of the end product quality is necessary to ensure the effectiveness of the drug products. For such control, the process and raw materials variability ideally need to be fed-forward in real time into an automatic control system so that a proactive action can be taken before it can affect the end product quality. Variations in raw material properties (e.g., particle size), feeder hopper level, amount of lubrication, milling and blending action, applied shear in different processing stages can affect the blend density significantly and thereby tablet weight, hardness and dissolution. Therefore, real time monitoring of powder bulk density variability and its incorporation into the automatic control system so that its effect can be mitigated proactively and efficiently is highly desired. However, real time monitoring of powder bulk density is still a challenging task because of different level of complexities. In this work, powder bulk density which has a significant effect on the critical quality attributes (CQA's) has been monitored in real time in a pilot-plant facility, using a NIR sensor. The sensitivity of the powder bulk density on critical process parameters (CPP's) and CQA's has been analyzed and thereby feed-forward controller has been designed. The measured signal can be used for feed-forward control so that the corrective actions on the density variations can be taken before they can influence the product quality. The coupled feed-forward/feed-back control system demonstrates improved control performance and improvements in the final product quality in the presence of process and raw material variations. Copyright © 2015 Elsevier B.V. All rights reserved.
Astigmatism compensation in digital holographic microscopy using complex-amplitude correlation
NASA Astrophysics Data System (ADS)
Tamrin, Khairul Fikri; Rahmatullah, Bahbibi; Samuri, Suzani Mohamad
2015-07-01
Digital holographic microscopy (DHM) is a promising tool for a three-dimensional imaging of microscopic particles. It offers the possibility of wavefront processing by manipulating amplitude and phase of the recorded digital holograms. With a view to compensate for aberration in the reconstructed particle images, this paper discusses a new approach of aberration compensation based on complex amplitude correlation and the use of a priori information. The approach is applied to holograms of microscopic particles flowing inside a cylindrical micro-channel recorded using an off-axis digital holographic microscope. The approach results in improvements in the image and signal qualities.
Spares Management : Optimizing Hardware Usage for the Space Shuttle Main Engine
NASA Technical Reports Server (NTRS)
Gulbrandsen, K. A.
1999-01-01
The complexity of the Space Shuttle Main Engine (SSME), combined with mounting requirements to reduce operations costs have increased demands for accurate tracking, maintenance, and projections of SSME assets. The SSME Logistics Team is developing an integrated asset management process. This PC-based tool provides a user-friendly asset database for daily decision making, plus a variable-input hardware usage simulation with complex logic yielding output that addresses essential asset management issues. Cycle times on critical tasks are significantly reduced. Associated costs have decreased as asset data quality and decision-making capability has increased.
ERIC Educational Resources Information Center
Fisher-Yoshida, Beth; Geller, Kathy D.; Wasserman, Ilene C.
2005-01-01
Today's complex global environment calls for leaders to be agile decision makers, engage in critical self-reflection, integrate reflection with action, and partner with those who are different in significant ways. These capabilities and skills are the core qualities of transformative learning. This paper weaves research findings that explore…
Oxidative aging and secondary organic aerosol formation from simulated wildfire emissions
C. J. Hennigan; M. A. Miracolo; G. J. Engelhart; A. A. May; Cyle Wold; WeiMin Hao; T. Lee; A. P. Sullivan; J. B. Gilman; W. C. Kuster; J. A. de Gouw; J. L. Collett; S. M. Kreidenweis; A. L. Robinson
2010-01-01
Wildfires are a significant fraction of global biomass burning and a major source of trace gas and particle emissions in the atmosphere. Understanding the air quality and climate implications of wildfires is difficult since the emissions undergo complex transformations due to aging processes during transport away from the source. As part of the third Fire Lab at...
Review of nitrogen fate models applicable to forest landscapes in the Southern U.S.
D. M. Amatya; C. G. Rossi; A. Saleh; Z. Dai; M. A. Youssef; R. G. Williams; D. D. Bosch; G. M. Chescheir; G. Sun; R. W. Skaggs; C. C. Trettin; E. D. Vance; J. E. Nettles; S. Tian
2013-01-01
Assessing the environmental impacts of fertilizer nitrogen (N) used to increase productivity in managed forests is complex due to a wide range of abiotic and biotic factors affecting its forms and movement. Models developed to predict fertilizer N fate (e.g., cycling processes) and water quality impacts vary widely in their design, scope, and potential application. We...
Use Zircon-Ilmenite Concentrate in Steelmaking
NASA Astrophysics Data System (ADS)
Fedoseev, S. N.; Volkova, T. N.
2016-08-01
Market requirements cause a constant search for new materials and technologies, for their immediate use in increasing requirements for material and energy efficiency, as well as to the quality of steel. In practice, steel production in the tended recently of more stringent requirements for the chemical composition of the steel and its contamination by nonmetallic inclusions, gas and non-ferrous metals. The main ways of increasing of strength and performance characteristics fabricated metal products related to the profound and effective influence on the crystallizing metal structure by furnace processing of the melt with refining and modifying additives. It can be argued that the furnace processing of steel and iron chemically active metals (alkali-earth metals, rare-earth metals, and others.) is an integral part of modern production of high quality products and competitive technologies. Important condition for development of methods secondary metallurgy of steel is the use of relatively inexpensive materials in a variety of complex alloys and blends, allowing targeted control of physical and chemical state of the molten metal and, therefore, receive steel with improved performance. In this connection the development of modifying natural materials metallurgy technologies presented complex ores containing titanium and zirconium, is a very urgent task.
Pixel-based speckle adjustment for noise reduction in Fourier-domain OCT images.
Zhang, Anqi; Xi, Jiefeng; Sun, Jitao; Li, Xingde
2017-03-01
Speckle resides in OCT signals and inevitably effects OCT image quality. In this work, we present a novel method for speckle noise reduction in Fourier-domain OCT images, which utilizes the phase information of complex OCT data. In this method, speckle area is pre-delineated pixelwise based on a phase-domain processing method and then adjusted by the results of wavelet shrinkage of the original image. Coefficient shrinkage method such as wavelet or contourlet is applied afterwards for further suppressing the speckle noise. Compared with conventional methods without speckle adjustment, the proposed method demonstrates significant improvement of image quality.
Vijayan, S.; Wong, C.F.; Buckley, L.P.
1994-11-22
In processes of this invention aqueous waste solutions containing a variety of mixed waste contaminants are treated to remove the contaminants by a sequential addition of chemicals and adsorption/ion exchange powdered materials to remove the contaminants including lead, cadmium, uranium, cesium-137, strontium-85/90, trichloroethylene and benzene, and impurities including iron and calcium. Staged conditioning of the waste solution produces a polydisperse system of size enlarged complexes of the contaminants in three distinct configurations: water-soluble metal complexes, insoluble metal precipitation complexes, and contaminant-bearing particles of ion exchange and adsorbent materials. The volume of the waste is reduced by separation of the polydisperse system by cross-flow microfiltration, followed by low-temperature evaporation and/or filter pressing. The water produced as filtrate is discharged if it meets a specified target water quality, or else the filtrate is recycled until the target is achieved. 1 fig.
Vijayan, Sivaraman; Wong, Chi F.; Buckley, Leo P.
1994-01-01
In processes of this invention aqueous waste solutions containing a variety of mixed waste contaminants are treated to remove the contaminants by a sequential addition of chemicals and adsorption/ion exchange powdered materials to remove the contaminants including lead, cadmium, uranium, cesium-137, strontium-85/90, trichloroethylene and benzene, and impurities including iron and calcium. Staged conditioning of the waste solution produces a polydisperse system of size enlarged complexes of the contaminants in three distinct configurations: water-soluble metal complexes, insoluble metal precipitation complexes, and contaminant-bearing particles of ion exchange and adsorbent materials. The volume of the waste is reduced by separation of the polydisperse system by cross-flow microfiltration, followed by low-temperature evaporation and/or filter pressing. The water produced as filtrate is discharged if it meets a specified target water quality, or else the filtrate is recycled until the target is achieved.
Soltes, Garner R; Martin, Nicholas R; Park, Eunhae; Sutterlin, Holly A; Silhavy, Thomas J
2017-10-15
Outer membrane protein (OMP) biogenesis in Escherichia coli is a robust process essential to the life of the organism. It is catalyzed by the β-barrel assembly machine (Bam) complex, and a number of quality control factors, including periplasmic chaperones and proteases, maintain the integrity of this trafficking pathway. Little is known, however, about how periplasmic proteases recognize and degrade OMP substrates when assembly is compromised or whether different proteases recognize the same substrate at distinct points in the assembly pathway. In this work, we use well-defined assembly-defective mutants of LptD, the essential lipopolysaccharide assembly translocon, to show that the periplasmic protease DegP degrades substrates with assembly defects that prevent or impair initial contact with Bam, causing the mutant protein to accumulate in the periplasm. In contrast, another periplasmic protease, BepA, degrades a LptD mutant substrate that has engaged the Bam complex and formed a nearly complete barrel. Furthermore, we describe the role of the outer membrane lipoprotein YcaL, a protease of heretofore unknown function, in the degradation of a LptD substrate that has engaged the Bam complex but is stalled at an earlier step in the assembly process that is not accessible to BepA. Our results demonstrate that multiple periplasmic proteases monitor OMPs at distinct points in the assembly process. IMPORTANCE OMP assembly is catalyzed by the essential Bam complex and occurs in a cellular environment devoid of energy sources. Assembly intermediates that misfold can compromise this essential molecular machine. Here we demonstrate distinctive roles for three different periplasmic proteases that can clear OMP substrates with folding defects that compromise assembly at three different stages. These quality control factors help ensure the integrity of the permeability barrier that contributes to the intrinsic resistance of Gram-negative organisms to many antibiotics. Copyright © 2017 American Society for Microbiology.
Lalev, A I; Abeyrathne, P D; Nazar, R N
2000-09-08
The interdependency of steps in the processing of pre-rRNA in Schizosaccharomyces pombe suggests that RNA processing, at least in part, acts as a quality control mechanism which helps assure that only functional RNA is incorporated into mature ribosomes. To determine further the role of the transcribed spacer regions in rRNA processing and to detect interactions which underlie the interdependencies, the ITS1 sequence was examined for its ability to form ribonucleoprotein complexes with cellular proteins. When incubated with protein extract, the spacer formed a specific large RNP. This complex was stable to fractionation by agarose or polyacrylamide gel electrophoresis. Modification exclusion analyses indicated that the proteins interact with a helical domain which is conserved in the internal transcribed spacers. Mutagenic analyses confirmed an interaction with this sequence and indicated that this domain is critical to the efficient maturation of the precursor RNA. The protein constituents, purified by affinity chromatography using the ITS1 sequence, retained an ability to form stable RNP. Protein analyses of gel purified complex, prepared with affinity-purified proteins, indicated at least 20 protein components ranging in size from 20-200 kDa. Peptide mapping by Maldi-Toff mass spectroscopy identified eight hypothetical RNA binding proteins which included four different RNA-binding motifs. Another protein was putatively identified as a pseudouridylate synthase. Additional RNA constituents were not detected. The significance of this complex with respect to rRNA maturation and interdependence in rRNA processing is discussed. Copyright 2000 Academic Press.
Van Bogaert, Peter; Van heusden, Danny; Somers, Annemie; Tegenbos, Muriel; Wouters, Kristien; Van der Straeten, Johnny; Van Aken, Paul; Havens, Donna Sullivan
2014-09-01
The objective of this study was to investigate the impact of The Productive Ward-Releasing Time to Care™ program implemented in a hospital transformation process on nurse perception related to practice environment, burnout, quality of care, and job outcomes. To address the continuously evolving complex challenges of patient care, high-performance nursing care is necessary. A longitudinal survey design was used to conduct a study in a 600-bed acute care university hospital with 3 measurement periods: T0: base line in 2006, T1 in 2011, and T2 in 2013. As part of the hospital transformation process, the productive ward program was introduced between T1 and T2. Relevant impact on nurse-physician relations, nurse management, hospital management-organizational support, nurse-reported quality of care, and job outcomes were identified. Hospital strategies and policies should be aligned with daily practices so that engaged and committed staff can promote excellent outcomes.
Moraes, Karen CM
2010-01-01
Production of mature mRNAs that encode functional proteins involves highly complex pathways of synthesis, processing and surveillance. At numerous steps during the maturation process, the mRNA transcript undergoes scrutiny by cellular quality control machinery. This extensive RNA surveillance ensures that only correctly processed mature mRNAs are translated and precludes production of aberrant transcripts that could encode mutant or possibly deleterious proteins. Recent advances in elucidating the molecular mechanisms of mRNA processing have demonstrated the existence of an integrated network of events, and have revealed that a variety of human diseases are caused by disturbances in the well-coordinated molecular equilibrium of these events. From a medical perspective, both loss and gain of function are relevant, and a considerable number of different diseases exemplify the importance of the mechanistic function of RNA surveillance in a cell. Here, mechanistic hallmarks of mRNA processing steps are reviewed, highlighting the medical relevance of their deregulation and how the understanding of such mechanisms can contribute to the development of therapeutic strategies. PMID:19829759
NASA Astrophysics Data System (ADS)
Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal
2017-07-01
Fused Deposition Modeling (FDM) is one of the prominent additive manufacturing technologies for producing polymer products. FDM is a complex additive manufacturing process that can be influenced by many process conditions. The industrial demands required from the FDM process are increasing with higher level product functionality and properties. The functionality and performance of FDM manufactured parts are greatly influenced by the combination of many various FDM process parameters. Designers and researchers always pay attention to study the effects of FDM process parameters on different product functionalities and properties such as mechanical strength, surface quality, dimensional accuracy, build time and material consumption. However, very limited studies have been carried out to investigate and optimize the effect of FDM build parameters on wear performance. This study focuses on the effect of different build parameters on micro-structural and wear performance of FDM specimens using definitive screening design based quadratic model. This would reduce the cost and effort of additive manufacturing engineer to have a systematic approachto make decision among the manufacturing parameters to achieve the desired product quality.
Rabe, Fran; Kadidlo, Diane; Van Orsow, Lisa; McKenna, David
2013-10-01
Qualification of a cord blood bank (CBB) is a complex process that includes evaluation of multiple aspects of donor screening and testing, processing, accreditation and approval by professional cell therapy groups, and results of received cord blood units. The University of Minnesota Medical Center Cell Therapy Laboratory has established a CBB vendor qualification process to ensure the CBB meets established regulatory and quality requirements. The deployed qualification of CBBs is based on retrospective and prospective review of the CBB. Forty-one CBBs were evaluated retrospectively: seven CBBs were disqualified based on failed quality control (QC) results. Eight CBBs did not meet the criteria for retrospective qualification because fewer than 3 cord blood units were received and the CBB was not accredited. As of March 2012, three US and one non-US CBBs have been qualified prospectively. One CBB withdrew from the qualification process after successful completion of the comprehensive survey and subsequent failure of the provided QC unit to pass the minimum criteria. One CBB failed the prospective qualification process based on processing methods that were revealed during the paper portion of the evaluation. A CBB qualification process is necessary for a transplant center to manage the qualification of the large number of CBBs needed to support a umbilical cord blood transplantation program. A transplant center that has utilized cord blood for a number of years before implementation of a qualification process should use a retrospective qualification process along with a prospective process. © 2013 American Association of Blood Banks.
Sharma, Davinder; Golla, Naresh; Singh, Dheer; Onteru, Suneel K
2018-03-01
The next-generation sequencing (NGS) based RNA sequencing (RNA-Seq) and transcriptome profiling offers an opportunity to unveil complex biological processes. Successful RNA-Seq and transcriptome profiling requires a large amount of high-quality RNA. However, NGS-quality RNA isolation is extremely difficult from recalcitrant adipose tissue (AT) with high lipid content and low cell numbers. Further, the amount and biochemical composition of AT lipid varies depending upon the animal species which can pose different degree of resistance to RNA extraction. Currently available approaches may work effectively in one species but can be almost unproductive in another species. Herein, we report a two step protocol for the extraction of NGS quality RNA from AT across a broad range of animal species. © 2017 Wiley Periodicals, Inc.
Quality Management in Astronomical Software and Data Systems
NASA Astrophysics Data System (ADS)
Radziwill, N. M.
2007-10-01
As the demand for more sophisticated facilities increases, the complexity of the technical and organizational challenges faced by operational space- and ground-based telescopes also increases. In many organizations, funding tends not to be proportional to this trend, and steps must be taken to cultivate a lean environment in both development and operations to consistently do more with less. To facilitate this transition, an organization must be aware of how it can meet quality-related goals, such as reducing variation, improving productivity of people and systems, streamlining processes, ensuring compliance with requirements (scientific, organizational, project, or regulatory), and increasing user satisfaction. Several organizations are already on this path. Quality-based techniques for the efficient, effective development of new telescope facilities and maintenance of existing facilities are described.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
Ion mobility spectrometry for food quality and safety.
Vautz, W; Zimmermann, D; Hartmann, M; Baumbach, J I; Nolte, J; Jung, J
2006-11-01
Ion mobility spectrometry is known to be a fast and sensitive technique for the detection of trace substances, and it is increasingly in demand not only for protection against explosives and chemical warfare agents, but also for new applications in medical diagnosis or process control. Generally, a gas phase sample is ionized by help of ultraviolet light, ss-radiation or partial discharges. The ions move in a weak electrical field towards a detector. During their drift they collide with a drift gas flowing in the opposite direction and, therefore, are slowed down depending on their size, shape and charge. As a result, different ions reach the detector at different drift times, which are characteristic for the ions considered. The number of ions reaching the detector are a measure of the concentration of the analyte. The method enables the identification and quantification of analytes with high sensitivity (ng l(-1) range). The selectivity can even be increased - as necessary for the analyses of complex mixtures - using pre-separation techniques such as gas chromatography or multi-capillary columns. No pre-concentration of the sample is necessary. Those characteristics of the method are preserved even in air with up to a 100% relative humidity rate. The suitability of the method for application in the field of food quality and safety - including storage, process and quality control as well as the characterization of food stuffs - was investigated in recent years for a number of representative examples, which are summarized in the following, including new studies as well: (1) the detection of metabolites from bacteria for the identification and control of their growth; (2) process control in food production - beer fermentation being an example; (3) the detection of the metabolites of mould for process control during cheese production, for quality control of raw materials or for the control of storage conditions; (4) the quality control of packaging materials during the production of polymeric materials; and (5) the characterization of products - wine being an example. The challenges of such applications were operation in humid air, fast on-line analyses of complex mixtures, high sensitivity - detection limits have to be, for example, in the range of the odour limits - and, in some cases, the necessity of mobile instrumentation. It can be shown that ion mobility spectrometry is optimally capable of fulfilling those challenges for many applications.
Mitchell, Peter D; Ratcliffe, Elizabeth; Hourd, Paul; Williams, David J; Thomas, Robert J
2014-12-01
It is well documented that cryopreservation and resuscitation of human embryonic stem cells (hESCs) is complex and ill-defined, and often suffers poor cell recovery and increased levels of undesirable cell differentiation. In this study we have applied Quality-by-Design (QbD) concepts to the critical processes of slow-freeze cryopreservation and resuscitation of hESC colony cultures. Optimized subprocesses were linked together to deliver a controlled complete process. We have demonstrated a rapid, high-throughput, and stable system for measurement of cell adherence and viability as robust markers of in-process and postrecovery cell state. We observed that measurement of adherence and viability of adhered cells at 1 h postseeding was predictive of cell proliferative ability up to 96 h in this system. Application of factorial design defined the operating spaces for cryopreservation and resuscitation, critically linking the performance of these two processes. Optimization of both processes resulted in enhanced reattachment and post-thaw viability, resulting in substantially greater recovery of cryopreserved, pluripotent cell colonies. This study demonstrates the importance of QbD concepts and tools for rapid, robust, and low-risk process design that can inform manufacturing controls and logistics.
Efficient quantum computing using coherent photon conversion.
Langford, N K; Ramelow, S; Prevedel, R; Munro, W J; Milburn, G J; Zeilinger, A
2011-10-12
Single photons are excellent quantum information carriers: they were used in the earliest demonstrations of entanglement and in the production of the highest-quality entanglement reported so far. However, current schemes for preparing, processing and measuring them are inefficient. For example, down-conversion provides heralded, but randomly timed, single photons, and linear optics gates are inherently probabilistic. Here we introduce a deterministic process--coherent photon conversion (CPC)--that provides a new way to generate and process complex, multiquanta states for photonic quantum information applications. The technique uses classically pumped nonlinearities to induce coherent oscillations between orthogonal states of multiple quantum excitations. One example of CPC, based on a pumped four-wave-mixing interaction, is shown to yield a single, versatile process that provides a full set of photonic quantum processing tools. This set satisfies the DiVincenzo criteria for a scalable quantum computing architecture, including deterministic multiqubit entanglement gates (based on a novel form of photon-photon interaction), high-quality heralded single- and multiphoton states free from higher-order imperfections, and robust, high-efficiency detection. It can also be used to produce heralded multiphoton entanglement, create optically switchable quantum circuits and implement an improved form of down-conversion with reduced higher-order effects. Such tools are valuable building blocks for many quantum-enabled technologies. Finally, using photonic crystal fibres we experimentally demonstrate quantum correlations arising from a four-colour nonlinear process suitable for CPC and use these measurements to study the feasibility of reaching the deterministic regime with current technology. Our scheme, which is based on interacting bosonic fields, is not restricted to optical systems but could also be implemented in optomechanical, electromechanical and superconducting systems with extremely strong intrinsic nonlinearities. Furthermore, exploiting higher-order nonlinearities with multiple pump fields yields a mechanism for multiparty mediation of the complex, coherent dynamics.
MacDonald, Kath; Greggans, Alison
2008-12-01
The aim of this paper is to share our experiences of dealing with chaos and complexity in interview situations in the home with children and young people. We highlight dilemmas relevant to dealing with multiple interruptions, building a rapport, consent and confidentiality. Furthermore, we discuss issues regarding the locus of power and control and offer some solutions based on our experiences. Creating a safe environment is essential for qualitative research. Participants are more likely to open up and communicate if they feel safe, comfortable and relaxed. We conclude that interviewing parents and their children with cystic fibrosis in their own homes, is chaotic and appears to threaten the rigour of data collection processes. Limited attention or print space is paid to this issue, with published articles frequently sanitising the messiness of real world qualitative research. Position paper. In this position paper, we use two case studies to illustrate ethical and pragmatic challenges of interviewing out in the field. These case studies, typical of families we encountered, help emphasise the concerns we had in balancing researcher-participant rapport with the quality of the research process. Dealing with perceived chaos is hard in reality, but capturing it is part of the complexity of qualitative enquiry. The context is interdependent with children's perceived reality, because they communicate with others through their environment. This paper gives researchers an insight into the tensions of operating out in the field and helps raise the importance of the environmental 'chaos' in revealing significant issues relevant to peoples daily lives. Knowing that unexpected chaos is part and parcel of qualitative research, will equip researchers with skills fundamental for balancing the well being of all those involved with the quality of the research process.
NASA Astrophysics Data System (ADS)
Maroy, E.; Rousseau, A. N.; Hallema, D. W.
2012-12-01
With recent efforts and increasing control over point source pollution of freshwater, agricultural non-point pollution sources have become responsible for most of sediment and nutrient loads in North American water systems. Environmental and agricultural agencies have recognised the need for reducing eutrophication and have developed various policies to compel or encourage producers to best management practices (BMPs). Addressing diffuse pollution is challenging considering the complex and cumulative nature of transport processes, high variability in space and time, and prohibitive costs of distributed water quality monitoring. Many policy options exist to push producers to adopt environmentally desirable behaviour while keeping their activity viable, and ensure equitable costs to consumers and tax payers. On the one hand, economic instruments (subsidies, taxes, water quality markets) are designed to maximize cost-effectiveness, so that farmers optimize their production for maximum profit while implementing BMPs. On the other hand, emission standards or regulation of inputs are often easier and less costly to implement. To study economic and environmental impacts of such policies, a distributed modelling approach is needed to deal with the complexity of the system and the large environmental and socio-economic data requirements. Our objective is to integrate agro-hydrological modelling and economic analysis to support decision and policy making processes of BMP implementation. The integrated modelling system GIBSI was developed in an earlier study within the Canadian WEBs project (Watershed Evaluation of BMPs) to evaluate the influence of BMPs on water quality. The case study involved 30 and 15 year records of discharge and water quality measurements respectively, in the Beaurivage River watershed (Quebec, Canada). GIBSI provided a risk-based overview of the impact of BMPs (including vegetated riparian buffer strips, precision slurry application, conversion to grassland and no-till) in terms of sediment, nutrient and pesticide yields and loads. Input data included characteristics of reservoirs, land cover, soil, agricultural management, livestock management and point sources of pollution. The present study continues from there by first assessing the cost-effectiveness of different sets of BMPs, based on farm budgets and environmental criteria selected by the user. We subsequently examine monetary trade-offs between on-farm costs and social value of water quality improvements using cost-benefit ratios. Because water quality is a non-excludable and non-rivalrous good, its benefits to society are evaluated with non-market evaluation techniques mostly based on quality-constrained recreational use of water. From a policy perspective, cost-effectiveness analysis is very helpful in assisting the decision maker in the highly complex process of defining priorities with respect to BMP strategies. With a user-friendly interface for economic analysis integrated into GIBSI, watershed organizations and stakeholders can use such a tool to promote sustainable agricultural practices and water use. This submission is part of Watershed Evaluation of BMPs project (WEBs) funded by Agriculture and Agri-Food Canada and Ducks Unlimited Canada
Biosimilarity Versus Manufacturing Change: Two Distinct Concepts.
Declerck, Paul; Farouk-Rezk, Mourad; Rudd, Pauline M
2016-02-01
As products of living cells, biologics are far more complicated than small molecular-weight drugs not only with respect to size and structural complexity but also their sensitivity to manufacturing processes and post-translational changes. Most of the information on the manufacturing process of biotherapeutics is proprietary and hence not fully accessible to the public. This information gap represents a key challenge for biosimilar developers and plays a key role in explaining the differences in regulatory pathways required to demonstrate biosimilarity versus those required to ensure that a change in manufacturing process did not have implications on safety and efficacy. Manufacturing process changes are frequently needed for a variety of reasons including response to regulatory requirements, up scaling production, change in facility, change in raw materials, improving control of quality (consistency) or optimising production efficiency. The scope of the change is usually a key indicator of the scale of analysis required to evaluate the quality. In most cases, where the scope of the process change is limited, only quality and analytical studies should be sufficient while comparative clinical studies can be required in case of major changes (e.g., cell line changes). Biosimilarity exercises have been addressed differently by regulators on the understanding that biosimilar developers start with fundamental differences being a new cell line and also a knowledge gap of the innovator's processes, including culture media, purification processes, and potentially different formulations, and are thus required to ensure that differences from innovators do not result in differences in efficacy and safety.
Initial Ada components evaluation
NASA Technical Reports Server (NTRS)
Moebes, Travis
1989-01-01
The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.
Data mining and statistical inference in selective laser melting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamath, Chandrika
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Data mining and statistical inference in selective laser melting
Kamath, Chandrika
2016-01-11
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Linking Local Scale Ecosystem Science to Regional Scale Management
NASA Astrophysics Data System (ADS)
Shope, C. L.; Tenhunen, J.; Peiffer, S.
2012-04-01
Ecosystem management with respect to sufficient water yield, a quality water supply, habitat and biodiversity conservation, and climate change effects requires substantial observational data at a range of scales. Complex interactions of local physical processes oftentimes vary over space and time, particularly in locations with extreme meteorological conditions. Modifications to local conditions (ie: agricultural land use changes, nutrient additions, landscape management, water usage) can further affect regional ecosystem services. The international, inter-disciplinary TERRECO research group is intensively investigating a variety of local processes, parameters, and conditions to link complex physical, economic, and social interactions at the regional scale. Field-based meteorology, hydrology, soil physics, plant production, solute and sediment transport, economic, and social behavior data were measured in a South Korean catchment. The data are used to parameterize suite of models describing local to landscape level water, sediment, nutrient, and monetary relationships. We focus on using the agricultural and hydrological SWAT model to synthesize the experimental field data and local-scale models throughout the catchment. The approach of our study was to describe local scientific processes, link potential interrelationships between different processes, and predict environmentally efficient management efforts. The Haean catchment case study shows how research can be structured to provide cross-disciplinary scientific linkages describing complex ecosystems and landscapes that can be used for regional management evaluations and predictions.
Stewart, Nathan L.; Konar, Brenda; Tinker, M. Tim
2015-01-01
Sea otters (Enhydra lutris) inhabiting the Aleutian Islands have stabilized at low abundance levels following a decline and currently exhibit restricted habitat-utilization patterns. Possible explanations for restricted habitat use by sea otters can be classified into two fundamentally different processes, bottom-up and top-down forcing. Bottom-up hypotheses argue that changes in the availability or nutritional quality of prey resources have led to the selective use of habitats that support the highest quality prey. In contrast, top-down hypotheses argue that increases in predation pressure from killer whales have led to the selective use of habitats that provide the most effective refuge from killer whale predation. A third hypothesis suggests that current restricted habitat use is based on a need for protection from storms. We tested all three hypotheses for restricted habitat use by comparing currently used and historically used sea otter foraging locations for: (1) prey availability and quality, (2) structural habitat complexity, and (3) exposure to prevailing storms. Our findings suggest that current use is based on physical habitat complexity and not on prey availability, prey quality, or protection from storms, providing further evidence for killer whale predation as a cause for restricted sea otter habitat use in the Aleutian Islands.
Stewart, Nathan L; Konar, Brenda; Tinker, M Tim
2015-03-01
Sea otters (Enhydra lutris) inhabiting the Aleutian Islands have stabilized at low abundance levels following a decline and currently exhibit restricted habitat-utilization patterns. Possible explanations for restricted habitat use by sea otters can be classified into two fundamentally different processes, bottom-up and top-down forcing. Bottom-up hypotheses argue that changes in the availability or nutritional quality of prey resources have led to the selective use of habitats that support the highest quality prey. In contrast, top-down hypotheses argue that increases in predation pressure from killer whales have led to the selective use of habitats that provide the most effective refuge from killer whale predation. A third hypothesis suggests that current restricted habitat use is based on a need for protection from storms. We tested all three hypotheses for restricted habitat use by comparing currently used and historically used sea otter foraging locations for: (1) prey availability and quality, (2) structural habitat complexity, and (3) exposure to prevailing storms. Our findings suggest that current use is based on physical habitat complexity and not on prey availability, prey quality, or protection from storms, providing further evidence for killer whale predation as a cause for restricted sea otter habitat use in the Aleutian Islands.
Quality and sensory acceptability of a chilled functional apple ready-dessert.
Keenan, D F; Brunton, N P; Gormley, T R; Butler, F
2012-04-01
An apple and dairy based ready-dessert with an added prebiotic was stored and chill temperatures and number of quality attributes were monitored during chill (4 °C) storage for 30 days. All ready-desserts were thermally processed by sous vide (P (90) > 10 min). The stability of the dairy component in ready-desserts was monitored by measuring volatile free fatty acids. Changes in these components were more evident in prebiotic-enriched samples compared to controls. However, no significant differences were observed over storage in control and prebiotic-enriched ready-desserts. This was supported by sensory analysis that showed no significant changes over storage in control or prebiotic-enriched samples. Of the other quality parameters, the addition of prebiotic inclusions resulted in lower L and b values and dry matter (p < 0.05), while increasing (p < 0.05) soluble solids content compared to control samples. Fluctuations in some of the quality parameters were also observed over storage. Rheological characteristics, i.e. flow behaviour (n), consistency index (K), storage (G'), loss (G″) and complex (G*) moduli were unaffected by prebiotic inclusion. However, storage affected the rheological characteristics of ready-desserts. A decrease (p < 0.05) in flow behaviour (n) led to concomitant increases in consistency index (K) and complex modulus (G*) values in control samples.
[Interventions to improve quality of life in oncological patients].
Klinkhammer-Schalke, Monika; Steinger, Brunhilde; Koller, Michael; Lindberg, Patricia
2017-05-01
The assessment of quality of life is a central aspect in the current debate in support groups, certified cancer centres, benefit assessment, and also in palliative care. Accordingly, quality of life has become an essential part of clinical trials for more than two decades. But most of the time results are presented in a descriptive manner without any concrete therapeutic consequences for the improvement of quality of life. Likewise, there are no uniform recommendations for considering quality of life data in the decision-making process. Therefore, a guide with recommendations for the assessment of quality of life in trials has been developed. Its implementation is illustrated by a complex intervention for a targeted diagnosis and therapy of quality of life in patients with breast cancer or colorectal cancer. The basis is a standardised quality of life assessment and the presentation of results in an intelligible fashion as well as the close collaboration of all healthcare providers to create regional network structures for the targeted support of patients in both the inpatient and outpatient sector. Copyright © 2017. Published by Elsevier GmbH.
User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org
Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.
2013-01-01
Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278
Using Six Sigma methodology to reduce patient transfer times from floor to critical-care beds.
Silich, Stephan J; Wetz, Robert V; Riebling, Nancy; Coleman, Christine; Khoueiry, Georges; Abi Rafeh, Nidal; Bagon, Emma; Szerszen, Anita
2012-01-01
In response to concerns regarding delays in transferring critically ill patients to intensive care units (ICU), a quality improvement project, using the Six Sigma process, was undertaken to correct issues leading to transfer delay. To test the efficacy of a Six Sigma intervention to reduce transfer time and establish a patient transfer process that would effectively enhance communication between hospital caregivers and improve the continuum of care for patients. The project was conducted at a 714-bed tertiary care hospital in Staten Island, New York. A Six Sigma multidisciplinary team was assembled to assess areas that needed improvement, manage the intervention, and analyze the results. The Six Sigma process identified eight key steps in the transfer of patients from general medical floors to critical care areas. Preintervention data and a root-cause analysis helped to establish the goal transfer-time limits of 3 h for any individual transfer and 90 min for the average of all transfers. The Six Sigma approach is a problem-solving methodology that resulted in almost a 60% reduction in patient transfer time from a general medical floor to a critical care area. The Six Sigma process is a feasible method for implementing healthcare related quality of care projects, especially those that are complex. © 2011 National Association for Healthcare Quality.
Quality of clinical trials: A moving target
Bhatt, Arun
2011-01-01
Quality of clinical trials depends on data integrity and subject protection. Globalization, outsourcing and increasing complexicity of clinical trials have made the target of achieving global quality challenging. The quality, as judged by regulatory inspections of the investigator sites, sponsors/contract research organizations and Institutional Review Board, has been of concern to the US Food and Drug Administration, as there has been hardly any change in frequency and nature of common deficiencies. To meet the regulatory expectations, the sponsors need to improve quality by developing systems with specific standards for each clinical trial process. The quality systems include: personnel roles and responsibilities, training, policies and procedures, quality assurance and auditing, document management, record retention, and reporting and corrective and preventive action. With an objective to improve quality, the FDA has planned new inspection approaches such as risk-based inspections, surveillance inspections, real-time oversight, and audit of sponsor quality systems. The FDA has partnered with Duke University for Clinical Trials Transformation Initiative, which will conduct research projects on design principles, data quality and quantity including monitoring, study start-up, and adverse event reporting. These recent initiatives will go a long way in improving quality of clinical trials. PMID:22145122
Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H
2017-03-13
Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the authors have demonstrated the capability of the method for both treatment specific QA and continuing quality improvement. Practical implications The proposed method is a valuable tool for assessing the accuracy of treatment delivery whilst also improving treatment quality and patient safety. Originality/value Assessing in vivo EPID dosimetry with SPC can be used to improve the quality of radiation treatment for cancer patients.
Perez, Louis A; Chou, Kang Wei; Love, John A; van der Poll, Thomas S; Smilgies, Detlef-M; Nguyen, Thuc-Quyen; Kramer, Edward J; Amassian, Aram; Bazan, Guillermo C
2013-11-26
Solvent additive processing can lead to drastic improvements in the power conversion efficiency (PCE) in solution processable small molecule (SPSM) bulk heterojunction solar cells. In situ grazing incidence wide-angle X-ray scattering is used to investigate the kinetics of crystallite formation during and shortly after spin casting. The additive is shown to have a complex effect on structural evolution invoking polymorphism and enhanced crystalline quality of the donor SPSM. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Mason, M D; Moore, R; Jones, G; Lewis, G; Donovan, J L; Neal, D E; Hamdy, F C; Lane, J A; Staffurth, J N
2016-09-01
The treatment of prostate cancer has evolved markedly over the last 40 years, including radiotherapy, notably with escalated dose and targeting. However, the optimal treatment for localised disease has not been established in comparative randomised trials. The aim of this article is to describe the history of prostate radiotherapy trials, including their quality assurance processes, and to compare these with the ProtecT trial. The UK ProtecT randomised trial compares external beam conformal radiotherapy, surgery and active monitoring for clinically localised prostate cancer and will report on the primary outcome (disease-specific mortality) in 2016 following recruitment between 1999 and 2009. The embedded quality assurance programme consists of on-site machine dosimetry at the nine trial centres, a retrospective review of outlining and adherence to dose constraints based on the trial protocol in 54 participants (randomly selected, around 10% of the total randomised to radiotherapy, n = 545). These quality assurance processes and results were compared with prostate radiotherapy trials of a comparable era. There has been an increasingly sophisticated quality assurance programme in UK prostate radiotherapy trials over the last 15 years, reflecting dose escalation and treatment complexity. In ProtecT, machine dosimetry results were comparable between trial centres and with the UK RT01 trial. The outlining review showed that most deviations were clinically acceptable, although three (1.4%) may have been of clinical significance and were related to outlining of the prostate. Seminal vesicle outlining varied, possibly due to several prostate trials running concurrently with different protocols. Adherence to dose constraints in ProtecT was considered acceptable, with 80% of randomised participants having two or less deviations and planning target volume coverage was excellent. The ProtecT trial quality assurance results were satisfactory and comparable with trials of its era. Future trials should aim to standardise treatment protocols and quality assurance programmes where possible to reduce complexities for centres involved in multiple trials. Copyright © 2016. Published by Elsevier Ltd.
Variability in Rheumatology day care hospitals in Spain: VALORA study.
Hernández Miguel, María Victoria; Martín Martínez, María Auxiliadora; Corominas, Héctor; Sanchez-Piedra, Carlos; Sanmartí, Raimon; Fernandez Martinez, Carmen; García-Vicuña, Rosario
To describe the variability of the day care hospital units (DCHUs) of Rheumatology in Spain, in terms of structural resources and operating processes. Multicenter descriptive study with data from a self-completed questionnaire of DCHUs self-assessment based on DCHUs quality standards of the Spanish Society of Rheumatology. Structural resources and operating processes were analyzed and stratified by hospital complexity (regional, general, major and complex). Variability was determined using the coefficient of variation (CV) of the variable with clinical relevance that presented statistically significant differences when was compared by centers. A total of 89 hospitals (16 autonomous regions and Melilla) were included in the analysis. 11.2% of hospitals are regional, 22,5% general, 27%, major and 39,3% complex. A total of 92% of DCHUs were polyvalent. The number of treatments applied, the coordination between DCHUs and hospital pharmacy and the post graduate training process were the variables that showed statistically significant differences depending on the complexity of hospital. The highest rate of rheumatologic treatments was found in complex hospitals (2.97 per 1,000 population), and the lowest in general hospitals (2.01 per 1,000 population). The CV was 0.88 in major hospitals; 0.86 in regional; 0.76 in general, and 0.72 in the complex. there was variability in the number of treatments delivered in DCHUs, being greater in major hospitals and then in regional centers. Nonetheless, the variability in terms of structure and function does not seem due to differences in center complexity. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
NASA Astrophysics Data System (ADS)
Ghasemi Nejhad, M. N.
1993-04-01
The on-line consolidation of thermoplastic composites is a relatively new technology that can be used to manufacture composite parts with complex geometries. The localized melting/solidification technique employed in this process can reduce the residual stresses and allow for improved dimensional stability and performance. An additional advantage of this technique is the elimination of the curing steps which are necessary in the processing of thermoset-matrix composites. This article presents the effects of processing parameters on processability in on-line consolidation of thermoplastic composites for tape-laying and filament-winding processes employing anisotropic thermal analyses. The results show that the heater size, preheating conditions, and tow thickness can significantly affect the processing window which, in turn, affects the production rate and the quality of the parts.
Ngantcha, Marcus; Le-Pogam, Marie-Annick; Calmus, Sophie; Grenier, Catherine; Evrard, Isabelle; Lamarche-Vadel, Agathe; Rey, Grégoire
2017-08-22
Results of associations between process and mortality indicators, both used for the external assessment of hospital care quality or public reporting, differ strongly across studies. However, most of those studies were conducted in North America or United Kingdom. Providing new evidence based on French data could fuel the international debate on quality of care indicators and help inform French policy-makers. The objective of our study was to explore whether optimal care delivery in French hospitals as assessed by their Hospital Process Indicators (HPIs) is associated with low Hospital Standardized Mortality Ratios (HSMRs). The French National Authority for Health (HAS) routinely collects for each hospital located in France, a set of mandatory HPIs. Five HPIs were selected among the process indicators collected by the HAS in 2009. They were measured using random samples of 60 to 80 medical records from inpatients admitted between January 1st, 2009 and December 31, 2009 in respect with some selection criteria. HSMRs were estimated at 30, 60 and 90 days post-admission (dpa) using administrative health data extracted from the national health insurance information system (SNIIR-AM) which covers 77% of the French population. Associations between HPIs and HSMRs were assessed by Poisson regression models corrected for measurement errors with a simulation-extrapolation (SIMEX) method. Most associations studied were not statistically significant. Only two process indicators were found associated with HSMRs. Completeness and quality of anesthetic records was negatively associated with 30 dpa HSMR (0.72 [0.52-0.99]). Early detection of nutritional disorders was negatively associated with all HSMRs: 30 dpa HSMR (0.71 [0.54-0.95]), 60 dpa HSMR (0.51 [0.39-0.67]) and 90 dpa HSMR (0.52 [0.40-0.68]). In absence of gold standard of quality of care measurement, the limited number of associations suggested to drive in-depth improvements in order to better determine associations between process and mortality indicators. A smart utilization of both process and outcomes indicators is mandatory to capture aspects of the hospital quality of care complexity.
Atwood, Robert C.; Bodey, Andrew J.; Price, Stephen W. T.; Basham, Mark; Drakopoulos, Michael
2015-01-01
Tomographic datasets collected at synchrotrons are becoming very large and complex, and, therefore, need to be managed efficiently. Raw images may have high pixel counts, and each pixel can be multidimensional and associated with additional data such as those derived from spectroscopy. In time-resolved studies, hundreds of tomographic datasets can be collected in sequence, yielding terabytes of data. Users of tomographic beamlines are drawn from various scientific disciplines, and many are keen to use tomographic reconstruction software that does not require a deep understanding of reconstruction principles. We have developed Savu, a reconstruction pipeline that enables users to rapidly reconstruct data to consistently create high-quality results. Savu is designed to work in an ‘orthogonal’ fashion, meaning that data can be converted between projection and sinogram space throughout the processing workflow as required. The Savu pipeline is modular and allows processing strategies to be optimized for users' purposes. In addition to the reconstruction algorithms themselves, it can include modules for identification of experimental problems, artefact correction, general image processing and data quality assessment. Savu is open source, open licensed and ‘facility-independent’: it can run on standard cluster infrastructure at any institution. PMID:25939626
Role of future scenarios in understanding deep uncertainty in long-term air quality management.
Gamas, Julia; Dodder, Rebecca; Loughlin, Dan; Gage, Cynthia
2015-11-01
The environment and its interactions with human systems, whether economic, social, or political, are complex. Relevant drivers may disrupt system dynamics in unforeseen ways, making it difficult to predict future conditions. This kind of "deep uncertainty" presents a challenge to organizations faced with making decisions about the future, including those involved in air quality management. Scenario Planning is a structured process that involves the development of narratives describing alternative future states of the world, designed to differ with respect to the most critical and uncertain drivers. The resulting scenarios are then used to understand the consequences of those futures and to prepare for them with robust management strategies. We demonstrate a novel air quality management application of Scenario Planning. Through a series of workshops, important air quality drivers were identified. The most critical and uncertain drivers were found to be "technological development" and "change in societal paradigms." These drivers were used as a basis to develop four distinct scenario storylines. The energy and emissions implications of each storyline were then modeled using the MARKAL energy system model. NOx emissions were found to decrease for all scenarios, largely a response to existing air quality regulations, whereas SO2 emissions ranged from 12% greater to 7% lower than 2015 emissions levels. Future-year emissions differed considerably from one scenario to another, however, with key differentiating factors being transition to cleaner fuels and energy demand reductions. Application of scenarios in air quality management provides a structured means of sifting through and understanding the dynamics of the many complex driving forces affecting future air quality. Further, scenarios provide a means to identify opportunities and challenges for future air quality management, as well as a platform for testing the efficacy and robustness of particular management options across wide-ranging conditions.
Investigation on the innovative impact hydroforming technology
NASA Astrophysics Data System (ADS)
Lihui, Lang; Shaohua, Wang; Chunlei, Yang
2013-05-01
Hydroforming has a rapid development recently which has good forming quality and less cost. However, it still cannot meet the requirements of forming complex parts with small features just like convex tables, or bars which are widely employed in automotive and aircraft industries. The impact hydroforming technology means the most features are formed by hydroforming and the small features are rapidly reshaped by high intensity impact energy in a very short time after the traditional hydroforming. The impact pressure rises to the peak in 10ms which belongs to dynamic loading. In this paper, impact hydroforming process is proposed. The generation and transmission of impact hydroforming energy and impact shock wave were studied and simulated. The deformation process of the metal disks under the dynamic impact loading condition presented impact hydroforming is an effective technology to form complex parts with small features.
NASA Technical Reports Server (NTRS)
Peck, Charles C.; Dhawan, Atam P.; Meyer, Claudia M.
1991-01-01
A genetic algorithm is used to select the inputs to a neural network function approximator. In the application considered, modeling critical parameters of the space shuttle main engine (SSME), the functional relationship between measured parameters is unknown and complex. Furthermore, the number of possible input parameters is quite large. Many approaches have been used for input selection, but they are either subjective or do not consider the complex multivariate relationships between parameters. Due to the optimization and space searching capabilities of genetic algorithms they were employed to systematize the input selection process. The results suggest that the genetic algorithm can generate parameter lists of high quality without the explicit use of problem domain knowledge. Suggestions for improving the performance of the input selection process are also provided.
Gershengorn, Hayley B; Kocher, Robert; Factor, Phillip
2014-02-01
The business community has developed strategies to ensure the quality of the goods or services they produce and to improve the management of multidisciplinary work teams. With modification, many of these techniques can be imported into intensive care units (ICUs) to improve clinical operations and patient safety. In Part I of a three-part ATS Seminar series, we argue for adopting business management strategies in ICUs and set forth strategies for targeting selected quality improvement initiatives. These tools are relevant to health care today as focus is placed on limiting low-value care and measuring, reporting, and improving quality. In the ICU, the complexity of illness and the need to standardize processes make these tools even more appealing. Herein, we highlight four techniques to help prioritize initiatives. First, the "80/20 rule" mandates focus on the few (20%) interventions likely to drive the majority (80%) of improvement. Second, benchmarking--a process of comparison with peer units or institutions--is essential to identifying areas of strength and weakness. Third, root cause analyses, in which structured retrospective reviews of negative events are performed, can be used to identify and fix systems errors. Finally, failure mode and effects analysis--a process aimed at prospectively identifying potential sources of error--allows for systems fixes to be instituted in advance to prevent negative outcomes. These techniques originated in fields other than health care, yet adoption has and can help ICU managers prioritize issues for quality improvement.
[Review on HSPF model for simulation of hydrology and water quality processes].
Li, Zhao-fu; Liu, Hong-Yu; Li, Yan
2012-07-01
Hydrological Simulation Program-FORTRAN (HSPF), written in FORTRAN, is one ol the best semi-distributed hydrology and water quality models, which was first developed based on the Stanford Watershed Model. Many studies on HSPF model application were conducted. It can represent the contributions of sediment, nutrients, pesticides, conservatives and fecal coliforms from agricultural areas, continuously simulate water quantity and quality processes, as well as the effects of climate change and land use change on water quantity and quality. HSPF consists of three basic application components: PERLND (Pervious Land Segment) IMPLND (Impervious Land Segment), and RCHRES (free-flowing reach or mixed reservoirs). In general, HSPF has extensive application in the modeling of hydrology or water quality processes and the analysis of climate change and land use change. However, it has limited use in China. The main problems with HSPF include: (1) some algorithms and procedures still need to revise, (2) due to the high standard for input data, the accuracy of the model is limited by spatial and attribute data, (3) the model is only applicable for the simulation of well-mixed rivers, reservoirs and one-dimensional water bodies, it must be integrated with other models to solve more complex problems. At present, studies on HSPF model development are still undergoing, such as revision of model platform, extension of model function, method development for model calibration, and analysis of parameter sensitivity. With the accumulation of basic data and imorovement of data sharing, the HSPF model will be applied more extensively in China.
NASA Astrophysics Data System (ADS)
Katchasuwanmanee, Kanet; Cheng, Kai; Bateman, Richard
2016-09-01
As energy efficiency is one of the key essentials towards sustainability, the development of an energy-resource efficient manufacturing system is among the great challenges facing the current industry. Meanwhile, the availability of advanced technological innovation has created more complex manufacturing systems that involve a large variety of processes and machines serving different functions. To extend the limited knowledge on energy-efficient scheduling, the research presented in this paper attempts to model the production schedule at an operation process by considering the balance of energy consumption reduction in production, production work flow (productivity) and quality. An innovative systematic approach to manufacturing energy-resource efficiency is proposed with the virtual simulation as a predictive modelling enabler, which provides real-time manufacturing monitoring, virtual displays and decision-makings and consequentially an analytical and multidimensional correlation analysis on interdependent relationships among energy consumption, work flow and quality errors. The regression analysis results demonstrate positive relationships between the work flow and quality errors and the work flow and energy consumption. When production scheduling is controlled through optimization of work flow, quality errors and overall energy consumption, the energy-resource efficiency can be achieved in the production. Together, this proposed multidimensional modelling and analysis approach provides optimal conditions for the production scheduling at the manufacturing system by taking account of production quality, energy consumption and resource efficiency, which can lead to the key competitive advantages and sustainability of the system operations in the industry.
Feedstock powder processing research needs for additive manufacturing development
Anderson, Iver E.; White, Emma M. H.; Dehoff, Ryan
2018-02-01
Additive manufacturing (AM) promises to redesign traditional manufacturing by enabling the ultimate in agility for rapid component design changes in commercial products and for fabricating complex integrated parts. Here, by significantly increasing quality and yield of metallic alloy powders, the pace for design, development, and deployment of the most promising AM approaches can be greatly accelerated, resulting in rapid commercialization of these advanced manufacturing methods. By successful completion of a critical suite of processing research tasks that are intended to greatly enhance gas atomized powder quality and the precision and efficiency of powder production, researchers can help promote continued rapidmore » growth of AM. Finally, other powder-based or spray-based advanced manufacturing methods could also benefit from these research outcomes, promoting the next wave of sustainable manufacturing technologies for conventional and advanced materials.« less
Feedstock powder processing research needs for additive manufacturing development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Iver E.; White, Emma M. H.; Dehoff, Ryan
Additive manufacturing (AM) promises to redesign traditional manufacturing by enabling the ultimate in agility for rapid component design changes in commercial products and for fabricating complex integrated parts. Here, by significantly increasing quality and yield of metallic alloy powders, the pace for design, development, and deployment of the most promising AM approaches can be greatly accelerated, resulting in rapid commercialization of these advanced manufacturing methods. By successful completion of a critical suite of processing research tasks that are intended to greatly enhance gas atomized powder quality and the precision and efficiency of powder production, researchers can help promote continued rapidmore » growth of AM. Finally, other powder-based or spray-based advanced manufacturing methods could also benefit from these research outcomes, promoting the next wave of sustainable manufacturing technologies for conventional and advanced materials.« less
Experiencing health care service quality: through patients' eyes.
Schembri, Sharon
2015-02-01
The primary aim of the present study was to consider health care service quality from the patients' perspective, specifically through the patient's eyes. A narrative analysis was performed on 300 patient stories. This rigorous analysis of patient stories is designed to identify and describe health care service quality through patients' eyes in an authentic and accurate, experiential manner. The findings show that there are variant and complex ways that patients experience health care service quality. Patient stories offer an authentic view of the complex ways that patients experience health care service quality. Narrative analysis is a useful tool to identify and describe how patients experience health care service quality. Patients experience health care service quality in complex and varying ways.
1980-12-05
planning process. Do the positive attributes of high complexity weapons outweigh their negative qualities’ What effect does our investment in...account for the future consequences of current decisions. We advocate increased budgets because we perceive a growing threat, yet at the same time we...Defense and therefore should not be construed as reflecting an official position of the Department. Thi3 dn .,ment has been opproved for public
Assurance Evaluation for OSS Adoption in a Telco Context
NASA Astrophysics Data System (ADS)
Ardagna, Claudio A.; Banzi, Massimo; Damiani, Ernesto; El Ioini, Nabil; Frati, Fulvio
Software Assurance (SwA) is a complex concept that involves different stages of a software development process and may be defined differently depending on its focus, as for instance software quality, security, or dependability. In Computer Science, the term assurance is referred to all activities necessary to provide enough confidence that a software product will satisfy its users’ functional and non-functional requirements.
Pharmaceutical process chemistry: evolution of a contemporary data-rich laboratory environment.
Caron, Stéphane; Thomson, Nicholas M
2015-03-20
Over the past 20 years, the industrial laboratory environment has gone through a major transformation in the industrial process chemistry setting. In order to discover and develop robust and efficient syntheses and processes for a pharmaceutical portfolio with growing synthetic complexity and increased regulatory expectations, the round-bottom flask and other conventional equipment familiar to a traditional organic chemistry laboratory are being replaced. The new process chemistry laboratory fosters multidisciplinary collaborations by providing a suite of tools capable of delivering deeper process understanding through mechanistic insights and detailed kinetics translating to greater predictability at scale. This transformation is essential to the field of organic synthesis in order to promote excellence in quality, safety, speed, and cost efficiency in synthesis.
Vaccine provision: Delivering sustained & widespread use.
Preiss, Scott; Garçon, Nathalie; Cunningham, Anthony L; Strugnell, Richard; Friedland, Leonard R
2016-12-20
The administration of a vaccine to a recipient is the final step in a development and production process that may have begun several decades earlier. Here we describe the scale and complexity of the processes that brings a candidate vaccine through clinical development to the recipient. These challenges include ensuring vaccine quality (between 100 and 500 different Quality Control tests are performed during production to continually assess safety, potency and purity); making decisions about optimal vaccine presentation (pre-filled syringes versus multi-dose vials) that affect capacity and supply; and the importance of maintaining the vaccine cold chain (most vaccines have stringent storage temperature requirements necessary to maintain activity and potency). The ultimate aim is to make sure that an immunogenic product matching the required specifications reaches the recipient. The process from concept to licensure takes 10-30years. Vaccine licensure is based on a file submitted to regulatory agencies which contains the comprehensive compilation of chemistry, manufacturing information, assay procedures, preclinical and clinical trial results, and proposals for post-licensure effectiveness and safety data collection. Expedited development and licensure pathways may be sought in emergency settings: e.g., the 2009 H1N1 influenza pandemic, the 2014 West African Ebola outbreak and meningococcal serogroup B meningitis outbreaks in the United States and New Zealand. Vaccines vary in the complexity of their manufacturing process. Influenza vaccines are particularly challenging to produce and delays in manufacturing may occur, leading to vaccine shortages during the influenza season. Shortages can be difficult to resolve due to long manufacturing lead times and stringent, but variable, local regulations. New technologies are driving the development of new vaccines with simplified manufacturing requirements and with quality specifications that can be confirmed with fewer tests. These technologies could have far-reaching effects on supply, cost of goods, and on response timing to a medical need until product availability. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
The impact of the Bologna process on nursing higher education in Europe: a review.
Collins, Shawn; Hewer, Ian
2014-01-01
Changes are occurring in global higher education. Nursing is not exempt from these changes, and must adapt in order to be competitive in a global market. The Bologna process has been integral in the last decade in modernizing European higher education. However, modernization does not occur without challenges. This paper addresses the Bologna process and the challenges it presents to nursing higher education in Europe. To describe the Bologna Process as it relates to European nursing education. Literature review via searches of the following electronic databases: Academic Search Premier, MEDLINE, PubMed, ERIC, and CINAHL. Search criteria included Bologna process, European higher education, nursing education, quality assurance, and ECTS. Twenty-four peer-reviewed articles were included as well as one peer-reviewed presentation, one commission report, and one book. Further investigation is required to address the complexities of the Bologna process and its evolutionary changes as it relates to nursing education in Europe. Change is not always easy, and is often complex, especially as it relates to cross-border education that involves governmental regulation. Bologna-member countries need to adapt to the ever-changing higher education environment or fall behind. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Kassapoglou, Christos; Dinicola, Al J.; Chou, Jack C.
1992-01-01
The autoclave based THERM-X(sub R) process was evaluated by cocuring complex curved panels with frames and stiffeners. The process was shown to result in composite parts of high quality with good compaction at sharp radius regions and corners of intersecting parts. The structural properties of the postbuckled panels fabricated were found to be equivalent to those of conventionally tooled hand laid-up parts. Significant savings in bagging time over conventional tooling were documented. Structural details such as cocured shear ties and embedded stiffener flanges in the skin were found to suppress failure modes such as failure at corners of intersecting members and skin stiffeners separation.
NASA Technical Reports Server (NTRS)
Schaefer, Jacob; Hanson, Curt; Johnson, Marcus A.; Nguyen, Nhan
2011-01-01
Three model reference adaptive controllers (MRAC) with varying levels of complexity were evaluated on a high performance jet aircraft and compared along with a baseline nonlinear dynamic inversion controller. The handling qualities and performance of the controllers were examined during failure conditions that induce coupling between the pitch and roll axes. Results from flight tests showed with a roll to pitch input coupling failure, the handling qualities went from Level 2 with the baseline controller to Level 1 with the most complex MRAC tested. A failure scenario with the left stabilator frozen also showed improvement with the MRAC. Improvement in performance and handling qualities was generally seen as complexity was incrementally added; however, added complexity usually corresponds to increased verification and validation effort required for certification. The tradeoff between complexity and performance is thus important to a controls system designer when implementing an adaptive controller on an aircraft. This paper investigates this relation through flight testing of several controllers of vary complexity.
Velázquez, Rocío; Zamora, Emiliano; Álvarez, Manuel; Álvarez, María L; Ramírez, Manuel
2016-10-01
The quality of traditional sparkling-wine depends on the aging process in the presence of dead yeast cells. These cells undergo a slow autolysis process thereby releasing some compounds, mostly colloidal polymers such as polysaccharides and mannoproteins, which influence the wine's foam properties and mouthfeel. Saccharomyces cerevisiae killer yeasts were tested to increase cell death and autolysis during mixed-yeast-inoculated second fermentation and aging. These yeasts killed sensitive strains in killer plate assays done under conditions of low pH and temperature similar to those used in sparkling-wine making, although some strains showed a different killer behaviour during the second fermentation. The fast killer effect improved the foam quality and mouthfeel of the mixed-inoculated wines, while the slow killer effect gave small improvements over single-inoculated wines. The effect was faster under high-pressure than under low-pressure conditions. Wine quality improvement did not correlate with the polysaccharide, protein, mannan, or aromatic compound concentrations, suggesting that the mouthfeel and foaming quality of sparkling wine are very complex properties influenced by other wine compounds and their interactions, as well as probably by the specific chemical composition of a given wine. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nagurney, Anna; Besik, Deniz; Yu, Min
2018-04-01
In this paper, we construct a competitive food supply chain network model in which the profit-maximizing producers decide not only as to the volume of fresh produce produced and distributed using various supply chain network pathways, but they also decide, with the associated costs, on the initial quality of the fresh produce. Consumers, in turn, respond to the various producers' product outputs through the prices that they are willing to pay, given also the average quality associated with each producer or brand at the retail outlets. The quality of the fresh produce is captured through explicit formulae that incorporate time, temperature, and other link characteristics with links associated with processing, shipment, storage, etc. Capacities on links are also incorporated as well as upper bounds on the initial product quality of the firms at their production/harvesting sites. The governing concept of the competitive supply chain network model is that of Nash Equilibrium, for which alternative variational inequality formulations are derived, along with existence results. An algorithmic procedure, which can be interpreted as a discrete-time tatonnement process, is then described and applied to compute the equilibrium produce flow patterns and accompanying link Lagrange multipliers in a realistic case study, focusing on peaches, which includes disruptions.
Dy, Sydney M; Purnell, Tanjala S
2012-02-01
High-quality provider-patient decision-making is key to quality care for complex conditions. We performed an analysis of key elements relevant to quality and complex, shared medical decision-making. Based on a search of electronic databases, including Medline and the Cochrane Library, as well as relevant articles' reference lists, reviews of tools, and annotated bibliographies, we developed a list of key concepts and applied them to a decision-making example. Key concepts identified included provider competence, trustworthiness, and cultural competence; communication with patients and families; information quality; patient/surrogate competence; and roles and involvement. We applied this concept list to a case example, shared decision-making for live donor kidney transplantation, and identified the likely most important concepts as provider and cultural competence, information quality, and communication with patients and families. This concept list may be useful for conceptualizing the quality of complex shared decision-making and in guiding research in this area. Copyright © 2011 Elsevier Ltd. All rights reserved.
Contribution au developpement d'une methode de controle des procedes dans une usine de bouletage
NASA Astrophysics Data System (ADS)
Gosselin, Claude
This thesis, a collaborative effort between Ecole de technologie superieure and ArcelorMittal Company, presents the development of a methodology for monitoring and quality control of multivariable industrial production processes. This innovation research mandate was developed at ArcelorMittal Exploitation Miniere (AMEM) pellet plant in Port-Cartier (Quebec, Canada). With this undertaking, ArcelorMittal is striving to maintain its world class level of excellence and continues to pursue initiatives that can augment its competitive advantage worldwide. The plant's gravimetric classification process was retained as a prototype and development laboratory due to its effect on the company's competitiveness and its impact on subsequent steps leading to final production of iron oxide pellets. Concretely, the development of this expertise in process control and in situ monitoring will establish a firm basic knowledge in the fields of complex system physical modeling, data reconciliation, statistical observers, multivariate command and quality control using real-time monitoring of the desirability function. The hydraulic classifier is mathematically modeled. Using planned disturbances on the production line, an identification procedure was established to provide empirical estimations of the model's structural parameters. A new sampling campaign and a previously unpublished data collection and consolidation policy were implemented plant-wide. Access to these invaluable data sources has enabled the establishment of new thresholds that govern the production process and its control. Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this function as an indicator of overall (economic) satisfaction in the production process, but rather in proposing it as an "observer" of the system's state. The first implementation steps have already demonstrated the method's feasibility as well as other numerous industrial impacts on production processes within the company. Namely, the emergence of the economical aspect as a strategic variable that assures better governance of production processes where quality variables present strategic issues.
Protocol for a realist review of workplace learning in postgraduate medical education and training.
Wiese, Anel; Kilty, Caroline; Bergin, Colm; Flood, Patrick; Fu, Na; Horgan, Mary; Higgins, Agnes; Maher, Bridget; O'Kane, Grainne; Prihodova, Lucia; Slattery, Dubhfeasa; Bennett, Deirdre
2017-01-19
Postgraduate medical education and training (PGMET) is a complex social process which happens predominantly during the delivery of patient care. The clinical learning environment (CLE), the context for PGMET, shapes the development of the doctors who learn and work within it, ultimately impacting the quality and safety of patient care. Clinical workplaces are complex, dynamic systems in which learning emerges from non-linear interactions within a network of related factors and activities. Those tasked with the design and delivery of postgraduate medical education and training need to understand the relationship between the processes of medical workplace learning and these contextual elements in order to optimise conditions for learning. We propose to conduct a realist synthesis of the literature to address the overarching questions; how, why and in what circumstances do doctors learn in clinical environments? This review is part of a funded projected with the overall aim of producing guidelines and recommendations for the design of high quality clinical learning environments for postgraduate medical education and training. We have chosen realist synthesis as a methodology because of its suitability for researching complexity and producing answers useful to policymakers and practitioners. This realist synthesis will follow the steps and procedures outlined by Wong et al. in the RAMESES Publication Standards for Realist Synthesis and the Realist Synthesis RAMESES Training Materials. The core research team is a multi-disciplinary group of researchers, clinicians and health professions educators. The wider research group includes experts in organisational behaviour and human resources management as well as the key stakeholders; doctors in training, patient representatives and providers of PGMET. This study will draw from the published literature and programme, and substantive, theories of workplace learning, to describe context, mechanism and outcome configurations for PGMET. This information will be useful to policymakers and practitioners in PGMET, who will be able to apply our findings within their own contexts. Improving the quality of clinical learning environments can improve the performance, humanism and wellbeing of learners and improve the quality and safety of patient care. The review is not registered with the PROSPERO International Prospective Register of Systematic Reviews as the review objectives relate solely to education outcomes.
Implications of Modeling Uncertainty for Water Quality Decision Making
NASA Astrophysics Data System (ADS)
Shabman, L.
2002-05-01
The report, National Academy of Sciences report, "Assessing the TMDL Approach to Water Quality Management" endorsed the "watershed" and "ambient water quality focused" approach" to water quality management called for in the TMDL program. The committee felt that available data and models were adequate to move such a program forward, if the EPA and all stakeholders better understood the nature of the scientific enterprise and its application to the TMDL program. Specifically, the report called for a greater acknowledgement of model prediction uncertinaity in making and implementing TMDL plans. To assure that such uncertinaity was addressed in water quality decision making the committee called for a commitment to "adaptive implementation" of water quality management plans. The committee found that the number and complexity of the interactions of multiple stressors, combined with model prediction uncertinaity means that we need to avoid the temptation to make assurances that specific actions will result in attainment of particular water quality standards. Until the work on solving a water quality problem begins, analysts and decision makers cannot be sure what the correct solutions are, or even what water quality goals a community should be seeking. In complex systems we need to act in order to learn; adaptive implementation is a concurrent process of action and learning. Learning requires (1) continued monitoring of the waterbody to determine how it responds to the actions taken and (2) carefully designed experiments in the watershed. If we do not design learning into what we attempt we are not doing adaptive implementation. Therefore, there needs to be an increased commitment to monitoring and experiments in watersheds that will lead to learning. This presentation will 1) explain the logic for adaptive implementation; 2) discuss the ways that water quality modelers could characterize and explain model uncertinaity to decision makers; 3) speculate on the implications of the adaptive implementation for setting of water quality standards, for design of watershed monitoring programs and for the regulatory rules governing the TMDL program implementation.
Comparison of patients' assessments of the quality of stroke care with audit findings.
Howell, Esther; Graham, Chris; Hoffman, A; Lowe, D; McKevitt, Christopher; Reeves, Rachel; Rudd, A G
2007-12-01
To determine the extent of correlation between stroke patients' experiences of hospital care with the quality of services assessed in a national audit. Patients' assessments of their care derived from survey data were linked to data obtained in the National Sentinel Stroke Audit 2004 for 670 patients in 51 English NHS trusts. A measure of patients' experience of hospital stroke care was derived by summing responses to 31 survey items and grouping these into three broad concept domains: quality of care; information; and relationships with staff. Audit data were extracted from hospital admissions data and management information to assess the organisation of services, and obtained retrospectively from patient records to evaluate the delivery of care. Patient survey responses were compared with audit measures of organisation of care and compliance with clinical process standards. Patient experience scores were positively correlated with clinicians' assessment of the organisational quality of stroke care, but were largely unrelated to clinical process standards. Responses to individual questions regarding communication about diagnosis revealed a discrepancy between clinicians' and patients' reports. Better organised stroke care is associated with more positive patient experiences. Examining areas of disparity between patients' and clinicians' reports is important for understanding the complex nature of healthcare and for identifying areas for quality improvement. Future evaluations of the quality of stroke services should include a validated patient experience survey in addition to audit of clinical records.
The binary protein-protein interaction landscape of Escherichia coli
Rajagopala, Seesandra V.; Vlasblom, James; Arnold, Roland; Franca-Koh, Jonathan; Pakala, Suman B.; Phanse, Sadhna; Ceol, Arnaud; Häuser, Roman; Siszler, Gabriella; Wuchty, Stefan; Emili, Andrew; Babu, Mohan; Aloy, Patrick; Pieper, Rembert; Uetz, Peter
2014-01-01
Efforts to map the Escherichia coli interactome have identified several hundred macromolecular complexes, but direct binary protein-protein interactions (PPIs) have not been surveyed on a large scale. Here we performed yeast two-hybrid screens of 3,305 baits against 3,606 preys (~70% of the E. coli proteome) in duplicate to generate a map of 2,234 interactions, approximately doubling the number of known binary PPIs in E. coli. Integration of binary PPIs and genetic interactions revealed functional dependencies among components involved in cellular processes, including envelope integrity, flagellum assembly and protein quality control. Many of the binary interactions that could be mapped within multi-protein complexes were informative regarding internal topology and indicated that interactions within complexes are significantly more conserved than those interactions connecting different complexes. This resource will be useful for inferring bacterial gene function and provides a draft reference of the basic physical wiring network of this evolutionarily significant model microbe. PMID:24561554
Regulatory challenges and approaches to characterize nanomedicines and their follow-on similars.
Mühlebach, Stefan; Borchard, Gerrit; Yildiz, Selcan
2015-03-01
Nanomedicines are highly complex products and are the result of difficult to control manufacturing processes. Nonbiological complex drugs and their biological counterparts can comprise nanoparticles and therefore show nanomedicine characteristics. They consist of not fully known nonhomomolecular structures, and can therefore not be characterized by physicochemical means only. Also, intended copies of nanomedicines (follow-on similars) may have clinically meaningful differences, creating the regulatory challenge of how to grant a high degree of assurance for patients' benefit and safety. As an example, the current regulatory approach for marketing authorization of intended copies of nonbiological complex drugs appears inappropriate; also, a valid strategy incorporating the complexity of such systems is undefined. To demonstrate sufficient similarity and comparability, a stepwise quality, nonclinical and clinical approach is necessary to obtain market authorization for follow-on products as therapeutic alternatives, substitution and/or interchangeable products. To fill the regulatory gap, harmonized and science-based standards are needed.
A Mixed-Methods Research Framework for Healthcare Process Improvement.
Bastian, Nathaniel D; Munoz, David; Ventura, Marta
2016-01-01
The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.
Baruah, Ananta Madhab; Mahanta, Pradip Kumar
2003-10-22
Changes in the specific activities of polyphenol oxidase (PPO), peroxidase (POD), and protease and in the relative amounts of flavan-3-ols for eight genetically derived cultivated teas at various stages of leaf maturity and in four succescive seasons were examined. A series of investigations were carried out to study the cross-reactivity of complex polyphenols and PPO-generated orange-yellow theaflavins, as well as of POD oxidized substrates, producing brown so-called thearubigins during fermented tea processing. From the estimation of five major catechins, PPO activities in young shoots, and theaflavin and thearubigin contents of crushed, torn, and curled (CTC) black teas, the superior variety and flavorful flush characteristics were refined. Notable protein hydrolysis by endogenous protease as measured from free amino acids and formation of tannin-protein complex (browning products) was obtained for cultivar character and product quality. Results showed that process optimization with respect to time, temperature, moisture, and pH maximizes PPO-catalyzed desirable theaflavin pigments, whereas POD-mediated chemical reaction produces dull color.
USDA-ARS?s Scientific Manuscript database
In rice (Oryza sativa L.), end-use/cooking quality is vital for producers and millions of consumers worldwide. Grain quality is a complex trait with interacting genetic and environmental factors. Deciphering the complex genetic architecture associated with grain quality, will provide vital informati...
Design Automation Using Script Languages. High-Level CAD Templates in Non-Parametric Programs
NASA Astrophysics Data System (ADS)
Moreno, R.; Bazán, A. M.
2017-10-01
The main purpose of this work is to study the advantages offered by the application of traditional techniques of technical drawing in processes for automation of the design, with non-parametric CAD programs, provided with scripting languages. Given that an example drawing can be solved with traditional step-by-step detailed procedures, is possible to do the same with CAD applications and to generalize it later, incorporating references. In today’s modern CAD applications, there are striking absences of solutions for building engineering: oblique projections (military and cavalier), 3D modelling of complex stairs, roofs, furniture, and so on. The use of geometric references (using variables in script languages) and their incorporation into high-level CAD templates allows the automation of processes. Instead of repeatedly creating similar designs or modifying their data, users should be able to use these templates to generate future variations of the same design. This paper presents the automation process of several complex drawing examples based on CAD script files aided with parametric geometry calculation tools. The proposed method allows us to solve complex geometry designs not currently incorporated in the current CAD applications and to subsequently create other new derivatives without user intervention. Automation in the generation of complex designs not only saves time but also increases the quality of the presentations and reduces the possibility of human errors.
NASA Astrophysics Data System (ADS)
Lerner, R. N.; Lerner, D. N.; Surridge, B.; Paetzold, A.; Harris, B.; Anderson, C. W.
2005-12-01
In Europe, the Water Framework Directive (WFD) is providing a powerful regulatory driver to adopt integrated catchment management, and so pressurizing researchers to build suitable supporting tools. The WFD requires agencies to drive towards `good ecological quality' by 2015. After the initial step of characterising water bodies and the pressures on them, the next substantive step is the preparation of river basin management plans and proposed programmes of measures by 2009. Ecological quality is a complex concept and poorly defined, unless it is taken as a simple measure such as the abundance of a particular species of organism. There is clearly substantial work to do to build a practical but sound definition of ecological quality; practical in the sense of being easy to measure and explain to stakeholders, and sound in the sense that it reflects ecological complexity within catchments, the variability between catchments, and the conflicts demands for goods and services that human society places upon the ecological system. However ecological quality is defined, it will be driven by four interacting groups of factors. These represent the physical, chemical, ecological and socio-economic environments within and encompassing the catchment. Some of these groupings are better understood than others, for example hydrological processes and the transport of solutes are reasonably understood, even though they remain research areas in their own right. There are much larger gaps in our understanding at the interfaces, i.e. predicting how, for example, hydrological processes such as flow and river morphology influence ecological quality. Overall, it is clear we are not yet in a position to build deterministic models of the overall ecological behaviour of catchment. But we need predictive tools to support catchment management agencies in preparing robust plans. This poster describes our current exploration of soft modelling options to build a comprehensive macro-ecological model of UK catchments. This is taking place within the Catchment Science Centre, a joint venture between the University of Sheffield and the Environment Agency.
Husebo, Bettina S; Flo, Elisabeth; Aarsland, Dag; Selbaek, Geir; Testad, Ingelin; Gulla, Christine; Aasmul, Irene; Ballard, Clive
2015-09-15
Nursing home patients have complex mental and physical health problems, disabilities and social needs, combined with widespread prescription of psychotropic drugs. Preservation of their quality of life is an important goal. This can only be achieved within nursing homes that offer competent clinical conditions of treatment and care. COmmunication, Systematic assessment and treatment of pain, Medication review, Occupational therapy, Safety (COSMOS) is an effectiveness-implementation hybrid trial that combines and implements organization of activities evidence-based interventions to improve staff competence and thereby the patients' quality of life, mental health and safety. The aim of this paper is to describe the development, content and implementation process of the COSMOS trial. COSMOS includes a 2-month pilot study with 128 participants distributed among nine Norwegian nursing homes, and a 4-month multicenter, cluster randomized effectiveness-implementation clinical hybrid trial with follow-up at month 9, including 571 patients from 67 nursing home units (one unit defined as one cluster). Clusters are randomized to COSMOS intervention or current best practice (control group). The intervention group will receive a 2-day education program including written guidelines, repeated theoretical and practical training (credited education of caregivers, physicians and nursing home managers), case discussions and role play. The 1-day midway evaluation, information and interviews of nursing staff and a telephone hotline all support the implementation process. Outcome measures include quality of life in late-stage dementia, neuropsychiatric symptoms, activities of daily living, pain, depression, sleep, medication, cost-utility analysis, hospital admission and mortality. Despite complex medical and psychosocial challenges, nursing home patients are often treated by staff possessing low level skills, lacking education and in facilities with a high staff turnover. Implementation of a research-based multicomponent intervention may improve staff's knowledge and competence and consequently the quality of life of nursing home patients in general and people with dementia in particular. ClinicalTrials.gov NCT02238652.
Data Processing Aspects of MEDLARS
Austin, Charles J.
1964-01-01
The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files. PMID:14119287
DATA PROCESSING ASPECTS OF MEDLARS.
AUSTIN, C J
1964-01-01
The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files.
NASA Technical Reports Server (NTRS)
Zhang, Jiming; Gardiner, Robin A.; Kirlin, Peter S.; Boerstler, Robert W.; Steinbeck, John
1992-01-01
High quality YBa2Cu3O(7-x) films were grown in-situ on LaAlO3 (100) by a novel single liquid source plasma-enhanced metalorganic chemical vapor deposition process. The metalorganic complexes M(thd) (sub n), (thd = 2,2,6,6-tetramethyl-3,5-heptanedionate; M = Y, Ba, Cu) were dissolved in an organic solution and injected into a vaporizer immediately upstream of the reactor inlet. The single liquid source technique dramatically simplifies current CVD processing and can significantly improve the process reproducibility. X-ray diffraction measurements indicated that single phase, highly c-axis oriented YBa2Cu3O(7-x) was formed in-situ at substrate temperature 680 C. The as-deposited films exhibited a mirror-like surface, had transition temperature T(sub cO) approximately equal to 89 K, Delta T(sub c) less than 1 K, and Jc (77 K) = 10(exp 6) A/sq cm.
Data Quality Objectives Process for Designation of K Basins Debris
DOE Office of Scientific and Technical Information (OSTI.GOV)
WESTCOTT, J.L.
2000-05-22
The U.S. Department of Energy has developed a schedule and approach for the removal of spent fuels, sludge, and debris from the K East (KE) and K West (KW) Basins, located in the 100 Area at the Hanford Site. The project that is the subject of this data quality objective (DQO) process is focused on the removal of debris from the K Basins and onsite disposal of the debris at the Environmental Restoration Disposal Facility (ERDF). This material previously has been dispositioned at the Hanford Low-Level Burial Grounds (LLBGs) or Central Waste Complex (CWC). The goal of this DQO processmore » and the resulting Sampling and Analysis Plan (SAP) is to provide the strategy for characterizing and designating the K-Basin debris to determine if it meets the Environmental Restoration Disposal Facility Waste Acceptance Criteria (WAC), Revision 3 (BHI 1998). A critical part of the DQO process is to agree on regulatory and WAC interpretation, to support preparation of the DQO workbook and SAP.« less
Error affect inoculation for a complex decision-making task.
Tabernero, Carmen; Wood, Robert E
2009-05-01
Individuals bring knowledge, implicit theories, and goal orientations to group meetings. Group decisions arise out of the exchange of these orientations. This research explores how a trainee's exploratory and deliberate process (an incremental theory and learning goal orientation) impacts the effectiveness of individual and group decision-making processes. The effectiveness of this training program is compared with another program that included error affect inoculation (EAI). Subjects were 40 Spanish Policemen in a training course. They were distributed in two training conditions for an individual and group decision-making task. In one condition, individuals received the Self-Guided Exploration plus Deliberation Process instructions, which emphasised exploring the options and testing hypotheses. In the other condition, individuals also received instructions based on Error Affect Inoculation (EAI), which emphasised positive affective reactions to errors and mistakes when making decisions. Results show that the quality of decisions increases when the groups share their reasoning. The AIE intervention promotes sharing information, flexible initial viewpoints, and improving the quality of group decisions. Implications and future directions are discussed.
Historical evolution of medical quality assurance in the Department of Defense.
Granger, Elder; Boyer, John; Weiss, Richard; Linton, Andrea; Williams, Thomas V
2010-08-01
The Department of Defense (DoD) Military Health System (MHS) embodies decades of health care practice that has evolved in scope and complexity to meet the demands for quality care to which its beneficiaries are entitled. War, Base Realignment and Closure (BRAC), and other dynamic forces require the ongoing review and revision of health care policy and practice in military hospitals as well as the expanded network of civilian providers who care for our nation's soldiers, sailors, airmen, and marines and their families. The result has been an incrementally constructed quality assurance (QA) program with emphasis on organizational structures, programs, and systems, and the use of robust data sources and standard measures to analyze and improve processes, manage disease, assess patient perceptions of care, and ensure that a uniform health care benefit and high quality health care is accessible to all MHS beneficiaries.
Unassigned MS/MS Spectra: Who Am I?
Pathan, Mohashin; Samuel, Monisha; Keerthikumar, Shivakumar; Mathivanan, Suresh
2017-01-01
Recent advances in high resolution tandem mass spectrometry (MS) has resulted in the accumulation of high quality data. Paralleled with these advances in instrumentation, bioinformatics software have been developed to analyze such quality datasets. In spite of these advances, data analysis in mass spectrometry still remains critical for protein identification. In addition, the complexity of the generated MS/MS spectra, unpredictable nature of peptide fragmentation, sequence annotation errors, and posttranslational modifications has impeded the protein identification process. In a typical MS data analysis, about 60 % of the MS/MS spectra remains unassigned. While some of these could attribute to the low quality of the MS/MS spectra, a proportion can be classified as high quality. Further analysis may reveal how much of the unassigned MS spectra attribute to search space, sequence annotation errors, mutations, and/or posttranslational modifications. In this chapter, the tools used to identify proteins and ways to assign unassigned tandem MS spectra are discussed.
Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio
2016-01-01
To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.
Putting the process of care into practice.
Houck, S; Baum, N
1997-01-01
"Putting the process of care into practice" provides an interactive, visual model of outpatient resources and processes. It illustrates an episode of care from a fee-for-service as well as managed care perspective. The Care Process Matrix can be used for planning and staffing, as well as retrospectively to assess appropriate resource use within a practice. It identifies effective strategies for reducing the cost per episode of care and optimizing quality while moving from managing costs to managing the care process. Because of an overbuilt health care system, including an oversupply of physicians, success in the future will require redesigning the process of care and a coherent customer service strategy. The growing complexities of practice will require physicians to focus on several key competencies while outsourcing other functions such as billing and contracting.
Ameh, Soter; Gómez-Olivé, Francesc Xavier; Kahn, Kathleen; Tollman, Stephen M; Klipstein-Grobusch, Kerstin
2017-03-23
South Africa faces a complex dual burden of chronic communicable and non-communicable diseases (NCDs). In response, the Integrated Chronic Disease Management (ICDM) model was initiated in primary health care (PHC) facilities in 2011 to leverage the HIV/ART programme to scale-up services for NCDs, achieve optimal patient health outcomes and improve the quality of medical care. However, little is known about the quality of care in the ICDM model. The objectives of this study were to: i) assess patients' and operational managers' satisfaction with the dimensions of ICDM services; and ii) evaluate the quality of care in the ICDM model using Avedis Donabedian's theory of relationships between structure (resources), process (clinical activities) and outcome (desired result of healthcare) constructs as a measure of quality of care. A cross-sectional study was conducted in 2013 in seven PHC facilities in the Bushbuckridge municipality of Mpumalanga Province, north-east South Africa - an area underpinned by a robust Health and Demographic Surveillance System (HDSS). The patient satisfaction questionnaire (PSQ-18), with measures reflecting structure/process/outcome (SPO) constructs, was adapted and administered to 435 chronic disease patients and the operational managers of all seven PHC facilities. The adapted questionnaire contained 17 dimensions of care, including eight dimensions identified as priority areas in the ICDM model - critical drugs, equipment, referral, defaulter tracing, prepacking of medicines, clinic appointments, waiting time, and coherence. A structural equation model was fit to operationalise Donabedian's theory, using unidirectional, mediation, and reciprocal pathways. The mediation pathway showed that the relationships between structure, process and outcome represented quality systems in the ICDM model. Structure correlated with process (0.40) and outcome (0.75). Given structure, process correlated with outcome (0.88). Of the 17 dimensions of care in the ICDM model, three structure (equipment, critical drugs, accessibility), three process (professionalism, friendliness and attendance to patients) and three outcome (competence, confidence and coherence) dimensions reflected their intended constructs. Of the priority dimensions, referrals, defaulter tracing, prepacking of medicines, appointments, and patient waiting time did not reflect their intended constructs. Donabedian's theoretical framework can be used to provide evidence of quality systems in the ICDM model.
[Possibilities and perspectives of quality management in radiation oncology].
Seegenschmiedt, M H; Zehe, M; Fehlauer, F; Barzen, G
2012-11-01
The medical discipline radiation oncology and radiation therapy (treatment with ionizing radiation) has developed rapidly in the last decade due to new technologies (imaging, computer technology, software, organization) and is one of the most important pillars of tumor therapy. Structure and process quality play a decisive role in the quality of outcome results (therapy success, tumor response, avoidance of side effects) in this field. Since 2007 all institutions in the health and social system are committed to introduce and continuously develop a quality management (QM) system. The complex terms of reference, the complicated technical instruments, the highly specialized personnel and the time-consuming processes for planning, implementation and assessment of radiation therapy made it logical to introduce a QM system in radiation oncology, independent of the legal requirements. The Radiation Center Hamburg (SZHH) has functioned as a medical care center under medical leadership and management since 2009. The total QM and organization system implemented for the Radiation Center Hamburg was prepared in 2008 and 2009 and certified in June 2010 by the accreditation body (TÜV-Süd) for DIN EN ISO 9001:2008. The main function of the QM system of the SZHH is to make the basic principles understandable for insiders and outsiders, to have clear structures, to integrate management principles into the routine and therefore to organize the learning processes more effectively both for interior and exterior aspects.
Quality Improvement in Critical Care: Selection and Development of Quality Indicators
Martin, Claudio M.; Project, The Quality Improvement in Critical Care
2016-01-01
Background. Caring for critically ill patients is complex and resource intensive. An approach to monitor and compare the function of different intensive care units (ICUs) is needed to optimize outcomes for patients and the health system as a whole. Objective. To develop and implement quality indicators for comparing ICU characteristics and performance within and between ICUs and regions over time. Methods. Canadian jurisdictions with established ICU clinical databases were invited to participate in an iterative series of face-to-face meetings, teleconferences, and web conferences. Eighteen adult intensive care units across 14 hospitals and 5 provinces participated in the process. Results. Six domains of ICU function were identified: safe, timely, efficient, effective, patient/family satisfaction, and staff work life. Detailed operational definitions were developed for 22 quality indicators. The feasibility was demonstrated with the collection of 3.5 years of data. Statistical process control charts and graphs of composite measures were used for data display and comparisons. Medical and nursing leaders as well as administrators found the system to be an improvement over prior methods. Conclusions. Our process resulted in the selection and development of 22 indicators representing 6 domains of ICU function. We have demonstrated the feasibility of such a reporting system. This type of reporting system will demonstrate variation between units and jurisdictions to help identify and prioritize improvement efforts. PMID:27493476
Baumgart, André; Denz, Christof; Bender, Hans-Joachim; Schleppers, Alexander
2009-01-01
The complexity of the operating room (OR) requires that both structural (eg, department layout) and behavioral (eg, staff interactions) patterns of work be considered when developing quality improvement strategies. In our study, we investigated how these contextual factors influence outpatient OR processes and the quality of care delivered. The study setting was a German university-affiliated hospital performing approximately 6000 outpatient surgeries annually. During the 3-year-study period, the hospital significantly changed its outpatient OR facility layout from a decentralized (ie, ORs in adjacent areas of the building) to a centralized (ie, ORs in immediate vicinity of each other) design. To study the impact of the facility change on OR processes, we used a mixed methods approach, including process analysis, process modeling, and social network analysis of staff interactions. The change in facility layout was seen to influence OR processes in ways that could substantially affect patient outcomes. For example, we found a potential for more errors during handovers in the new centralized design due to greater interdependency between tasks and staff. Utilization of the mixed methods approach in our analysis, as compared with that of a single assessment method, enabled a deeper understanding of the OR work context and its influence on outpatient OR processes.
Hybrid modeling as a QbD/PAT tool in process development: an industrial E. coli case study.
von Stosch, Moritz; Hamelink, Jan-Martijn; Oliveira, Rui
2016-05-01
Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.
Economic-Oriented Stochastic Optimization in Advanced Process Control of Chemical Processes
Dobos, László; Király, András; Abonyi, János
2012-01-01
Finding the optimal operating region of chemical processes is an inevitable step toward improving economic performance. Usually the optimal operating region is situated close to process constraints related to product quality or process safety requirements. Higher profit can be realized only by assuring a relatively low frequency of violation of these constraints. A multilevel stochastic optimization framework is proposed to determine the optimal setpoint values of control loops with respect to predetermined risk levels, uncertainties, and costs of violation of process constraints. The proposed framework is realized as direct search-type optimization of Monte-Carlo simulation of the controlled process. The concept is illustrated throughout by a well-known benchmark problem related to the control of a linear dynamical system and the model predictive control of a more complex nonlinear polymerization process. PMID:23213298
NASA Astrophysics Data System (ADS)
Wang, G.; Liu, L.; Chen, G.
2016-12-01
The complex environmental physical and chemical processes and interplay with the associating biological responses are keys to understanding the environmental microbiology ensconced in environmental remediation, water quality control, food safety, nutrient cycling, and etc., yet remain poorly understood. Using experimental micromodels, we study how environmental conditions (e.g., hydration fluctuation, nutrient limitation, pH variation, etc.) affect microbial extracellular polymeric substances (EPS) production and their configuration within various hydrated surfaces, and impacts on microbial motility, surface attachment, aggregation, and other bioremediation activities. To elucidate the potential mechanisms underlying the complex bio-physicochemical processes, we developed an individual-based and spatio-temporally resolved modeling platform that explicitly considers microscale aqueous-phase configuration and nutrient transport/diffusion and associated biophysical processes affecting individual microbial cell life history. We quantitatively explore the effects of the above microscale environmental processes on bio-physicochemical interactions affecting microbial growth, motility, surface attachment and aggregation, and shaping population interactions and functions. Simulation scenarios of microbial induced pollutant (e.g., roxarsone) biotransformation on various hydrated rough surfaces will also be present.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali T-Raissi
The aim of this work was to assess issues of cost, and performance associated with the production and storage of hydrogen via following three feedstocks: sub-quality natural gas (SQNG), ammonia (NH{sub 3}), and water. Three technology areas were considered: (1) Hydrogen production utilizing SQNG resources, (2) Hydrogen storage in ammonia and amine-borane complexes for fuel cell applications, and (3) Hydrogen from solar thermochemical cycles for splitting water. This report summarizes our findings with the following objectives: Technoeconomic analysis of the feasibility of the technology areas 1-3; Evaluation of the hydrogen production cost by technology areas 1; and Feasibility of ammoniamore » and/or amine-borane complexes (technology areas 2) as a means of hydrogen storage on-board fuel cell powered vehicles. For each technology area, we reviewed the open literature with respect to the following criteria: process efficiency, cost, safety, and ease of implementation and impact of the latest materials innovations, if any. We employed various process analysis platforms including FactSage chemical equilibrium software and Aspen Technologies AspenPlus and HYSYS chemical process simulation programs for determining the performance of the prospective hydrogen production processes.« less
Content standards for medical image metadata
NASA Astrophysics Data System (ADS)
d'Ornellas, Marcos C.; da Rocha, Rafael P.
2003-12-01
Medical images are at the heart of the healthcare diagnostic procedures. They have provided not only a noninvasive mean to view anatomical cross-sections of internal organs but also a mean for physicians to evaluate the patient"s diagnosis and monitor the effects of the treatment. For a Medical Center, the emphasis may shift from the generation of image to post processing and data management since the medical staff may generate even more processed images and other data from the original image after various analyses and post processing. A medical image data repository for health care information system is becoming a critical need. This data repository would contain comprehensive patient records, including information such as clinical data and related diagnostic images, and post-processed images. Due to the large volume and complexity of the data as well as the diversified user access requirements, the implementation of the medical image archive system will be a complex and challenging task. This paper discusses content standards for medical image metadata. In addition it also focuses on the image metadata content evaluation and metadata quality management.
Shiver, Stacy A; Schmitt, Karla; Cooksey, Adrian
2009-01-01
The business of sexually transmitted disease (STD) prevention and control demands technology that is capable of supporting a wide array of program activities-from the processing of laboratory test results to the complex and confidential process involved in contact investigation. The need for a tool that enables public health officials to successfully manage the complex operations encountered in an STD prevention and control program, and the need to operate in an increasingly poor resource environment, led the Florida Bureau of STD to develop the Patient Reporting Investigation Surveillance Manager. Its unique approach, technical architecture, and sociotechnical philosophy have made this business application successful in real-time monitoring of disease burden for local communities, identification of emerging outbreaks, monitoring and assurance of appropriate treatments, improving access to laboratory data, and improving the quality of data for epidemiologic analysis. Additionally, the effort attempted to create and release a product that promoted the Centers for Disease Control and Prevention's ideas for integration of programs and processes.
The genome editing toolbox: a spectrum of approaches for targeted modification.
Cheng, Joseph K; Alper, Hal S
2014-12-01
The increase in quality, quantity, and complexity of recombinant products heavily drives the need to predictably engineer model and complex (mammalian) cell systems. However, until recently, limited tools offered the ability to precisely manipulate their genomes, thus impeding the full potential of rational cell line development processes. Targeted genome editing can combine the advances in synthetic and systems biology with current cellular hosts to further push productivity and expand the product repertoire. This review highlights recent advances in targeted genome editing techniques, discussing some of their capabilities and limitations and their potential to aid advances in pharmaceutical biotechnology. Copyright © 2014 Elsevier Ltd. All rights reserved.
An improvement of vehicle detection under shadow regions in satellite imagery
NASA Astrophysics Data System (ADS)
Karim, Shahid; Zhang, Ye; Ali, Saad; Asif, Muhammad Rizwan
2018-04-01
The processing of satellite imagery is dependent upon the quality of imagery. Due to low resolution, it is difficult to extract accurate information according to the requirements of applications. For the purpose of vehicle detection under shadow regions, we have used HOG for feature extraction, SVM is used for classification and HOG is discerned worthwhile tool for complex environments. Shadow images have been scrutinized and found very complex for detection as observed very low detection rates therefore our dedication is towards enhancement of detection rate under shadow regions by implementing appropriate preprocessing. Vehicles are precisely detected under non-shadow regions with high detection rate than shadow regions.
Pixel-based speckle adjustment for noise reduction in Fourier-domain OCT images
Zhang, Anqi; Xi, Jiefeng; Sun, Jitao; Li, Xingde
2017-01-01
Speckle resides in OCT signals and inevitably effects OCT image quality. In this work, we present a novel method for speckle noise reduction in Fourier-domain OCT images, which utilizes the phase information of complex OCT data. In this method, speckle area is pre-delineated pixelwise based on a phase-domain processing method and then adjusted by the results of wavelet shrinkage of the original image. Coefficient shrinkage method such as wavelet or contourlet is applied afterwards for further suppressing the speckle noise. Compared with conventional methods without speckle adjustment, the proposed method demonstrates significant improvement of image quality. PMID:28663860
Increase in competitiveness of housing-and-communal services
NASA Astrophysics Data System (ADS)
Skripnik, Oksana
2017-10-01
The problems, interfering effective activity of housing-and-communal complex are considered in the article. Some factors of the increase in competitiveness and the importance of transactional expenses are revealed. The assessment of competitiveness of the organizations of the sphere of housing-and-communal services is considered as the set of the following basic elements organizational and administrative, marketing, financial, production, indicators of quality, indicators of development, labor indicators interconnected with processes of the organization. The author proves that the increase in competitiveness is possible by carrying out organizational and administrative, innovative, technological, economic transformations, increasing quality of services, reducing costs for production and realization of services, providing new services.
Least-squares luma-chroma demultiplexing algorithm for Bayer demosaicking.
Leung, Brian; Jeon, Gwanggil; Dubois, Eric
2011-07-01
This paper addresses the problem of interpolating missing color components at the output of a Bayer color filter array (CFA), a process known as demosaicking. A luma-chroma demultiplexing algorithm is presented in detail, using a least-squares design methodology for the required bandpass filters. A systematic study of objective demosaicking performance and system complexity is carried out, and several system configurations are recommended. The method is compared with other benchmark algorithms in terms of CPSNR and S-CIELAB ∆E∗ objective quality measures and demosaicking speed. It was found to provide excellent performance and the best quality-speed tradeoff among the methods studied.
Optimized suppression of coherent noise from seismic data using the Karhunen-Loève transform
NASA Astrophysics Data System (ADS)
Montagne, Raúl; Vasconcelos, Giovani L.
2006-07-01
Signals obtained in land seismic surveys are usually contaminated with coherent noise, among which the ground roll (Rayleigh surface waves) is of major concern for it can severely degrade the quality of the information obtained from the seismic record. This paper presents an optimized filter based on the Karhunen-Loève transform for processing seismic images contaminated with ground roll. In this method, the contaminated region of the seismic record, to be processed by the filter, is selected in such way as to correspond to the maximum of a properly defined coherence index. The main advantages of the method are that the ground roll is suppressed with negligible distortion of the remnant reflection signals and that the filtering procedure can be automated. The image processing technique described in this study should also be relevant for other applications where coherent structures embedded in a complex spatiotemporal pattern need to be identified in a more refined way. In particular, it is argued that the method is appropriate for processing optical coherence tomography images whose quality is often degraded by coherent noise (speckle).
Consciousness: a unique way of processing information.
Marchetti, Giorgio
2018-02-08
In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the "single voice" of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention.
Experimental and numerical analysis of interlocking rib formation at sheet metal blanking
NASA Astrophysics Data System (ADS)
Bolka, Špela; Bratuš, Vitoslav; Starman, Bojan; Mole, Nikolaj
2018-05-01
Cores for electrical motors are typically produced by blanking of laminations and then stacking them together, with, for instance, interlocking ribs or welding. Strict geometrical tolerances, both on the lamination and on the stack, combined with complex part geometry and harder steel strip material, call for use of predictive methods to optimize the process before actual blanking to reduce the costs and speed up the process. One of the major influences on the final stack geometry is the quality of the interlocking ribs. A rib is formed in one step and joined with the rib of the preceding lamination in the next. The quality of the joint determines the firmness of the stack and also influences its. The geometrical and positional accuracy is thus crucial in rib formation process. In this study, a complex experimental and numerical analysis of interlocking rib formation has been performed. The aim of the analysis is to numerically predict the shape of the rib in order to perform a numerical simulation of the stack formation in the next step of the process. A detailed experimental research has been performed in order to characterize influential parameters on the rib formation and the geometry of the ribs itself, using classical and 3D laser microscopy. The formation of the interlocking rib is then simulated using Abaqus Explicit. The Hilll 48 constitutive material model is based on extensive and novel material characterization process, combining data from in-plane and out-of-plane material tests to perform a 3D analysis of both, rib formation and rib joining. The study shows good correlation between the experimental and numerical results.
Andrews, William J.; Stark, James R.; Fong, Alison L.; Fallon, James D.
2005-01-01
Although land use had substantial effects on ground-water quality, the distribution of contaminants in the aquifer also is affected by complex combinations of factors and processes that include sources of natural and anthropogenic contaminants, three-dimensional advective flow, physical and hydrologic settings, age and evolution of ground water, and transformation of chemical compounds along the flow system. Compounds such as nitrate and dissolved oxygen were greatest in water samples from the upgradient end of the flow system and near the water table. Specific conductance and dissolved solids increased along the flow system and with depth due to increase in residence time in the flow system and dissolution of aquifer materials.
Revisiting perceptions of quality of hospice care: managing for the ultimate referral.
Churchman, Richard; York, Grady S; Woodard, Beth; Wainright, Charles; Rau-Foster, Mary
2014-08-01
Hospice services provided in the final months of life are delivered through complex interpersonal relationships between caregivers, patients, and families. Often, service value and quality are defined by these interpersonal interactions. This understanding provides hospice leaders with an enormous opportunity to create processes that provide the optimal level of care during the last months of life. The authors argue that the ultimate referral is attained when a family member observes the care of a loved one, and the family member conveys a desire to receive the same quality of services their loved one received at that facility. The point of this article is to provide evidence that supports the methods to ultimately enhance the patient's and family's experience and increase the potential for the ultimate referral. © The Author(s) 2013.
Registration of interferometric SAR images
NASA Technical Reports Server (NTRS)
Lin, Qian; Vesecky, John F.; Zebker, Howard A.
1992-01-01
Interferometric synthetic aperture radar (INSAR) is a new way of performing topography mapping. Among the factors critical to mapping accuracy is the registration of the complex SAR images from repeated orbits. A new algorithm for registering interferometric SAR images is presented. A new figure of merit, the average fluctuation function of the phase difference image, is proposed to evaluate the fringe pattern quality. The process of adjusting the registration parameters according to the fringe pattern quality is optimized through a downhill simplex minimization algorithm. The results of applying the proposed algorithm to register two pairs of Seasat SAR images with a short baseline (75 m) and a long baseline (500 m) are shown. It is found that the average fluctuation function is a very stable measure of fringe pattern quality allowing very accurate registration.
The Endoplasmic Reticulum-Associated Degradation Pathways of Budding Yeast
Thibault, Guillaume; Ng, Davis T.W.
2012-01-01
Protein misfolding is a common cellular event that can produce intrinsically harmful products. To reduce the risk, quality control mechanisms are deployed to detect and eliminate misfolded, aggregated, and unassembled proteins. In the secretory pathway, it is mainly the endoplasmic reticulum-associated degradation (ERAD) pathways that perform this role. Here, specialized factors are organized to monitor and process the folded states of nascent polypeptides. Despite the complex structures, topologies, and posttranslational modifications of client molecules, the ER mechanisms are the best understood among all protein quality-control systems. This is the result of convergent and sometimes serendipitous discoveries by researchers from diverse fields. Although major advances in ER quality control and ERAD came from all model organisms, this review will focus on the discoveries culminating from the simple budding yeast. PMID:23209158
Self-transcending meditation is good for mental health: why this should be the case.
Hankey, Alex; Shetkar, Rashmi
2016-06-01
A simple theory of health has recently been proposed: while poor quality regulation corresponds to poor quality health so that improving regulation should improve health, optimal regulation optimizes function and optimizes health. Examining the term 'optimal regulation' in biological systems leads to a straightforward definition in terms of 'criticality' in complexity biology, a concept that seems to apply universally throughout biology. Criticality maximizes information processing and sensitivity of response to external stimuli, and for these reasons may be held to optimize regulation. In this way a definition of health has been given in terms of regulation, a scientific concept, which ties into detailed properties of complex systems, including brain cortices, and mental health. Models of experience and meditation built on complexity also point to criticality: it represents the condition making self-awareness possible, and is strengthened by meditation practices leading to the state of pure consciousness-the content-free state of mind in deep meditation. From this it follows that healthy function of the brain cortex, its sensitivity,y and consistency of response to external challenges should improve by practicing techniques leading to content-free awareness-transcending the original focus introduced during practice. Evidence for this is reviewed.
Motion adaptive Kalman filter for super-resolution
NASA Astrophysics Data System (ADS)
Richter, Martin; Nasse, Fabian; Schröder, Hartmut
2011-01-01
Superresolution is a sophisticated strategy to enhance image quality of both low and high resolution video, performing tasks like artifact reduction, scaling and sharpness enhancement in one algorithm, all of them reconstructing high frequency components (above Nyquist frequency) in some way. Especially recursive superresolution algorithms can fulfill high quality aspects because they control the video output using a feed-back loop and adapt the result in the next iteration. In addition to excellent output quality, temporal recursive methods are very hardware efficient and therefore even attractive for real-time video processing. A very promising approach is the utilization of Kalman filters as proposed by Farsiu et al. Reliable motion estimation is crucial for the performance of superresolution. Therefore, robust global motion models are mainly used, but this also limits the application of superresolution algorithm. Thus, handling sequences with complex object motion is essential for a wider field of application. Hence, this paper proposes improvements by extending the Kalman filter approach using motion adaptive variance estimation and segmentation techniques. Experiments confirm the potential of our proposal for ideal and real video sequences with complex motion and further compare its performance to state-of-the-art methods like trainable filters.
Laser beam complex amplitude measurement by phase diversity.
Védrenne, Nicolas; Mugnier, Laurent M; Michau, Vincent; Velluet, Marie-Thérèse; Bierent, Rudolph
2014-02-24
The control of the optical quality of a laser beam requires a complex amplitude measurement able to deal with strong modulus variations and potentially highly perturbed wavefronts. The method proposed here consists in an extension of phase diversity to complex amplitude measurements that is effective for highly perturbed beams. Named camelot for Complex Amplitude MEasurement by a Likelihood Optimization Tool, it relies on the acquisition and processing of few images of the beam section taken along the optical path. The complex amplitude of the beam is retrieved from the images by the minimization of a Maximum a Posteriori error metric between the images and a model of the beam propagation. The analytical formalism of the method and its experimental validation are presented. The modulus of the beam is compared to a measurement of the beam profile, the phase of the beam is compared to a conventional phase diversity estimate. The precision of the experimental measurements is investigated by numerical simulations.
Managing Programmatic Risk for Complex Space System Developments
NASA Technical Reports Server (NTRS)
Panetta, Peter V.; Hastings, Daniel; Brumfield, Mark (Technical Monitor)
2001-01-01
Risk management strategies have become a recent important research topic to many aerospace organizations as they prepare to develop the revolutionary complex space systems of the future. Future multi-disciplinary complex space systems will make it absolutely essential for organizations to practice a rigorous, comprehensive risk management process, emphasizing thorough systems engineering principles to succeed. Project managers must possess strong leadership skills to direct high quality, cross-disciplinary teams for successfully developing revolutionary space systems that are ever increasing in complexity. Proactive efforts to reduce or eliminate risk throughout a project's lifecycle ideally must be practiced by all technical members in the organization. This paper discusses some of the risk management perspectives that were collected from senior managers and project managers of aerospace and aeronautical organizations by the use of interviews and surveys. Some of the programmatic risks which drive the success or failure of projects are revealed. Key findings lead to a number of insights for organizations to consider for proactively approaching the risks which face current and future complex space systems projects.
Large eddy simulation modeling of particle-laden flows in complex terrain
NASA Astrophysics Data System (ADS)
Salesky, S.; Giometto, M. G.; Chamecki, M.; Lehning, M.; Parlange, M. B.
2017-12-01
The transport, deposition, and erosion of heavy particles over complex terrain in the atmospheric boundary layer is an important process for hydrology, air quality forecasting, biology, and geomorphology. However, in situ observations can be challenging in complex terrain due to spatial heterogeneity. Furthermore, there is a need to develop numerical tools that can accurately represent the physics of these multiphase flows over complex surfaces. We present a new numerical approach to accurately model the transport and deposition of heavy particles in complex terrain using large eddy simulation (LES). Particle transport is represented through solution of the advection-diffusion equation including terms that represent gravitational settling and inertia. The particle conservation equation is discretized in a cut-cell finite volume framework in order to accurately enforce mass conservation. Simulation results will be validated with experimental data, and numerical considerations required to enforce boundary conditions at the surface will be discussed. Applications will be presented in the context of snow deposition and transport, as well as urban dispersion.
Identifying and Coordinating Care for Complex Patients
Rudin, Robert S.; Gidengil, Courtney A.; Predmore, Zachary; Schneider, Eric C.; Sorace, James; Hornstein, Rachel
2017-01-01
Abstract In the United States, a relatively small proportion of complex patients---defined as having multiple comorbidities, high risk for poor outcomes, and high cost---incur most of the nation's health care costs. Improved care coordination and management of complex patients could reduce costs while increasing quality of care. However, care coordination efforts face multiple challenges, such as segmenting populations of complex patients to better match their needs with the design of specific interventions, understanding how to reduce spending, and integrating care coordination programs into providers' care delivery processes. Innovative uses of analytics and health information technology (HIT) may address these challenges. Rudin and colleagues at RAND completed a literature review and held discussions with subject matter experts, reaching the conclusion that analytics and HIT are being used in innovative ways to coordinate care for complex patients but that the capabilities are limited, evidence of their effectiveness is lacking, and challenges are substantial, and important foundational work is still needed. PMID:28845354
Integrating technology into complex intervention trial processes: a case study.
Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica
2016-11-17
Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database designed to support data collection, intervention fidelity and trial progress provides a viable option for streamlining trial processes in a multicentre complex intervention trial. There is scope to further extend the system to cater for larger trials and add further functionality such as automatic reporting facilities and participant management support. ISRCTN65378754 , registered on 13 March 2014.
Forouhar, Amir; Hasankhani, Mahnoosh
2018-04-01
Urban decay is the process by which a historical city center, or an old part of a city, falls into decrepitude and faces serious problems. Urban management, therefore, implements renewal mega projects with the goal of physical and functional revitalization, retrieval of socioeconomic capacities, and improving of quality of life of residents. Ignoring the complexities of these large-scale interventions in the old and historical urban fabrics may lead to undesirable consequences, including an additional decline of quality of life. Thus, the present paper aims to assess the impact of renewal mega projects on residents' subjective quality of life, in the historical religious district of the holy city of Mashhad (Samen District). A combination of quantitative and qualitative methods of impact assessment, including questionnaires, semi-structured personal interviews, and direct observation, is used in this paper. The results yield that the Samen Renewal Project has significantly reduced the resident's subjective quality of life, due to its undesirable impacts on physical, socio-cultural, and economic environments.
Quality improving techniques for free-viewpoint DIBR
NASA Astrophysics Data System (ADS)
Do, Luat; Zinger, Sveta; de With, Peter H. N.
2010-02-01
Interactive free-viewpoint selection applied to a 3D multi-view signal is a possible attractive feature of the rapidly developing 3D TV media. This paper explores a new rendering algorithm that computes a free-viewpoint based on depth image warping between two reference views from existing cameras. We have developed three quality enhancing techniques that specifically aim at solving the major artifacts. First, resampling artifacts are filled in by a combination of median filtering and inverse warping. Second, contour artifacts are processed while omitting warping of edges at high discontinuities. Third, we employ a depth signal for more accurate disocclusion inpainting. We obtain an average PSNR gain of 3 dB and 4.5 dB for the 'Breakdancers' and 'Ballet' sequences, respectively, compared to recently published results. While experimenting with synthetic data, we observe that the rendering quality is highly dependent on the complexity of the scene. Moreover, experiments are performed using compressed video from surrounding cameras. The overall system quality is dominated by the rendering quality and not by coding.
NASA Astrophysics Data System (ADS)
Snoalv, J.; Groeneveld, M.; Quine, T. A.; Tranvik, L.
2017-12-01
Flocculation of dissolved organic carbon (DOC) in streams and rivers is a process that contributes to the pool of particulate organic carbon (POC) in the aquatic system. In low-energy waters the increased sedimentation rates of this higher-density fraction of organic carbon (OC) makes POC important in allocating organic carbon into limnic storage, which subsequently influences emissions of greenhouse gases from the continental environment to the atmosphere. Allochthonous OC, derived from the terrestrial environment by soil erosion and litterfall, import both mineral aggregate-bound and free OC into freshwaters, which comprise carbon species of different quality and recalcitrance than autochthonous in-stream produced OC, such as from biofilms, aquatic plants and algae. Increased soil erosion due to land use change (e.g. agriculture, deforestation etc.) influences the input of allochthonous OC, which can lead to increased POC formation and sedimentation of terrestrial OC at flocculation boundaries in the landscape, i.e. where coagulation and flocculation processes are prone to occur in the water column. This study investigates the seasonal variation in POC content and flocculation capacity with respect to water quality (elemental composition) in eight river systems (four agricultural and four wooded streams) with headwaters in Exmoor, UK, that drain managed and non-managed land into Bristol Channel. Through flocculation experiments the samples were allowed to flocculate by treatments with added clay and salt standards that simulate the flocculation processes by 1) increased input of sediment into streams, and 2) saline mixing at the estuarine boundary, in order to quantify floc production and investigate POC quality by each process respectively. The results show how floc production, carbon quality and incorporation (e.g. complexation) of metals and rare earth elements (REE) in produced POC and remaining DOC in solution vary in water samples over the season and how these are related to different flocculation processes and affected by land use. This study improves our understanding on OC flocculation dynamics on a local catchment scale and how POC fate is affected by changed water quality in streams perturbed by land use change.
TMT approach to observatory software development process
NASA Astrophysics Data System (ADS)
Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder
2016-07-01
The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.
Koski, Greg; Tobin, Mary F; Whalen, Matthew
2014-10-01
The pharmaceutical industry, once highly respected, productive, and profitable, is in the throes of major change driven by many forces, including economics, science, regulation, and ethics. A variety of initiatives and partnerships have been launched to improve efficiency and productivity but without significant effect because they have failed to consider the process as a system. Addressing the challenges facing this complex endeavor requires more than modifications of individual processes; it requires a fully integrated application of systems thinking and an understanding of the desired goals and complex interactions among essential components and stakeholders of the whole. A multistakeholder collaborative effort, led by the Alliance for Clinical Research Excellence and Safety (ACRES), a global nonprofit organization operating in the public interest, is now under way to build a shared global system for clinical research. Its systems approach focuses on the interconnection of stakeholders at critical points of interaction within 4 operational domains: site development and support, quality management, information technology, and safety. The ACRES initiatives, Site Accreditation and Standards, Product Safety Culture, Global Ethical Review and Regulatory Innovation, and Quality Assurance and Safety, focus on building and implementing systems solutions. Underpinning these initiatives is an open, shared, integrated technology (site and optics and quality informatics initiative). We describe the rationale, challenges, progress, and successes of this effort to date and lessons learned. The complexity and fragmentation of the intensely proprietary ecosystem of drug development, challenging regulatory climate, and magnitude of the endeavor itself pose significant challenges, but the economic, social, and scientific rewards will more than justify the effort. An effective alliance model requires a willingness of multiple stakeholders to work together to build a shared system within a noncompetitive space that will have major benefits for all, including better access to medicines, better health, and more productive lives. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
,
2009-01-01
In the Southeast, U.S. Geological Survey (USGS) scientists are researching issues through technical studies of water availability and quality, geologic processes (marine, coastal, and terrestrial), geographic complexity, and biological resources. The USGS is prepared to tackle multifaceted questions associated with global climate change and resulting weather patterns such as drought through expert scientific skill, innovative research approaches, and accurate information technology.
Assessment of regional air quality by a concentration-dependent Pollution Permeation Index
Liang, Chun-Sheng; Liu, Huan; He, Ke-Bin; Ma, Yong-Liang
2016-01-01
Although air quality monitoring networks have been greatly improved, interpreting their expanding data in both simple and efficient ways remains challenging. Therefore, needed are new analytical methods. We developed such a method based on the comparison of pollutant concentrations between target and circum areas (circum comparison for short), and tested its applications by assessing the air pollution in Jing-Jin-Ji, Yangtze River Delta, Pearl River Delta and Cheng-Yu, China during 2015. We found the circum comparison can instantly judge whether a city is a pollution permeation donor or a pollution permeation receptor by a Pollution Permeation Index (PPI). Furthermore, a PPI-related estimated concentration (original concentration plus halved average concentration difference) can be used to identify some overestimations and underestimations. Besides, it can help explain pollution process (e.g., Beijing’s PM2.5 maybe largely promoted by non-local SO2) though not aiming at it. Moreover, it is applicable to any region, easy-to-handle, and able to boost more new analytical methods. These advantages, despite its disadvantages in considering the whole process jointly influenced by complex physical and chemical factors, demonstrate that the PPI based circum comparison can be efficiently used in assessing air pollution by yielding instructive results, without the absolute need for complex operations. PMID:27731344
Prosthetic joint infection development of an evidence-based diagnostic algorithm.
Mühlhofer, Heinrich M L; Pohlig, Florian; Kanz, Karl-Georg; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; Kelch, Sarah; Harrasser, Norbert; von Eisenhart-Rothe, Rüdiger; Schauwecker, Johannes
2017-03-09
Increasing rates of prosthetic joint infection (PJI) have presented challenges for general practitioners, orthopedic surgeons and the health care system in the recent years. The diagnosis of PJI is complex; multiple diagnostic tools are used in the attempt to correctly diagnose PJI. Evidence-based algorithms can help to identify PJI using standardized diagnostic steps. We reviewed relevant publications between 1990 and 2015 using a systematic literature search in MEDLINE and PUBMED. The selected search results were then classified into levels of evidence. The keywords were prosthetic joint infection, biofilm, diagnosis, sonication, antibiotic treatment, implant-associated infection, Staph. aureus, rifampicin, implant retention, pcr, maldi-tof, serology, synovial fluid, c-reactive protein level, total hip arthroplasty (THA), total knee arthroplasty (TKA) and combinations of these terms. From an initial 768 publications, 156 publications were stringently reviewed. Publications with class I-III recommendations (EAST) were considered. We developed an algorithm for the diagnostic approach to display the complex diagnosis of PJI in a clear and logically structured process according to ISO 5807. The evidence-based standardized algorithm combines modern clinical requirements and evidence-based treatment principles. The algorithm provides a detailed transparent standard operating procedure (SOP) for diagnosing PJI. Thus, consistently high, examiner-independent process quality is assured to meet the demands of modern quality management in PJI diagnosis.
Cao, Bibo; Li, Chuan; Liu, Yan; Zhao, Yue; Sha, Jian; Wang, Yuqiu
2015-05-01
Because water quality monitoring sections or sites could reflect the water quality status of rivers, surface water quality management based on water quality monitoring sections or sites would be effective. For the purpose of improving water quality of rivers, quantifying the contribution ratios of pollutant resources to a specific section is necessary. Because physical and chemical processes of nutrient pollutants are complex in water bodies, it is difficult to quantitatively compute the contribution ratios. However, water quality models have proved to be effective tools to estimate surface water quality. In this project, an enhanced QUAL2Kw model with an added module was applied to the Xin'anjiang Watershed, to obtain water quality information along the river and to assess the contribution ratios of each pollutant source to a certain section (the Jiekou state-controlled section). Model validation indicated that the results were reliable. Then, contribution ratios were analyzed through the added module. Results show that among the pollutant sources, the Lianjiang tributary contributes the largest part of total nitrogen (50.43%), total phosphorus (45.60%), ammonia nitrogen (32.90%), nitrate (nitrite + nitrate) nitrogen (47.73%), and organic nitrogen (37.87%). Furthermore, contribution ratios in different reaches varied along the river. Compared with pollutant loads ratios of different sources in the watershed, an analysis of contribution ratios of pollutant sources for each specific section, which takes the localized chemical and physical processes into consideration, was more suitable for local-regional water quality management. In summary, this method of analyzing the contribution ratios of pollutant sources to a specific section based on the QUAL2Kw model was found to support the improvement of the local environment.
Shu, Yisong; Liu, Zhenli; Zhao, Siyu; Song, Zhiqian; He, Dan; Wang, Menglei; Zeng, Honglian; Lu, Cheng; Lu, Aiping; Liu, Yuanyan
2017-08-01
Traditional Chinese medicine (TCM) exerts its therapeutic effect in a holistic fashion with the synergistic function of multiple characteristic constituents. The holism philosophy of TCM is coincident with global and systematic theories of metabolomics. The proposed pseudotargeted metabolomics methodologies were employed for the establishment of reliable quality control markers for use in the screening strategy of TCMs. Pseudotargeted metabolomics integrates the advantages of both targeted and untargeted methods. In the present study, targeted metabolomics equipped with the gold standard RRLC-QqQ-MS method was employed for in vivo quantitative plasma pharmacochemistry study of characteristic prototypic constituents. Meanwhile, untargeted metabolomics using UHPLC-QE Orbitrap HRMS with better specificity and selectivity was employed for identification of untargeted metabolites in the complex plasma matrix. In all, 32 prototypic metabolites were quantitatively determined, and 66 biotransformed metabolites were convincingly identified after being orally administered with standard extracts of four labeled Citrus TCMs. The global absorption and metabolism process of complex TCMs was depicted in a systematic manner.
The US regulatory and pharmacopeia response to the global heparin contamination crisis.
Szajek, Anita Y; Chess, Edward; Johansen, Kristian; Gratzl, Gyöngyi; Gray, Elaine; Keire, David; Linhardt, Robert J; Liu, Jian; Morris, Tina; Mulloy, Barbara; Nasr, Moheb; Shriver, Zachary; Torralba, Pearle; Viskov, Christian; Williams, Roger; Woodcock, Janet; Workman, Wesley; Al-Hakim, Ali
2016-06-09
The contamination of the widely used lifesaving anticoagulant drug heparin in 2007 has drawn renewed attention to the challenges that are associated with the characterization, quality control and standardization of complex biological medicines from natural sources. Heparin is a linear, highly sulfated polysaccharide consisting of alternating glucosamine and uronic acid monosaccharide residues. Heparin has been used successfully as an injectable antithrombotic medicine since the 1930s, and its isolation from animal sources (primarily porcine intestine) as well as its manufacturing processes have not changed substantially since its introduction. The 2007 heparin contamination crisis resulted in several deaths in the United States and hundreds of adverse reactions worldwide, revealing the vulnerability of a complex global supply chain to sophisticated adulteration. This Perspective discusses how the US Food and Drug Administration (FDA), the United States Pharmacopeial Convention (USP) and international stakeholders collaborated to redefine quality expectations for heparin, thus making an important natural product better controlled and less susceptible to economically motivated adulteration.
Network Analysis Reveals Putative Genes Affecting Meat Quality in Angus Cattle.
Mateescu, Raluca G; Garrick, Dorian J; Reecy, James M
2017-01-01
Improvements in eating satisfaction will benefit consumers and should increase beef demand which is of interest to the beef industry. Tenderness, juiciness, and flavor are major determinants of the palatability of beef and are often used to reflect eating satisfaction. Carcass qualities are used as indicator traits for meat quality, with higher quality grade carcasses expected to relate to more tender and palatable meat. However, meat quality is a complex concept determined by many component traits making interpretation of genome-wide association studies (GWAS) on any one component challenging to interpret. Recent approaches combining traditional GWAS with gene network interactions theory could be more efficient in dissecting the genetic architecture of complex traits. Phenotypic measures of 23 traits reflecting carcass characteristics, components of meat quality, along with mineral and peptide concentrations were used along with Illumina 54k bovine SNP genotypes to derive an annotated gene network associated with meat quality in 2,110 Angus beef cattle. The efficient mixed model association (EMMAX) approach in combination with a genomic relationship matrix was used to directly estimate the associations between 54k SNP genotypes and each of the 23 component traits. Genomic correlated regions were identified by partial correlations which were further used along with an information theory algorithm to derive gene network clusters. Correlated SNP across 23 component traits were subjected to network scoring and visualization software to identify significant SNP. Significant pathways implicated in the meat quality complex through GO term enrichment analysis included angiogenesis, inflammation, transmembrane transporter activity, and receptor activity. These results suggest that network analysis using partial correlations and annotation of significant SNP can reveal the genetic architecture of complex traits and provide novel information regarding biological mechanisms and genes that lead to complex phenotypes, like meat quality, and the nutritional and healthfulness value of beef. Improvements in genome annotation and knowledge of gene function will contribute to more comprehensive analyses that will advance our ability to dissect the complex architecture of complex traits.
NASA Astrophysics Data System (ADS)
Manjanaik, N.; Parameshachari, B. D.; Hanumanthappa, S. N.; Banu, Reshma
2017-08-01
Intra prediction process of H.264 video coding standard used to code first frame i.e. Intra frame of video to obtain good coding efficiency compare to previous video coding standard series. More benefit of intra frame coding is to reduce spatial pixel redundancy with in current frame, reduces computational complexity and provides better rate distortion performance. To code Intra frame it use existing process Rate Distortion Optimization (RDO) method. This method increases computational complexity, increases in bit rate and reduces picture quality so it is difficult to implement in real time applications, so the many researcher has been developed fast mode decision algorithm for coding of intra frame. The previous work carried on Intra frame coding in H.264 standard using fast decision mode intra prediction algorithm based on different techniques was achieved increased in bit rate, degradation of picture quality(PSNR) for different quantization parameters. Many previous approaches of fast mode decision algorithms on intra frame coding achieved only reduction of computational complexity or it save encoding time and limitation was increase in bit rate with loss of quality of picture. In order to avoid increase in bit rate and loss of picture quality a better approach was developed. In this paper developed a better approach i.e. Gaussian pulse for Intra frame coding using diagonal down left intra prediction mode to achieve higher coding efficiency in terms of PSNR and bitrate. In proposed method Gaussian pulse is multiplied with each 4x4 frequency domain coefficients of 4x4 sub macro block of macro block of current frame before quantization process. Multiplication of Gaussian pulse for each 4x4 integer transformed coefficients at macro block levels scales the information of the coefficients in a reversible manner. The resulting signal would turn abstract. Frequency samples are abstract in a known and controllable manner without intermixing of coefficients, it avoids picture getting bad hit for higher values of quantization parameters. The proposed work was implemented using MATLAB and JM 18.6 reference software. The proposed work measure the performance parameters PSNR, bit rate and compression of intra frame of yuv video sequences in QCIF resolution under different values of quantization parameter with Gaussian value for diagonal down left intra prediction mode. The simulation results of proposed algorithm are tabulated and compared with previous algorithm i.e. Tian et al method. The proposed algorithm achieved reduced in bit rate averagely 30.98% and maintain consistent picture quality for QCIF sequences compared to previous algorithm i.e. Tian et al method.
NASA Astrophysics Data System (ADS)
Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin
2014-04-01
Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic-fibre sensors, Rayleigh, Luna Technologies
Ricordi, Camillo; Goldstein, Julia S; Balamurugan, A N; Szot, Gregory L; Kin, Tatsuya; Liu, Chengyang; Czarniecki, Christine W; Barbaro, Barbara; Bridges, Nancy D; Cano, Jose; Clarke, William R; Eggerman, Thomas L; Hunsicker, Lawrence G; Kaufman, Dixon B; Khan, Aisha; Lafontant, David-Erick; Linetsky, Elina; Luo, Xunrong; Markmann, James F; Naji, Ali; Korsgren, Olle; Oberholzer, Jose; Turgeon, Nicole A; Brandhorst, Daniel; Chen, Xiaojuan; Friberg, Andrew S; Lei, Ji; Wang, Ling-Jia; Wilhelm, Joshua J; Willits, Jamie; Zhang, Xiaomin; Hering, Bernhard J; Posselt, Andrew M; Stock, Peter G; Shapiro, A M James; Chen, Xiaojuan
2016-11-01
Eight manufacturing facilities participating in the National Institutes of Health-sponsored Clinical Islet Transplantation (CIT) Consortium jointly developed and implemented a harmonized process for the manufacture of allogeneic purified human pancreatic islet (PHPI) product evaluated in a phase 3 trial in subjects with type 1 diabetes. Manufacturing was controlled by a common master production batch record, standard operating procedures that included acceptance criteria for deceased donor organ pancreata and critical raw materials, PHPI product specifications, certificate of analysis, and test methods. The process was compliant with Current Good Manufacturing Practices and Current Good Tissue Practices. This report describes the manufacturing process for 75 PHPI clinical lots and summarizes the results, including lot release. The results demonstrate the feasibility of implementing a harmonized process at multiple facilities for the manufacture of a complex cellular product. The quality systems and regulatory and operational strategies developed by the CIT Consortium yielded product lots that met the prespecified characteristics of safety, purity, potency, and identity and were successfully transplanted into 48 subjects. No adverse events attributable to the product and no cases of primary nonfunction were observed. © 2016 by the American Diabetes Association.
Al-Kasmi, Basheer; Alsirawan, Mhd Bashir; Bashimam, Mais; El-Zein, Hind
2017-08-28
Drug taste masking is a crucial process for the preparation of pediatric and geriatric formulations as well as fast dissolving tablets. Taste masking techniques aim to prevent drug release in saliva and at the same time to obtain the desired release profile in gastrointestinal tract. Several taste masking methods are reported, however this review has focused on a group of promising methods; complexation, encapsulation, and hot melting. The effects of each method on the physicochemical properties of the drug are described in details. Furthermore, a scoring system was established to evaluate each process using recent published data of selected factors. These include, input, process, and output factors that are related to each taste masking method. Input factors include the attributes of the materials used for taste masking. Process factors include equipment type and process parameters. Finally, output factors, include taste masking quality and yield. As a result, Mechanical microencapsulation obtained the highest score (5/8) along with complexation with cyclodextrin suggesting that these methods are the most preferable for drug taste masking. Copyright © 2017 Elsevier B.V. All rights reserved.
Balamurugan, A.N.; Szot, Gregory L.; Kin, Tatsuya; Liu, Chengyang; Czarniecki, Christine W.; Barbaro, Barbara; Bridges, Nancy D.; Cano, Jose; Clarke, William R.; Eggerman, Thomas L.; Hunsicker, Lawrence G.; Kaufman, Dixon B.; Khan, Aisha; Lafontant, David-Erick; Linetsky, Elina; Luo, Xunrong; Markmann, James F.; Naji, Ali; Korsgren, Olle; Oberholzer, Jose; Turgeon, Nicole A.; Brandhorst, Daniel; Chen, Xiaojuan; Friberg, Andrew S.; Lei, Ji; Wang, Ling-jia; Wilhelm, Joshua J.; Willits, Jamie; Zhang, Xiaomin; Hering, Bernhard J.; Posselt, Andrew M.; Stock, Peter G.; Shapiro, A.M. James
2016-01-01
Eight manufacturing facilities participating in the National Institutes of Health–sponsored Clinical Islet Transplantation (CIT) Consortium jointly developed and implemented a harmonized process for the manufacture of allogeneic purified human pancreatic islet (PHPI) product evaluated in a phase 3 trial in subjects with type 1 diabetes. Manufacturing was controlled by a common master production batch record, standard operating procedures that included acceptance criteria for deceased donor organ pancreata and critical raw materials, PHPI product specifications, certificate of analysis, and test methods. The process was compliant with Current Good Manufacturing Practices and Current Good Tissue Practices. This report describes the manufacturing process for 75 PHPI clinical lots and summarizes the results, including lot release. The results demonstrate the feasibility of implementing a harmonized process at multiple facilities for the manufacture of a complex cellular product. The quality systems and regulatory and operational strategies developed by the CIT Consortium yielded product lots that met the prespecified characteristics of safety, purity, potency, and identity and were successfully transplanted into 48 subjects. No adverse events attributable to the product and no cases of primary nonfunction were observed. PMID:27465220
Graphene growth process modeling: a physical-statistical approach
NASA Astrophysics Data System (ADS)
Wu, Jian; Huang, Qiang
2014-09-01
As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.
An Improved Method to Control the Critical Parameters of a Multivariable Control System
NASA Astrophysics Data System (ADS)
Subha Hency Jims, P.; Dharmalingam, S.; Wessley, G. Jims John
2017-10-01
The role of control systems is to cope with the process deficiencies and the undesirable effect of the external disturbances. Most of the multivariable processes are highly iterative and complex in nature. Aircraft systems, Modern Power Plants, Refineries, Robotic systems are few such complex systems that involve numerous critical parameters that need to be monitored and controlled. Control of these important parameters is not only tedious and cumbersome but also is crucial from environmental, safety and quality perspective. In this paper, one such multivariable system, namely, a utility boiler has been considered. A modern power plant is a complex arrangement of pipework and machineries with numerous interacting control loops and support systems. In this paper, the calculation of controller parameters based on classical tuning concepts has been presented. The controller parameters thus obtained and employed has controlled the critical parameters of a boiler during fuel switching disturbances. The proposed method can be applied to control the critical parameters like elevator, aileron, rudder, elevator trim rudder and aileron trim, flap control systems of aircraft systems.
Dangerous mating systems: signal complexity, signal content and neural capacity in spiders.
Herberstein, M E; Wignall, A E; Hebets, E A; Schneider, J M
2014-10-01
Spiders are highly efficient predators in possession of exquisite sensory capacities for ambushing prey, combined with machinery for launching rapid and determined attacks. As a consequence, any sexually motivated approach carries a risk of ending up as prey rather than as a mate. Sexual selection has shaped courtship to effectively communicate the presence, identity, motivation and/or quality of potential mates, which help ameliorate these risks. Spiders communicate this information via several sensory channels, including mechanical (e.g. vibrational), visual and/or chemical, with examples of multimodal signalling beginning to emerge in the literature. The diverse environments that spiders inhabit have further shaped courtship content and form. While our understanding of spider neurobiology remains in its infancy, recent studies are highlighting the unique and considerable capacities of spiders to process and respond to complex sexual signals. As a result, the dangerous mating systems of spiders are providing important insights into how ecology shapes the evolution of communication systems, with future work offering the potential to link this complex communication with its neural processes. Copyright © 2014 Elsevier Ltd. All rights reserved.
A multistage motion vector processing method for motion-compensated frame interpolation.
Huang, Ai- Mei; Nguyen, Truong Q
2008-05-01
In this paper, a novel, low-complexity motion vector processing algorithm at the decoder is proposed for motion-compensated frame interpolation or frame rate up-conversion. We address the problems of having broken edges and deformed structures in an interpolated frame by hierarchically refining motion vectors on different block sizes. Our method explicitly considers the reliability of each received motion vector and has the capability of preserving the structure information. This is achieved by analyzing the distribution of residual energies and effectively merging blocks that have unreliable motion vectors. The motion vector reliability information is also used as a prior knowledge in motion vector refinement using a constrained vector median filter to avoid choosing identical unreliable one. We also propose using chrominance information in our method. Experimental results show that the proposed scheme has better visual quality and is also robust, even in video sequences with complex scenes and fast motion.
Brykala, M; Deptula, A; Rogowski, M; Lada, W; Olczak, T; Wawszczak, D; Smolinski, T; Wojtowicz, P; Modolo, G
A new method for synthesis of uranium oxide microspheres (diameter <100 μm) has been developed. It is a variant of our patented Complex Sol-Gel Process, which has been used to synthesize high-quality powders of a wide variety of complex oxides. Starting uranyl-nitrate-ascorbate sols were prepared by addition of ascorbic acid to uranyl nitrate hexahydrate solution and alkalizing by aqueous ammonium hydroxide and then emulsified in 2-ethylhexanol-1 containing 1v/o SPAN-80. Drops of emulsion were firstly gelled by extraction of water by the solvent. Destruction of the microspheres during thermal treatment, owing to highly reactive components in the gels, requires modification of the gelation step by Double Extraction Process-simultaneously extraction of water and nitrates using Primene JMT, which completely eliminates these problem. Final step was calcination in air of obtained microspheres of gels to triuranium octaoxide.
Lown, Beth A
2015-06-02
Compassion is a complex process that is innate, determined in part by individual traits, and modulated by a myriad of conscious and unconscious factors, immediate context, social structures and expectations, and organizational "culture." Compassion is an ethical foundation of healthcare and a widely shared value; it is not an optional luxury in the healing process. While the interrelations between individual motivation and social structure are complex, we can choose to act individually and collectively to remove barriers to the innate compassion that most healthcare professionals bring to their work. Doing so will reduce professional burnout, improve the well-being of the healthcare workforce, and facilitate our efforts to achieve the triple aim of improving patients' experiences of care and health while lowering costs. © 2015 by Kerman University of Medical Sciences.
Hardwick, Steven W.; Luisi, Ben F.
2013-01-01
RNA helicases are compact, machine-like proteins that can harness the energy of nucleoside triphosphate binding and hydrolysis to dynamically remodel RNA structures and protein-RNA complexes. Through such activities, helicases participate in virtually every process associated with the expression of genetic information. Often found as components of multi-enzyme assemblies, RNA helicases facilitate the processivity of RNA degradation, the remodeling of protein interactions during maturation of structured RNA precursors, and fidelity checks of RNA quality. In turn, the assemblies modulate and guide the activities of the helicases. We describe the roles of RNA helicases with a conserved “DExD/H box” sequence motif in representative examples of such machineries from bacteria, archaea and eukaryotes. The recurrent occurrence of such helicases in complex assemblies throughout the course of evolution suggests a common requirement for their activities to meet cellular demands for the dynamic control of RNA metabolism. PMID:23064154
Suss, Samuel; Bhuiyan, Nadia; Demirli, Kudret; Batist, Gerald
2017-06-01
Outpatient cancer treatment centers can be considered as complex systems in which several types of medical professionals and administrative staff must coordinate their work to achieve the overall goals of providing quality patient care within budgetary constraints. In this article, we use analytical methods that have been successfully employed for other complex systems to show how a clinic can simultaneously reduce patient waiting times and non-value added staff work in a process that has a series of steps, more than one of which involves a scarce resource. The article describes the system model and the key elements in the operation that lead to staff rework and patient queuing. We propose solutions to the problems and provide a framework to evaluate clinic performance. At the time of this report, the proposals are in the process of implementation at a cancer treatment clinic in a major metropolitan hospital in Montreal, Canada.
Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D
NASA Astrophysics Data System (ADS)
Bales, Ben; Pollock, Tresa; Petzold, Linda
2017-06-01
Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.
Tsoi, Shuk C; Aiya, Utsav V; Wasner, Kobi D; Phan, Mimi L; Pytte, Carolyn L; Vicario, David S
2014-01-01
Many brain regions exhibit lateral differences in structure and function, and also incorporate new neurons in adulthood, thought to function in learning and in the formation of new memories. However, the contribution of new neurons to hemispheric differences in processing is unknown. The present study combines cellular, behavioral, and physiological methods to address whether 1) new neuron incorporation differs between the brain hemispheres, and 2) the degree to which hemispheric lateralization of new neurons correlates with behavioral and physiological measures of learning and memory. The songbird provides a model system for assessing the contribution of new neurons to hemispheric specialization because songbird brain areas for vocal processing are functionally lateralized and receive a continuous influx of new neurons in adulthood. In adult male zebra finches, we quantified new neurons in the caudomedial nidopallium (NCM), a forebrain area involved in discrimination and memory for the complex vocalizations of individual conspecifics. We assessed song learning and recorded neural responses to song in NCM. We found significantly more new neurons labeled in left than in right NCM; moreover, the degree of asymmetry in new neuron numbers was correlated with the quality of song learning and strength of neuronal memory for recently heard songs. In birds with experimentally impaired song quality, the hemispheric difference in new neurons was diminished. These results suggest that new neurons may contribute to an allocation of function between the hemispheres that underlies the learning and processing of complex signals.
Wasner, Kobi D.; Phan, Mimi L.; Pytte, Carolyn L.; Vicario, David S.
2014-01-01
Many brain regions exhibit lateral differences in structure and function, and also incorporate new neurons in adulthood, thought to function in learning and in the formation of new memories. However, the contribution of new neurons to hemispheric differences in processing is unknown. The present study combines cellular, behavioral, and physiological methods to address whether 1) new neuron incorporation differs between the brain hemispheres, and 2) the degree to which hemispheric lateralization of new neurons correlates with behavioral and physiological measures of learning and memory. The songbird provides a model system for assessing the contribution of new neurons to hemispheric specialization because songbird brain areas for vocal processing are functionally lateralized and receive a continuous influx of new neurons in adulthood. In adult male zebra finches, we quantified new neurons in the caudomedial nidopallium (NCM), a forebrain area involved in discrimination and memory for the complex vocalizations of individual conspecifics. We assessed song learning and recorded neural responses to song in NCM. We found significantly more new neurons labeled in left than in right NCM; moreover, the degree of asymmetry in new neuron numbers was correlated with the quality of song learning and strength of neuronal memory for recently heard songs. In birds with experimentally impaired song quality, the hemispheric difference in new neurons was diminished. These results suggest that new neurons may contribute to an allocation of function between the hemispheres that underlies the learning and processing of complex signals. PMID:25251077
Conceptual design of an aircraft automated coating removal system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, J.E.; Draper, J.V.; Pin, F.G.
1996-05-01
Paint stripping of the U.S. Air Force`s large transport aircrafts is currently a labor-intensive, manual process. Significant reductions in costs, personnel and turnaround time can be accomplished by the judicious use of automation in some process tasks. This paper presents the conceptual design of a coating removal systems for the tail surfaces of the C-5 plane. Emphasis is placed on the technology selection to optimize human-automation synergy with respect to overall costs, throughput, quality, safety, and reliability. Trade- offs between field-proven vs. research-requiring technologies, and between expected gain vs. cost and complexity, have led to a conceptual design which ismore » semi-autonomous (relying on the human for task specification and disturbance handling) yet incorporates sensor- based automation (for sweep path generation and tracking, surface following, stripping quality control and tape/breach handling).« less
Analysis of Food Contaminants, Residues, and Chemical Constituents of Concern
NASA Astrophysics Data System (ADS)
Ismail, Baraem; Reuhs, Bradley L.; Nielsen, S. Suzanne
The food chain that starts with farmers and ends with consumers can be complex, involving multiple stages of production and distribution (planting, harvesting, breeding, transporting, storing, importing, processing, packaging, distributing to retail markets, and shelf storing) (Fig. 18.1). Various practices can be employed at each stage in the food chain, which may include pesticide treatment, agricultural bioengineering, veterinary drug administration, environmental and storage conditions, processing applications, economic gain practices, use of food additives, choice of packaging material, etc. Each of these practices can play a major role in food quality and safety, due to the possibility of contamination with or introduction (intentionally and nonintentionally) of hazardous substances or constituents. Legislation and regulation to ensure food quality and safety are in place and continue to develop to protect the stakeholders, namely farmers, consumers, and industry. [Refer to reference (1) for information on regulations of food contaminants and residues.
NASA Astrophysics Data System (ADS)
Graves, Mark; Smith, Alexander; Batchelor, Bruce G.; Palmer, Stephen C.
1994-10-01
In the food industry there is an ever increasing need to control and monitor food quality. In recent years fully automated x-ray inspection systems have been used to detect food on-line for foreign body contamination. These systems involve a complex integration of x- ray imaging components with state of the art high speed image processing. The quality of the x-ray image obtained by such systems is very poor compared with images obtained from other inspection processes, this makes reliable detection of very small, low contrast defects extremely difficult. It is therefore extremely important to optimize the x-ray imaging components to give the very best image possible. In this paper we present a method of analyzing the x-ray imaging system in order to consider the contrast obtained when viewing small defects.
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Applications of artificial neural networks (ANNs) in food science.
Huang, Yiqun; Kangas, Lars J; Rasco, Barbara A
2007-01-01
Artificial neural networks (ANNs) have been applied in almost every aspect of food science over the past two decades, although most applications are in the development stage. ANNs are useful tools for food safety and quality analyses, which include modeling of microbial growth and from this predicting food safety, interpreting spectroscopic data, and predicting physical, chemical, functional and sensory properties of various food products during processing and distribution. ANNs hold a great deal of promise for modeling complex tasks in process control and simulation and in applications of machine perception including machine vision and electronic nose for food safety and quality control. This review discusses the basic theory of the ANN technology and its applications in food science, providing food scientists and the research community an overview of the current research and future trend of the applications of ANN technology in the field.