A negotiation methodology and its application to cogeneration planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, S.M.; Liu, C.C.; Luu, S.
Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.
Applying Statistical Process Quality Control Methodology to Educational Settings.
ERIC Educational Resources Information Center
Blumberg, Carol Joyce
A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…
Coupling Computer-Aided Process Simulation and ...
A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
Evaluating Payments for Environmental Services: Methodological Challenges
2016-01-01
Over the last fifteen years, Payments for Environmental Services (PES) schemes have become very popular environmental policy instruments, but the academic literature has begun to question their additionality. The literature attempts to estimate the causal effect of these programs by applying impact evaluation (IE) techniques. However, PES programs are complex instruments and IE methods cannot be directly applied without adjustments. Based on a systematic review of the literature, this article proposes a framework for the methodological process of designing an IE for PES schemes. It revises and discusses the methodological choices at each step of the process and proposes guidelines for practitioners. PMID:26910850
A Nursing Process Methodology.
ERIC Educational Resources Information Center
Ryan-Wenger, Nancy M.
1990-01-01
A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)
Meaning and Problems of Planning
ERIC Educational Resources Information Center
Brieve, Fred J.; Johnston, A. P.
1973-01-01
Examines the educational planning process. Discusses what planning is, how methodological planning can work in education, misunderstandings about planning, and difficulties in applying the planning methodology. (DN)
A Data Preparation Methodology in Data Mining Applied to Mortality Population Databases.
Pérez, Joaquín; Iturbide, Emmanuel; Olivares, Víctor; Hidalgo, Miguel; Martínez, Alicia; Almanza, Nelva
2015-11-01
It is known that the data preparation phase is the most time consuming in the data mining process, using up to 50% or up to 70% of the total project time. Currently, data mining methodologies are of general purpose and one of their limitations is that they do not provide a guide about what particular task to develop in a specific domain. This paper shows a new data preparation methodology oriented to the epidemiological domain in which we have identified two sets of tasks: General Data Preparation and Specific Data Preparation. For both sets, the Cross-Industry Standard Process for Data Mining (CRISP-DM) is adopted as a guideline. The main contribution of our methodology is fourteen specialized tasks concerning such domain. To validate the proposed methodology, we developed a data mining system and the entire process was applied to real mortality databases. The results were encouraging because it was observed that the use of the methodology reduced some of the time consuming tasks and the data mining system showed findings of unknown and potentially useful patterns for the public health services in Mexico.
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
NASA Astrophysics Data System (ADS)
Srivastava, Y.; Srivastava, S.; Boriwal, L.
2016-09-01
Mechanical alloying is a novelistic solid state process that has received considerable attention due to many advantages over other conventional processes. In the present work, Co2FeAl healer alloy powder, prepared successfully from premix basic powders of Cobalt (Co), Iron (Fe) and Aluminum (Al) in stoichiometric of 60Co-26Fe-14Al (weight %) by novelistic mechano-chemical route. Magnetic properties of mechanically alloyed powders were characterized by vibrating sample magnetometer (VSM). 2 factor 5 level design matrix was applied to experiment process. Experimental results were used for response surface methodology. Interaction between the input process parameters and the response has been established with the help of regression analysis. Further analysis of variance technique was applied to check the adequacy of developed model and significance of process parameters. Test case study was performed with those parameters, which was not selected for main experimentation but range was same. Response surface methodology, the process parameters must be optimized to obtain improved magnetic properties. Further optimum process parameters were identified using numerical and graphical optimization techniques.
Tracer methodology: an appropriate tool for assessing compliance with accreditation standards?
Bouchard, Chantal; Jean, Olivier
2017-10-01
Tracer methodology has been used by Accreditation Canada since 2008 to collect evidence on the quality and safety of care and services, and to assess compliance with accreditation standards. Given the importance of this methodology in the accreditation program, the objective of this study is to assess the quality of the methodology and identify its strengths and weaknesses. A mixed quantitative and qualitative approach was adopted to evaluate consistency, appropriateness, effectiveness and stakeholder synergy in applying the methodology. An online questionnaire was sent to 468 Accreditation Canada surveyors. According to surveyors' perceptions, tracer methodology is an effective tool for collecting useful, credible and reliable information to assess compliance with Qmentum program standards and priority processes. The results show good coherence between methodology components (appropriateness of the priority processes evaluated, activities to evaluate a tracer, etc.). The main weaknesses are the time constraints faced by surveyors and management's lack of cooperation during the evaluation of tracers. The inadequate amount of time allowed for the methodology to be applied properly raises questions about the quality of the information obtained. This study paves the way for a future, more in-depth exploration of the identified weaknesses to help the accreditation organization make more targeted improvements to the methodology. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth's resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.
Incorporating ITS into corridor planning : Seattle case study
DOT National Transportation Integrated Search
1999-08-01
The goals of this study were to develop a methodology for incorporating Intelligent Transportation Systems (ITS) into the transportation planning process and apply the methodology to estimate ITS costs and benefits for one case study. A major result ...
Incorporating ITS into corridor planning : Seattle case study
DOT National Transportation Integrated Search
1999-06-01
The goals of this study were to develop a methodology for incorporating Intelligent Transportation Systems (ITS) into the transportation planning process and apply the methodology to estimate ITS costs and benefits for one case study. A major result ...
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
2009-09-01
the most efficient model is developed and validated by applying it to the current IA C&A process flow at the TSO-KC. Finally... models are explored using the Knowledge Value Added (KVA) methodology, and the most efficient model is developed and validated by applying it to the ... models requires only one available actor from its respective group , rather than all actors in the group , to
Integrated structure/control design - Present methodology and future opportunities
NASA Technical Reports Server (NTRS)
Weisshaar, T. A.; Newsom, J. R.; Zeiler, T. A.; Gilbert, M. G.
1986-01-01
Attention is given to current methodology applied to the integration of the optimal design process for structures and controls. Multilevel linear decomposition techniques proved to be most effective in organizing the computational efforts necessary for ISCD (integrated structures and control design) tasks. With the development of large orbiting space structures and actively controlled, high performance aircraft, there will be more situations in which this concept can be applied.
Sampalli, Tara; Desy, Michel; Dhir, Minakshi; Edwards, Lynn; Dickson, Robert; Blackmore, Gail
2015-04-05
Recognizing the significant impact of wait times for care for individuals with complex chronic conditions, we applied a LEAN methodology, namely - an adaptation of Value Stream Mapping (VSM) to meet the needs of people with multiple chronic conditions and to improve wait times without additional resources or funding. Over an 18-month time period, staff applied a patient-centric approach that included LEAN methodology of VSM to improve wait times to care. Our framework of evaluation was grounded in the needs and perspectives of patients and individuals waiting to receive care. Patient centric views were obtained through surveys such as Patient Assessment of Chronic Illness Care (PACIC) and process engineering based questions. In addition, LEAN methodology, VSM was added to identify non-value added processes contributing to wait times. The care team successfully reduced wait times to 2 months in 2014 with no wait times for care anticipated in 2015. Increased patient engagement and satisfaction are also outcomes of this innovative initiative. In addition, successful transformations and implementation have resulted in resource efficiencies without increase in costs. Patients have shown significant improvements in functional health following Integrated Chronic Care Service (ICCS) intervention. The methodology will be applied to other chronic disease management areas in Capital Health and the province. Wait times to care in the management of multimoribidities and other complex conditions can add a significant burden not only on the affected individuals but also on the healthcare system. In this study, a novel and modified LEAN methodology has been applied to embed the voice of the patient in care delivery processes and to reduce wait times to care in the management of complex chronic conditions. © 2015 by Kerman University of Medical Sciences.
[Methodological problems in the use of information technologies in physical education].
Martirosov, E G; Zaĭtseva, G A
2000-01-01
The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.
Parallel processing in a host plus multiple array processor system for radar
NASA Technical Reports Server (NTRS)
Barkan, B. Z.
1983-01-01
Host plus multiple array processor architecture is demonstrated to yield a modular, fast, and cost-effective system for radar processing. Software methodology for programming such a system is developed. Parallel processing with pipelined data flow among the host, array processors, and discs is implemented. Theoretical analysis of performance is made and experimentally verified. The broad class of problems to which the architecture and methodology can be applied is indicated.
NASA Astrophysics Data System (ADS)
Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.
2017-09-01
This study conducts the simulation on optimisation of injection moulding process parameters using Autodesk Moldflow Insight (AMI) software. This study has applied some process parameters which are melt temperature, mould temperature, packing pressure, and cooling time in order to analyse the warpage value of the part. Besides, a part has been selected to be studied which made of Polypropylene (PP). The combination of the process parameters is analysed using Analysis of Variance (ANOVA) and the optimised value is obtained using Response Surface Methodology (RSM). The RSM as well as Genetic Algorithm are applied in Design Expert software in order to minimise the warpage value. The outcome of this study shows that the warpage value improved by using RSM and GA.
The Statistical point of view of Quality: the Lean Six Sigma methodology
Viti, Andrea; Terzi, Alberto
2015-01-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality. PMID:25973253
The Statistical point of view of Quality: the Lean Six Sigma methodology.
Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto
2015-04-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.
Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi
2016-12-15
Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Study about Kalman Filters Applied to Embedded Sensors
Valade, Aurélien; Acco, Pascal; Grabolosa, Pierre; Fourniols, Jean-Yves
2017-01-01
Over the last decade, smart sensors have grown in complexity and can now handle multiple measurement sources. This work establishes a methodology to achieve better estimates of physical values by processing raw measurements within a sensor using multi-physical models and Kalman filters for data fusion. A driving constraint being production cost and power consumption, this methodology focuses on algorithmic complexity while meeting real-time constraints and improving both precision and reliability despite low power processors limitations. Consequently, processing time available for other tasks is maximized. The known problem of estimating a 2D orientation using an inertial measurement unit with automatic gyroscope bias compensation will be used to illustrate the proposed methodology applied to a low power STM32L053 microcontroller. This application shows promising results with a processing time of 1.18 ms at 32 MHz with a 3.8% CPU usage due to the computation at a 26 Hz measurement and estimation rate. PMID:29206187
Analysis of a Proposal to Implement the Readiness Based Sparing Process in the Brazilian Navy
2017-06-01
determine inventory levels. This research investigates whether implementing the U.S. DOD readiness-based sparing (RBS) methodology could provide the...suggested by applying the methodology first for determining reparable spares initial provisioning. 14. SUBJECT TERMS reparable, system-approach...This research investigates whether implementing the U.S. DOD readiness-based sparing (RBS) methodology could provide the Brazilian Navy with greater
Site selection for MSFC operational tests of solar heating and cooling systems
NASA Technical Reports Server (NTRS)
1978-01-01
The criteria, methodology, and sequence aspects of the site selection process are presented. This report organized the logical thought process that should be applied to the site selection process, but final decisions are highly selective.
Assessment Methodology for Process Validation Lifecycle Stage 3A.
Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana
2017-07-01
The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.
GREENSCOPE Technical User’s Guide
GREENSCOPE’s methodology has been developed and its software tool designed such that it can be applied to an entire process, to a piece of equipment or process unit, or at the investigatory bench scale.
The Scholarship of Teaching and Learning: Transformation and Transgression
ERIC Educational Resources Information Center
Bolf-Beliveau, Laura
2013-01-01
Chapter Five of "The Scholarship of Teaching and Learning Reconsidered" (2011) suggests that traditional research scholarship methodology can inform and reform the ways in which we value and evaluate teaching. The authors discuss applying research methodology as way to complete this process. This article suggests that using theoretical…
The Interpretative Phenomenological Analysis (IPA): A Guide to a Good Qualitative Research Approach
ERIC Educational Resources Information Center
Alase, Abayomi
2017-01-01
As a research methodology, qualitative research method infuses an added advantage to the exploratory capability that researchers need to explore and investigate their research studies. Qualitative methodology allows researchers to advance and apply their interpersonal and subjectivity skills to their research exploratory processes. However, in a…
Life Cycle Assessment Software for Product and Process Sustainability Analysis
ERIC Educational Resources Information Center
Vervaeke, Marina
2012-01-01
In recent years, life cycle assessment (LCA), a methodology for assessment of environmental impacts of products and services, has become increasingly important. This methodology is applied by decision makers in industry and policy, product developers, environmental managers, and other non-LCA specialists working on environmental issues in a wide…
NASA Astrophysics Data System (ADS)
Mateos-Espejel, Enrique
The objective of this thesis is to develop, validate, and apply a unified methodology for the energy efficiency improvement of a Kraft process that addresses globally the interactions of the various process systems that affect its energy performance. An implementation strategy is the final result. An operating Kraft pulping mill situated in Eastern Canada with a production of 700 adt/d of high-grade bleached pulp was the case study. The Pulp and Paper industry is Canada's premier industry. It is characterized by large thermal energy and water consumption. Rising energy costs and more stringent environmental regulations have led the industry to refocus its efforts toward identifying ways to improve energy and water conservation. Energy and water aspects are usually analyzed independently, but in reality they are strongly interconnected. Therefore, there is a need for an integrated methodology, which considers energy and water aspects, as well as the optimal utilization and production of the utilities. The methodology consists of four successive stages. The first stage is the base case definition. The development of a focused, reliable and representative model of an operating process is a prerequisite to the optimization and fine tuning of its energy performance. A four-pronged procedure has been developed: data gathering, master diagram, utilities systems analysis, and simulation. The computer simulation has been focused on the energy and water systems. The second stage corresponds to the benchmarking analysis. The benchmarking of the base case has the objectives of identifying the process inefficiencies and to establish guidelines for the development of effective enhancement measures. The studied process is evaluated by a comparison of its efficiency to the current practice of the industry and by the application of new energy and exergy content indicators. The minimum energy and water requirements of the process are also determined in this step. The third stage is the core of the methodology; it represents the formulation of technically feasible energy enhancing options. Several techniques are applied in an iterative procedure to cast light on their synergies and counter-actions. The objective is to develop a path for improving the process so as to maximize steam savings while minimizing the investment required. The fourth stage is the implementation strategy. As the existing process configuration and operating conditions vary from process to process it is important to develop a strategy for the implementation of energy enhancement programs in the most advantageous way for each case. A three-phase strategy was selected for the specific case study in the context of its management strategic plan: the elimination of fossil fuel, the production of power and the liberation of steam capacity. A post-benchmarking analysis is done to quantify the improvement of the energy efficiency. The performance indicators are computed after all energy enhancing measures have been implemented. The improvement of the process by applying the unified methodology results in substantially more steam savings than by applying individually the typical techniques that it comprises: energy savings of 5.6 GJ/adt (27% of the current requirement), water savings of 32 m3/adt (34% of the current requirement) and an electricity production potential of 44.5MW. As a result of applying the unified methodology the process becomes eco-friendly as it does not require fossil fuel for producing steam; its water and steam consumptions are below the Canadian average and it produces large revenues from the production of green electricity.
Tchepel, Oxana; Dias, Daniela
2011-06-01
This study is focused on the assessment of potential health benefits by meeting the air quality limit values (2008/50/CE) for short-term PM₁₀ exposure. For this purpose, the methodology of the WHO for Health Impact Assessment and APHEIS guidelines for data collection were applied to Porto Metropolitan Area, Portugal. Additionally, an improved methodology using population mobility data is proposed in this work to analyse number of persons exposed. In order to obtain representative background concentrations, an innovative approach to process air quality time series was implemented. The results provide the number of attributable cases prevented annually by reducing PM(10) concentration. An intercomparison of two approaches to process input data for the health risk analysis provides information on sensitivity of the applied methodology. The findings highlight the importance of taking into account spatial variability of the air pollution levels and population mobility in the health impact assessment.
Automatic Feature Selection and Improved Classification in SICADA Counterfeit Electronics Detection
2017-03-20
The SICADA methodology was developed to detect such counterfeit microelectronics by collecting power side channel data and applying machine learning...to identify counterfeits. This methodology has been extended to include a two-step automated feature selection process and now uses a one-class SVM...classifier. We describe this methodology and show results for empirical data collected from several types of Microchip dsPIC33F microcontrollers
Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process
NASA Astrophysics Data System (ADS)
Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.
2015-08-01
An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.
Cabrera-Barona, Pablo; Ghorbanzadeh, Omid
2018-01-16
Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas.
Cabrera-Barona, Pablo
2018-01-01
Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas. PMID:29337915
Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.
Nandi, Anirban; Danquah, Michael K.
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA. PMID:25057428
12om Methodology: Process v1.1
2014-03-31
in support of the Applied Research Project (ARP) 12om entitled “Collaborative Understanding of Complex Situations”. The overall purpose of this...Definition ARP Applied Research Project CF Canadian Forces CFOPP Canadian Forces Operational Planning Process CIDA Canadian International... Research Project (ARP) 12om entitled “Collaborative Understanding of Complex Situations”. The overall purpose of this project is to develop a
Using Appreciative Inquiry to Create a Sustainable Rural School District and Community
ERIC Educational Resources Information Center
Calabrese, Raymond; Hester, Michael; Friesen, Scott; Burkhalter, Kim
2010-01-01
Purpose: The purpose of this paper is to document how a doctoral research team applied an action research process to improve communication and collaboration strategies among rural Midwestern school district stakeholders. Design/methodology/approach: An appreciative inquiry (AI) action research methodology framed as a qualitative case study using…
Recovery and purification process development for monoclonal antibody production
Ma, Junfen; Winter, Charles; Bayer, Robert
2010-01-01
Hundreds of therapeutic monoclonal antibodies (mAbs) are currently in development, and many companies have multiple antibodies in their pipelines. Current methodology used in recovery processes for these molecules are reviewed here. Basic unit operations such as harvest, Protein A affinity chromatography and additional polishing steps are surveyed. Alternative processes such as flocculation, precipitation and membrane chromatography are discussed. We also cover platform approaches to purification methods development, use of high throughput screening methods, and offer a view on future developments in purification methodology as applied to mAbs. PMID:20647768
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial
Application of Six Sigma towards improving surgical outcomes.
Shukla, P J; Barreto, S G; Nadkarni, M S
2008-01-01
Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.
Moore, Bethany; Bone, Eric A
2017-01-01
The concept of triage in healthcare has been around for centuries and continues to be applied today so that scarce resources are allocated according to need. A business impact analysis (BIA) is a form of triage in that it identifies which processes are most critical, which to address first and how to allocate limited resources. On its own, however, the BIA provides only a roadmap of the impacts and interdependencies of an event. When disaster strikes, organisational decision-makers often face difficult decisions with regard to allocating limited resources between multiple 'mission-critical' functions. Applying the concept of triage to business continuity provides those decision-makers navigating a rapidly evolving and unpredictable event with a path that protects the fundamental priorities of the organisation. A business triage methodology aids decision-makers in times of crisis by providing a simplified framework for decision-making based on objective, evidence-based criteria, which is universally accepted and understood. When disaster strikes, the survival of the organisation depends on critical decision-making and quick actions to stabilise the incident. This paper argues that organisations need to supplement BIA processes with a decision-making triage methodology that can be quickly applied during the chaos of an actual event.
Bovea, María D; Ibáñez-Forés, Valeria; Pérez-Belis, Victoria; Quemades-Beltrán, Pilar
2016-07-01
This study proposes a general methodology for assessing and estimating the potential reuse of small waste electrical and electronic equipment (sWEEE), focusing on devices classified as domestic appliances. Specific tests for visual inspection, function and safety have been defined for ten different types of household appliances (vacuum cleaner, iron, microwave, toaster, sandwich maker, hand blender, juicer, boiler, heater and hair dryer). After applying the tests, reuse protocols have been defined in the form of easy-to-apply checklists for each of the ten types of appliance evaluated. This methodology could be useful for reuse enterprises, since there is a lack of specific protocols, adapted to each type of appliance, to test its potential of reuse. After applying the methodology, electrical and electronic appliances (used or waste) can be segregated into three categories: the appliance works properly and can be classified as direct reuse (items can be used by a second consumer without prior repair operations), the appliance requires a later evaluation of its potential refurbishment and repair (restoration of products to working order, although with possible loss of quality) or the appliance needs to be finally discarded from the reuse process and goes directly to a recycling process. Results after applying the methodology to a sample of 87.7kg (96 units) show that 30.2% of the appliances have no potential for reuse and should be diverted for recycling, while 67.7% require a subsequent evaluation of their potential refurbishment and repair, and only 2.1% of them could be directly reused with minor cleaning operations. This study represents a first approach to the "preparation for reuse" strategy that the European Directive related to Waste Electrical and Electronic Equipment encourages to be applied. However, more research needs to be done as an extension of this study, mainly related to the identification of the feasibility of repair or refurbishment operations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Methodological challenges when doing research that includes ethnic minorities: a scoping review.
Morville, Anne-Le; Erlandsson, Lena-Karin
2016-11-01
There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.
"Conceptual Change" as both Revolutionary and Evolutionary Process
ERIC Educational Resources Information Center
Keiny, Shoshana
2008-01-01
Our argument concerning the debate around the process of "conceptual change" is that it is both an evolutionary learning process and a revolutionary paradigm change. To gain a deeper understanding of the process, the article focuses on the discourse of educational facilitators participating in a community of learners. Applying the methodology of…
ERIC Educational Resources Information Center
Maseda, F. J.; Martija, I.; Martija, I.
2012-01-01
This paper describes a novel Electrical Machine and Power Electronic Training Tool (EM&PE[subscript TT]), a methodology for using it, and associated experimental educational activities. The training tool is implemented by recreating a whole power electronics system, divided into modular blocks. This process is similar to that applied when…
A Matrix Approach to Software Process Definition
NASA Technical Reports Server (NTRS)
Schultz, David; Bachman, Judith; Landis, Linda; Stark, Mike; Godfrey, Sally; Morisio, Maurizio; Powers, Edward I. (Technical Monitor)
2000-01-01
The Software Engineering Laboratory (SEL) is currently engaged in a Methodology and Metrics program for the Information Systems Center (ISC) at Goddard Space Flight Center (GSFC). This paper addresses the Methodology portion of the program. The purpose of the Methodology effort is to assist a software team lead in selecting and tailoring a software development or maintenance process for a specific GSFC project. It is intended that this process will also be compliant with both ISO 9001 and the Software Engineering Institute's Capability Maturity Model (CMM). Under the Methodology program, we have defined four standard ISO-compliant software processes for the ISC, and three tailoring criteria that team leads can use to categorize their projects. The team lead would select a process and appropriate tailoring factors, from which a software process tailored to the specific project could be generated. Our objective in the Methodology program is to present software process information in a structured fashion, to make it easy for a team lead to characterize the type of software engineering to be performed, and to apply tailoring parameters to search for an appropriate software process description. This will enable the team lead to follow a proven, effective software process and also satisfy NASA's requirement for compliance with ISO 9001 and the anticipated requirement for CMM assessment. This work is also intended to support the deployment of sound software processes across the ISC.
An industrial ecology approach to municipal solid waste ...
Municipal solid waste (MSW) can be viewed as a feedstock for industrial ecology inspired conversions of wastes to valuable products and energy. The industrial ecology principle of symbiotic processes using waste streams for creating value-added products is applied to MSW, with examples suggested for various residual streams. A methodology is presented to consider individual waste-to-energy or waste-to-product system synergies, evaluating the economic and environmental issues associated with each system. Steps included in the methodology include identifying waste streams, specific waste components of interest, and conversion technologies, plus steps for determining the economic and environmental effects of using wastes and changes due to transport, administrative handling, and processing. In addition to presenting the methodology, technologies for various MSW input streams are categorized as commercialized or demonstrated to provide organizations that are considering processes for MSW with summarized information. The organization can also follow the methodology to analyze interesting processes. Presents information useful for analyzing the sustainability of alternatives for the management of municipal solid waste.
Applying a contemporary grounded theory methodology.
Licqurish, Sharon; Seibold, Carmel
2011-01-01
The aim of this paper is to discuss the application of a contemporary grounded theory methodology to a research project exploring the experiences of students studying for a degree in midwifery. Grounded theory is a qualitative research approach developed by Glaser and Strauss in the 1950s but the methodology for this study was modelled on Clarke's (2005) approach and was underpinned by a symbolic interactionist theoretical perspective, post-structuralist theories of Michel Foucault and a constructionist epistemology. The study participants were 19 midwifery students completing their final placement. Data were collected through individual in-depth interviews and participant observation, and analysed using the grounded theory analysis techniques of coding, constant comparative analysis and theoretical sampling, as well as situational maps. The analysis focused on social action and interaction and the operation of power in the students' environment. The social process in which the students were involved, as well as the actors and discourses that affected the students' competency development, were highlighted. The methodology allowed a thorough exploration of the students' experiences of achieving competency. However, some difficulties were encountered. One of the major issues related to the understanding and application of complex sociological theories that challenged positivist notions of truth and power. Furthermore, the mapping processes were complex. Despite these minor challenges, the authors recommend applying this methodology to other similar research projects.
NASA Technical Reports Server (NTRS)
Dec, John A.; Braun, Robert D.
2011-01-01
A finite element ablation and thermal response program is presented for simulation of three-dimensional transient thermostructural analysis. The three-dimensional governing differential equations and finite element formulation are summarized. A novel probabilistic design methodology for thermal protection systems is presented. The design methodology is an eight step process beginning with a parameter sensitivity study and is followed by a deterministic analysis whereby an optimum design can determined. The design process concludes with a Monte Carlo simulation where the probabilities of exceeding design specifications are estimated. The design methodology is demonstrated by applying the methodology to the carbon phenolic compression pads of the Crew Exploration Vehicle. The maximum allowed values of bondline temperature and tensile stress are used as the design specifications in this study.
Lismont, Jasmien; Janssens, Anne-Sophie; Odnoletkova, Irina; Vanden Broucke, Seppe; Caron, Filip; Vanthienen, Jan
2016-10-01
The aim of this study is to guide healthcare instances in applying process analytics on healthcare processes. Process analytics techniques can offer new insights in patient pathways, workflow processes, adherence to medical guidelines and compliance with clinical pathways, but also bring along specific challenges which will be examined and addressed in this paper. The following methodology is proposed: log preparation, log inspection, abstraction and selection, clustering, process mining, and validation. It was applied on a case study in the type 2 diabetes mellitus domain. Several data pre-processing steps are applied and clarify the usefulness of process analytics in a healthcare setting. Healthcare utilization, such as diabetes education, is analyzed and compared with diabetes guidelines. Furthermore, we take a look at the organizational perspective and the central role of the GP. This research addresses four challenges: healthcare processes are often patient and hospital specific which leads to unique traces and unstructured processes; data is not recorded in the right format, with the right level of abstraction and time granularity; an overflow of medical activities may cloud the analysis; and analysts need to deal with data not recorded for this purpose. These challenges complicate the application of process analytics. It is explained how our methodology takes them into account. Process analytics offers new insights into the medical services patients follow, how medical resources relate to each other and whether patients and healthcare processes comply with guidelines and regulations. Copyright © 2016 Elsevier Ltd. All rights reserved.
A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.
Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas
2018-02-23
We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.
A systematic review of grounded theory studies in physiotherapy.
Ali, Nancy; May, Stephen; Grafton, Kate
2018-05-23
This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.
NASA Astrophysics Data System (ADS)
Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel
2013-09-01
Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.
Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing
NASA Technical Reports Server (NTRS)
Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana
2013-01-01
The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.
Lagrangian condensation microphysics with Twomey CCN activation
NASA Astrophysics Data System (ADS)
Grabowski, Wojciech W.; Dziekan, Piotr; Pawlowska, Hanna
2018-01-01
We report the development of a novel Lagrangian microphysics methodology for simulations of warm ice-free clouds. The approach applies the traditional Eulerian method for the momentum and continuous thermodynamic fields such as the temperature and water vapor mixing ratio, and uses Lagrangian super-droplets
to represent condensed phase such as cloud droplets and drizzle or rain drops. In other applications of the Lagrangian warm-rain microphysics, the super-droplets outside clouds represent unactivated cloud condensation nuclei (CCN) that become activated upon entering a cloud and can further grow through diffusional and collisional processes. The original methodology allows for the detailed study of not only effects of CCN on cloud microphysics and dynamics, but also CCN processing by a cloud. However, when cloud processing is not of interest, a simpler and computationally more efficient approach can be used with super-droplets forming only when CCN is activated and no super-droplet existing outside a cloud. This is possible by applying the Twomey activation scheme where the local supersaturation dictates the concentration of cloud droplets that need to be present inside a cloudy volume, as typically used in Eulerian bin microphysics schemes. Since a cloud volume is a small fraction of the computational domain volume, the Twomey super-droplets provide significant computational advantage when compared to the original super-droplet methodology. Additional advantage comes from significantly longer time steps that can be used when modeling of CCN deliquescence is avoided. Moreover, other formulation of the droplet activation can be applied in case of low vertical resolution of the host model, for instance, linking the concentration of activated cloud droplets to the local updraft speed. This paper discusses the development and testing of the Twomey super-droplet methodology, focusing on the activation and diffusional growth. Details of the activation implementation, transport of super-droplets in the physical space, and the coupling between super-droplets and the Eulerian temperature and water vapor field are discussed in detail. Some of these are relevant to the original super-droplet methodology as well and to the ice phase modeling using the Lagrangian approach. As a computational example, the scheme is applied to an idealized moist thermal rising in a stratified environment, with the original super-droplet methodology providing a benchmark to which the new scheme is compared.
Software Engineering Laboratory (SEL) cleanroom process model
NASA Technical Reports Server (NTRS)
Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon
1991-01-01
The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.
Operant Variability: Procedures and Processes
ERIC Educational Resources Information Center
Machado, Armando; Tonneau, Francois
2012-01-01
Barba's (2012) article deftly weaves three main themes in one argument about operant variability. From general theoretical considerations on operant behavior (Catania, 1973), Barba derives methodological guidelines about response differentiation and applies them to the study of operant variability. In the process, he uncovers unnoticed features of…
Optimisation of process parameters on thin shell part using response surface methodology (RSM)
NASA Astrophysics Data System (ADS)
Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.
2017-09-01
This study is carried out to focus on optimisation of process parameters by simulation using Autodesk Moldflow Insight (AMI) software. The process parameters are taken as the input in order to analyse the warpage value which is the output in this study. There are some significant parameters that have been used which are melt temperature, mould temperature, packing pressure, and cooling time. A plastic part made of Polypropylene (PP) has been selected as the study part. Optimisation of process parameters is applied in Design Expert software with the aim to minimise the obtained warpage value. Response Surface Methodology (RSM) has been applied in this study together with Analysis of Variance (ANOVA) in order to investigate the interactions between parameters that are significant to the warpage value. Thus, the optimised warpage value can be obtained using the model designed using RSM due to its minimum error value. This study comes out with the warpage value improved by using RSM.
Time-varying bispectral analysis of visually evoked multi-channel EEG
NASA Astrophysics Data System (ADS)
Chandran, Vinod
2012-12-01
Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.
Applications of Landsat data and the data base approach
Lauer, D.T.
1986-01-01
A generalized methodology for applying digital Landsat data to resource inventory and assessment tasks is currently being used by several bureaux and agencies within the US Department of the Interior. The methodology includes definition of project objectives and output, identification of source materials, construction of the digital data base, performance of computer-assisted analyses, and generation of output. The USGS, Bureau of Land Management, US Fish and Wildlife Service, Bureau of Indian Affairs, Bureau of Reclamation, and National Park Service have used this generalized methodology to assemble comprehensive digital data bases for resource management. Advanced information processing techniques have been applied to these data bases for making regional environmental surveys on millions of acres of public lands at costs ranging from $0.01 to $0.08 an acre.-Author
Proof test methodology for composites
NASA Technical Reports Server (NTRS)
Wu, Edward M.; Bell, David K.
1992-01-01
The special requirements for proof test of composites are identified based on the underlying failure process of composites. Two proof test methods are developed to eliminate the inevitable weak fiber sites without also causing flaw clustering which weakens the post-proof-test composite. Significant reliability enhancement by these proof test methods has been experimentally demonstrated for composite strength and composite life in tension. This basic proof test methodology is relevant to the certification and acceptance of critical composite structures. It can also be applied to the manufacturing process development to achieve zero-reject for very large composite structures.
Leighton, Angela; Weinborn, Michael; Maybery, Murray
2014-10-01
Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.
Experiencing Collaborative Knowledge Creation Processes
ERIC Educational Resources Information Center
Jakubik, Maria
2008-01-01
Purpose: How people learn and create knowledge together through interactions in communities of practice (CoPs) is not fully understood. The purpose of this paper is to create and apply a model that could increase participants' consciousness about knowledge creation processes. Design/methodology/approach: This four-month qualitative research was…
Riding on the Back of a Giant: Adding Malta to the "5 Cultures" Study by Robin Alexander
ERIC Educational Resources Information Center
Peresso, Randolph
2017-01-01
This paper focuses on the methodology adopted for Malta+5, which builds on Robin Alexander's work by comparing the five pedagogical cultures he studied to the one in Malta. It reflects critically on the research process adopted in this study, and shows how, despite the very limited experience and resources, applying the methodology, frameworks and…
An Interdisciplinary Approach for Designing Kinetic Models of the Ras/MAPK Signaling Pathway.
Reis, Marcelo S; Noël, Vincent; Dias, Matheus H; Albuquerque, Layra L; Guimarães, Amanda S; Wu, Lulu; Barrera, Junior; Armelin, Hugo A
2017-01-01
We present in this article a methodology for designing kinetic models of molecular signaling networks, which was exemplarily applied for modeling one of the Ras/MAPK signaling pathways in the mouse Y1 adrenocortical cell line. The methodology is interdisciplinary, that is, it was developed in a way that both dry and wet lab teams worked together along the whole modeling process.
Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François
2016-08-01
The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Brook, A.; Cristofani, E.; Vandewal, M.; Matheis, C.; Jonuscheit, J.; Beigang, R.
2012-05-01
The present study proposes a fully integrated, semi-automatic and near real-time mode-operated image processing methodology developed for Frequency-Modulated Continuous-Wave (FMCW) THz images with the center frequencies around: 100 GHz and 300 GHz. The quality control of aeronautics composite multi-layered materials and structures using Non-Destructive Testing is the main focus of this work. Image processing is applied on the 3-D images to extract useful information. The data is processed by extracting areas of interest. The detected areas are subjected to image analysis for more particular investigation managed by a spatial model. Finally, the post-processing stage examines and evaluates the spatial accuracy of the extracted information.
MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes
Williams, B.K.
1988-01-01
Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.
Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R
2016-06-01
Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.
ERIC Educational Resources Information Center
Cliffe, Neil; Stone, Roger; Coutts, Jeff; Reardon-Smith, Kathryn; Mushtaq, Shahbaz
2016-01-01
Purpose: This paper documents and evaluates collaborative learning processes aimed at developing farmer's knowledge, skills and aspirations to use seasonal climate forecasting (SCF). Methodology: Thirteen workshops conducted in 2012 engaged over 200 stakeholders across Australian sugar production regions. Workshop design promoted participant…
Valuing national effects of digital health investments: an applied method.
Hagens, Simon; Zelmer, Jennifer; Frazer, Cassandra; Gheorghiu, Bobby; Leaver, Chad
2015-01-01
This paper describes an approach which has been applied to value national outcomes of investments by federal, provincial and territorial governments, clinicians and healthcare organizations in digital health. Hypotheses are used to develop a model, which is revised and populated based upon the available evidence. Quantitative national estimates and qualitative findings are produced and validated through structured peer review processes. This methodology has applied in four studies since 2008.
On Constructing, Grouping and Using Topical Ontology for Semantic Matching
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert
An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). This paper presents a pattern-driven modeling methodology for constructing and grouping topics in an ontology (PAD-ON methodology), which is used for matching similarities between competences in the human resource management (HRM) domain. The methodology is supported by a tool called PAD-ON. This paper demonstrates our recent achievement in the work from the EC Prolix project. The paper approach is applied to the training processes at British Telecom as the test bed.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek
1991-01-01
Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.
A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.
Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan
2017-12-01
A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lian See, Tan; Zulazlan Shah Zulkifli, Ahmad; Mook Tzeng, Lim
2018-04-01
Ozone is a reactant which can be applied for various environmental treatment processes. It can be generated via atmospheric air non-thermal plasmas when sufficient voltages are applied through a combination of electrodes and dielectric materials. In this study, the concentration of ozone generated via two different configurations of multi-cylinder dielectric barrier discharge (DBD) reactor (3 x 40 mm and 10 x 10 mm) was investigated. The influence of the voltage and the duty cycle to the concentration of ozone generated by each configuration was analysed using response surface methodology. Voltage was identified as significant factor to the ozone production process. However, the regressed model was biased towards one of the configuration, leaving the predicted results of another configuration to be out of range.
A Method for Co-Designing Theory-Based Behaviour Change Systems for Health Promotion.
Janols, Rebecka; Lindgren, Helena
2017-01-01
A methodology was defined and developed for designing theory-based behaviour change systems for health promotion that can be tailored to the individual. Theories from two research fields were combined with a participatory action research methodology. Two case studies applying the methodology were conducted. During and between group sessions the participants created material and designs following the behaviour change strategy themes, which were discussed, analysed and transformed into a design of a behaviour change system. Theories in behavioural change and persuasive technology guided the data collection, data analyses, and the design of a behaviour change system. The methodology has strong emphasis on the target group's participation in the design process. The different aspects brought forward related to behaviour change strategies defined in literature on persuasive technology, and the dynamics of these are associated to needs and motivation defined in literature on behaviour change. It was concluded that the methodology aids the integration of theories into a participatory action research design process, and aids the analyses and motivations of design choices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
Kushniruk, Andre W; Borycki, Elizabeth M
2015-01-01
The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations.
Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner
2013-09-01
This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.
An enhanced methodology for spacecraft correlation activity using virtual testing tools
NASA Astrophysics Data System (ADS)
Remedia, Marcello; Aglietti, Guglielmo S.; Appolloni, Matteo; Cozzani, Alessandro; Kiley, Andrew
2017-11-01
Test planning and post-test correlation activity have been issues of growing importance in the last few decades and many methodologies have been developed to either quantify or improve the correlation between computational and experimental results. In this article the methodologies established so far are enhanced with the implementation of a recently developed procedure called Virtual Testing. In the context of fixed-base sinusoidal tests (commonly used in the space sector for correlation), there are several factors in the test campaign that affect the behaviour of the satellite and are not normally taken into account when performing analyses: different boundary conditions created by the shaker's own dynamics, non-perfect control system, signal delays etc. All these factors are the core of the Virtual Testing implementation, which will be thoroughly explained in this article and applied to the specific case of Bepi-Colombo spacecraft tested on the ESA QUAD Shaker. Correlation activity will be performed in the various stages of the process, showing important improvements observed after applying the final complete methodology.
Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares
2011-09-15
Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
Simulation of investment returns of toll projects.
DOT National Transportation Integrated Search
2013-08-01
This research develops a methodological framework to illustrate key stages in applying the simulation of investment returns of toll projects, acting as an example process of helping agencies conduct numerical risk analysis by taking certain uncertain...
Ecological monitoring in a discrete-time prey-predator model.
Gámez, M; López, I; Rodríguez, C; Varga, Z; Garay, J
2017-09-21
The paper is aimed at the methodological development of ecological monitoring in discrete-time dynamic models. In earlier papers, in the framework of continuous-time models, we have shown how a systems-theoretical methodology can be applied to the monitoring of the state process of a system of interacting populations, also estimating certain abiotic environmental changes such as pollution, climatic or seasonal changes. In practice, however, there may be good reasons to use discrete-time models. (For instance, there may be discrete cycles in the development of the populations, or observations can be made only at discrete time steps.) Therefore the present paper is devoted to the development of the monitoring methodology in the framework of discrete-time models of population ecology. By monitoring we mean that, observing only certain component(s) of the system, we reconstruct the whole state process. This may be necessary, e.g., when in a complex ecosystem the observation of the densities of certain species is impossible, or too expensive. For the first presentation of the offered methodology, we have chosen a discrete-time version of the classical Lotka-Volterra prey-predator model. This is a minimal but not trivial system where the methodology can still be presented. We also show how this methodology can be applied to estimate the effect of an abiotic environmental change, using a component of the population system as an environmental indicator. Although this approach is illustrated in a simplest possible case, it can be easily extended to larger ecosystems with several interacting populations and different types of abiotic environmental effects. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Metaphorical Salience in Artistic Text Processing: Evidence From Eye Movement.
Novikova, Eleonora G; Janyan, Armina; Tsaregorodtseva, Oksana V
2015-01-01
The study aimed to explore processing difference between a literal phrase and a metaphoric one. Unlike artificially created stimuli in most experimental research, an artistic text with an ambiguous binary metaphoric phrase was used. Eye tracking methodology was applied. Results suggested difference between the two types of phrases in both early and late processing measures. © The Author(s) 2015.
Additive Manufacturing of Functional Elements on Sheet Metal
NASA Astrophysics Data System (ADS)
Schaub, Adam; Ahuja, Bhrigu; Butzhammer, Lorenz; Osterziel, Johannes; Schmidt, Michael; Merklein, Marion
Laser Beam Melting (LBM) process with its advantages of high design flexibility and free form manufacturing methodology is often applied limitedly due to its low productivity and unsuitability for mass production compared to conventional manufacturing processes. In order to overcome these limitations, a hybrid manufacturing methodology is developed combining the additive manufacturing process of laser beam melting with sheet forming processes. With an interest towards aerospace and medical industry, the material in focus is Ti-6Al-4V. Although Ti-6Al-4V is a commercially established material and its application for LBM process has been extensively investigated, the combination of LBM of Ti-6Al-4V with sheet metal still needs to be researched. Process dynamics such as high temperature gradients and thermally induced stresses lead to complex stress states at the interaction zone between the sheet and LBM structure. Within the presented paper mechanical characterization of hybrid parts will be performed by shear testing. The association of shear strength with process parameters is further investigated by analyzing the internal structure of the hybrid geometry at varying energy inputs during the LBM process. In order to compare the hybrid manufacturing methodology with conventional fabrication, the conventional methodologies subtractive machining and state of the art Laser Beam Melting is evaluated within this work. These processes will be analyzed for their mechanical characteristics and productivity by determining the build time and raw material consumption for each case. The paper is concluded by presenting the characteristics of the hybrid manufacturing methodology compared to alternative manufacturing technologies.
Using lean methodology to improve efficiency of electronic order set maintenance in the hospital.
Idemoto, Lori; Williams, Barbara; Blackmore, Craig
2016-01-01
Order sets, a series of orders focused around a diagnosis, condition, or treatment, can reinforce best practice, help eliminate outdated practice, and provide clinical guidance. However, order sets require regular updates as evidence and care processes change. We undertook a quality improvement intervention applying lean methodology to create a systematic process for order set review and maintenance. Root cause analysis revealed challenges with unclear prioritization of requests, lack of coordination between teams, and lack of communication between producers and requestors of order sets. In March of 2014, we implemented a systematic, cyclical order set review process, with a set schedule, defined responsibilities for various stakeholders, formal meetings and communication between stakeholders, and transparency of the process. We first identified and deactivated 89 order sets which were infrequently used. Between March and August 2014, 142 order sets went through the new review process. Processing time for the build duration of order sets decreased from a mean of 79.6 to 43.2 days (p<.001, CI=22.1, 50.7). Applying Lean production principles to the order set review process resulted in significant improvement in processing time and increased quality of orders. As use of order sets and other forms of clinical decision support increase, regular evidence and process updates become more critical.
Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene
2017-01-01
In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162
Applying industrial engineering practices to radiology.
Rosen, Len
2004-01-01
Seven hospitals in Oregon and Washington have successfully adopted the Toyota Production System (TPS). Developed by Taiichi Ohno, TPS focuses on finding efficiencies and cost savings in manufacturing processes. A similar effort has occurred in Canada, where Toronto's Hospital for Sick Children has developed a database for its diagnostic imaging department built on the principles of TPS applied to patient encounters. Developed over the last 5 years, the database currently manages all interventional patient procedures for quality assurance, inventory, equipment, and labor. By applying industrial engineering methodology to manufacturing processes, it is possible to manage these constraints, eliminate the obstacles to achieving streamlined processes, and keep the cost of delivering products and services under control. Industrial engineering methodology has encouraged all stakeholders in manufacturing plants to become participants in dealing with constraints. It has empowered those on the shop floor as well as management to become partners in the change process. Using a manufacturing process model to organize patient procedures enables imaging department and imaging centers to generate reports that can help them understand utilization of labor, materials, equipment, and rooms. Administrators can determine the cost of individual procedures as well as the total and average cost of specific procedure types. When Toronto's Hospital for Sick Children first implemented industrial engineering methodology to medical imaging interventional radiology patient encounters, it focused on materials management. Early in the process, the return on investment became apparent as the department improved its management of more than 500,000 dollars of inventory. The calculated accumulated savings over 4 years for 10,000 interventional procedures alone amounted to more than 140,000 dollars. The medical imaging department in this hospital is only now beginning to apply what it has learned to other factors contributing to case cost. It has started to analyze its service contracts with equipment vendors. The department also is accumulating data to measure room, equipment, and labor utilization. The hospital now has a true picture of the real cost associated with each patient encounter in medical imaging. It can now begin to manage case costs, perform better capacity planning, create more effective relationships with its material suppliers, and optimize scheduling of patients and staff.
Identifying Mentors for Student Employees on Campus
ERIC Educational Resources Information Center
Frock, David
2015-01-01
Purpose: This exploratory research project aims to seek an effective process for identifying supervisors of part-time student employees who also serve in a mentoring capacity. Design/methodology/approach: This paper is based on a review of literature and an evaluation process focused on established traits and functions of mentoring as applied to…
ERIC Educational Resources Information Center
Haesen, Birgitt; Boets, Bart; Wagemans, Johan
2011-01-01
This literature review aims to interpret behavioural and electrophysiological studies addressing auditory processing in children and adults with autism spectrum disorder (ASD). Data have been organised according to the applied methodology (behavioural versus electrophysiological studies) and according to stimulus complexity (pure versus complex…
Inside the Black Box: Revealing the Process in Applying a Grounded Theory Analysis
ERIC Educational Resources Information Center
Rich, Peter
2012-01-01
Qualitative research methods have long set an example of rich description, in which data and researchers' hermeneutics work together to inform readers of findings in specific contexts. Among published works, insight into the analytical process is most often represented in the form of methodological propositions or research results. This paper…
A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying proces...
ERIC Educational Resources Information Center
Tingerthal, John Steven
2013-01-01
Using case study methodology and autoethnographic methods, this study examines a process of curricular development known as "Decoding the Disciplines" (Decoding) by documenting the experience of its application in a construction engineering mechanics course. Motivated by the call to integrate what is known about teaching and learning…
Restorative Justice as Reflective Practice and Applied Pedagogy on College Campuses
ERIC Educational Resources Information Center
Rinker, Jeremy A.; Jonason, Chelsey
2014-01-01
Restorative justice (RJ) is both a methodology for dealing with conflict and a process for modeling more positive human relations after social harm. As both method and process, the benefits of developing restorative practices on college campuses go well beyond just the many positive community-oriented outcomes of facilitated conflict resolution…
NASA Astrophysics Data System (ADS)
Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha
2016-04-01
This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.
Region-to-area screening methodology for the Crystalline Repository Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1985-04-01
The purpose of this document is to describe the Crystalline Repository Project's (CRP) process for region-to-area screening of exposed and near-surface crystalline rock bodies in the three regions of the conterminous United States where crystalline rock is being evaluated as a potential host for the second nuclear waste repository (i.e., in the North Central, Northeastern, and Southeastern Regions). This document indicates how the US Department of Energy's (DOE) General Guidelines for the Recommendation of Sites for Nuclear Waste Repositories (10 CFR 960) were used to select and apply factors and variables for the region-to-area screening, explains how these factors andmore » variable are to be applied in the region-to-area screening, and indicates how this methodology relates to the decision process leading to the selection of candidate areas. A brief general discussion of the screening process from the national survey through area screening and site recommendation is presented. This discussion sets the scene for detailed discussions which follow concerning the region-to-area screening process, the guidance provided by the DOE Siting Guidelines for establishing disqualifying factors and variables for screening, and application of the disqualifying factors and variables in the screening process. This document is complementary to the regional geologic and environmental characterization reports to be issued in the summer of 1985 as final documents. These reports will contain the geologic and environmental data base that will be used in conjunction with the methodology to conduct region-to-area screening.« less
NASA Astrophysics Data System (ADS)
Acri, Antonio; Offner, Guenter; Nijman, Eugene; Rejlek, Jan
2016-10-01
Noise legislations and the increasing customer demands determine the Noise Vibration and Harshness (NVH) development of modern commercial vehicles. In order to meet the stringent legislative requirements for the vehicle noise emission, exact knowledge of all vehicle noise sources and their acoustic behavior is required. Transfer path analysis (TPA) is a fairly well established technique for estimating and ranking individual low-frequency noise or vibration contributions via the different transmission paths. Transmission paths from different sources to target points of interest and their contributions can be analyzed by applying TPA. This technique is applied on test measurements, which can only be available on prototypes, at the end of the designing process. In order to overcome the limits of TPA, a numerical transfer path analysis methodology based on the substructuring of a multibody system is proposed in this paper. Being based on numerical simulation, this methodology can be performed starting from the first steps of the designing process. The main target of the proposed methodology is to get information of noise sources contributions of a dynamic system considering the possibility to have multiple forces contemporary acting on the system. The contributions of these forces are investigated with particular focus on distribute or moving forces. In this paper, the mathematical basics of the proposed methodology and its advantages in comparison with TPA will be discussed. Then, a dynamic system is investigated with a combination of two methods. Being based on the dynamic substructuring (DS) of the investigated model, the methodology proposed requires the evaluation of the contact forces at interfaces, which are computed with a flexible multi-body dynamic (FMBD) simulation. Then, the structure-borne noise paths are computed with the wave based method (WBM). As an example application a 4-cylinder engine is investigated and the proposed methodology is applied on the engine block. The aim is to get accurate and clear relationships between excitations and responses of the simulated dynamic system, analyzing the noise and vibrational sources inside a car engine, showing the main advantages of a numerical methodology.
Social customer relationship management: taking advantage of Web 2.0 and Big Data technologies.
Orenga-Roglá, Sergio; Chalmeta, Ricardo
2016-01-01
The emergence of Web 2.0 and Big Data technologies has allowed a new customer relationship strategy based on interactivity and collaboration called Social Customer Relationship Management (Social CRM) to be created. This enhances customer engagement and satisfaction. The implementation of Social CRM is a complex task that involves different organisational, human and technological aspects. However, there is a lack of methodologies to assist companies in these processes. This paper shows a novel methodology that helps companies to implement Social CRM, taking into account different aspects such as social customer strategy, the Social CRM performance measurement system, the Social CRM business processes, or the Social CRM computer system. The methodology was applied to one company in order to validate and refine it.
Using NIAM to capture time dependencies in a domain of discourse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, S.D.
1994-07-01
This paper addresses the issues surrounding the use of NIAM to capture time dependencies in a domain of discourse. The NIAM concepts that support capturing time dependencies are in the event and process portions of the NIAM metamodel, which are the portions most poorly supported by a well-established methodology. This lack of methodological support is a potentially serious handicap in any attempt to apply NIAM to a domain of discourse in which time dependencies are a central issue. However, the capability that NIAM provides for validating and verifying the elementary facts in the domain may reduce the magnitude of themore » event/process-specification task to a level at which it could be effectively handled even without strong methodological support.« less
A methodology for cloud masking uncalibrated lidar signals
NASA Astrophysics Data System (ADS)
Binietoglou, Ioannis; D'Amico, Giuseppe; Baars, Holger; Belegante, Livio; Marinou, Eleni
2018-04-01
Most lidar processing algorithms, such as those included in EARLINET's Single Calculus Chain, can be applied only to cloud-free atmospheric scenes. In this paper, we present a methodology for masking clouds in uncalibrated lidar signals. First, we construct a reference dataset based on manual inspection and then train a classifier to separate clouds and cloud-free regions. Here we present details of this approach together with an example cloud masks from an EARLINET station.
Leveraging design thinking to build sustainable mobile health systems.
Eckman, Molly; Gorski, Irena; Mehta, Khanjan
Mobile health, or mHealth, technology has the potential to improve health care access in the developing world. However, the majority of mHealth projects do not expand beyond the pilot stage. A core reason why is because they do not account for the individual needs and wants of those involved. A collaborative approach is needed to integrate the perspectives of all stakeholders into the design and operation of mHealth endeavours. Design thinking is a methodology used to develop and evaluate novel concepts for systems. With roots in participatory processes and self-determined pathways, design thinking provides a compelling framework to understand and apply the needs of diverse stakeholders to mHealth project development through a highly iterative process. The methodology presented in this article provides a structured approach to apply design thinking principles to assess the feasibility of novel mHealth endeavours during early conceptualisation.
[Improvement in the efficiency of a rehabilitation service using Lean Healthcare methodology].
Pineda Dávila, S; Tinoco González, J
2015-01-01
The aim of this study was to evaluate the reduction in costs and the increase in time devoted to the patient, by applying Lean Healthcare methodology. A multidisciplinary team was formed, setting up three potential areas for improvement by performing a diagnostic process, including the storage and standardization of materials, and professional tasks in the therapeutic areas, by implementing three Lean tools: kanban, 5S and 2P. Stored material costs decreased by 43%, the cost of consumables per patient treated by 19%, and time dedicated to patient treatment increased by 7%. The processes were standardized and "muda" (wastefulness) was eliminated, thus reducing costs and increasing the value to the patient. All this demonstrates that it is possible to apply tools of industrial origin to the health sector, with the aim of improving the quality of care and achieve maximum efficiency. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.
Farias, Diego Carlos; Araujo, Fernando Oliveira de
2017-06-01
Hospitals are complex organizations which, in addition to the technical assistance expected in the context of treatment and prevention of health hazards, also require good management practices aimed at improving their efficiency in their core business. However, in administrative terms, recurrent conflicts arise involving technical and managerial areas. Thus, this article sets out to conducta review of the scientific literature pertaining to the themes of hospital management and projects that have been applied in the hospital context. In terms of methodology, the study adopts the webiblioming method of collection and systematic analysis of knowledge in indexed journal databases. The results show a greater interest on the part of researchers in looking for a more vertically and horizontally dialogical administration, better definition of work processes, innovative technological tools to support the management process and finally the possibility of applying project management methodologies in collaboration with hospital management.
A Design Methodology for Medical Processes.
Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara
2016-01-01
Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.
An Approach for Implementation of Project Management Information Systems
NASA Astrophysics Data System (ADS)
Běrziša, Solvita; Grabis, Jānis
Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, Joshua Daniel; Carr, Christina; Pettit, Erin C.
We apply a fully autonomous icequake detection methodology to a single day of high-sample rate (200 Hz) seismic network data recorded from the terminus of Taylor Glacier, ANT that temporally coincided with a brine release episode near Blood Falls (May 13, 2014). We demonstrate a statistically validated procedure to assemble waveforms triggered by icequakes into populations of clusters linked by intra-event waveform similarity. Our processing methodology implements a noise-adaptive power detector coupled with a complete-linkage clustering algorithm and noise-adaptive correlation detector. This detector-chain reveals a population of 20 multiplet sequences that includes ~150 icequakes and produces zero false alarms onmore » the concurrent, diurnally variable noise. Our results are very promising for identifying changes in background seismicity associated with the presence or absence of brine release episodes. We thereby suggest that our methodology could be applied to longer time periods to establish a brine-release monitoring program for Blood Falls that is based on icequake detections.« less
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Evaluation of grid generation technologies from an applied perspective
NASA Technical Reports Server (NTRS)
Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.
1995-01-01
An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.
Preparative Purification of Recombinant Proteins: Current Status and Future Trends
Saraswat, Mayank; Ravidá, Alessandra; Holthofer, Harry
2013-01-01
Advances in fermentation technologies have resulted in the production of increased yields of proteins of economic, biopharmaceutical, and medicinal importance. Consequently, there is an absolute requirement for the development of rapid, cost-effective methodologies which facilitate the purification of such products in the absence of contaminants, such as superfluous proteins and endotoxins. Here, we provide a comprehensive overview of a selection of key purification methodologies currently being applied in both academic and industrial settings and discuss how innovative and effective protocols such as aqueous two-phase partitioning, membrane chromatography, and high-performance tangential flow filtration may be applied independently of or in conjunction with more traditional protocols for downstream processing applications. PMID:24455685
Improving Process Evaluations of Health Behavior Interventions: Learning From the Social Sciences.
Morgan-Trimmer, Sarah
2015-09-01
This article reflects on the current state of process evaluations of health behavior interventions and argues that evaluation practice in this area could be improved by drawing on the social science literature to a greater degree. While process evaluations of health behavior interventions have increasingly engaged with the social world and sociological aspects of interventions, there has been a lag in applying relevant and potentially useful approaches from the social sciences. This has limited the scope for health behavior process evaluations to address pertinent contextual issues and methodological challenges. Three aspects of process evaluations are discussed: the incorporation of contexts of interventions; engagement with the concept of "process" in process evaluation; and working with theory to understand interventions. Following on from this, the article also comments on the need for new methodologies and on the implications for addressing health inequalities. © The Author(s) 2013.
Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems
NASA Technical Reports Server (NTRS)
Song, Lixia; Kuchar, James K.
2003-01-01
Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.
Additive Manufacturing in Production: A Study Case Applying Technical Requirements
NASA Astrophysics Data System (ADS)
Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni
Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.
Grounded theory as a method for research in speech and language therapy.
Skeat, J; Perry, A
2008-01-01
The use of qualitative methodologies in speech and language therapy has grown over the past two decades, and there is now a body of literature, both generally describing qualitative research, and detailing its applicability to health practice(s). However, there has been only limited profession-specific discussion of qualitative methodologies and their potential application to speech and language therapy. To describe the methodology of grounded theory, and to explain how it might usefully be applied to areas of speech and language research where theoretical frameworks or models are lacking. Grounded theory as a methodology for inductive theory-building from qualitative data is explained and discussed. Some differences between 'modes' of grounded theory are clarified and areas of controversy within the literature are highlighted. The past application of grounded theory to speech and language therapy, and its potential for informing research and clinical practice, are examined. This paper provides an in-depth critique of a qualitative research methodology, including an overview of the main difference between two major 'modes'. The article supports the application of a theory-building approach in the profession, which is sometimes complex to learn and apply, but worthwhile in its results. Grounded theory as a methodology has much to offer speech and language therapists and researchers. Although the majority of research and discussion around this methodology has rested within sociology and nursing, grounded theory can be applied by researchers in any field, including speech and language therapists. The benefit of the grounded theory method to researchers and practitioners lies in its application to social processes and human interactions. The resulting theory may support further research in the speech and language therapy profession.
Save money by understanding variance and tolerancing.
Stuart, K
2007-01-01
Manufacturing processes are inherently variable, which results in component and assembly variance. Unless process capability, variance and tolerancing are fully understood, incorrect design tolerances may be applied, which will lead to more expensive tooling, inflated production costs, high reject rates, product recalls and excessive warranty costs. A methodology is described for correctly allocating tolerances and performing appropriate analyses.
The Use of Eye Movements in the Study of Multimedia Learning
ERIC Educational Resources Information Center
Hyona, Jukka
2010-01-01
This commentary focuses on the use of the eye-tracking methodology to study cognitive processes during multimedia learning. First, some general remarks are made about how the method is applied to investigate visual information processing, followed by a reflection on the eye movement measures employed in the studies published in this special issue.…
ERIC Educational Resources Information Center
Chen, Gwo-Dong; Liu, Chen-Chung; Ou, Kuo-Liang; Liu, Baw-Jhiune
2000-01-01
Discusses the use of Web logs to record student behavior that can assist teachers in assessing performance and making curriculum decisions for distance learning students who are using Web-based learning systems. Adopts decision tree and data cube information processing methodologies for developing more effective pedagogical strategies. (LRW)
Life Design Counseling Group Intervention with Portuguese Adolescents: A Process and Outcome Study
ERIC Educational Resources Information Center
Cardoso, Paulo; Janeiro, Isabel Nunes; Duarte, Maria Eduarda
2018-01-01
This article examines the process and outcome of a life design counseling group intervention with students in Grades 9 and 12. First, we applied a quasi-experimental methodology to analyze the intervention's effectiveness in promoting career certainty, career decision-making, self-efficacy, and career adaptability in a sample of 236 students.…
ERIC Educational Resources Information Center
Akyol, Zehra; Garrison, D. Randy
2011-01-01
This paper focuses on deep and meaningful learning approaches and outcomes associated with online and blended communities of inquiry. Applying mixed methodology for the research design, the study used transcript analysis, learning outcomes, perceived learning, satisfaction, and interviews to assess learning processes and outcomes. The findings for…
NASA Technical Reports Server (NTRS)
Baird, J.
1967-01-01
This supplement to Task lB-Large Solid Rocket Motor Case Fabrication Methods supplies additional supporting cost data and discusses in detail the methodology that was applied to the task. For the case elements studied, the cost was found to be directly proportional to the Process Complexity Factor (PCF). The PCF was obtained for each element by identifying unit processes that are common to the elements and their alternative manufacturing routes, by assigning a weight to each unit process, and by summing the weighted counts. In three instances of actual manufacture, the actual cost per pound equaled the cost estimate based on PCF per pound, but this supplement, recognizes that the methodology is of limited, rather than general, application.
New methodology to reconstruct in 2-D the cuspal enamel of modern human lower molars.
Modesto-Mata, Mario; García-Campos, Cecilia; Martín-Francés, Laura; Martínez de Pinillos, Marina; García-González, Rebeca; Quintino, Yuliet; Canals, Antoni; Lozano, Marina; Dean, M Christopher; Martinón-Torres, María; Bermúdez de Castro, José María
2017-08-01
In the last years different methodologies have been developed to reconstruct worn teeth. In this article, we propose a new 2-D methodology to reconstruct the worn enamel of lower molars. Our main goals are to reconstruct molars with a high level of accuracy when measuring relevant histological variables and to validate the methodology calculating the errors associated with the measurements. This methodology is based on polynomial regression equations, and has been validated using two different dental variables: cuspal enamel thickness and crown height of the protoconid. In order to perform the validation process, simulated worn modern human molars were employed. The associated errors of the measurements were also estimated applying methodologies previously proposed by other authors. The mean percentage error estimated in reconstructed molars for these two variables in comparison with their own real values is -2.17% for the cuspal enamel thickness of the protoconid and -3.18% for the crown height of the protoconid. This error significantly improves the results of other methodologies, both in the interobserver error and in the accuracy of the measurements. The new methodology based on polynomial regressions can be confidently applied to the reconstruction of cuspal enamel of lower molars, as it improves the accuracy of the measurements and reduces the interobserver error. The present study shows that it is important to validate all methodologies in order to know the associated errors. This new methodology can be easily exportable to other modern human populations, the human fossil record and forensic sciences. © 2017 Wiley Periodicals, Inc.
Fuzzy neural network methodology applied to medical diagnosis
NASA Technical Reports Server (NTRS)
Gorzalczany, Marian B.; Deutsch-Mcleish, Mary
1992-01-01
This paper presents a technique for building expert systems that combines the fuzzy-set approach with artificial neural network structures. This technique can effectively deal with two types of medical knowledge: a nonfuzzy one and a fuzzy one which usually contributes to the process of medical diagnosis. Nonfuzzy numerical data is obtained from medical tests. Fuzzy linguistic rules describing the diagnosis process are provided by a human expert. The proposed method has been successfully applied in veterinary medicine as a support system in the diagnosis of canine liver diseases.
Manterola, Carlos; Torres, Rodrigo; Burgos, Luis; Vial, Manuel; Pineda, Viviana
2006-07-01
Surgery is a curative treatment for gastric cancer (GC). As relapse is frequent, adjuvant therapies such as postoperative chemo radiotherapy have been tried. In Chile, some hospitals adopted Macdonald's study as a protocol for the treatment of GC. To determine methodological quality and internal and external validity of the Macdonald study. Three instruments were applied that assess methodological quality. A critical appraisal was done and the internal and external validity of the methodological quality was analyzed with two scales: MINCIR (Methodology and Research in Surgery), valid for therapy studies and CONSORT (Consolidated Standards of Reporting Trials), valid for randomized controlled trials (RCT). Guides and scales were applied by 5 researchers with training in clinical epidemiology. The reader's guide verified that the Macdonald study was not directed to answer a clearly defined question. There was random assignment, but the method used is not described and the patients were not considered until the end of the study (36% of the group with surgery plus chemo radiotherapy did not complete treatment). MINCIR scale confirmed a multicentric RCT, not blinded, with an unclear randomized sequence, erroneous sample size estimation, vague objectives and no exclusion criteria. CONSORT system proved the lack of working hypothesis and specific objectives as well as an absence of exclusion criteria and identification of the primary variable, an imprecise estimation of sample size, ambiguities in the randomization process, no blinding, an absence of statistical adjustment and the omission of a subgroup analysis. The instruments applied demonstrated methodological shortcomings that compromise the internal and external validity of the.
Entropy Filtered Density Function for Large Eddy Simulation of Turbulent Reacting Flows
NASA Astrophysics Data System (ADS)
Safari, Mehdi
Analysis of local entropy generation is an effective means to optimize the performance of energy and combustion systems by minimizing the irreversibilities in transport processes. Large eddy simulation (LES) is employed to describe entropy transport and generation in turbulent reacting flows. The entropy transport equation in LES contains several unclosed terms. These are the subgrid scale (SGS) entropy flux and entropy generation caused by irreversible processes: heat conduction, mass diffusion, chemical reaction and viscous dissipation. The SGS effects are taken into account using a novel methodology based on the filtered density function (FDF). This methodology, entitled entropy FDF (En-FDF), is developed and utilized in the form of joint entropy-velocity-scalar-turbulent frequency FDF and the marginal scalar-entropy FDF, both of which contain the chemical reaction effects in a closed form. The former constitutes the most comprehensive form of the En-FDF and provides closure for all the unclosed filtered moments. This methodology is applied for LES of a turbulent shear layer involving transport of passive scalars. Predictions show favor- able agreements with the data generated by direct numerical simulation (DNS) of the same layer. The marginal En-FDF accounts for entropy generation effects as well as scalar and entropy statistics. This methodology is applied to a turbulent nonpremixed jet flame (Sandia Flame D) and predictions are validated against experimental data. In both flows, sources of irreversibility are predicted and analyzed.
Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent
2018-06-01
Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.
The Problem of Multiple Criteria Selection of the Surface Mining Haul Trucks
NASA Astrophysics Data System (ADS)
Bodziony, Przemysław; Kasztelewicz, Zbigniew; Sawicki, Piotr
2016-06-01
Vehicle transport is a dominant type of technological processes in rock mines, and its profit ability is strictly dependent on overall cost of its exploitation, especially on diesel oil consumption. Thus, a rational design of transportation system based on haul trucks should result from thorough analysis of technical and economic issues, including both cost of purchase and its further exploitation, having a crucial impact on the cost of minerals extraction. Moreover, off-highway trucks should be selected with respect to all specific exploitation conditions and even the user's preferences and experience. In this paper a development of universal family of evaluation criteria as well as application of evaluation method for haul truck selection process for a specific exploitation conditions in surface mining have been carried out. The methodology presented in the paper is based on the principles of multiple criteria decision aiding (MCDA) using one of the ranking method, i.e. ELECTRE III. The applied methodology has been allowed for ranking of alternative solution (variants), on the considered set of haul trucks. The result of the research is a universal methodology, and it consequently may be applied in other surface mines with similar exploitation parametres.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
Li, Jingrui; Kondov, Ivan; Wang, Haobin; Thoss, Michael
2015-04-10
A recently developed methodology to simulate photoinduced electron transfer processes at dye-semiconductor interfaces is outlined. The methodology employs a first-principles-based model Hamiltonian and accurate quantum dynamics simulations using the multilayer multiconfiguration time-dependent Hartree approach. This method is applied to study electron injection in the dye-semiconductor system coumarin 343-TiO2. Specifically, the influence of electronic-vibrational coupling is analyzed. Extending previous work, we consider the influence of Dushinsky rotation of the normal modes as well as anharmonicities of the potential energy surfaces on the electron transfer dynamics.
Oshchepkov, Sergey; Bril, Andrey; Yokota, Tatsuya; Yoshida, Yukio; Blumenstock, Thomas; Deutscher, Nicholas M; Dohe, Susanne; Macatangay, Ronald; Morino, Isamu; Notholt, Justus; Rettinger, Markus; Petri, Christof; Schneider, Matthias; Sussman, Ralf; Uchino, Osamu; Velazco, Voltaire; Wunch, Debra; Belikov, Dmitry
2013-02-20
This paper presents an improved photon path length probability density function method that permits simultaneous retrievals of column-average greenhouse gas mole fractions and light path modifications through the atmosphere when processing high-resolution radiance spectra acquired from space. We primarily describe the methodology and retrieval setup and then apply them to the processing of spectra measured by the Greenhouse gases Observing SATellite (GOSAT). We have demonstrated substantial improvements of the data processing with simultaneous carbon dioxide and light path retrievals and reasonable agreement of the satellite-based retrievals against ground-based Fourier transform spectrometer measurements provided by the Total Carbon Column Observing Network (TCCON).
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Making sense of grounded theory in medical education.
Kennedy, Tara J T; Lingard, Lorelei A
2006-02-01
Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.
Methodologies for launcher-payload coupled dynamic analysis
NASA Astrophysics Data System (ADS)
Fransen, S. H. J. A.
2012-06-01
An important step in the design and verification process of spacecraft structures is the coupled dynamic analysis with the launch vehicle in the low-frequency domain, also referred to as coupled loads analysis (CLA). The objective of such analyses is the computation of the dynamic environment of the spacecraft (payload) in terms of interface accelerations, interface forces, center of gravity (CoG) accelerations as well as the internal state of stress. In order to perform an efficient, fast and accurate launcher-payload coupled dynamic analysis, various methodologies have been applied and developed. The methods are related to substructuring techniques, data recovery techniques, the effects of prestress and fluids and time integration problems. The aim of this paper was to give an overview of these methodologies and to show why, how and where these techniques can be used in the process of launcher-payload coupled dynamic analysis. In addition, it will be shown how these methodologies fit together in a library of procedures which can be used with the MSC.Nastran™ solution sequences.
A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1998-01-01
This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.
Stoeckel, D.M.; Stelzer, E.A.; Dick, L.K.
2009-01-01
Quantitative PCR (qPCR), applied to complex environmental samples such as water, wastewater, and feces, is susceptible to methodological and sample related biases. In this study, we evaluated two exogenous DNA spike-and-recovery controls as proxies for recovery efficiency of Bacteroidales 16S rDNA gene sequences (AllBac and qHF183) that are used for microbial source tracking (MST) in river water. Two controls-(1) the plant pathogen Pantoea stewartii, carrying the chromosomal target gene cpsD, and (2) Escherichia coli, carrying the plasmid-borne target gene DsRed2-were added to raw water samples immediately prior to concentration and DNA extraction for qPCR. When applied to samples processed in replicate, recovery of each control was positively correlated with the observed concentration of each MST marker. Adjustment of MST marker concentrations according to recovery efficiency reduced variability in replicate analyses when consistent processing and extraction methodologies were applied. Although the effects of this procedure on accuracy could not be tested due to uncertainties in control DNA concentrations, the observed reduction in variability should improve the strength of statistical comparisons. These findings suggest that either of the tested spike-and-recovery controls can be useful to measure efficiency of extraction and recovery in routine laboratory processing. ?? 2009 Elsevier Ltd.
Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J
2008-04-01
Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.
Application of Six Sigma methodology to a diagnostic imaging process.
Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M
2012-01-01
This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.
A Design Methodology for Medical Processes
Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara
2016-01-01
Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415
Markov vs. Hurst-Kolmogorov behaviour identification in hydroclimatic processes
NASA Astrophysics Data System (ADS)
Dimitriadis, Panayiotis; Gournari, Naya; Koutsoyiannis, Demetris
2016-04-01
Hydroclimatic processes are usually modelled either by exponential decay of the autocovariance function, i.e., Markovian behaviour, or power type decay, i.e., long-term persistence (or else Hurst-Kolmogorov behaviour). For the identification and quantification of such behaviours several graphical stochastic tools can be used such as the climacogram (i.e., plot of the variance of the averaged process vs. scale), autocovariance, variogram, power spectrum etc. with the former usually exhibiting smaller statistical uncertainty as compared to the others. However, most methodologies including these tools are based on the expected value of the process. In this analysis, we explore a methodology that combines both the practical use of a graphical representation of the internal structure of the process as well as the statistical robustness of the maximum-likelihood estimation. For validation and illustration purposes, we apply this methodology to fundamental stochastic processes, such as Markov and Hurst-Kolmogorov type ones. Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.
A methodology for the evaluation of the human-bioclimatic performance of open spaces
NASA Astrophysics Data System (ADS)
Charalampopoulos, Ioannis; Tsiros, Ioannis; Chronopoulou-Sereli, Aik.; Matzarakis, Andreas
2017-05-01
The purpose of this paper is to present a simple methodology to improve the evaluation of the human-biometeorological benefits of open spaces. It is based on two groups of new indices using as basis the well-known PET index. This simple methodology along with the accompanying indices allows a qualitative and quantitative evaluation of the climatic behavior of the selected sites. The proposed methodology was applied in a human-biometeorology research in the city of Athens, Greece. The results of this study are in line with the results of other related studies indicating the considerable influence of the sky view factor (SVF), the existence of the vegetation and the building material on human-biometeorological conditions. The proposed methodology may provide new insights in the decision-making process related to urban open spaces' best configuration.
Indirect assessment of bulk strain soliton velocity in opaque solids
NASA Astrophysics Data System (ADS)
Belashov, A. V.; Beltukov, Y. M.; Petrov, N. V.; Samsonov, A. M.; Semenova, I. V.
2018-03-01
This paper presents a methodology allowing for determination of strain soliton velocity in opaque solid materials. The methodology is based on the analysis of soliton evolution in a layer of a transparent material adhesively bonded to the layer of a material under study. It is shown that the resulting soliton velocity in the complex waveguide equals to the arithmetic mean of soliton velocities in the two component materials. The suggested methodology is best suited for analysis of materials with relatively close elastic parameters and can be applied in research of nonlinear wave processes in opaque composites on the basis of transparent matrices.
Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide
2017-03-01
Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.
Pantex Falling Man - Independent Review Panel Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolini, Louis; Brannon, Nathan; Olson, Jared
2014-11-01
Consolidated Nuclear Security (CNS) Pantex took the initiative to organize a Review Panel of subject matter experts to independently assess the adequacy of the Pantex Tripping Man Analysis methodology. The purpose of this report is to capture the details of the assessment including the scope, approach, results, and detailed Appendices. Along with the assessment of the analysis methodology, the panel evaluated the adequacy with which the methodology was applied as well as congruence with Department of Energy (DOE) standards 3009 and 3016. The approach included the review of relevant documentation, interactive discussion with Pantex staff, and the iterative process ofmore » evaluating critical lines of inquiry.« less
Methodology for evaluating pattern transfer completeness in inkjet printing with irregular edges
NASA Astrophysics Data System (ADS)
Huang, Bo-Cin; Chan, Hui-Ju; Hong, Jian-Wei; Lo, Cheng-Yao
2016-06-01
A methodology for quantifying and qualifying pattern transfer completeness in inkjet printing through examining both pattern dimensions and pattern contour deviations from reference design is proposed, which enables scientifically identifying and evaluating inkjet-printed lines, corners, circles, ellipses, and spirals with irregular edges of bulging, necking, and unpredictable distortions resulting from different process conditions. This methodology not only avoids differences in individual perceptions of ambiguous pattern distortions but also indicates the systematic effects of mechanical stresses applied in different directions to a polymer substrate, and is effective for both optical and electrical microscopy in direct and indirect lithography or lithography-free patterning.
ERIC Educational Resources Information Center
Malmi, Lauri; Adawi, Tom; Curmi, Ronald; de Graaff, Erik; Duffy, Gavin; Kautz, Christian; Kinnunen, Päivi; Williams, Bill
2018-01-01
We investigated research processes applied in recent publications in the "European Journal of Engineering Education" (EJEE), exploring how papers link to theoretical work and how research processes have been designed and reported. We analysed all 155 papers published in EJEE in 2009, 2010 and 2013, classifying the papers using a taxonomy…
ERIC Educational Resources Information Center
Štofková, Katarína; Strícek, Ivan; Štofková, Jana
2014-01-01
The paper is aimed to evaluate the possibility of applying new methods and tools of more effective educational processes, with an emphasis on increasing their quality especially aimed on educational processes at secondary schools and universities. There are some contributions from practice for the effective implementation of time management, such…
NASA Technical Reports Server (NTRS)
House, Frederick B.
1986-01-01
The Nimbus 7 Earth Radiation Budget (ERB) data set is reviewed to examine its strong and weak points. In view of the timing of this report relative to the processing schedule of Nimbus 7 ERB observations, emphasis is placed on the methodology of interpreting the scanning radiometer data to develop directional albedo models. These findings enhance the value of the Nimbus 7 ERB data set and can be applied to the interpretation of both the scanning and nonscanning radiometric observations.
NASA Astrophysics Data System (ADS)
Larnier, H.; Sailhac, P.; Chambodut, A.
2018-01-01
Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.
USDA-ARS?s Scientific Manuscript database
There is a need to develop scale explicit understanding of erosion to overcome existing conceptual and methodological flaws in our modelling methods currently applied to understand the process of erosion, transport and deposition at the catchment scale. These models need to be based on a sound under...
Inventive Performance Improvement of Integrated Optical Rate Sensor Using TIPS/TRIZ
NASA Technical Reports Server (NTRS)
Blosiu, Julian O.; Youmans, Bruce R.; Kowalick, Jim
1996-01-01
The Theory of Inventive Problem Solving (TIPS or also known as TRIZ) is a new scientific approach to innovative improvements of products and processes. This methodology was applied to inventively improve performance of an Integrated Optic Rate Sensor (IORS).
Seismic Characterization of EGS Reservoirs
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.
2014-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
A Numerical, Literal, and Converged Perturbation Algorithm
NASA Astrophysics Data System (ADS)
Wiesel, William E.
2017-09-01
The KAM theorem and von Ziepel's method are applied to a perturbed harmonic oscillator, and it is noted that the KAM methodology does not allow for necessary frequency or angle corrections, while von Ziepel does. The KAM methodology can be carried out with purely numerical methods, since its generating function does not contain momentum dependence. The KAM iteration is extended to allow for frequency and angle changes, and in the process apparently can be successfully applied to degenerate systems normally ruled out by the classical KAM theorem. Convergence is observed to be geometric, not exponential, but it does proceed smoothly to machine precision. The algorithm produces a converged perturbation solution by numerical methods, while still retaining literal variable dependence, at least in the vicinity of a given trajectory.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. Conclusions We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. PMID:28302594
A consistent modelling methodology for secondary settling tanks in wastewater treatment.
Bürger, Raimund; Diehl, Stefan; Nopens, Ingmar
2011-03-01
The aim of this contribution is partly to build consensus on a consistent modelling methodology (CMM) of complex real processes in wastewater treatment by combining classical concepts with results from applied mathematics, and partly to apply it to the clarification-thickening process in the secondary settling tank. In the CMM, the real process should be approximated by a mathematical model (process model; ordinary or partial differential equation (ODE or PDE)), which in turn is approximated by a simulation model (numerical method) implemented on a computer. These steps have often not been carried out in a correct way. The secondary settling tank was chosen as a case since this is one of the most complex processes in a wastewater treatment plant and simulation models developed decades ago have no guarantee of satisfying fundamental mathematical and physical properties. Nevertheless, such methods are still used in commercial tools to date. This particularly becomes of interest as the state-of-the-art practice is moving towards plant-wide modelling. Then all submodels interact and errors propagate through the model and severely hamper any calibration effort and, hence, the predictive purpose of the model. The CMM is described by applying it first to a simple conversion process in the biological reactor yielding an ODE solver, and then to the solid-liquid separation in the secondary settling tank, yielding a PDE solver. Time has come to incorporate established mathematical techniques into environmental engineering, and wastewater treatment modelling in particular, and to use proven reliable and consistent simulation models. Copyright © 2011 Elsevier Ltd. All rights reserved.
Woods, J
2001-01-01
The third generation cardiac institute will build on the successes of the past in structuring the service line, re-organizing to assimilate specialist interests, and re-positioning to expand cardiac services into cardiovascular services. To meet the challenges of an increasingly competitive marketplace and complex delivery system, the focus for this new model will shift away from improved structures, and toward improved processes. This shift will require a sound methodology for statistically measuring and sustaining process changes related to the optimization of cardiovascular care. In recent years, GE Medical Systems has successfully applied Six Sigma methodologies to enable cardiac centers to control key clinical and market development processes through its DMADV, DMAIC and Change Acceleration processes. Data indicates Six Sigma is having a positive impact within organizations across the United States, and when appropriately implemented, this approach can serve as a solid foundation for building the next generation cardiac institute.
IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bazjanac, Vladimir
2008-07-01
Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuente, Rafael de la; Iglesias, Javier; Sedano, Pablo G.
IBERDROLA (Spanish utility) and IBERDROLA INGENIERIA (engineering branch) have been developing during the last 2 yr the 110% Extended Power Uprate Project for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved in advance by the Spanish Nuclear Regulatory Authority. This methodology has been applied to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 and 13 and to develop a significant number of safety analyses of the Cofrentes Extended Power.Because the scope of the licensing process of the Cofrentes Extended Power Uprate exceeds the range of analysis includedmore » in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients. This is the case of the total loss of feedwater (TLFW) transient.The content of this paper shows the benefits of having an in-house design and licensing methodology and describes the process to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients, particularly in this paper the TLFW transient.« less
Comparing Alternatives For Replacing Harmful Chemicals
NASA Technical Reports Server (NTRS)
Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.
1995-01-01
Methodology developed to provide guidance for replacement of industrial chemicals that must be phased out by law because they are toxic and/or affect environment adversely. Chemicals and processes ranked numerically. Applies mostly to chemicals contributing to depletion of ozone in upper atmosphere; some other harmful chemicals included. Quality function deployment matrix format provides convenient way to compare alternative processes and chemicals. Overall rating at bottom of each process-and-chemical column indicates relative advantage.
The Use of Multi-Criteria Evaluation and Network Analysis in the Area Development Planning Process
2013-03-01
layouts. The alternative layout scoring process, base in multi-criteria evaluation, returns a quantitative score for each alternative layout and a...The purpose of this research was to develop improvements to the area development planning process. These plans are used to improve operations within...an installation sub-section by altering the physical layout of facilities. One methodology was developed based on apply network analysis concepts to
NASA Astrophysics Data System (ADS)
Huang, Xiao
2006-04-01
Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.
NASA Astrophysics Data System (ADS)
Asyirah, B. N.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.
2017-09-01
In manufacturing a variety of parts, plastic injection moulding is widely use. The injection moulding process parameters have played important role that affects the product's quality and productivity. There are many approaches in minimising the warpage ans shrinkage such as artificial neural network, genetic algorithm, glowworm swarm optimisation and hybrid approaches are addressed. In this paper, a systematic methodology for determining a warpage and shrinkage in injection moulding process especially in thin shell plastic parts are presented. To identify the effects of the machining parameters on the warpage and shrinkage value, response surface methodology is applied. In thos study, a part of electronic night lamp are chosen as the model. Firstly, experimental design were used to determine the injection parameters on warpage for different thickness value. The software used to analyse the warpage is Autodesk Moldflow Insight (AMI) 2012.
Timing of translation in cross-language qualitative research.
Santos, Hudson P O; Black, Amanda M; Sandelowski, Margarete
2015-01-01
Although there is increased understanding of language barriers in cross-language studies, the point at which language transformation processes are applied in research is inconsistently reported, or treated as a minor issue. Differences in translation timeframes raise methodological issues related to the material to be translated, as well as for the process of data analysis and interpretation. In this article we address methodological issues related to the timing of translation from Portuguese to English in two international cross-language collaborative research studies involving researchers from Brazil, Canada, and the United States. One study entailed late-phase translation of a research report, whereas the other study involved early phase translation of interview data. The timing of translation in interaction with the object of translation should be considered, in addition to the language, cultural, subject matter, and methodological competencies of research team members. © The Author(s) 2014.
Sethi, Rajiv; Yanamadala, Vijay; Burton, Douglas C; Bess, Robert Shay
2017-11-01
Lean methodology was developed in the manufacturing industry to increase output and decrease costs. These labor organization methods have become the mainstay of major manufacturing companies worldwide. Lean methods involve continuous process improvement through the systematic elimination of waste, prevention of mistakes, and empowerment of workers to make changes. Because of the profit and productivity gains made in the manufacturing arena using lean methods, several healthcare organizations have adopted lean methodologies for patient care. Lean methods have now been implemented in many areas of health care. In orthopaedic surgery, lean methods have been applied to reduce complication rates and create a culture of continuous improvement. A step-by-step guide based on our experience can help surgeons use lean methods in practice. Surgeons and hospital centers well versed in lean methodology will be poised to reduce complications, improve patient outcomes, and optimize cost/benefit ratios for patient care.
NASA Astrophysics Data System (ADS)
Diller, Christian; Karic, Sarah; Oberding, Sarah
2017-06-01
The topic of this article ist the question, in which phases oft he political planning process planners apply their methodological set of tools. That for the results of a research-project are presented, which were gained by an examination of planning-cases in learned journals. Firstly it is argued, which model oft he planning-process is most suitable to reflect the regarded cases and how it is positioned to models oft he political process. Thereafter it is analyzed, which types of planning methods are applied in the several stages oft he planning process. The central findings: Although complex, many planning processes can be thouroughly pictured by a linear modell with predominantly simple feedback loops. Even in times of he communicative turn, concerning their set of tools, planners should pay attention to apply not only communicative methods but as well the classical analytical-rational methods. They are helpful especially for the understanding of the political process before and after the actual planning phase.
Proceedings of the 19th Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1994-01-01
The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effects of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include this document.
Montella, Emma; Di Cicco, Maria Vincenza; Ferraro, Anna; Centobelli, Piera; Raiola, Eliana; Triassi, Maria; Improta, Giovanni
2017-06-01
Nowadays, the monitoring and prevention of healthcare-associated infections (HAIs) is a priority for the healthcare sector. In this article, we report on the application of the Lean Six Sigma (LSS) methodology to reduce the number of patients affected by sentinel bacterial infections who are at risk of HAI. The LSS methodology was applied in the general surgery department by using a multidisciplinary team of both physicians and academics. Data on more than 20 000 patients who underwent a wide range of surgical procedures between January 2011 and December 2014 were collected to conduct the study using the departmental information system. The most prevalent sentinel bacteria were determined among the infected patients. The preintervention (January 2011 to December 2012) and postintervention (January 2013 to December 2014) phases were compared to analyze the effects of the methodology implemented. The methodology allowed the identification of variables that influenced the risk of HAIs and the implementation of corrective actions to improve the care process, thereby reducing the percentage of infected patients. The improved process resulted in a 20% reduction in the average number of hospitalization days between preintervention and control phases, and a decrease in the mean (SD) number of days of hospitalization amounted to 36 (15.68), with a data distribution around 3 σ. The LSS is a helpful strategy that ensures a significant decrease in the number of HAIs in patients undergoing surgical interventions. The implementation of this intervention in the general surgery departments resulted in a significant reduction in both the number of hospitalization days and the number of patients affected by HAIs. This approach, together with other tools for reducing the risk of infection (surveillance, epidemiological guidelines, and training of healthcare personnel), could be applied to redesign and improve a wide range of healthcare processes. © 2016 John Wiley & Sons, Ltd.
Evaluative methodology for prioritizing transportation energy conservation strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pang, L.M.G.
An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less
Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology
NASA Astrophysics Data System (ADS)
Morgan, T. W.; Thurgood, R. L.
1984-05-01
This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.
A new approach to subjectively assess quality of plenoptic content
NASA Astrophysics Data System (ADS)
Viola, Irene; Řeřábek, Martin; Ebrahimi, Touradj
2016-09-01
Plenoptic content is becoming increasingly popular thanks to the availability of acquisition and display devices. Thanks to image-based rendering techniques, a plenoptic content can be rendered in real time in an interactive manner allowing virtual navigation through the captured scenes. This way of content consumption enables new experiences, and therefore introduces several challenges in terms of plenoptic data processing, transmission and consequently visual quality evaluation. In this paper, we propose a new methodology to subjectively assess the visual quality of plenoptic content. We also introduce a prototype software to perform subjective quality assessment according to the proposed methodology. The proposed methodology is further applied to assess the visual quality of a light field compression algorithm. Results show that this methodology can be successfully used to assess the visual quality of plenoptic content.
Quantifying chemical reactions by using mixing analysis.
Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao
2015-01-01
This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point. Copyright © 2014 Elsevier B.V. All rights reserved.
Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N
2015-03-01
A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.
Perez, Pablo A; Hintelman, Holger; Quiroz, Waldo; Bravo, Manuel A
2017-11-01
In the present work, the efficiency of distillation process for extracting monomethylmercury (MMHg) from soil samples was studied and optimized using an experimental design methodology. The influence of soil composition on MMHg extraction was evaluated by testing of four soil samples with different geochemical characteristics. Optimization suggested that the acid concentration and the duration of the distillation process were most significant and the most favorable conditions, established as a compromise for the studied soils, were determined to be a 70 min distillation using an 0.2 M acid. Corresponding limits of detection (LOD) and quantification (LOQ) were 0.21 and 0.7 pg absolute, respectively. The optimized methodology was applied with satisfactory results to soil samples and was compared to a reference methodology based on isotopic dilution analysis followed by gas chromatography-inductively coupled plasma mass spectrometry (IDA-GC-ICP-MS). Using the optimized conditions, recoveries ranged from 82 to 98%, which is an increase of 9-34% relative to the previously used standard operating procedure. Finally, the validated methodology was applied to quantify MMHg in soils collected from different sites impacted by coal fired power plants in the north-central zone of Chile, measuring MMHg concentrations ranging from 0.091 to 2.8 ng g -1 . These data are to the best of our knowledge the first MMHg measurements reported for Chile. Copyright © 2017 Elsevier Ltd. All rights reserved.
The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis.
Hachaj, Tomasz; Ogiela, Marek R
2016-06-01
The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment.
ERIC Educational Resources Information Center
Sangra, Albert; Gonzalez-Sanmamed, Mercedes
2010-01-01
The purpose of this study is to analyse what is happening at schools regarding the integration and use of information and communication technologies (ICT) and to examine teachers' perceptions about what teaching and learning processes can be improved through the use of ICT. A multiple-case-study research methodology was applied. From a previous…
A methodology for stochastic analysis of share prices as Markov chains with finite states.
Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey
2014-01-01
Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data.
Treves-Kagan, Sarah; Naidoo, Evasen; Gilvydis, Jennifer M; Raphela, Elsie; Barnhart, Scott; Lippman, Sheri A
2017-09-01
Successful HIV prevention programming requires engaging communities in the planning process and responding to the social environmental factors that shape health and behaviour in a specific local context. We conducted two community-based situational analyses to inform a large, comprehensive HIV prevention programme in two rural districts of North West Province South Africa in 2012. The methodology includes: initial partnership building, goal setting and background research; 1 week of field work; in-field and subsequent data analysis; and community dissemination and programmatic incorporation of results. We describe the methodology and a case study of the approach in rural South Africa; assess if the methodology generated data with sufficient saturation, breadth and utility for programming purposes; and evaluate if this process successfully engaged the community. Between the two sites, 87 men and 105 women consented to in-depth interviews; 17 focus groups were conducted; and 13 health facilities and 7 NGOs were assessed. The methodology succeeded in quickly collecting high-quality data relevant to tailoring a comprehensive HIV programme and created a strong foundation for community engagement and integration with local health services. This methodology can be an accessible tool in guiding community engagement and tailoring future combination HIV prevention and care programmes.
Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System
NASA Astrophysics Data System (ADS)
Lee, Chang Jae; Yun, Jae Hee
2017-06-01
Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Jacobo-Velázquez, D A; Ramos-Parra, P A; Hernández-Brenes, C
2010-08-01
High hydrostatic pressure (HHP) pasteurized and refrigerated avocado and mango pulps contain lower microbial counts and thus are safer and acceptable for human consumption for a longer period of time, when compared to fresh unprocessed pulps. However, during their commercial shelf life, changes in their sensory characteristics take place and eventually produce the rejection of these products by consumers. Therefore, in the present study, the use of sensory evaluation was proposed for the shelf-life determinations of HHP-processed avocado and mango pulps. The study focused on evaluating the feasibility of applying survival analysis methodology to the data generated by consumers in order to determine the sensory shelf lives of both HHP-treated pulps of avocado and mango. Survival analysis proved to be an effective methodology for the estimation of the sensory shelf life of avocado and mango pulps processed with HHP, with potential application for other pressurized products. Practical Application: At present, HHP processing is one of the most effective alternatives for the commercial nonthermal pasteurization of fresh tropical fruits. HHP processing improves the microbial stability of the fruit pulps significantly; however, the products continue to deteriorate during their refrigerated storage mainly due to the action of residual detrimental enzymes. This article proposes the application of survival analysis methodology for the determination of the sensory shelf life of HHP-treated avocado and mango pulps. Results demonstrated that the procedure appears to be simple and practical for the sensory shelf-life determination of HHP-treated foods when their main mode of failure is not caused by increases in microbiological counts that can affect human health.
2012-01-01
Background The World Health Organization (WHO) Department of HIV/AIDS led the development of public health guidelines for delivering an evidence-based, essential package of interventions for the prevention and treatment of HIV and other sexually transmitted infections (STIs) among men who have sex with men (MSM) and transgender people in the health sector in low- and middle-income countries. The objective of this paper is to review the methodological challenges faced and solutions applied during the development of the guidelines. Methods The development of the guidelines followed the WHO guideline development process, which utilizes the GRADE approach. We identified, categorized and labeled the challenges identified in the guidelines development process and described the solutions through an interactive process of in-person and electronic communication. Results We describe how we dealt with the following challenges: (1) heterogeneous and complex interventions; (2) paucity of trial data; (3) selecting outcomes of interest; (4) using indirect evidence; (5) integrating values and preferences; (6) considering resource use; (7) addressing social and legal barriers; (8) wording of recommendations; and (9) developing global guidelines. Conclusion We were able to successfully apply the GRADE approach for developing recommendations for public health interventions. Applying the general principles of the approach while carefully considering specific challenges can enhance both the process and the outcome of guideline development. PMID:22640260
The Opportunities and Pitfalls of Applying Life Cycle Thinking to Nanoproducts and Nanomaterials
Life Cycle Assessment (LCA) is a well-established methodology for evaluating the environmental impact of products, materials, and processes. LCA experts worldwide agree that existing LCA tools are capable of supporting the development of decisions on the use of nanomaterials and ...
An industrial ecology approach to municipal solid waste management: I. Methodology
Municipal solid waste (MSW) can be viewed as a feedstock for industrial ecology inspired conversions of wastes to valuable products and energy. The industrial ecology principle of symbiotic processes using waste streams for creating value-added products is applied to MSW, with e...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harben, P E; Harris, D; Myers, S
Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less
Acceptance testing for PACS: from methodology to design to implementation
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Huang, H. K.
2004-04-01
Acceptance Testing (AT) is a crucial step in the implementation process of a PACS within a clinical environment. AT determines whether the PACS is ready for clinical use and marks the official sign off of the PACS product. Most PACS vendors have Acceptance Testing (AT) plans, however, these plans do not provide a complete and robust evaluation of the full system. In addition, different sites will have different special requirements that vendor AT plans do not cover. The purpose of this paper is to introduce a protocol for AT design and present case studies of AT performed on clinical PACS. A methodology is presented that includes identifying testing components within PACS, quality assurance for both functionality and performance, and technical testing focusing on key single points-of-failure within the PACS product. Tools and resources that provide assistance in performing AT are discussed. In addition, implementation of the AT within the clinical environment and the overall implementation timeline of the PACS process are presented. Finally, case studies of actual AT of clinical PACS performed in the healthcare environment will be reviewed. The methodology for designing and implementing a robust AT plan for PACS was documented and has been used in PACS acceptance tests in several sites. This methodology can be applied to any PACS and can be used as a validation for the PACS product being acquired by radiology departments and hospitals. A methodology for AT design and implementation was presented that can be applied to future PACS installations. A robust AT plan for a PACS installation can increase both the utilization and satisfaction of a successful implementation of a PACS product that benefits both vendor and customer.
Arun, Mike W J; Yoganandan, Narayan; Stemper, Brian D; Pintar, Frank A
2014-12-01
While studies have used acoustic sensors to determine fracture initiation time in biomechanical studies, a systematic procedure is not established to process acoustic signals. The objective of the study was to develop a methodology to condition distorted acoustic emission data using signal processing techniques to identify fracture initiation time. The methodology was developed from testing a human cadaver lumbar spine column. Acoustic sensors were glued to all vertebrae, high-rate impact loading was applied, load-time histories were recorded (load cell), and fracture was documented using CT. Compression fracture occurred to L1 while other vertebrae were intact. FFT of raw voltage-time traces were used to determine an optimum frequency range associated with high decibel levels. Signals were bandpass filtered in this range. Bursting pattern was found in the fractured vertebra while signals from other vertebrae were silent. Bursting time was associated with time of fracture initiation. Force at fracture was determined using this time and force-time data. The methodology is independent of selecting parameters a priori such as fixing a voltage level(s), bandpass frequency and/or using force-time signal, and allows determination of force based on time identified during signal processing. The methodology can be used for different body regions in cadaver experiments. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, John E.; English, Christine M.; Gesick, Joshua C.
This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.
The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin; Zhang, Yizhuo
This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.
Modeling of electrohydrodynamic drying process using response surface methodology
Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin
2014-01-01
Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
A Colorimetric Process to Visualize Erythrocyte Exovesicles Aggregates
ERIC Educational Resources Information Center
Saldanha, Carlota; Santos, Nuno C.; Martins-Silva, J.
2004-01-01
A biochemistry laboratory class protocol is described in order to create an opportunity for students to apply by doing the theoretical concepts underlying biomolecules and vesicles properties, together with the principles of centrifugation and colorimetric methodologies. Through simple procedures the students will i) observe the segregation of the…
Performance Scripts Creation: Processes and Applications
ERIC Educational Resources Information Center
Lyons, Paul
2006-01-01
Purpose: Seeks to explain some of the dynamics of scripts creation as used in training, to offer some theoretical underpinning regarding the influence of script creation on behavior and performance, and to offer some examples of how script creation is applied in training activities. Design/methodology/approach: The paper explains in detail and…
ERIC Educational Resources Information Center
Hung, Wei-Chen; Smith, Thomas J.; Harris, Marian S.; Lockard, James
2010-01-01
This study adopted design and development research methodology (Richey & Klein, "Design and development research: Methods, strategies, and issues," 2007) to systematically investigate the process of applying instructional design principles, human-computer interaction, and software engineering to a performance support system (PSS) for behavior…
Decision making in prioritization of required operational capabilities
NASA Astrophysics Data System (ADS)
Andreeva, P.; Karev, M.; Kovacheva, Ts.
2015-10-01
The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.
A Grounded Theory Study of Supervision of Preservice Consultation Training
ERIC Educational Resources Information Center
Newman, Daniel S.
2012-01-01
The purpose of this study was to explore a university-based supervision process for consultants-in-training (CITs) engaged in a preservice level consultation course with applied practicum experience. The study was approached from a constructivist worldview using a grounded theory methodology. Data consisted of supervision session transcripts,…
Help Students Become Wise Energy Consumers
ERIC Educational Resources Information Center
Massiha, G. H.; Hebert, Herbert A.; Rawat, Kuldeep S.
2007-01-01
The authors of this article introduce students in their department's construction course to a variety of energy-saving practices and processes. They describe activities that could give students an opportunity to apply design methodology in the creative pursuit of a solution to an open-ended problem. An introductory lecture gives students the…
A New Lean Paradigm in Higher Education: A Case Study
ERIC Educational Resources Information Center
Doman, Mark S.
2011-01-01
Purpose: This case study aims to demonstrate that lean principles and practices utilized in industry can be successfully applied to improve higher education administrative processes through an innovative and engaging learning experience involving undergraduate students. Design/methodology/approach: This is a first-hand account by the instructor of…
Learning to Mock--Challenging the Teaching Establishment
ERIC Educational Resources Information Center
Gabriel, Norman
2016-01-01
There have been very few studies that apply the work of Mikhail Bakhtin and Norbert Elias to understand the underlying learning processes of young children. This article will explore the methodological similarities between Bakhtin's ideas about the carnivalesque and Norbert Elias's theory of established-outsider relations to explain how young…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
...: Data collection to understand how NIH programs apply methodologies to improve their research programs... research programs apply methodologies to improve their organizational effectiveness. The degree of an...; 30-Day Comment Request; Data Collection To Understand How NIH Programs Apply Methodologies To Improve...
Determination of orthotropic material properties by modal analysis
NASA Astrophysics Data System (ADS)
Lai, Junpeng
The methodology for determination of orthotropic material properties in plane stress condition will be presented. It is applied to orthotropic laminated plates like printed wiring boards. The first part of the thesis will focus on theories and methodologies. The static beam model and vibratory plate model is presented. The methods are validated by operating a series of test on aluminum. In the static tests, deflection and two directions of strain are measured, thus four of the properties will be identified: Ex, Ey, nuxy, nuyx. Moving on to dynamic test, the first ten modes' resonance frequencies are obtained. The technique of modal analysis is adopted. The measured data is processed by FFT and analyzed by curve fitting to extract natural frequencies and mode shapes. With the last material property to be determined, a finite element method using ANSYS is applied. Along with the identified material properties in static tests, and proper initial guess of the unknown shear modulus, an iterative process creates finite element model and conducts modal analysis with the updating model. When the modal analysis result produced by ANSYS matches the natural frequencies acquired by dynamic test, the process will halt. Then we obtained the last material property in plane stress condition.
Georeferenced LiDAR 3D vine plantation map generation.
Llorens, Jordi; Gil, Emilio; Llop, Jordi; Queraltó, Meritxell
2011-01-01
The use of electronic devices for canopy characterization has recently been widely discussed. Among such devices, LiDAR sensors appear to be the most accurate and precise. Information obtained with LiDAR sensors during reading while driving a tractor along a crop row can be managed and transformed into canopy density maps by evaluating the frequency of LiDAR returns. This paper describes a proposed methodology to obtain a georeferenced canopy map by combining the information obtained with LiDAR with that generated using a GPS receiver installed on top of a tractor. Data regarding the velocity of LiDAR measurements and UTM coordinates of each measured point on the canopy were obtained by applying the proposed transformation process. The process allows overlap of the canopy density map generated with the image of the intended measured area using Google Earth(®), providing accurate information about the canopy distribution and/or location of damage along the rows. This methodology was applied and tested on different vine varieties and crop stages in two important vine production areas in Spain. The results indicate that the georeferenced information obtained with LiDAR sensors appears to be an interesting tool with the potential to improve crop management processes.
Espiñeira, Montserrat; Vieites, Juan M
2012-12-15
The TaqMan real-time PCR has the highest potential for automation, therefore representing the currently most suitable method for screening, allowing the detection of fraudulent or unintentional mislabeling of species. This work describes the development of a real-time polymerase chain reaction (RT-PCR) system for the detection and identification of common octopus (Octopus vulgaris) and main substitute species (Eledone cirrhosa and Dosidicus gigas). This technique is notable for the combination of simplicity, speed, sensitivity and specificity in an homogeneous assay. The method can be applied to all kinds of products; fresh, frozen and processed, including those undergoing intensive processes of transformation. This methodology was validated to check how the degree of food processing affects the method and the detection of each species. Moreover, it was applied to 34 commercial samples to evaluate the labeling of products made from them. The methodology herein developed is useful to check the fulfillment of labeling regulations for seafood products and to verify traceability in commercial trade and for fisheries control. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purcupile, J.C.
The purpose of this study is to apply the methodologies developed in the Energy Conservation in Coal Conversion August, 1977 Progress Report - Contract No. EY77S024196 - to an energy efficient, near-term coal conversion process design, and to develop additional, general techniques for studying energy conservation and utilization in coal conversion processes. The process selected for study was the Ralph M. Parsons Company of Pasadena, California ''Oil/Gas Complex, Conceptual Design/Economic Analysis'' as described in R and D Report No. 114 - Interim Report No. 4, published March, 1977, ERDA Contract No. E(49-18)-1975. Thirteen papers representing possible alternative methods of energymore » conservation or waste heat utilization have been entered individually into EDB and ERA. (LTN)« less
The Design Process of Physical Security as Applied to a U.S. Border Port of Entry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, G.G.
1999-02-22
This paper details the application of a standard physical security system design process to a US Border Port of Entry (PoE) for vehicle entry/exit. The physical security design methodology is described as well as the physical security similarities to facilities currently at a US Border PoE for vehicles. The physical security design process description includes the various elements that make up the methodologies well as the considerations that must be taken into account when dealing with system integration of those elements. The distinctions between preventing unlawful entry/exit of illegal contraband and personnel are described. The potential to enhance the functionsmore » of drug/contraband detection in the Pre-Primary Inspection area through the application of emerging technologies are also addressed.« less
Numerical simulation of the SAGD process coupled with geomechanical behavior
NASA Astrophysics Data System (ADS)
Li, Pingke
Canada has vast oil sand resources. While a large portion of this resource can be recovered by surface mining techniques, a majority is located at depths requiring the application of in situ recovery technologies. Although a number of in situ recovery technologies exist, the steam assisted gravity drainage (SAGD) process has emerged as one of the most promising technologies to develop the in situ oil sands resources. During the SAGD operations, saturated steam is continuously injected into the oil sands reservoir, which induces pore pressure and stress variations. As a result, reservoir parameters and processes may also vary, particularly when tensile and shear failure occur. This geomechanical effect is obvious for oil sands material because oil sands have the in situ interlocked fabric. The conventional reservoir simulation generally does not take this coupled mechanism into consideration. Therefore, this research is to improve the reservoir simulation techniques of the SAGD process applied in the development of oil sands and heavy oil reservoirs. The analyses of the decoupled reservoir geomechanical simulation results show that the geomechanical behavior in SAGD has obvious impact on reservoir parameters, such as absolute permeability. The issues with the coupled reservoir geomechanical simulations of the SAGD process have been clarified and the permeability variations due to geomechanical behaviors in the SAGD process investigated. A methodology of sequentially coupled reservoir geomechanical simulation technique was developed based on the reservoir simulator, EXOTHERM, and the geomechanical simulator, FLAC. In addition, a representative geomechanical model of oil sands material was summarized in this research. Finally, this reservoir geomechanical simulation methodology was verified with the UTF Phase A SAGD project and applied in a SAGD operation with gas-over-bitumen geometry. Based on this methodology, the geomechanical effect on the SAGD production performance can be quantified. This research program involves the analyses of laboratory testing results obtained from literatures. However, no laboratory testing was conducted in the process of this research.
Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H
2018-07-01
Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.
Optimisation of wire-cut EDM process parameter by Grey-based response surface methodology
NASA Astrophysics Data System (ADS)
Kumar, Amit; Soota, Tarun; Kumar, Jitendra
2018-03-01
Wire electric discharge machining (WEDM) is one of the advanced machining processes. Response surface methodology coupled with Grey relation analysis method has been proposed and used to optimise the machining parameters of WEDM. A face centred cubic design is used for conducting experiments on high speed steel (HSS) M2 grade workpiece material. The regression model of significant factors such as pulse-on time, pulse-off time, peak current, and wire feed is considered for optimising the responses variables material removal rate (MRR), surface roughness and Kerf width. The optimal condition of the machining parameter was obtained using the Grey relation grade. ANOVA is applied to determine significance of the input parameters for optimising the Grey relation grade.
NASA Technical Reports Server (NTRS)
Waligora, Sharon; Bailey, John; Stark, Mike
1995-01-01
The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are (1) to understand the software development process in the GSFC environment; (2) to measure the effects of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.
A methodology for Manufacturing Execution Systems (MES) implementation
NASA Astrophysics Data System (ADS)
Govindaraju, Rajesri; Putra, Krisna
2016-02-01
Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid
2017-03-16
Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. ©Richard Harte, Liam Glynn, Alejandro Rodríguez-Molinero, Paul MA Baker, Thomas Scharf, Leo R Quinlan, Gearóid ÓLaighin. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 16.03.2017.
Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude
2011-07-01
Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P
2018-01-01
Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.
Determining Training Device Requirements in Army Aviation Systems
NASA Technical Reports Server (NTRS)
Poumade, M. L.
1984-01-01
A decision making methodology which applies the systems approach to the training problem is discussed. Training is viewed as a total system instead of a collection of individual devices and unrelated techniques. The core of the methodology is the use of optimization techniques such as the transportation algorithm and multiobjective goal programming with training task and training device specific data. The role of computers, especially automated data bases and computer simulation models, in the development of training programs is also discussed. The approach can provide significant training enhancement and cost savings over the more traditional, intuitive form of training development and device requirements process. While given from an aviation perspective, the methodology is equally applicable to other training development efforts.
Know how to maximize maintenance spending
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrino, A.J.; Jones, R.B.; Platt, W.E.
Solomon has developed a methodology to determine a large optimum point where availability meets maintenance spending for Powder River Basin (PRB) coal-fired units. Using a database of sufficient size and composition across various operating ranges, Solomon generated an algorithm that predicts the relationship between maintenance spending and availability. Coupling this generalized algorithm with a unit-specific market-loss curve determines the optimum spending for a facility. The article presents the results of the analysis, how this methodology can be applied to develop optimum operating and financial targets for specific units and markets and a process to achieve those targets. It also describesmore » how this methodology can be used for other types of fossil-fired technologies and future enhancements to the analysis. 5 figs.« less
Romanelli, Asunción; Massone, Héctor E; Escalante, Alicia H
2011-09-01
This article gives an account of the implementation of a stakeholder analysis framework at La Brava Wetland Basin, Argentina, in a common-pool resource (CPR) management context. Firstly, the context in which the stakeholder framework was implemented is described. Secondly, a four-step methodology is applied: (1) stakeholder identification, (2) stakeholder differentiation-categorization, (3) investigation of stakeholders' relationships, and (4) analysis of social-biophysical interdependencies. This methodology classifies stakeholders according to their level of influence on the system and their potential in the conservation of natural resources. The main influential stakeholders are La Brava Village residents and tourism-related entrepreneurs who are empowered to make the more important decisions within the planning process of the ecosystem. While these key players are seen as facilitators of change, there are other groups (residents of the inner basin and fishermen) which are seen mainly as key blockers. The applied methodology for the Stakeholder Analysis and the evaluation of social-biophysical interdependencies carried out in this article can be seen as an encouraging example for other experts in natural sciences to learn and use these methods developed in social sciences. Major difficulties and some recommendations of applying this method in the practice by non-experts are discussed.
Mancosu, Pietro; Nicolini, Giorgia; Goretti, Giulia; De Rose, Fiorenza; Franceschini, Davide; Ferrari, Chiara; Reggiori, Giacomo; Tomatis, Stefano; Scorsetti, Marta
2018-05-01
Lean Six Sigma Methodology (LSSM) was introduced in industry to provide near-perfect services to large processes, by reducing improbable occurrence. LSSM has been applied to redesign the 2D-2D breast repositioning process (Lean) by the retrospective analysis of the database (Six Sigma). Breast patients with daily 2D-2D matching before RT were considered. The five DMAIC (define, measure, analyze, improve, and control) LSSM steps were applied. The process was retrospectively measured over 30 months (7/2014-12/2016) by querying the RT Record&Verify database. Two Lean instruments (Poka-Yoke and Visual Management) were considered for advancing the process. The new procedure was checked over 6 months (1-6/2017). 14,931 consecutive shifts from 1342 patients were analyzed. Only 0.8% of patients presented median shifts >1 cm. The major observed discrepancy was the monthly percentage of fractions with almost zero shifts (AZS = 13.2% ± 6.1%). Ishikawa fishbone diagram helped in defining the main discrepancy con-causes. Procedure harmonization involving a multidisciplinary team to increase confidence in matching procedure was defined. AZS was reduced to 4.8% ± 0.6%. Furthermore, distribution symmetry improvement (Skewness moved from 1.4 to 1.1) and outlier reduction, verified by Kurtosis diminution, demonstrated a better "normalization" of the procedure after the LSSM application. LSSM was implemented in a RT department, allowing to redesign the breast repositioning matching procedure. Copyright © 2018 Elsevier B.V. All rights reserved.
Szaciłowski, Konrad
2007-01-01
Analogies between photoactive nitric oxide generators and various electronic devices: logic gates and operational amplifiers are presented. These analogies have important biological consequences: application of control parameters allows for better targeting and control of nitric oxide drugs. The same methodology may be applied in the future for other therapeutic strategies and at the same time helps to understand natural regulatory and signaling processes in biological systems.
ERIC Educational Resources Information Center
Seay, Jeffrey R.; Eden, Mario R.
2008-01-01
This paper introduces, via case study example, the benefit of including risk assessment methodology and inherently safer design practices into the curriculum for chemical engineering students. This work illustrates how these tools can be applied during the earliest stages of conceptual process design. The impacts of decisions made during…
Power Block Geometry Applied to the Building of Power Electronics Converters
ERIC Educational Resources Information Center
dos Santos, E. C., Jr.; da Silva, E. R. C.
2013-01-01
This paper proposes a new methodology, Power Block Geometry (PBG), for the presentation of power electronics topologies that process ac voltage. PBG's strategy uses formal methods based on a geometrical representation with particular rules and defines a universe with axioms and conjectures to establish a formation law. It allows power…
ERIC Educational Resources Information Center
Bloomquist, Carroll R.
The TRANSCOM (Transportation Command) Regulating Command and Control Evacuation System (TRAC2ES), which applies state-of-the-art technology to manage global medical regulating (matching patients to clinical availability) and medical evacuation processes, will be installed at all Department of Defense medical locations globally. A combination of…
ERIC Educational Resources Information Center
Homsin, Nawattakorn; Chantarasombat, Chalard; Yeamsang, Theerawatta
2015-01-01
This research uses Mixed-Methodology applied research and development together with participatory action research. The model is appropriate for the context environment. The participants were able to complete the learning activities in participatory forms of knowledge management, using the following five-step model: 1) Knowledge Identification, 2)…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... these decisionmaking processes will be applied by FDA to help design effective communication strategies..., beliefs, and behaviors--and use risk communications; (2) more efficiently and effectively design messages... provided about the design and methodology of the pretests and the studies to effectively comment on the...
Developmental Trajectories for Children with Dyslexia and Low IQ Poor Readers
ERIC Educational Resources Information Center
Kuppen, Sarah E. A.; Goswami, Usha
2016-01-01
Reading difficulties are found in children with both high and low IQ and it is now clear that both groups exhibit difficulties in phonological processing. Here, we apply the developmental trajectories approach, a new methodology developed for studying language and cognitive impairments in developmental disorders, to both poor reader groups. The…
The Key Factors of an Active Learning Method in a Microprocessors Course
ERIC Educational Resources Information Center
Carpeno, A.; Arriaga, J.; Corredor, J.; Hernandez, J.
2011-01-01
The creation of the European Higher Education Area (EHEA) is promoting a change toward a new model of education focused on the student. It is impelling methodological innovation processes in many European universities, leading more teachers to apply methods based on active and cooperative learning in their classrooms. However, the successful…
ERIC Educational Resources Information Center
Brecka, Peter; Cervenanská, Marcela
2016-01-01
The introduced study represents methodology and results of research focused on utilization of interactive whiteboard as didactic technology mediating information through multimedia worksheets applied in education process in pre-primary education. Its aim was to determine whether it can significantly increase the level of children's acquired…
PBL-SEE: An Authentic Assessment Model for PBL-Based Software Engineering Education
ERIC Educational Resources Information Center
dos Santos, Simone C.
2017-01-01
The problem-based learning (PBL) approach has been successfully applied to teaching software engineering thanks to its principles of group work, learning by solving real problems, and learning environments that match the market realities. However, the lack of well-defined methodologies and processes for implementing the PBL approach represents a…
Seeking an Online Social Media Radar
ERIC Educational Resources Information Center
ter Veen, James
2014-01-01
Purpose: The purpose of this paper is to explore how the application of Systems Engineering tools and techniques can be applied to rapidly process and analyze the vast amounts of data present in social media in order to yield practical knowledge for Command and Control (C2) systems. Design/methodology/approach: Based upon comparative analysis of…
Teaching and Social Change: Reflections on a Freirean Approach in a College Classroom.
ERIC Educational Resources Information Center
Solorzano, Daniel G.
1989-01-01
Reflects on the implementation of Paulo Freire's problem-posing method in an East Los Angeles College (California) course on the media portrayal of Chicanos. Examines Freire's pedagogy and its application in the classroom, and critiques the process. Describes recent work applying the Freirean methodology in college classrooms. (Author/LS)
Pendula, Models, Constructivism and Reality
ERIC Educational Resources Information Center
Nola, Robert
2004-01-01
It is argued that Galileo made an important breakthrough in the methodology of science by considering idealized models of phenomena such as free fall, swinging pendula and the like, which can conflict with experience. The idealized models are constructs largely by our reasoning processes applied to the theoretical situation at hand. On this view,…
Using the Agile Development Methodology and Applying Best Practice Project Management Processes
2014-12-01
side of this writing: Like finicky domestic helpers who announce that they ‘don’t do windows,’ I’ve often heard software developers state proudly...positioned or motivated, but rather because they were the least skilled developer (2012, 34). This result turned a team of what should be generalists
An Exploratory Study of Best Lean Sustainability Practices in Higher Education
ERIC Educational Resources Information Center
Comm, Clare L.; Mathaisel, Dennis F. X.
2005-01-01
Purpose: Because of the ever-expanding commercialization and marketing of higher education, a need now exists to apply the concepts of business process improvement to colleges and universities. Aims to explore this issue. Design/methodology/approach: An open-ended qualitative questionnaire was developed, administered to 18 public and private…
A Qualitative Metasynthesis of Consultation Process Research: What We Know and Where to Go
ERIC Educational Resources Information Center
Newman, Daniel S.; McKenney, Elizabeth L. W.; Silva, Arlene E.; Clare, Mary; Salmon, Diane; Jackson, Safiyah
2017-01-01
Qualitative metasynthesis (QM) is a research methodology that permits the meaningful integration and interpretation of qualitative research. This study applies a QM approach combined with constructivist grounded theory methods, bolstered by several features of research credibility, to examine the state of consultee-centered consultation (CCC) and…
Quality indicators for hip fracture care, a systematic review.
Voeten, S C; Krijnen, P; Voeten, D M; Hegeman, J H; Wouters, M W J M; Schipper, I B
2018-05-17
Quality indicators are used to measure quality of care and enable benchmarking. An overview of all existing hip fracture quality indicators is lacking. The primary aim was to identify quality indicators for hip fracture care reported in literature, hip fracture audits, and guidelines. The secondary aim was to compose a set of methodologically sound quality indicators for the evaluation of hip fracture care in clinical practice. A literature search according to the PRISMA guidelines and an internet search were performed to identify hip fracture quality indicators. The indicators were subdivided into process, structure, and outcome indicators. The methodological quality of the indicators was judged using the Appraisal of Indicators through Research and Evaluation (AIRE) instrument. For structure and process indicators, the construct validity was assessed. Sixteen publications, nine audits and five guidelines were included. In total, 97 unique quality indicators were found: 9 structure, 63 process, and 25 outcome indicators. Since detailed methodological information about the indicators was lacking, the AIRE instrument could not be applied. Seven indicators correlated with an outcome measure. A set of nine quality indicators was extracted from the literature, audits, and guidelines. Many quality indicators are described and used. Not all of them correlate with outcomes of care and have been assessed methodologically. As methodological evidence is lacking, we recommend the extracted set of nine indicators to be used as the starting point for further clinical research. Future research should focus on assessing the clinimetric properties of the existing quality indicators.
Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Kula, K.R.; Paik, I.K.; Chung, D.Y.
1996-12-31
Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less
A methodology for extending domain coverage in SemRep.
Rosemblat, Graciela; Shin, Dongwook; Kilicoglu, Halil; Sneiderman, Charles; Rindflesch, Thomas C
2013-12-01
We describe a domain-independent methodology to extend SemRep coverage beyond the biomedical domain. SemRep, a natural language processing application originally designed for biomedical texts, uses the knowledge sources provided by the Unified Medical Language System (UMLS©). Ontological and terminological extensions to the system are needed in order to support other areas of knowledge. We extended SemRep's application by developing a semantic representation of a previously unsupported domain. This was achieved by adapting well-known ontology engineering phases and integrating them with the UMLS knowledge sources on which SemRep crucially depends. While the process to extend SemRep coverage has been successfully applied in earlier projects, this paper presents in detail the step-wise approach we followed and the mechanisms implemented. A case study in the field of medical informatics illustrates how the ontology engineering phases have been adapted for optimal integration with the UMLS. We provide qualitative and quantitative results, which indicate the validity and usefulness of our methodology. Published by Elsevier Inc.
An eco-balance of a recycling plant for spent lead-acid batteries.
Salomone, Roberta; Mondello, Fabio; Lanuzza, Francesco; Micali, Giuseppe
2005-02-01
This study applies Life Cycle Assessment (LCA) methodology to present an eco-balance of a recycling plant that treats spent lead-acid batteries. The recycling plant uses pyrometallurgical treatment to obtain lead from spent batteries. The application of LCA methodology (ISO 14040 series) enabled us to assess the potential environmental impacts arising from the recycling plant's operations. Thus, net emissions of greenhouse gases as well as other major environmental consequences were examined and hot spots inside the recycling plant were identified. A sensitivity analysis was also performed on certain variables to evaluate their effect on the LCA study. The LCA of a recycling plant for spent lead-acid batteries presented shows that this methodology allows all of the major environmental consequences associated with lead recycling using the pyrometallurgical process to be examined. The study highlights areas in which environmental improvements are easily achievable by a business, providing a basis for suggestions to minimize the environmental impact of its production phases, improving process and company performance in environmental terms.
NASA Astrophysics Data System (ADS)
Sara, Balazs; Antonini, Ernesto; Tarantini, Mario
2001-02-01
The VAMP project (VAlorization of building demolition Materials and Products, LIFE 98/ENV/IT/33) aims to build an effective and innovative information system to support decision making in selective demolition activity and to manage the valorization (recovery-reuse-recycling) of waste flows produced by the construction and demolition (C&D) sector. The VAMP information system will be tested it in Italy in some case studies of selective demolition. In this paper the proposed demolition-valorization system will be compared to the traditional one in a life cycle perspective, applying LCA methodology to highlight the advantages of VAMP system from an eco-sustainability point of view. Within the system boundaries demolition processes, transport of demolition wastes and its recovery/treatment or disposal in landfill were included. Processes avoided due to reuse-recycling activities, such as extraction of natural resources and manufacture of building materials and components, were considered too. In this paper data collection procedure applied in inventory and impact assessment phases and a general overview about data availability for LCA studies in this sector are presented. Results of application of VAMP methodology to a case study are discussed and compared with a simulated traditional demolition of the same building. Environmental advantages of VAMP demolition-valorization system are demonstrated quantitatively emphasizing the special importance of reuse of building components with high demand of energy for manufacture.
Jorge-Botana, Guillermo; Olmos, Ricardo; Luzón, José M
2018-01-01
The aim of this paper is to describe and explain one useful computational methodology to model the semantic development of word representation: Word maturity. In particular, the methodology is based on the longitudinal word monitoring created by Kirylev and Landauer using latent semantic analysis for the representation of lexical units. The paper is divided into two parts. First, the steps required to model the development of the meaning of words are explained in detail. We describe the technical and theoretical aspects of each step. Second, we provide a simple example of application of this methodology with some simple tools that can be used by applied researchers. This paper can serve as a user-friendly guide for researchers interested in modeling changes in the semantic representations of words. Some current aspects of the technique and future directions are also discussed. WIREs Cogn Sci 2018, 9:e1457. doi: 10.1002/wcs.1457 This article is categorized under: Computer Science > Natural Language Processing Linguistics > Language Acquisition Psychology > Development and Aging. © 2017 Wiley Periodicals, Inc.
Alexakis, Dimitrios D.; Mexis, Filippos-Dimitrios K.; Vozinaki, Anthi-Eirini K.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.
2017-01-01
A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R2 values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies. PMID:28635625
Alexakis, Dimitrios D; Mexis, Filippos-Dimitrios K; Vozinaki, Anthi-Eirini K; Daliakopoulos, Ioannis N; Tsanis, Ioannis K
2017-06-21
A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R² values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies.
Using Sandelowski and Barroso's Meta-Synthesis Method in Advancing Qualitative Evidence.
Ludvigsen, Mette S; Hall, Elisabeth O C; Meyer, Gabriele; Fegran, Liv; Aagaard, Hanne; Uhrenfeldt, Lisbeth
2016-02-01
The purpose of this article was to iteratively account for and discuss the handling of methodological challenges in two qualitative research syntheses concerning patients' experiences of hospital transition. We applied Sandelowski and Barroso's guidelines for synthesizing qualitative research, and to our knowledge, this is the first time researchers discuss their methodological steps. In the process, we identified a need for prolonged discussions to determine mutual understandings of the methodology. We discussed how to identify the appropriate qualitative research literature and how to best conduct exhaustive literature searches on our target phenomena. Another finding concerned our status as third-order interpreters of participants' experiences and what this meant for synthesizing the primary findings. Finally, we discussed whether our studies could be classified as metasummaries or metasyntheses. Although we have some concerns regarding the applicability of the methodology, we conclude that following Sandelowski and Barroso's guidelines contributed to valid syntheses of our studies. © The Author(s) 2015.
A Methodology for Evaluating Artifacts Produced by a Formal Verification Process
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette
2011-01-01
The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.
Optimum surface roughness prediction for titanium alloy by adopting response surface methodology
NASA Astrophysics Data System (ADS)
Yang, Aimin; Han, Yang; Pan, Yuhang; Xing, Hongwei; Li, Jinze
Titanium alloy has been widely applied in industrial engineering products due to its advantages of great corrosion resistance and high specific strength. This paper investigated the processing parameters for finish turning of titanium alloy TC11. Firstly, a three-factor central composite design of experiment, considering the cutting speed, feed rate and depth of cut, are conducted in titanium alloy TC11 and the corresponding surface roughness are obtained. Then a mathematic model is constructed by the response surface methodology to fit the relationship between the process parameters and the surface roughness. The prediction accuracy was verified by the one-way ANOVA. Finally, the contour line of the surface roughness under different combination of process parameters are obtained and used for the optimum surface roughness prediction. Verification experimental results demonstrated that material removal rate (MRR) at the obtained optimum can be significantly improved without sacrificing the surface roughness.
Clinical governance and operations management methodologies.
Davies, C; Walley, P
2000-01-01
The clinical governance mechanism, introduced since 1998 in the UK National Health Service (NHS), aims to deliver high quality care with efficient, effective and cost-effective patient services. Scally and Donaldson recognised that new approaches are needed, and operations management techniques comprise potentially powerful methodologies in understanding the process of care, which can be applied both within and across professional boundaries. This paper summarises four studies in hospital Trusts which took approaches to improving process that were different from and less structured than business process re-engineering (BPR). The problems were then amenable to change at a relatively low cost and short timescale, producing significant improvement to patient care. This less structured approach to operations management avoided incurring overhead costs of large scale and costly change such as new information technology (IT) systems. The most successful changes were brought about by formal tools to control quantity, content and timing of changes.
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will focus on the modernization of design and engineering practices through the use of Model Based Definition methodology. By gathering important engineering data into one 3D digital data set, applying model annotations, and setting up model view states directly in the 3D CAD model, model-specific information can be published to Windchill and CreoView for use during the Design Review Process. This presentation will describe the methods that have been incorporated into the modeling.
Proceedings of the Fifteenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1990-01-01
The Software Engineering Laboratory (SEL) is an organization sponsored by GSFC and created for the purpose of investigating the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. Fifteen papers were presented at the Fifteenth Annual Software Engineering Workshop in five sessions: (1) SEL at age fifteen; (2) process improvement; (3) measurement; (4) reuse; and (5) process assessment. The sessions were followed by two panel discussions: (1) experiences in implementing an effective measurement program; and (2) software engineering in the 1980's. A summary of the presentations and panel discussions is given.
A methodology for collecting valid software engineering data
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Weiss, David M.
1983-01-01
An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.
A hierarchical modeling methodology for the definition and selection of requirements
NASA Astrophysics Data System (ADS)
Dufresne, Stephane
This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the epistemic uncertainty. The proposed methodology is applied to the design of a hurricane tracker Unmanned Aerial Vehicles to demonstrate the origin and impact of requirements on the concept of operations and systems alternatives. This research demonstrates that the hierarchical modeling methodology provides a traceable flow-down of the requirements from the problem definition to the systems alternatives phases of conceptual design.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine
2017-02-01
Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns.
Van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine
2017-01-01
Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns. PMID:28220842
Life Prediction/Reliability Data of Glass-Ceramic Material Determined for Radome Applications
NASA Technical Reports Server (NTRS)
Choi, Sung R.; Gyekenyesi, John P.
2002-01-01
Brittle materials, ceramics, are candidate materials for a variety of structural applications for a wide range of temperatures. However, the process of slow crack growth, occurring in any loading configuration, limits the service life of structural components. Therefore, it is important to accurately determine the slow crack growth parameters required for component life prediction using an appropriate test methodology. This test methodology also should be useful in determining the influence of component processing and composition variables on the slow crack growth behavior of newly developed or existing materials, thereby allowing the component processing and composition to be tailored and optimized to specific needs. Through the American Society for Testing and Materials (ASTM), the authors recently developed two test methods to determine the life prediction parameters of ceramics. The two test standards, ASTM 1368 for room temperature and ASTM C 1465 for elevated temperatures, were published in the 2001 Annual Book of ASTM Standards, Vol. 15.01. Briefly, the test method employs constant stress-rate (or dynamic fatigue) testing to determine flexural strengths as a function of the applied stress rate. The merit of this test method lies in its simplicity: strengths are measured in a routine manner in flexure at four or more applied stress rates with an appropriate number of test specimens at each applied stress rate. The slow crack growth parameters necessary for life prediction are then determined from a simple relationship between the strength and the applied stress rate. Extensive life prediction testing was conducted at the NASA Glenn Research Center using the developed ASTM C 1368 test method to determine the life prediction parameters of a glass-ceramic material that the Navy will use for radome applications.
[Problems of world outlook and methodology of science integration in biological studies].
Khododova, Iu D
1981-01-01
Problems of worldoutlook and methodology of the natural-science knowledge are considered basing on the analysis of tendencies in the development of the membrane theory of cell processes and the use of principles of biological membrane functioning when solving some scientific and applied problems pertaining to different branches of chemistry and biology. The notion scientific knowledge integration is defined as interpenetration of approaches, methods and ideas of different branches of knowledge and enrichment on this basis of their content resulting in knowledge augmentation in each field taken separately. These processes are accompanied by appearance of new branches of knowledge - sciences "on junction" and their subsequent differentiations. The analysis of some gnoseological situations shows that integration of sciences contributes to coordination and some agreement of thinking styles of different specialists, puts forward keen personality of a scientist demanding, in particular, his high professional mobility. Problems of scientific activity organization are considered, which involve social sciences into the integration processes. The role of philosophy in the integration processes is emphasized.
GilPavas, Edison; Dobrosz-Gómez, Izabela; Gómez-García, Miguel Ángel
2012-01-01
The Response Surface Methodology (RSM) was applied as a tool for the optimization of the operational conditions of the photo-degradation of highly concentrated PY12 wastewater, resulting from a textile industry located in the suburbs of Medellin (Colombia). The Box-Behnken experimental Design (BBD) was chosen for the purpose of response optimization. The photo-Fenton process was carried out in a laboratory-scale batch photo-reactor. A multifactorial experimental design was proposed, including the following variables: the initial dyestuff concentration, the H(2)O(2) and the Fe(+2) concentrations, as well as the UV wavelength radiation. The photo-Fenton process performed at the optimized conditions resulted in ca. 100% of dyestuff decolorization, 92% of COD and 82% of TOC degradation. A kinetic study was accomplished, including the identification of some intermediate compounds generated during the oxidation process. The water biodegradability reached a final DBO(5)/DQO = 0.86 value.
Vázquez, José A; Ramos, Patrícia; Mirón, Jesús; Valcarcel, Jesus; Sotelo, Carmen G; Pérez-Martín, Ricardo I
2017-06-16
The waste generated from shrimp processing contains valuable materials such as protein, carotenoids, and chitin. The present study describes a process at pilot plant scale to recover chitin from the cephalothorax of Penaeus vannamei using mild conditions. The application of a sequential enzymatic-acid-alkaline treatment yields 30% chitin of comparable purity to commercial sources. Effluents from the process are rich in protein and astaxanthin, and represent inputs for further by-product recovery. As a last step, chitin is deacetylated to produce chitosan; the optimal conditions are established by applying a response surface methodology (RSM). Under these conditions, deacetylation reaches 92% as determined by Proton Nuclear Magnetic Resonance (¹H-NMR), and the molecular weight (Mw) of chitosan is estimated at 82 KDa by gel permeation chromatography (GPC). Chitin and chitosan microstructures are characterized by Scanning Electron Microscopy (SEM).
Improving scanner wafer alignment performance by target optimization
NASA Astrophysics Data System (ADS)
Leray, Philippe; Jehoul, Christiane; Socha, Robert; Menchtchikov, Boris; Raghunathan, Sudhar; Kent, Eric; Schoonewelle, Hielke; Tinnemans, Patrick; Tuffy, Paul; Belen, Jun; Wise, Rich
2016-03-01
In the process nodes of 10nm and below, the patterning complexity along with the processing and materials required has resulted in a need to optimize alignment targets in order to achieve the required precision, accuracy and throughput performance. Recent industry publications on the metrology target optimization process have shown a move from the expensive and time consuming empirical methodologies, towards a faster computational approach. ASML's Design for Control (D4C) application, which is currently used to optimize YieldStar diffraction based overlay (DBO) metrology targets, has been extended to support the optimization of scanner wafer alignment targets. This allows the necessary process information and design methodology, used for DBO target designs, to be leveraged for the optimization of alignment targets. In this paper, we show how we applied this computational approach to wafer alignment target design. We verify the correlation between predictions and measurements for the key alignment performance metrics and finally show the potential alignment and overlay performance improvements that an optimized alignment target could achieve.
NASA Astrophysics Data System (ADS)
Leu, Jun-Der; Lee, Larry Jung-Hsing
2017-09-01
Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G
2016-09-01
The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.
Applying total quality management concepts to public health organizations.
Kaluzny, A D; McLaughlin, C P; Simpson, K
1992-01-01
Total quality management (TQM) is a participative, systematic approach to planning and implementing a continuous organizational improvement process. Its approach is focused on satisfying customers' expectations, identifying problems, building commitment, and promoting open decision-making among workers. TQM applies analytical tools, such as flow and statistical charts and check sheets, to gather data about activities within an organization. TQM uses process techniques, such as nominal groups, brainstorming, and consensus forming to facilitate communication and decision making. TQM applications in the public sector and particularly in public health agencies have been limited. The process of integrating TQM into public health agencies complements and enhances the Model Standards Program and assessment methodologies, such as the Assessment Protocol for Excellence in Public Health (APEX-PH), which are mechanisms for establishing strategic directions for public health. The authors examine the potential for using TQM as a method to achieve and exceed standards quickly and efficiently. They discuss the relationship of performance standards and assessment methodologies with TQM and provide guidelines for achieving the full potential of TQM in public health organizations. The guidelines include redefining the role of management, defining a common corporate culture, refining the role of citizen oversight functions, and setting realistic estimates of the time needed to complete a task or project. PMID:1594734
NASA Astrophysics Data System (ADS)
Malmi, Lauri; Adawi, Tom; Curmi, Ronald; de Graaff, Erik; Duffy, Gavin; Kautz, Christian; Kinnunen, Päivi; Williams, Bill
2018-03-01
We investigated research processes applied in recent publications in the European Journal of Engineering Education (EJEE), exploring how papers link to theoretical work and how research processes have been designed and reported. We analysed all 155 papers published in EJEE in 2009, 2010 and 2013, classifying the papers using a taxonomy of research processes in engineering education research (EER) (Malmi et al. 2012). The majority of the papers presented either empirical work (59%) or were case reports (27%). Our main findings are as follows: (1) EJEE papers build moderately on a wide selection of theoretical work; (2) a great majority of papers have a clear research strategy, but data analysis methods are mostly simple descriptive statistics or simple/undocumented qualitative research methods; and (3) there are significant shortcomings in reporting research questions, methodology and limitations of studies. Our findings are consistent with and extend analyses of EER papers in other publishing venues; they help to build a clearer picture of the research currently published in EJEE and allow us to make recommendations for consideration by the editorial team of the journal. Our employed procedure also provides a framework that can be applied to monitor future global evolution of this and other EER journals.
de Paiva, Anderson Paulo
2018-01-01
This research evaluates the influence of the Brazilian accreditation methodology on the sustainability of the organizations. Critical factors for implementing accreditation were also examined, including measuring the relationships established between these factors in the organization sustainability. The present study was developed based on the survey methodology applied in the organizations accredited by ONA (National Accreditation Organization); 288 responses were received from the top level managers. The analysis of quantitative data of the measurement models was made with factorial analysis from principal components. The final model was evaluated from the confirmatory factorial analysis and structural equation modeling techniques. The results from the research are vital for the definition of factors that interfere in the accreditation processes, providing a better understanding for accredited organizations and for Brazilian accreditation. PMID:29599939
Lugão, Suzana S M; Ricart, Simone L S I; Pinheiro, Renata M S; Gonçalves, Waldney M
2012-01-01
This article presents the description and discussion of a pilot project in an ergonomic action developed in a public health institution. This project involves the implantation of an Ergonomics Program (PROERGO) in a department of this institution, guided by a methodology structured on six stages, referenced in the literature by ergonomics authors. The methodology includes the training of workers and the formation of facilitators and multipliers of the ergonomics actions, aiming to the implementation of a cyclical process of actions and the consolidation of an ergonomics culture in the organization. Starting from the results of this experiment we intend to replicate this program model in other departments of the institution and to propose the methodology applied as a strategy of intervention to Occupational Health area.
Using TELOS for the planning of the information system audit
NASA Astrophysics Data System (ADS)
Drljaca, D. P.; Latinovic, B.
2018-01-01
This paper intent is to analyse different aspects of information system audit and to synthesise them into the feasibility study report in order to facilitate decision making and planning of information system audit process. The TELOS methodology provides a comprehensive and holistic review for making feasibility study in general. This paper examines the use of TELOS in the identification of possible factors that may influence the decision on implementing information system audit. The research question relates to TELOS provision of sufficient information to decision makers to plan an information system audit. It was found that the TELOS methodology can be successfully applied in the process of approving and planning of information system audit. The five aspects of the feasibility study, if performed objectively, can provide sufficient information to decision makers to commission an information system audit, and also contribute better planning of the audit. Using TELOS methodology can assure evidence-based and cost-effective decision-making process and facilitate planning of the audit. The paper proposes an original approach, not examined until now. It is usual to use TELOS for different purposes and when there is a need for conveying of the feasibility study, but not in the planning of the information system audit. This gives originality to the paper and opens further research questions about evaluation of the feasibility study and possible research on comparative and complementary methodologies.
Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean
2017-03-01
In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.
NASA Astrophysics Data System (ADS)
Määttä, A.; Laine, M.; Tamminen, J.; Veefkind, J. P.
2014-05-01
Satellite instruments are nowadays successfully utilised for measuring atmospheric aerosol in many applications as well as in research. Therefore, there is a growing need for rigorous error characterisation of the measurements. Here, we introduce a methodology for quantifying the uncertainty in the retrieval of aerosol optical thickness (AOT). In particular, we concentrate on two aspects: uncertainty due to aerosol microphysical model selection and uncertainty due to imperfect forward modelling. We apply the introduced methodology for aerosol optical thickness retrieval of the Ozone Monitoring Instrument (OMI) on board NASA's Earth Observing System (EOS) Aura satellite, launched in 2004. We apply statistical methodologies that improve the uncertainty estimates of the aerosol optical thickness retrieval by propagating aerosol microphysical model selection and forward model error more realistically. For the microphysical model selection problem, we utilise Bayesian model selection and model averaging methods. Gaussian processes are utilised to characterise the smooth systematic discrepancies between the measured and modelled reflectances (i.e. residuals). The spectral correlation is composed empirically by exploring a set of residuals. The operational OMI multi-wavelength aerosol retrieval algorithm OMAERO is used for cloud-free, over-land pixels of the OMI instrument with the additional Bayesian model selection and model discrepancy techniques introduced here. The method and improved uncertainty characterisation is demonstrated by several examples with different aerosol properties: weakly absorbing aerosols, forest fires over Greece and Russia, and Sahara desert dust. The statistical methodology presented is general; it is not restricted to this particular satellite retrieval application.
Developing an ethical code for engineers: the discursive approach.
Lozano, J Félix
2006-04-01
From the Hippocratic Oath on, deontological codes and other professional self-regulation mechanisms have been used to legitimize and identify professional groups. New technological challenges and, above all, changes in the socioeconomic environment require adaptable codes which can respond to new demands. We assume that ethical codes for professionals should not simply focus on regulative functions, but must also consider ideological and educative functions. Any adaptations should take into account both contents (values, norms and recommendations) and the drafting process itself. In this article we propose a process for developing a professional ethical code for an official professional association (Colegio Oficial de Ingenieros Industriales de Valencia (COIIV) starting from the philosophical assumptions of discursive ethics but adapting them to critical hermeneutics. Our proposal is based on the Integrity Approach rather than the Compliance Approach. A process aiming to achieve an effective ethical document that fulfils regulative and ideological functions requires a participative, dialogical and reflexive methodology. This process must respond to moral exigencies and demands for efficiency and professional effectiveness. In addition to the methodological proposal we present our experience of producing an ethical code for the industrial engineers' association in Valencia (Spain) where this methodology was applied, and we evaluate the detected problems and future potential.
[Method for the quality assessment of data collection processes in epidemiological studies].
Schöne, G; Damerow, S; Hölling, H; Houben, R; Gabrys, L
2017-10-01
For a quantitative evaluation of primary data collection processes in epidemiological surveys based on accompaniments and observations (in the field), there is no description of test criteria and methodologies in relevant literature and thus no known application in practice. Therefore, methods need to be developed and existing procedures adapted. The aim was to identify quality-relevant developments within quality dimensions by means of inspection points (quality indicators) during the process of data collection. As a result we seek to implement and establish a methodology for the assessment of overall survey quality supplementary to standardized data analyses. Monitors detect deviations from standard primary data collection during site visits by applying standardized checklists. Quantitative results - overall and for each dimension - are obtained by numerical calculation of quality indicators. Score results are categorized and color coded. This visual prioritization indicates necessity for intervention. The results obtained give clues regarding the current quality of data collection. This allows for the identification of such sections where interventions for quality improvement are needed. In addition, process quality development can be shown over time on an intercomparable basis. This methodology for the evaluation of data collection quality can identify deviations from norms, focalize quality analyses and help trace causes for significant deviations.
Strategy Revitalization in Academe: A Balanced Scorecard Approach
ERIC Educational Resources Information Center
McDevitt, Roselie; Giapponi, Catherine; Solomon, Norman
2008-01-01
Purpose: The purpose of this paper is to present a unique version of the balanced scorecard developed and applied by the faculty of a university division. Design/methodology/approach: The paper uses a case study approach and uses the experiences of the faculty of a business school to describe the process and benefits of developing a custom…
ERIC Educational Resources Information Center
McQueen, Robert J.; Janson, Annick
2016-01-01
Purpose: This paper aims to examine factors which influence how tacit knowledge is built and applied by client-facing consultants. Design/methodology/approach: Qualitative methods (interviews, thematic analysis) were used to gather and analyse data from 15 consultants in an agricultural extension context. Findings: Twenty-six factors about how…
Assessment--Enabling Participation in Academic Discourse and the Implications
ERIC Educational Resources Information Center
Bayaga, Anass; Wadesango, Newman
2013-01-01
The current study was an exploration of how to develop assessment resources and processes via in-depth interviews with 30 teachers. The focus was on how teachers use and apply different assessment situations. The methodology, which was a predominately qualitative approach and adopted case study design, sought to use a set of criteria based on…
A System Evaluation Theory Analyzing Value and Results Chain for Institutional Accreditation in Oman
ERIC Educational Resources Information Center
Paquibut, Rene Ymbong
2017-01-01
Purpose: This paper aims to apply the system evaluation theory (SET) to analyze the institutional quality standards of Oman Academic Accreditation Authority using the results chain and value chain tools. Design/methodology/approach: In systems thinking, the institutional standards are connected as input, process, output and feedback and leads to…
ERIC Educational Resources Information Center
Hollomotz, A.
2014-01-01
Background: Over the past two decades, disability activists and scholars have developed research paradigms that aim to place (some of the) control over the research process in the hands of disabled people. This paper discusses the appropriateness of applying such paradigms to sex offenders with intellectual disabilities (ID). It exposes to what…
The Challenges of Adopting the Learning Organisation Philosophy in a Singapore School
ERIC Educational Resources Information Center
Retna, Kala S.; Tee, Ng Pak
2006-01-01
Purpose: To report on a case study that examines how the Learning Organisation (LO) concept can be applied in a Singapore school and the challenges that the school faces in the process. Design/methodology/approach: A qualitative research inquiry was adopted using ethnographic methods. Data includes in-depth face-to-face interviews, observation of…
Negative Results: Conceptual and Methodological Dimensions in Single-Case Intervention Research
ERIC Educational Resources Information Center
Kratochwill, Thomas R.; Levin, Joel R.; Horner, Robert H.
2018-01-01
The central roles of science in the field of remedial and special education are to (a) identify basic laws of nature and (b) apply those laws in the design of practices that achieve socially valued outcomes. The scientific process is designed to allow demonstration of specific (typically positive) outcomes, and to assist in the attribution of…
ERIC Educational Resources Information Center
Wilczek-Vera, Grazyna; Salin, Eric Dunbar
2011-01-01
An experiment on fluorescence spectroscopy suitable for an advanced analytical laboratory is presented. Its conceptual development used a combination of the expository and discovery styles. The "learn-as-you-go" and direct "hands-on" methodology applied ensures an active role for a student in the process of visualization and discovery of concepts.…
ERIC Educational Resources Information Center
Spaiser, Viktoria; Hedström, Peter; Ranganathan, Shyam; Jansson, Kim; Nordvik, Monica K.; Sumpter, David J. T.
2018-01-01
It is widely recognized that segregation processes are often the result of complex nonlinear dynamics. Empirical analyses of complex dynamics are however rare, because there is a lack of appropriate empirical modeling techniques that are capable of capturing complex patterns and nonlinearities. At the same time, we know that many social phenomena…
ADM1-based methodology for the characterisation of the influent sludge in anaerobic reactors.
Huete, E; de Gracia, M; Ayesa, E; Garcia-Heras, J L
2006-01-01
This paper presents a systematic methodology to characterise the influent sludge in terms of the ADM1 components from the experimental measurements traditionally used in wastewater engineering. For this purpose, a complete characterisation of the model components in their elemental mass fractions and charge has been used, making a rigorous mass balance for all the process transformations and enabling the future connection with other unit-process models. It also makes possible the application of mathematical algorithms for the optimal characterisation of several components poorly defined in the ADM1 report. Additionally, decay and disintegration have been necessarily uncoupled so that the decay proceeds directly to hydrolysis instead of producing intermediate composites. The proposed methodology has been applied to the particular experimental work of a pilot-scale CSTR treating real sewage sludge, a mixture of primary and secondary sludge. The results obtained have shown a good characterisation of the influent reflected in good model predictions. However, its limitations for an appropriate prediction of alkalinity and carbon percentages in biogas suggest the convenience of including the elemental characterisation of the process in terms of carbon in the analytical program.
Model-Based Thermal System Design Optimization for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.
2017-01-01
Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.
Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.
Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun
2018-01-01
Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.
Model-based thermal system design optimization for the James Webb Space Telescope
NASA Astrophysics Data System (ADS)
Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.
2017-10-01
Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.
Application of Six Sigma/CAP methodology: controlling blood-product utilization and costs.
Neri, Robert A; Mason, Cindy E; Demko, Lisa A
2008-01-01
Blood-product components are a limited commodity whose cost is rising. Many patients benefit from their use, but patients who receive transfusions face an unnecessary increased risk for developing infections; fatal, febrile, or allergic reactions; and circulatory overload. To improve patient care, safety, and resource stewardship, transfusion practices must be evaluated for appropriateness (Wilson et al. 2002). A multihospital health system undertook a rigorous study of blood-product utilization patterns and management processes to address cost-control problems in the organization. The system leveraged two process improvement tools widely implemented outside of the healthcare industry: (1) Six Sigma methodology to identify blood-utilization drivers and to standardize transfusion practice, and (2) change acceleration process model to drive effective change. The initiative resulted in a decreased rate of inappropriate transfusions of packed red blood cell from 16 percent to less than 5 percent, improved clinician use of a blood-component order form, establishment of internal benchmarks, enhanced laboratory-to-clinician communication, and better blood-product expense control. The project further demonstrated how out-of-industry tools and methodologies can be adopted, adapted, and systematically applied to generate positive change (Black and Revere 2006).
Case Study: Applying OpenEHR Archetypes to a Clinical Data Repository in a Chinese Hospital.
Min, Lingtong; Wang, Li; Lu, Xudong; Duan, Huilong
2015-01-01
openEHR is a flexible and scalable modeling methodology for clinical information and has been widely adopted in Europe and Australia. Due to the reasons of differences in clinical process and management, there are few research projects involving openEHR in China. To investigate the feasibility of openEHR methodology for clinical information modelling in China, this paper carries out a case study to apply openEHR archetypes to Clinical Data Repository (CDR) in a Chinese hospital. The results show that a set of 26 archetypes are found to cover all the concepts used in the CDR. Of all these, 9 (34.6%) are reused without change, 10 are modified and/or extended, and 7 are newly defined. The reasons for modification, extension and newly definition have been discussed, including granularity of archetype, metadata-level versus data-level modelling, and the representation of relationships between archetypes.
Service Modeling Language Applied to Critical Infrastructure
NASA Astrophysics Data System (ADS)
Baldini, Gianmarco; Fovino, Igor Nai
The modeling of dependencies in complex infrastructure systems is still a very difficult task. Many methodologies have been proposed, but a number of challenges still remain, including the definition of the right level of abstraction, the presence of different views on the same critical infrastructure and how to adequately represent the temporal evolution of systems. We propose a modeling methodology where dependencies are described in terms of the service offered by the critical infrastructure and its components. The model provides a clear separation between services and the underlying organizational and technical elements, which may change in time. The model uses the Service Modeling Language proposed by the W3 consortium for describing critical infrastructure in terms of interdependent services nodes including constraints, behavior, information flows, relations, rules and other features. Each service node is characterized by its technological, organizational and process components. The model is then applied to a real case of an ICT system for users authentication.
A reflective lens: applying critical systems thinking and visual methods to ecohealth research.
Cleland, Deborah; Wyborn, Carina
2010-12-01
Critical systems methodology has been advocated as an effective and ethical way to engage with the uncertainty and conflicting values common to ecohealth problems. We use two contrasting case studies, coral reef management in the Philippines and national park management in Australia, to illustrate the value of critical systems approaches in exploring how people respond to environmental threats to their physical and spiritual well-being. In both cases, we used visual methods--participatory modeling and rich picturing, respectively. The critical systems methodology, with its emphasis on reflection, guided an appraisal of the research process. A discussion of these two case studies suggests that visual methods can be usefully applied within a critical systems framework to offer new insights into ecohealth issues across a diverse range of socio-political contexts. With this article, we hope to open up a conversation with other practitioners to expand the use of visual methods in integrated research.
Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks
NASA Astrophysics Data System (ADS)
Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.
2012-05-01
A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.
Evaluating building performance in healthcare facilities: an organizational perspective.
Steinke, Claudia; Webster, Lynn; Fontaine, Marie
2010-01-01
Using the environment as a strategic tool is one of the most cost-effective and enduring approaches for improving public health; however, it is one that requires multiple perspectives. The purpose of this article is to highlight an innovative methodology that has been developed for conducting comprehensive performance evaluations in public sector health facilities in Canada. The building performance evaluation methodology described in this paper is a government initiative. The project team developed a comprehensive building evaluation process for all new capital health projects that would respond to the aforementioned need for stakeholders to be more accountable and to better integrate the larger organizational strategy of facilities. The Balanced Scorecard, which is a multiparadigmatic, performance-based business framework, serves as the underlying theoretical framework for this initiative. It was applied in the development of the conceptual model entitled the Building Performance Evaluation Scorecard, which provides the following benefits: (1) It illustrates a process to link facilities more effectively to the overall mission and goals of an organization; (2) It is both a measurement and a management system that has the ability to link regional facilities to measures of success and larger business goals; (3) It provides a standardized methodology that ensures consistency in assessing building performance; and (4) It is more comprehensive than traditional building evaluations. The methodology presented in this paper is both a measurement and management system that integrates the principles of evidence-based design with the practices of pre- and post-occupancy evaluation. It promotes accountability and continues throughout the life cycle of a project. The advantage of applying this framework is that it engages health organizations in clarifying a vision and strategy for their facilities and helps translate those strategies into action and measurable performance outcomes.
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
Van Haute, S; López-Gálvez, F; Gómez-López, V M; Eriksson, Markus; Devlieghere, F; Allende, Ana; Sampers, I
2015-09-02
A methodology to i) assess the feasibility of water disinfection in fresh-cut leafy greens wash water and ii) to compare the disinfectant efficiency of water disinfectants was defined and applied for a combination of peracetic acid (PAA) and lactic acid (LA) and comparison with free chlorine was made. Standardized process water, a watery suspension of iceberg lettuce, was used for the experiments. First, the combination of PAA+LA was evaluated for water recycling. In this case disinfectant was added to standardized process water inoculated with Escherichia coli (E. coli) O157 (6logCFU/mL). Regression models were constructed based on the batch inactivation data and validated in industrial process water obtained from fresh-cut leafy green processing plants. The UV254(F) was the best indicator for PAA decay and as such for the E. coli O157 inactivation with PAA+LA. The disinfection efficiency of PAA+LA increased with decreasing pH. Furthermore, PAA+LA efficacy was assessed as a process water disinfectant to be used within the washing tank, using a dynamic washing process with continuous influx of E. coli O157 and organic matter in the washing tank. The process water contamination in the dynamic process was adequately estimated by the developed model that assumed that knowledge of the disinfectant residual was sufficient to estimate the microbial contamination, regardless the physicochemical load. Based on the obtained results, PAA+LA seems to be better suited than chlorine for disinfecting process wash water with a high organic load but a higher disinfectant residual is necessary due to the slower E. coli O157 inactivation kinetics when compared to chlorine. Copyright © 2015 Elsevier B.V. All rights reserved.
Risk-based process safety assessment and control measures design for offshore process facilities.
Khan, Faisal I; Sadiq, Rehan; Husain, Tahir
2002-09-02
Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.
Shnorhavorian, Margarett; Kroon, Leah; Jeffries, Howard; Johnson, Rebecca
2012-11-01
There is limited literature on strategies to overcome the barriers to sperm banking among adolescent and young adult (AYA) males with cancer. By standardizing our process for offering sperm banking to AYA males before cancer treatment, we aimed to improve rates of sperm banking at our institution. Continuous process improvement is a technique that has recently been applied to improve health care delivery. We used continuous process improvement methodologies to create a standard process for fertility preservation for AYA males with cancer at our institution. We compared rates of sperm banking before and after standardization. In the 12-month period after implementation of a standardized process, 90% of patients were offered sperm banking. We demonstrated an 8-fold increase in the proportion of AYA males' sperm banking, and a 5-fold increase in the rate of sperm banking at our institution. Implementation of a standardized process for sperm banking for AYA males with cancer was associated with increased rates of sperm banking at our institution. This study supports the role of standardized health care in decreasing barriers to sperm banking.
Research and development of an electrochemical biocide reactor
NASA Technical Reports Server (NTRS)
See, G. G.; Bodo, C. A.; Glennon, J. P.
1975-01-01
An alternate disinfecting process to chemical agents, heat, or radiation in an aqueous media has been studied. The process is called an electrochemical biocide and employs cyclic, low-level voltages at chemically inert electrodes to pass alternating current through water and, in the process, to destroy microorganisms. The paper describes experimental hardware, methodology, and results with a tracer microorganism (Escherichia coli). The results presented show the effects on microorganism kill of operating parameters, including current density (15 to 55 mA/sq cm (14 to 51 ASF)), waveform of applied electrical signal (square, triangular, sine), frequency of applied electrical signal (0.5 to 1.5 Hz), process water flow rate (100 to 600 cc/min (1.6 to 9.5 gph)), and reactor resident time (0 to 4 min). Comparisons are made between the disinfecting property of the electrochemical biocide and chlorine, bromine, and iodine.
A Study in Sexual Health Applying the Principles of Community-Based Participatory Research
Reece, Michael; Dodge, Brian
2012-01-01
The principles of community-based participatory research were applied to an exploratory sexual health study that examined “cruising for sex” among men on a college campus. In the context of a study seeking a broad interpretation of the health implications of cruising, and when faced with methodological challenges, the researchers found these principles to provide invaluable guidance. A review of the research process is offered and the manner in which the principles of community-based participatory research were operationalized for this study is described. PMID:15129042
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diaz, Luis A.; Clark, Gemma G.; Lister, Tedd E.
The rapid growth of the electronic waste can be viewed both as an environmental threat and as an attractive source of minerals that can reduce the mining of natural resources, and stabilize the market of critical materials, such as rare earths. Here in this article surface response methodology was used to optimize a previously developed electrochemical recovery process for base metals from electronic waste using a mild oxidant (Fe 3+). Through this process an effective extraction of base metals can be achieved enriching the concentration of precious metals and significantly reducing environmental impacts and operational costs associated with the wastemore » generation and chemical consumption. The optimization was performed using a bench-scale system specifically designed for this process. Operational parameters such as flow rate, applied current density and iron concentration were optimized to reduce the specific energy consumption of the electrochemical recovery process to 1.94 kWh per kg of metal recovered at a processing rate of 3.3 g of electronic waste per hour.« less
Diaz, Luis A.; Clark, Gemma G.; Lister, Tedd E.
2017-06-08
The rapid growth of the electronic waste can be viewed both as an environmental threat and as an attractive source of minerals that can reduce the mining of natural resources, and stabilize the market of critical materials, such as rare earths. Here in this article surface response methodology was used to optimize a previously developed electrochemical recovery process for base metals from electronic waste using a mild oxidant (Fe 3+). Through this process an effective extraction of base metals can be achieved enriching the concentration of precious metals and significantly reducing environmental impacts and operational costs associated with the wastemore » generation and chemical consumption. The optimization was performed using a bench-scale system specifically designed for this process. Operational parameters such as flow rate, applied current density and iron concentration were optimized to reduce the specific energy consumption of the electrochemical recovery process to 1.94 kWh per kg of metal recovered at a processing rate of 3.3 g of electronic waste per hour.« less
Knowledge base methodology: Methodology for first Engineering Script Language (ESL) knowledge base
NASA Technical Reports Server (NTRS)
Peeris, Kumar; Izygon, Michel E.
1992-01-01
The primary goal of reusing software components is that software can be developed faster, cheaper and with higher quality. Though, reuse is not automatic and can not just happen. It has to be carefully engineered. For example a component needs to be easily understandable in order to be reused, and it has also to be malleable enough to fit into different applications. In fact the software development process is deeply affected when reuse is being applied. During component development, a serious effort has to be directed toward making these components as reusable. This implies defining reuse coding style guidelines and applying then to any new component to create as well as to any old component to modify. These guidelines should point out the favorable reuse features and may apply to naming conventions, module size and cohesion, internal documentation, etc. During application development, effort is shifted from writing new code toward finding and eventually modifying existing pieces of code, then assembling them together. We see here that reuse is not free, and therefore has to be carefully managed.
Albach, Carlos Augusto; Wagland, Richard; Hunt, Katherine J
2018-04-01
This systematic review (1) identifies the current generic and cancer-related patient-reported outcome measures (PROMs) that have been cross-culturally adapted to Brazilian Portuguese and applied to cancer patients and (2) critically evaluates their cross-cultural adaptation (CCA) and measurement properties. Seven databases were searched for articles regarding the translation and evaluation of measurement properties of generic and cancer-related PROMs cross-culturally adapted to Brazilian Portuguese that are applied in adult (≥18 years old) cancer patients. The methodological quality of included studies was assessed using the COSMIN checklist. The bibliographic search retrieved 1674 hits, of which seven studies analysing eight instruments were included in this review. Data on the interpretability of scores were poorly reported. Overall, the quality of the CCA process was inconsistent throughout the studies. None of the included studies performed a cross-cultural validation. The evidence concerning the quality of measurement properties is limited by poor or fair methodological quality. Moreover, limited information regarding measurement properties was provided within the included papers. This review aids the selection process of Brazilian Portuguese PROMs for use in cancer patients. After acknowledging the methodological caveats and strengths of each tool, our opinion is that for quality of life and symptoms assessment the adapted FACT-G version and the ESAS could be recommended, respectively. Future research should rely on the already accepted standards of CCA and validation studies.
Lean methodology for performance improvement in the trauma discharge process.
O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey
2014-07-01
High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p < 0.0001). The lean process lasted 8 months, and three areas for new improvement were identified: (1) the off-unit patients; (2) patients with length of stay more than 15 days contribute disproportionately to length of stay; and (3) miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.
A Development Architecture for Serious Games Using BCI (Brain Computer Interface) Sensors
Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun
2012-01-01
Games that use brainwaves via brain–computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories. PMID:23202227
A development architecture for serious games using BCI (brain computer interface) sensors.
Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun
2012-11-12
Games that use brainwaves via brain-computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories.
Methodology to build medical ontology from textual resources.
Baneyx, Audrey; Charlet, Jean; Jaulent, Marie-Christine
2006-01-01
In the medical field, it is now established that the maintenance of unambiguous thesauri goes through ontologies. Our research task is to help pneumologists code acts and diagnoses with a software that represents medical knowledge through a domain ontology. In this paper, we describe our general methodology aimed at knowledge engineers in order to build various types of medical ontologies based on terminology extraction from texts. The hypothesis is to apply natural language processing tools to textual patient discharge summaries to develop the resources needed to build an ontology in pneumology. Results indicate that the joint use of distributional analysis and lexico-syntactic patterns performed satisfactorily for building such ontologies.
From intuition to statistics in building subsurface structural models
Brandenburg, J.P.; Alpak, F.O.; Naruk, S.; Solum, J.
2011-01-01
Experts associated with the oil and gas exploration industry suggest that combining forward trishear models with stochastic global optimization algorithms allows a quantitative assessment of the uncertainty associated with a given structural model. The methodology is applied to incompletely imaged structures related to deepwater hydrocarbon reservoirs and results are compared to prior manual palinspastic restorations and borehole data. This methodology is also useful for extending structural interpretations into other areas of limited resolution, such as subsalt in addition to extrapolating existing data into seismic data gaps. This technique can be used for rapid reservoir appraisal and potentially have other applications for seismic processing, well planning, and borehole stability analysis.
Ranasinghe, Nadeesha; Jones, Graham B
2013-03-15
Microwave, flow and combination methodologies have been applied to the synthesis of a number of substituted indoles. Based on the Hemetsberger-Knittel (HK) process, modifications allow formation of products rapidly and in high yield. Adapting the methodology allows formation of 2-unsubstituted indoles and derivatives, and a route to analogs of the antitumor agent PLX-4032 is demonstrated. The utility of the HK substrates is further demonstrated through bioconjugation and subsequent ring closure and via Huisgen type [3+2] cycloaddition chemistry, allowing formation of peptide adducts which can be subsequently labeled with fluorine tags. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Mixed-Methods Research Framework for Healthcare Process Improvement.
Bastian, Nathaniel D; Munoz, David; Ventura, Marta
2016-01-01
The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.
Lahmiri, Salim; Boukadoum, Mounir
2013-01-01
A new methodology for automatic feature extraction from biomedical images and subsequent classification is presented. The approach exploits the spatial orientation of high-frequency textural features of the processed image as determined by a two-step process. First, the two-dimensional discrete wavelet transform (DWT) is applied to obtain the HH high-frequency subband image. Then, a Gabor filter bank is applied to the latter at different frequencies and spatial orientations to obtain new Gabor-filtered image whose entropy and uniformity are computed. Finally, the obtained statistics are fed to a support vector machine (SVM) binary classifier. The approach was validated on mammograms, retina, and brain magnetic resonance (MR) images. The obtained classification accuracies show better performance in comparison to common approaches that use only the DWT or Gabor filter banks for feature extraction. PMID:27006906
Seismic data compression speeds exploration projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galibert, P.Y.
As part of an ongoing commitment to ensure industry-wide distribution of its revolutionary seismic data compression technology, Chevron Petroleum Technology Co. (CPTC) has entered into licensing agreements with Compagnie Generale de Geophysique (CGG) and other seismic contractors for use of its software in oil and gas exploration programs. CPTC expects use of the technology to be far-reaching to all of its industry partners involved in seismic data collection, processing, analysis and storage. Here, CGG--one of the world`s leading seismic acquisition and processing companies--talks about its success in applying the new methodology to replace full on-board seismic processing. Chevron`s technology ismore » already being applied on large off-shore 3-D seismic surveys. Worldwide, CGG has acquired more than 80,000 km of seismic data using the data compression technology.« less
ERIC Educational Resources Information Center
Rabikowska, Marta
2009-01-01
In this paper an ethical approach to educational methodology is discussed in relation to the philosophies of Emanuel Levinas and Robert Cox. Cox's anti-essentialist understanding of historical materialism and Levinas' metaphysical idealism are applied to an analysis of the (self)-reflective methods required today in Higher Education in the UK,…
3D Printing Processes Applied to the Creation of Glass Art
ERIC Educational Resources Information Center
Chivers, Morgan
2015-01-01
The purpose of this article is to present a few of the innovative techniques used in the execution of Morgan Chivers' sculptural work, not on the content of the work itself. The author's interest has been in merging the methodologies and precise output control of 3D printing with finished objects in nonprintable materials as required by the…
Towards a Periodical and Monograph Price Index. AIR Forum 1980 Paper.
ERIC Educational Resources Information Center
Belanger, Charles H.; Lavallee, Lise
The steps involved in tailoring a periodical and monograph price index to a university library are examined, as are the difficulties involved in applying a simple methodology such as a price index when the data base has not been organized to play an active role in the decision-making process. The following topics are addressed: the shifting of…
ERIC Educational Resources Information Center
Park, Sanghoon
2004-01-01
For millennia, emotional states have been viewed as avoidable impediments to rational thinking (Ellis & Newton, 2000). Several reasons have been pointed out. The lack of consensus of the definition on emotion that tend to conflict with each other was suggested as a main reason (Price, 1998). Also the difficulty of research methodology such as…
ERIC Educational Resources Information Center
Rawlings, Tomas
2010-01-01
The purpose of this article is to explore the development of new methodological approaches that draw on ideas and concepts from natural sciences and apply them within the humanities. The main research example this article looks at is the re-application of a palaeontological process; it looks though the geological layers of sediment for fossilised…
Chapelle, Frank H.; Robertson, John F.; Landmeyer, James E.; Bradley, Paul M.
2000-01-01
These two sites illustrate how the efficiency of natural attenuation processes acting on petroleum hydrocarbons can be systematically evaluated using hydrologic, geochemical, and microbiologic methods. These methods, in turn, can be used to assess the role that the natural attenuation of petroleum hydrocarbons can play in achieving overall site remediation.
EOS Operations Systems: EDOS Implemented Changes to Reduce Operations Costs
NASA Technical Reports Server (NTRS)
Cordier, Guy R.; Gomez-Rosa, Carlos; McLemore, Bruce D.
2007-01-01
The authors describe in this paper the progress achieved to-date with the reengineering of the Earth Observing System (EOS) Data and Operations System (EDOS), the experience gained in the process and the ensuing reduction of ground systems operations costs. The reengineering effort included a major methodology change, applying to an existing schedule driven system, a data-driven system approach.
General Electric composite ring-disk flywheel: Recent and potential developments
NASA Technical Reports Server (NTRS)
Coppa, A. P.
1984-01-01
Recent developments of the General Electric hybrid rotor design are described. The relation of the hybrid rotor design to flywheel designs that are especially suitable for spacecraft applications is discussed. Potential performance gains that can be achieved in such rotor designs by applying latest developments in materials, processing, and design methodology are projected. Indications are that substantial improvements can be obtained.
2008-09-01
SEP) is a comprehensive , iterative and recursive problem solving process, applied sequentially top-down by integrated teams. It transforms needs...central integrated design repository. It includes a comprehensive behavior modeling notation to understand the dynamics of a design. CORE is a MBSE...37 F. DYNAMIC POSITIONING..........................................................................38 G. FIREFIGHTING
Estimating Soil Hydraulic Parameters using Gradient Based Approach
NASA Astrophysics Data System (ADS)
Rai, P. K.; Tripathi, S.
2017-12-01
The conventional way of estimating parameters of a differential equation is to minimize the error between the observations and their estimates. The estimates are produced from forward solution (numerical or analytical) of differential equation assuming a set of parameters. Parameter estimation using the conventional approach requires high computational cost, setting-up of initial and boundary conditions, and formation of difference equations in case the forward solution is obtained numerically. Gaussian process based approaches like Gaussian Process Ordinary Differential Equation (GPODE) and Adaptive Gradient Matching (AGM) have been developed to estimate the parameters of Ordinary Differential Equations without explicitly solving them. Claims have been made that these approaches can straightforwardly be extended to Partial Differential Equations; however, it has been never demonstrated. This study extends AGM approach to PDEs and applies it for estimating parameters of Richards equation. Unlike the conventional approach, the AGM approach does not require setting-up of initial and boundary conditions explicitly, which is often difficult in real world application of Richards equation. The developed methodology was applied to synthetic soil moisture data. It was seen that the proposed methodology can estimate the soil hydraulic parameters correctly and can be a potential alternative to the conventional method.
Multi-scaling allometric analysis for urban and regional development
NASA Astrophysics Data System (ADS)
Chen, Yanguang
2017-01-01
The concept of allometric growth is based on scaling relations, and it has been applied to urban and regional analysis for a long time. However, most allometric analyses were devoted to the single proportional relation between two elements of a geographical system. Few researches focus on the allometric scaling of multielements. In this paper, a process of multiscaling allometric analysis is developed for the studies on spatio-temporal evolution of complex systems. By means of linear algebra, general system theory, and by analogy with the analytical hierarchy process, the concepts of allometric growth can be integrated with the ideas from fractal dimension. Thus a new methodology of geo-spatial analysis and the related theoretical models emerge. Based on the least squares regression and matrix operations, a simple algorithm is proposed to solve the multiscaling allometric equation. Applying the analytical method of multielement allometry to Chinese cities and regions yields satisfying results. A conclusion is reached that the multiscaling allometric analysis can be employed to make a comprehensive evaluation for the relative levels of urban and regional development, and explain spatial heterogeneity. The notion of multiscaling allometry may enrich the current theory and methodology of spatial analyses of urban and regional evolution.
Using lean methodology to improve productivity in a hospital oncology pharmacy.
Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D
2014-09-01
Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach
NASA Astrophysics Data System (ADS)
Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh
2017-03-01
Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.
NASA Astrophysics Data System (ADS)
Lin, H.; Zhang, X.; Wu, X.; Tarnas, J. D.; Mustard, J. F.
2018-04-01
Quantitative analysis of hydrated minerals from hyperspectral remote sensing data is fundamental for understanding Martian geologic process. Because of the difficulties for selecting endmembers from hyperspectral images, a sparse unmixing algorithm has been proposed to be applied to CRISM data on Mars. However, it's challenge when the endmember library increases dramatically. Here, we proposed a new methodology termed Target Transformation Constrained Sparse Unmixing (TTCSU) to accurately detect hydrous minerals on Mars. A new version of target transformation technique proposed in our recent work was used to obtain the potential detections from CRISM data. Sparse unmixing constrained with these detections as prior information was applied to CRISM single-scattering albedo images, which were calculated using a Hapke radiative transfer model. This methodology increases success rate of the automatic endmember selection of sparse unmixing and could get more accurate abundances. CRISM images with well analyzed in Southwest Melas Chasma was used to validate our methodology in this study. The sulfates jarosite was detected from Southwest Melas Chasma, the distribution is consistent with previous work and the abundance is comparable. More validations will be done in our future work.
Extension of Companion Modeling Using Classification Learning
NASA Astrophysics Data System (ADS)
Torii, Daisuke; Bousquet, François; Ishida, Toru
Companion Modeling is a methodology of refining initial models for understanding reality through a role-playing game (RPG) and a multiagent simulation. In this research, we propose a novel agent model construction methodology in which classification learning is applied to the RPG log data in Companion Modeling. This methodology enables a systematic model construction that handles multi-parameters, independent of the modelers ability. There are three problems in applying classification learning to the RPG log data: 1) It is difficult to gather enough data for the number of features because the cost of gathering data is high. 2) Noise data can affect the learning results because the amount of data may be insufficient. 3) The learning results should be explained as a human decision making model and should be recognized by the expert as being the result that reflects reality. We realized an agent model construction system using the following two approaches: 1) Using a feature selction method, the feature subset that has the best prediction accuracy is identified. In this process, the important features chosen by the expert are always included. 2) The expert eliminates irrelevant features from the learning results after evaluating the learning model through a visualization of the results. Finally, using the RPG log data from the Companion Modeling of agricultural economics in northeastern Thailand, we confirm the capability of this methodology.
Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela
2018-05-01
Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Using State Estimation Residuals to Detect Abnormal SCADA Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Jian; Chen, Yousu; Huang, Zhenyu
2010-06-14
Detection of manipulated supervisory control and data acquisition (SCADA) data is critically important for the safe and secure operation of modern power systems. In this paper, a methodology of detecting manipulated SCADA data based on state estimation residuals is presented. A framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection process. The BACON algorithm is applied to detect outliers in the state estimation residuals. The IEEE 118-bus system is used asmore » a test case to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less
Direct-to-consumer-advertising of prescription medicines: a theoretical approach to understanding.
Harker, Michael; Harker, Debra
2007-01-01
The pharmaceutical industry is a leader in research and development investment. New treatments need to be communicated to the market, and consumers are increasingly interested in learning about new drugs. Direct to consumer advertising of prescription medicines (DTCA) is a controversial practice where many of the arguments for and against are not supported by strong evidence. This paper aims to contribute to a research agenda that is forming in this area. The paper reports on a systematic review that was conducted and applies accepted theoretical models to the DTCA context. The systematic review methodology is widely accepted in the medical sector and is successfully applied here in the marketing field. The hierarchy of effects model is specifically applied to DTCA with a clear emphasis on consumer rights, empowerment, protection and knowledge. This paper provides healthcare practitioners with insight into how consumers process DTCA messages and provides guidance into how to assist in this message processing.
Experimental Design of a UCAV-Based High-Energy Laser Weapon
2016-12-01
propagation. The Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their... Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their effect on the...73 A. DESIGN OF EXPERIMENTS METHODOLOGY .............................73 B. OPERATIONAL CONCEPT
Teaching and Learning Methodologies Supported by ICT Applied in Computer Science
ERIC Educational Resources Information Center
Capacho, Jose
2016-01-01
The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…
Becoming an Expert: Developing Expertise in an Applied Discipline
ERIC Educational Resources Information Center
Kuhlmann, Diane Orlich; Ardichvili, Alexandre
2015-01-01
Purpose: This paper aims to examine the development of expertise in an applied discipline by addressing the research question: How is professional expertise developed in an applied profession? Design/methodology/approach: Using a grounded theory methodology (GTM), nine technical-tax experts, and three experienced, non-expert tax professionals were…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-08
... Coordinator; (2) applies research methodologies to perform evaluation studies of health information technology grant programs; and, (3) applies advanced mathematical or quantitative modeling to the U.S. health care... remaining items in the paragraph accordingly: ``(1) Applying research methodologies to perform evaluation...
Epistemological-Methodological Issues Related to Applied Organizational Research.
ERIC Educational Resources Information Center
van Meel, R. M.
Applied research is supposed to take the perspective with the highest degree of corroboration as a basis for action. The realm of organizational perspectives is characterized, however, with a multitude of competing research programs, seldom tested against each other. Epistemological and methodological issues overwhelm inquiry in applied research.…
Ancient DNA studies: new perspectives on old samples
2012-01-01
In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611
Galfi, Istvan; Virtanen, Jorma; Gasik, Michael M.
2017-01-01
A new, faster and more reliable analytical methodology for S(IV) species analysis at low pH solutions by bichromatometry is proposed. For decades the state of the art methodology has been iodometry that is still well justified method for neutral solutions, thus at low pH media possess various side reactions increasing inaccuracy. In contrast, the new methodology has no side reactions at low pH media, requires only one titration step and provides a clear color change if S(IV) species are present in the solution. The method is validated using model solutions with known concentrations and applied to analyses of gaseous SO2 from purged solution in low pH media samples. The results indicate that bichromatometry can accurately analyze SO2 from liquid samples having pH even below 0 relevant to metallurgical industrial processes. PMID:29145479
Expert System Development Methodology (ESDM)
NASA Technical Reports Server (NTRS)
Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.
1990-01-01
The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.
Measuring e-Commerce service quality from online customer review using sentiment analysis
NASA Astrophysics Data System (ADS)
Kencana Sari, Puspita; Alamsyah, Andry; Wibowo, Sulistyo
2018-03-01
The biggest e-Commerce challenge to understand their market is to chart their level of service quality according to customer perception. The opportunities to collect user perception through online user review is considered faster methodology than conducting direct sampling methodology. To understand the service quality level, sentiment analysis methodology is used to classify the reviews into positive and negative sentiment for five dimensions of electronic service quality (e-Servqual). As case study in this research, we use Tokopedia, one of the biggest e-Commerce service in Indonesia. We obtain the online review comments about Tokopedia service quality during several month observations. The Naïve Bayes classification methodology is applied for the reason of its high-level accuracy and support large data processing. The result revealed that personalization and reliability dimension required more attention because have high negative sentiment. Meanwhile, trust and web design dimension have high positive sentiments that means it has very good services. The responsiveness dimension have balance sentiment positive and negative.
Intelligent monitoring and control of semiconductor manufacturing equipment
NASA Technical Reports Server (NTRS)
Murdock, Janet L.; Hayes-Roth, Barbara
1991-01-01
The use of AI methods to monitor and control semiconductor fabrication in a state-of-the-art manufacturing environment called the Rapid Thermal Multiprocessor is described. Semiconductor fabrication involves many complex processing steps with limited opportunities to measure process and product properties. By applying additional process and product knowledge to that limited data, AI methods augment classical control methods by detecting abnormalities and trends, predicting failures, diagnosing, planning corrective action sequences, explaining diagnoses or predictions, and reacting to anomalous conditions that classical control systems typically would not correct. Research methodology and issues are discussed, and two diagnosis scenarios are examined.
Estarellas Martin, Carolina; Seira Castan, Constantí; Luque Garriga, F Javier; Bidon-Chanal Badia, Axel
2015-10-01
Residue conformational changes and internal cavity migration processes play a key role in regulating the kinetics of ligand migration and binding events in globins. Molecular dynamics simulations have demonstrated their value in the study of these processes in different haemoglobins, but derivation of kinetic data demands the use of more complex techniques like enhanced sampling molecular dynamics methods. This review discusses the different methodologies that are currently applied to study the ligand migration process in globins and highlight those specially developed to derive kinetic data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Some comments on Hurst exponent and the long memory processes on capital markets
NASA Astrophysics Data System (ADS)
Sánchez Granero, M. A.; Trinidad Segovia, J. E.; García Pérez, J.
2008-09-01
The analysis of long memory processes in capital markets has been one of the topics in finance, since the existence of the market memory could implicate the rejection of an efficient market hypothesis. The study of these processes in finance is realized through Hurst exponent and the most classical method applied is R/S analysis. In this paper we will discuss the efficiency of this methodology as well as some of its more important modifications to detect the long memory. We also propose the application of a classical geometrical method with short modifications and we compare both approaches.
Chiu, Ming-Chuan; Hsieh, Min-Chih
2016-05-01
The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Simulation Of Assembly Processes With Technical Of Virtual Reality
NASA Astrophysics Data System (ADS)
García García, Manuel; Arenas Reina, José Manuel; Lite, Alberto Sánchez; Sebastián Pérez, Miguel Ángel
2009-11-01
Virtual reality techniques use at industrial processes provides a real approach to product life cycle. For components manual assembly, the use of virtual surroundings facilitates a simultaneous engineering in which variables such as human factors and productivity take a real act. On the other hand, in the actual phase of industrial competition it is required a rapid adjustment to client needs and to market situation. In this work it is analyzed the assembly of the front components of a vehicle using virtual reality tools and following up a product-process design methodology which includes every life service stage. This study is based on workstations design, taking into account productive and human factors from the ergonomic point of view implementing a postural study of every assembly operation, leaving the rest of stages for a later study. Design is optimized applying this methodology together with the use of virtual reality tools. It is also achieved a 15% reduction on time assembly and of 90% reduction in muscle—skeletal diseases at every assembly operation.
NASA Astrophysics Data System (ADS)
Dragos, Kosmas; Smarsly, Kay
2016-04-01
System identification has been employed in numerous structural health monitoring (SHM) applications. Traditional system identification methods usually rely on centralized processing of structural response data to extract information on structural parameters. However, in wireless SHM systems the centralized processing of structural response data introduces a significant communication bottleneck. Exploiting the merits of decentralization and on-board processing power of wireless SHM systems, many system identification methods have been successfully implemented in wireless sensor networks. While several system identification approaches for wireless SHM systems have been proposed, little attention has been paid to obtaining information on the physical parameters (e.g. stiffness, damping) of the monitored structure. This paper presents a hybrid system identification methodology suitable for wireless sensor networks based on the principles of component mode synthesis (dynamic substructuring). A numerical model of the monitored structure is embedded into the wireless sensor nodes in a distributed manner, i.e. the entire model is segmented into sub-models, each embedded into one sensor node corresponding to the substructure the sensor node is assigned to. The parameters of each sub-model are estimated by extracting local mode shapes and by applying the equations of the Craig-Bampton method on dynamic substructuring. The proposed methodology is validated in a laboratory test conducted on a four-story frame structure to demonstrate the ability of the methodology to yield accurate estimates of stiffness parameters. Finally, the test results are discussed and an outlook on future research directions is provided.
Dissecting delays in trauma care using corporate lean six sigma methodology.
Parks, Jennifer K; Klein, Jorie; Frankel, Heidi L; Friese, Randall S; Shafi, Shahid
2008-11-01
The Institute of Medicine has identified trauma center overcrowding as a crisis. We applied corporate Lean Six Sigma methodology to reduce overcrowding by quantifying patient dwell times in trauma resuscitation units (TRU) and to identify opportunities for reducing them. TRU dwell time of all patients treated at a Level I trauma center were measured prospectively during a 3-month period (n = 1,184). Delays were defined as TRU dwell time >6 hours. Using personnel trained in corporate Lean Six Sigma methodology, we created a detailed process map of patient flow through our TRU and measured time spent at each step prospectively during a 24/7 week-long time study (n = 43). Patients with TRU dwell time below the median (3 hours) were compared with those with longer dwell times to identify opportunities for improvement. TRU delays occurred in 183 of 1,184 trauma patients (15%), and peaked on days with >15 patients or with presence of five simultaneous patients. However, 135 delays (74%) occurred on days when =15 patients were treated. Six Sigma mapping identified four processes that were related to TRU delays. Reduction of TRU dwell time by 1 hour per patient using interventions targeting these specific processes has the potential to improve our TRU capacity to care for more patients. Application of corporate Lean Six Sigma methodology identified opportunities for reducing dwell times in our TRU. Such endeavors are vital to maximize operational efficiency and decrease overcrowding in busy trauma centers working at capacity.
Revisiting the PLUMBER Experiments from a Process-Diagnostics Perspective
NASA Astrophysics Data System (ADS)
Nearing, G. S.; Ruddell, B. L.; Clark, M. P.; Nijssen, B.; Peters-Lidard, C. D.
2017-12-01
The PLUMBER benchmarking experiments [1] showed that some of the most sophisticated land models (CABLE, CH-TESSEL, COLA-SSiB, ISBA-SURFEX, JULES, Mosaic, Noah, ORCHIDEE) were outperformed - in simulations of half-hourly surface energy fluxes - by instantaneous, out-of-sample, and globally-stationary regressions with no state memory. One criticism of PLUMBER is that the benchmarking methodology was not derived formally, so that applying a similar methodology with different performance metrics can result in qualitatively different results. Another common criticism of model intercomparison projects in general is that they offer little insight into process-level deficiencies in the models, and therefore are of marginal value for helping to improve the models. We address both of these issues by proposing a formal benchmarking methodology that also yields a formal and quantitative method for process-level diagnostics. We apply this to the PLUMBER experiments to show that (1) the PLUMBER conclusions were generally correct - the models use only a fraction of the information available to them from met forcing data (<50% by our analysis), and (2) all of the land models investigated by PLUMBER have similar process-level error structures, and therefore together do not represent a meaningful sample of structural or epistemic uncertainty. We conclude by suggesting two ways to improve the experimental design of model intercomparison and/or model benchmarking studies like PLUMBER. First, PLUMBER did not report model parameter values, and it is necessary to know these values to separate parameter uncertainty from structural uncertainty. This is a first order requirement if we want to use intercomparison studies to provide feedback to model development. Second, technical documentation of land models is inadequate. Future model intercomparison projects should begin with a collaborative effort by model developers to document specific differences between model structures. This could be done in a reproducible way using a unified, process-flexible system like SUMMA [2]. [1] Best, M.J. et al. (2015) 'The plumbing of land surface models: benchmarking model performance', J. Hydrometeor. [2] Clark, M.P. et al. (2015) 'A unified approach for process-based hydrologic modeling: 1. Modeling concept', Water Resour. Res.
NASA Astrophysics Data System (ADS)
Copping, A. E.; Blake, K.; Zdanski, L.
2011-12-01
As marine and hydrokinetic (MHK) energy development projects progress towards early deployments in the U.S., the process of determining the risks to aquatic animals, habitats, and ecosystem processes from these engineered systems continues to be a significant barrier to efficient siting and permitting. Understanding the risk of MHK installations requires that the two elements of risk - consequence and probability - be evaluated. However, standard risk assessment methodologies are not easily applied to MHK interactions with marine and riverine environment as there are few data that describe the interaction of stressors (MHK devices, anchors, foundations, mooring lines and power cables) and receptors (aquatic animals, habitats and ecosystem processes). The number of possible combinations and permutations of stressors and receptors in MHK systems is large: there are many different technologies designed to harvest energy from the tides, waves and flowing rivers; each device is planned for a specific waterbody that supports an endemic ecosystem of animals and habitats, tied together by specific physical and chemical processes. With few appropriate analogue industries in the oceans and rivers, little information on the effects of these technologies on the living world is available. Similarly, without robust data sets of interactions, mathematical probability models are difficult to apply. Pacific Northwest National Laboratory scientists are working with MHK developers, researchers, engineers, and regulators to rank the consequences of planned MHK projects on living systems, and exploring alternative methodologies to estimate probabilities of these encounters. This paper will present the results of ERES, the Environmental Risk Evaluation System, which has been used to rank consequences for major animal groups and habitats for five MHK projects that are in advanced stages of development and/or early commercial deployment. Probability analyses have been performed for high priority stressor/receptor interactions where data are adaptable from other industries. In addition, a methodology for evaluating the probability of encounter, and therefore risk, to an endangered marine mammal from tidal turbine blades will be presented.
A Validated Methodology for Genetic Identification of Tuna Species (Genus Thunnus)
Viñas, Jordi; Tudela, Sergi
2009-01-01
Background Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. Methodology After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Conclusions Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned. PMID:19898615
Applying axiomatic design to a medication distribution system
NASA Astrophysics Data System (ADS)
Raguini, Pepito B.
As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.
A proven approach for more effective software development and maintenance
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Hall, Dana; Sinclair, Craig
1994-01-01
Modern space flight mission operations and associated ground data systems are increasingly dependent upon reliable, quality software. Critical functions such as command load preparation, health and status monitoring, communications link scheduling and conflict resolution, and transparent gateway protocol conversion are routinely performed by software. Given budget constraints and the ever increasing capabilities of processor technology, the next generation of control centers and data systems will be even more dependent upon software across all aspects of performance. A key challenge now is to implement improved engineering, management, and assurance processes for the development and maintenance of that software; processes that cost less, yield higher quality products, and that self-correct for continual improvement evolution. The NASA Goddard Space Flight Center has a unique experience base that can be readily tapped to help solve the software challenge. Over the past eighteen years, the Software Engineering Laboratory within the code 500 Flight Dynamics Division has evolved a software development and maintenance methodology that accommodates the unique characteristics of an organization while optimizing and continually improving the organization's software capabilities. This methodology relies upon measurement, analysis, and feedback much analogous to that of control loop systems. It is an approach with a time-tested track record proven through repeated applications across a broad range of operational software development and maintenance projects. This paper describes the software improvement methodology employed by the Software Engineering Laboratory, and how it has been exploited within the Flight Dynamics Division with GSFC Code 500. Examples of specific improvement in the software itself and its processes are presented to illustrate the effectiveness of the methodology. Finally, the initial findings are given when this methodology was applied across the mission operations and ground data systems software domains throughout Code 500.
Stanford, Robert E
2004-05-01
This paper uses a non-parametric frontier model and adaptations of the concepts of cross-efficiency and peer-appraisal to develop a formal methodology for benchmarking provider performance in the treatment of Acute Myocardial Infarction (AMI). Parameters used in the benchmarking process are the rates of proper recognition of indications of six standard treatment processes for AMI; the decision making units (DMUs) to be compared are the Medicare eligible hospitals of a particular state; the analysis produces an ordinal ranking of individual hospital performance scores. The cross-efficiency/peer-appraisal calculation process is constructed to accommodate DMUs that experience no patients in some of the treatment categories. While continuing to rate highly the performances of DMUs which are efficient in the Pareto-optimal sense, our model produces individual DMU performance scores that correlate significantly with good overall performance, as determined by a comparison of the sums of the individual DMU recognition rates for the six standard treatment processes. The methodology is applied to data collected from 107 state Medicare hospitals.
NASA Astrophysics Data System (ADS)
Hodijah, A.; Sundari, S.; Nugraha, A. C.
2018-05-01
As a Local Government Agencies who perform public services, General Government Office already has utilized Reporting Information System of Local Government Implementation (E-LPPD). However, E-LPPD has upgrade limitation for the integration processes that cannot accommodate General Government Offices’ needs in order to achieve Good Government Governance (GGG), while success stories of the ultimate goal of e-government implementation requires good governance practices. Currently, citizen demand public services as private sector do, which needs service innovation by utilizing the legacy system as a service based e-government implementation, while Service Oriented Architecture (SOA) to redefine a business processes as a set of IT enabled services and Enterprise Architecture from the Open Group Architecture Framework (TOGAF) as a comprehensive approach in redefining business processes as service innovation towards GGG. This paper takes a case study on Performance Evaluation of Local Government Implementation (EKPPD) system on General Government Office. The results show that TOGAF will guide the development of integrated business processes of EKPPD system that fits good governance practices to attain GGG with SOA methodology as technical approach.
A time-responsive tool for informing policy making: rapid realist review.
Saul, Jessie E; Willis, Cameron D; Bitz, Jennifer; Best, Allan
2013-09-05
A realist synthesis attempts to provide policy makers with a transferable theory that suggests a certain program is more or less likely to work in certain respects, for particular subjects, in specific kinds of situations. Yet realist reviews can require considerable and sustained investment over time, which does not always suit the time-sensitive demands of many policy decisions. 'Rapid Realist Review' methodology (RRR) has been developed as a tool for applying a realist approach to a knowledge synthesis process in order to produce a product that is useful to policy makers in responding to time-sensitive and/or emerging issues, while preserving the core elements of realist methodology. Using examples from completed RRRs, we describe key features of the RRR methodology, the resources required, and the strengths and limitations of the process. All aspects of an RRR are guided by both a local reference group, and a group of content experts. Involvement of knowledge users and external experts ensures both the usability of the review products, as well as their links to current practice. RRRs have proven useful in providing evidence for and making explicit what is known on a given topic, as well as articulating where knowledge gaps may exist. From the RRRs completed to date, findings broadly adhere to four (often overlapping) classifications: guiding rules for policy-making; knowledge quantification (i.e., the amount of literature available that identifies context, mechanisms, and outcomes for a given topic); understanding tensions/paradoxes in the evidence base; and, reinforcing or refuting beliefs and decisions taken. 'Traditional' realist reviews and RRRs have some key differences, which allow policy makers to apply each type of methodology strategically to maximize its utility within a particular local constellation of history, goals, resources, politics and environment. In particular, the RRR methodology is explicitly designed to engage knowledge users and review stakeholders to define the research questions, and to streamline the review process. In addition, results are presented with a focus on context-specific explanations for what works within a particular set of parameters rather than producing explanations that are potentially transferrable across contexts and populations. For policy makers faced with making difficult decisions in short time frames for which there is sufficient (if limited) published/research and practice-based evidence available, RRR provides a practical, outcomes-focused knowledge synthesis method.
Quantitative Methods in Psychology: Inevitable and Useless
Toomela, Aaro
2010-01-01
Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199
Quantitative methods in psychology: inevitable and useless.
Toomela, Aaro
2010-01-01
Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, S; Larsen, S; Wagoner, J
Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization of full three-dimensional (3D)more » finite difference modeling, as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project, in support of LLNL's national-security mission, benefits the U.S. military and intelligence community. Fiscal year (FY) 2003 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A three-seismic-array vehicle tracking testbed was installed on site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications. In FY03 specifically, a large and complex simulation experiment was conducted that tested the full modeling-based approach to geological characterization using E2D, the K-L statistical methodology, and matched field processing applied to tunnel detection with surface seismic sensors. The simulation validated the full methodology and the need for geological heterogeneity to be accounted for in the overall approach. The Lake Lynn site area was geologically modeled using the code Earthvision to produce a 32 million node 3D model grid for E3D. Model linking issues were resolved and a number of full 3D model runs were accomplished using shot locations that matched the data. E3D-generated wavefield movies showed the reflection signal would be too small to be observed in the data due to trapped and attenuated energy in the weathered layer. An analysis of the few sensors coupled to bedrock did not improve the reflection signal strength sufficiently because the shots, though buried, were within the surface layer and hence attenuated. Ability to model a complex 3D geological structure and calculate synthetic seismograms that are in good agreement with actual data (especially for surface waves and below the complex weathered layer) was demonstrated. We conclude that E3D is a powerful tool for assessing the conditions under which a tunnel could be detected in a specific geological setting. Finally, the Lake Lynn tunnel explosion data were analyzed using standard array processing techniques. The results showed that single detonations could be detected and located but simultaneous detonations would require a strategic placement of arrays.« less
A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.
Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew
2016-01-01
While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Effects of image processing on the detective quantum efficiency
NASA Astrophysics Data System (ADS)
Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na
2010-04-01
Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.
ERIC Educational Resources Information Center
Vamvakoussi, Xenia; Van Dooren, Wim; Verschaffel, Lieven
2013-01-01
This study tested the hypothesis that intuitions about the effect of operations, e.g., "addition makes bigger" and "division makes smaller", are still present in educated adults, even after years of instruction. To establish the intuitive character, we applied a reaction time methodology, grounded in dual process theories of reasoning. Educated…
Automated measurement of human body shape and curvature using computer vision
NASA Astrophysics Data System (ADS)
Pearson, Jeremy D.; Hobson, Clifford A.; Dangerfield, Peter H.
1993-06-01
A system to measure the surface shape of the human body has been constructed. The system uses a fringe pattern generated by projection of multi-stripe structured light. The optical methodology used is fully described and the algorithms used to process acquired digital images are outlined. The system has been applied to the measurement of the shape of the human back in scoliosis.
ERIC Educational Resources Information Center
Ivankova, Nataliya V.
2014-01-01
In spite of recent methodological developments related to quality assurance in mixed methods research, practical examples of how to implement quality criteria in designing and conducting sequential QUAN [right arrow] QUAL mixed methods studies to ensure the process is systematic and rigorous remain scarce. This article discusses a three-step…
ERIC Educational Resources Information Center
Taylor, Lloyd J., III; Poyner, Ilene
2008-01-01
Purpose: This study aims to investigate the problem of trained employee retention in a highly competitive labor market for a manufacturing facility in the oilfields of West Texas. Design/methodology/approach: This article examines how one manufacturing facility should be able to retain their trained employees by using the logic of Eliyahu M.…
NASA Technical Reports Server (NTRS)
Alexander, Tiffaney Miller
2017-01-01
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.
NASA Technical Reports Server (NTRS)
Alexander, Tiffaney Miller
2017-01-01
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.
NASA Technical Reports Server (NTRS)
Alexander, Tiffaney Miller
2017-01-01
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.
Application of atomic force microscopy as a nanotechnology tool in food science.
Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng
2007-05-01
Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.
A Model for Oil-Gas Pipelines Cost Prediction Based on a Data Mining Process
NASA Astrophysics Data System (ADS)
Batzias, Fragiskos A.; Spanidis, Phillip-Mark P.
2009-08-01
This paper addresses the problems associated with the cost estimation of oil/gas pipelines during the elaboration of feasibility assessments. Techno-economic parameters, i.e., cost, length and diameter, are critical for such studies at the preliminary design stage. A methodology for the development of a cost prediction model based on Data Mining (DM) process is proposed. The design and implementation of a Knowledge Base (KB), maintaining data collected from various disciplines of the pipeline industry, are presented. The formulation of a cost prediction equation is demonstrated by applying multiple regression analysis using data sets extracted from the KB. Following the methodology proposed, a learning context is inductively developed as background pipeline data are acquired, grouped and stored in the KB, and through a linear regression model provide statistically substantial results, useful for project managers or decision makers.
NASA Astrophysics Data System (ADS)
Kouloumentas, Christos
2011-09-01
The concept of the all-fiberized multi-wavelength regenerator is analyzed, and the design methodology for operation at 40 Gb/s is presented. The specific methodology has been applied in the past for the experimental proof-of-principle of the technique, but it has never been reported in detail. The regenerator is based on a strong dispersion map that is implemented using alternating dispersion compensating fibers (DCF) and single-mode fibers (SMF), and minimizes the nonlinear interaction between the wavelength-division multiplexing (WDM) channels. The optimized regenerator design with + 0.86 ps/nm/km average dispersion of the nonlinear fiber section is further investigated. The specific design is capable of simultaneously processing five WDM channels with 800 GHz channel spacing and providing Q-factor improvement higher than 1 dB for each channel. The cascadeability of the regenerator is also indicated using a 6-node metropolitan network simulation model.
de Carvalho, Helder Pereira; Huang, Jiguo; Zhao, Meixia; Liu, Gang; Yang, Xinyu; Dong, Lili; Liu, Xingjuan
2016-01-01
In this study, response surface methodology (RSM) model was applied for optimization of Basic Red 2 (BR2) removal using electrocoagulation/eggshell (ES) coupling process in a batch system. Central composite design was used to evaluate the effects and interactions of process parameters including current density, reaction time, initial pH and ES dosage on the BR2 removal efficiency and energy consumption. The analysis of variance revealed high R(2) values (≥85%) indicating that the predictions of RSM models are adequately applicable for both responses. The optimum conditions when the dye removal efficiency of 93.18% and energy consumption of 0.840 kWh/kg were observed were 11.40 mA/cm(2) current density, 5 min and 3 s reaction time, 6.5 initial pH and 10.91 g/L ES dosage.
Analysis and Implementation of Methodologies for the Monitoring of Changes in Eye Fundus Images
NASA Astrophysics Data System (ADS)
Gelroth, A.; Rodríguez, D.; Salvatelli, A.; Drozdowicz, B.; Bizai, G.
2011-12-01
We present a support system for changes detection in fundus images of the same patient taken at different time intervals. This process is useful for monitoring pathologies lasting for long periods of time, as are usually the ophthalmologic. We propose a flow of preprocessing, processing and postprocessing applied to a set of images selected from a public database, presenting pathological advances. A test interface was developed designed to select the images to be compared in order to apply the different methods developed and to display the results. We measure the system performance in terms of sensitivity, specificity and computation times. We have obtained good results, higher than 84% for the first two parameters and processing times lower than 3 seconds for 512x512 pixel images. For the specific case of detection of changes associated with bleeding, the system responds with sensitivity and specificity over 98%.
On processing development for fabrication of fiber reinforced composite, part 2
NASA Technical Reports Server (NTRS)
Hou, Tan-Hung; Hou, Gene J. W.; Sheen, Jeen S.
1989-01-01
Fiber-reinforced composite laminates are used in many aerospace and automobile applications. The magnitudes and durations of the cure temperature and the cure pressure applied during the curing process have significant consequences for the performance of the finished product. The objective of this study is to exploit the potential of applying the optimization technique to the cure cycle design. Using the compression molding of a filled polyester sheet molding compound (SMC) as an example, a unified Computer Aided Design (CAD) methodology, consisting of three uncoupled modules, (i.e., optimization, analysis and sensitivity calculations), is developed to systematically generate optimal cure cycle designs. Various optimization formulations for the cure cycle design are investigated. The uniformities in the distributions of the temperature and the degree with those resulting from conventional isothermal processing conditions with pre-warmed platens. Recommendations with regards to further research in the computerization of the cure cycle design are also addressed.
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
2018-03-01
We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model that...factors. We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model...9 D. BINARY CLASSIFICATION AND FEATURE SELECTION ..........11 III. METHODOLOGY
Longenecker, R J; Galazyuk, A V
2012-11-16
Recently prepulse inhibition of the acoustic startle reflex (ASR) became a popular technique for tinnitus assessment in laboratory animals. This method confers a significant advantage over the previously used time-consuming behavioral approaches utilizing basic mechanisms of conditioning. Although this technique has been successfully used to assess tinnitus in different laboratory animals, many of the finer details of this methodology have not been described enough to be replicated, but are critical for tinnitus assessment. Here we provide detail description of key procedures and methodological issues that provide guidance for newcomers with the process of learning to correctly apply gap detection techniques for tinnitus assessment in laboratory animals. The major categories of these issues include: refinement of hardware for best performance, optimization of stimulus parameters, behavioral considerations, and identification of optimal strategies for data analysis. This article is part of a Special Issue entitled: Tinnitus Neuroscience. Copyright © 2012. Published by Elsevier B.V.
Application of the HARDMAN methodology to the single channel ground-airborne radio system (SINCGARS)
NASA Astrophysics Data System (ADS)
Balcom, J.; Park, J.; Toomer, L.; Feng, T.
1984-12-01
The HARDMAN methodology is designed to assess the human resource requirements early in the weapon system acquisition process. In this case, the methodology was applied to the family of radios known as SINCGARS (Single Channel Ground-Airborne Radio System). At the time of the study, SINCGARS was approaching the Full-Scale Development phase, with 2 contractors in competition. Their proposed systems were compared with a composite baseline comparison (reference) system. The systems' manpower, personnel and training requirements were compared. Based on RAM data, the contractors' MPT figures showed a significant reduction from the figures derived for the baseline comparison system. Differences between the two contractors were relatively small. Impact and some tradeoff analyses were hindered by data access problems. Tactical radios, manpower and personnel requirements analysis, impact and tradeoff analysis, human resource sensitivity, training requirements analysis, human resources in LCSMM, and logistics analyses are discussed.
Response-Guided Community Detection: Application to Climate Index Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bello, Gonzalo; Angus, Michael; Pedemane, Navya
Discovering climate indices-time series that summarize spatiotemporal climate patterns-is a key task in the climate science domain. In this work, we approach this task as a problem of response-guided community detection; that is, identifying communities in a graph associated with a response variable of interest. To this end, we propose a general strategy for response-guided community detection that explicitly incorporates information of the response variable during the community detection process, and introduce a graph representation of spatiotemporal data that leverages information from multiple variables. We apply our proposed methodology to the discovery of climate indices associated with seasonal rainfall variability.more » Our results suggest that our methodology is able to capture the underlying patterns known to be associated with the response variable of interest and to improve its predictability compared to existing methodologies for data-driven climate index discovery and official forecasts.« less
An analysis of IGBP global land-cover characterization process
Loveland, Thomas R.; Zhu, Zhiliang; Ohlen, Donald O.; Brown, Jesslyn F.; Reed, Bradley C.; Yang, Limin
1999-01-01
The international Geosphere Biosphere Programme (IGBP) has called for the development of improved global land-cover data for use in increasingly sophisticated global environmental models. To meet this need, the staff of the U.S. Geological Survey and the University of Nebraska-Lincoln developed and applied a global land-cover characterization methodology using 1992-1993 1-km resolution Advanced Very High Resolution Radiometer (AVHRR) and other spatial data. The methodology, based on unsupervised classification with extensive postclassification refinement, yielded a multi-layer database consisting of eight land-cover data sets, descriptive attributes, and source data. An independent IGBP accuracy assessment reports a global accuracy of 73.5 percent, and continental results vary from 63 percent to 83 percent. Although data quality, methodology, interpreter performance, and logistics affected the results, significant problems were associated with the relationship between AVHRR data and fine-scale, spectrally similar land-cover patterns in complex natural or disturbed landscapes.
Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A
2018-05-01
In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.
Bayesian Local Contamination Models for Multivariate Outliers
Page, Garritt L.; Dunson, David B.
2013-01-01
In studies where data are generated from multiple locations or sources it is common for there to exist observations that are quite unlike the majority. Motivated by the application of establishing a reference value in an inter-laboratory setting when outlying labs are present, we propose a local contamination model that is able to accommodate unusual multivariate realizations in a flexible way. The proposed method models the process level of a hierarchical model using a mixture with a parametric component and a possibly nonparametric contamination. Much of the flexibility in the methodology is achieved by allowing varying random subsets of the elements in the lab-specific mean vectors to be allocated to the contamination component. Computational methods are developed and the methodology is compared to three other possible approaches using a simulation study. We apply the proposed method to a NIST/NOAA sponsored inter-laboratory study which motivated the methodological development. PMID:24363465
Applying the SERENITY Methodology to the Domain of Trusted Electronic Archiving
NASA Astrophysics Data System (ADS)
Porekar, Jan; Klobučar, Tomaž; Šaljič, Svetlana; Gabrijelčič, Dušan
We present the application of the SERENITY methodology to the domain of long-term trusted electronic archiving, sometimes also referred to as trusted digital notary services. We address the SERENITY approach from thepoint of view of a company providing security solutions in the mentioned domain and adopt the role of a solution developer. In this chapter we show a complete vertical slice through the trusted archiving domain providing: (i) the relevant S&D properties, (ii) the S&D classes and S&D patterns on both organizational and technical level, (iii) describe how S&D patterns are integrated into a trusted longterm archiving service using the SERENITY Run-Time Framework (SRF). At the end of the chapter we put in perspective what a solution developer can learn from the process of capturing security knowledge according to SERENITY methodology and we discuss how existing implementations of archiving services can benefit from SERENITY approach in the future.
NASA Astrophysics Data System (ADS)
Polatidis, Heracles; Morales, Jan Borràs
2016-11-01
In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.
NASA Astrophysics Data System (ADS)
Hoang, Hanh H.; Jung, Jason J.; Tran, Chi P.
2014-11-01
Based on an in-depth analysis of the existing approaches in applying semantic technologies to business process management (BPM) research in the perspective of cross-enterprise collaboration or so-called business-to-business integration, we analyse, discuss and compare methodologies, applications and best practices of the surveyed approaches with the proposed criteria. This article identifies various relevant research directions in semantic BPM (SBPM). Founded on the result of our investigation, we summarise the state of art of SBPM. We also address areas and directions for further research activities.
Process improvement of pap smear tracking in a women's medicine center clinic in residency training.
Calhoun, Byron C; Goode, Jeff; Simmons, Kathy
2011-11-01
Application of Six-Sigma methodology and Change Acceleration Process (CAP)/Work Out (WO) tools to track pap smear results in an outpatient clinic in a hospital-based residency-training program. Observational study of impact of changes obtained through application of Six-Sigma principles in clinic process with particular attention to prevention of sentinel events. Using cohort analysis and applying Six-Sigma principles to an interactive electronic medical record Soarian workflow engine, we designed a system of timely accession and reporting of pap smear and pathology results. We compared manual processes from January 1, 2007 to February 28, 2008 to automated processes from March 1, 2008 to December 31, 2009. Using the Six-Sigma principles, CAP/WO tools, including "voice of the customer" and team focused approach, no outlier events went untracked. Applying the Soarian workflow engine to track prescribed 7 day turnaround time for completion, we identified 148 pap results in 3,936, 3 non-gynecological results in 15, and 41 surgical results in 246. We applied Six-Sigma principles to an outpatient clinic facilitating an interdisciplinary team approach to improve the clinic's reporting system. Through focused problem assessment, verification of process, and validation of outcomes, we improved patient care for pap smears and critical pathology. © 2011 National Association for Healthcare Quality.
An Introduction to Flight Software Development: FSW Today, FSW 2010
NASA Technical Reports Server (NTRS)
Gouvela, John
2004-01-01
Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by automated office assistants. The infrastructure in use today includes strict software development and configuration management procedures, including strong control of resource management and critical skills coverage. This will evolve to a fully integrated staff organization with efficient and effective communication throughout all levels guided by a Mission-Systems Architecture framework with focus on risk management and attention toward inevitable product obsolescence. This infrastructure of computing equipment, software and processes will itself be subject to technological change and need for management of change and improvement,
NASA Astrophysics Data System (ADS)
El Serafy, Ghada; Gaytan Aguilar, Sandra; Ziemba, Alexander
2016-04-01
There is an increasing use of process-based models in the investigation of ecological systems and scenario predictions. The accuracy and quality of these models are improved when run with high spatial and temporal resolution data sets. However, ecological data can often be difficult to collect which manifests itself through irregularities in the spatial and temporal domain of these data sets. Through the use of Data INterpolating Empirical Orthogonal Functions(DINEOF) methodology, earth observation products can be improved to have full spatial coverage within the desired domain as well as increased temporal resolution to daily and weekly time step, those frequently required by process-based models[1]. The DINEOF methodology results in a degree of error being affixed to the refined data product. In order to determine the degree of error introduced through this process, the suspended particulate matter and chlorophyll-a data from MERIS is used with DINEOF to produce high resolution products for the Wadden Sea. These new data sets are then compared with in-situ and other data sources to determine the error. Also, artificial cloud cover scenarios are conducted in order to substantiate the findings from MERIS data experiments. Secondly, the accuracy of DINEOF is explored to evaluate the variance of the methodology. The degree of accuracy is combined with the overall error produced by the methodology and reported in an assessment of the quality of DINEOF when applied to resolution refinement of chlorophyll-a and suspended particulate matter in the Wadden Sea. References [1] Sirjacobs, D.; Alvera-Azcárate, A.; Barth, A.; Lacroix, G.; Park, Y.; Nechad, B.; Ruddick, K.G.; Beckers, J.-M. (2011). Cloud filling of ocean colour and sea surface temperature remote sensing products over the Southern North Sea by the Data Interpolating Empirical Orthogonal Functions methodology. J. Sea Res. 65(1): 114-130. Dx.doi.org/10.1016/j.seares.2010.08.002
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruiz-Padillo, Alejandro, E-mail: aruizp@correo.ugr.es; Civil Engineering Department, University of Granada, Av. Fuentenueva s/n, 18071 Granada; Ruiz, Diego P., E-mail: druiz@ugr.es
Road traffic noise is one of the most significant environmental impacts generated by transport systems. To this regard, the recent implementation of the European Environmental Noise Directive by Public Administrations of the European Union member countries has led to various noise action plans (NAPs) for reducing the noise exposure of EU inhabitants. Every country or administration is responsible for applying criteria based on their own experience or expert knowledge, but there is no regulated process for the prioritization of technical measures within these plans. This paper proposes a multi-criteria decision methodology for the selection of suitable alternatives against traffic noisemore » in each of the road stretches included in the NAPs. The methodology first defines the main criteria and alternatives to be considered. Secondly, it determines the relative weights for the criteria and sub-criteria using the fuzzy extended analytical hierarchy process as applied to the results from an expert panel, thereby allowing expert knowledge to be captured in an automated way. A final step comprises the use of discrete multi-criteria analysis methods such as weighted sum, ELECTRE and TOPSIS, to rank the alternatives by suitability. To illustrate an application of the proposed methodology, this paper describes its implementation in a complex real case study: the selection of optimal technical solutions against traffic noise in the top priority road stretch included in the revision of the NAP of the regional road network in the province of Almeria (Spain).« less
Integrating Laser Scanner and Bim for Conservation and Reuse: "the Lyric Theatre of Milan"
NASA Astrophysics Data System (ADS)
Utica, G.; Pinti, L.; Guzzoni, L.; Bonelli, S.; Brizzolari, A.
2017-12-01
The paper underlines the importance to apply a methodology that integrates the Building Information Modeling (BIM), Work Breakdown Structure (WBS) and the Laser Scanner tool in conservation and reuse projects. As it is known, the laser scanner technology provides a survey of the building object which is more accurate rather than that carried out using traditional methodologies. Today most existing buildings present their attributes in a dispersed way, stored and collected in paper documents, in sheets of equipment information, in file folders of maintenance records. In some cases, it is difficult to find updated technical documentation and the research of reliable data can be a cost and time-consuming process. Therefore, this new survey technology, embedded with BIM systems represents a valid tool to obtain a coherent picture of the building state. The following case consists in the conservation and reuse project of Milan Lyric Theatre, started in 2013 from the collaboration between the Milan Polytechnic and the Municipality. This project first attempts to integrate these new techniques which are already professional standards in many other countries such as the US, Norway, Finland, England and so on. Concerning the methodology, the choice has been to use BIM software for the structured analysis of the project, with the aim to define a single code of communication to develop a coherent documentation according to rules in a consistent manner and in tight schedules. This process provides the definition of an effective and efficient operating method that can be applied to other projects.
PseudoBase: a database with RNA pseudoknots.
van Batenburg, F H; Gultyaev, A P; Pleij, C W; Ng, J; Oliehoek, J
2000-01-01
PseudoBase is a database containing structural, functional and sequence data related to RNA pseudo-knots. It can be reached at http://wwwbio. Leiden Univ.nl/ approximately Batenburg/PKB.html. This page will direct the user to a retrieval page from where a particular pseudoknot can be chosen, or to a submission page which enables the user to add pseudoknot information to the database or to an informative page that elaborates on the various aspects of the database. For each pseudoknot, 12 items are stored, e.g. the nucleotides of the region that contains the pseudoknot, the stem positions of the pseudoknot, the EMBL accession number of the sequence that contains this pseudoknot and the support that can be given regarding the reliability of the pseudoknot. Access is via a small number of steps, using 16 different categories. The development process was done by applying the evolutionary methodology for software development rather than by applying the methodology of the classical waterfall model or the more modern spiral model.
Analysis of pressure distortion testing
NASA Technical Reports Server (NTRS)
Koch, K. E.; Rees, R. L.
1976-01-01
The development of a distortion methodology, method D, was documented, and its application to steady state and unsteady data was demonstrated. Three methodologies based upon DIDENT, a NASA-LeRC distortion methodology based upon the parallel compressor model, were investigated by applying them to a set of steady state data. The best formulation was then applied to an independent data set. The good correlation achieved with this data set showed that method E, one of the above methodologies, is a viable concept. Unsteady data were analyzed by using the method E methodology. This analysis pointed out that the method E sensitivities are functions of pressure defect level as well as corrected speed and pattern.
Cho, Yongrae; Kim, Minsung
2014-01-01
The volatility and uncertainty in the process of technological developments are growing faster than ever due to rapid technological innovations. Such phenomena result in integration among disparate technology fields. At this point, it is a critical research issue to understand the different roles and the propensity of each element technology for technological convergence. In particular, the network-based approach provides a holistic view in terms of technological linkage structures. Furthermore, the development of new indicators based on network visualization can reveal the dynamic patterns among disparate technologies in the process of technological convergence and provide insights for future technological developments. This research attempts to analyze and discover the patterns of the international patent classification codes of the United States Patent and Trademark Office's patent data in printed electronics, which is a representative technology in the technological convergence process. To this end, we apply the physical idea as a new methodological approach to interpret technological convergence. More specifically, the concepts of entropy and gravity are applied to measure the activities among patent citations and the binding forces among heterogeneous technologies during technological convergence. By applying the entropy and gravity indexes, we could distinguish the characteristic role of each technology in printed electronics. At the technological convergence stage, each technology exhibits idiosyncratic dynamics which tend to decrease technological differences and heterogeneity. Furthermore, through nonlinear regression analysis, we have found the decreasing patterns of disparity over a given total period in the evolution of technological convergence. This research has discovered the specific role of each element technology field and has consequently identified the co-evolutionary patterns of technological convergence. These new findings on the evolutionary patterns of technological convergence provide some implications for engineering and technology foresight research, as well as for corporate strategy and technology policy.
Process-based tolerance assessment of connecting rod machining process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.
2016-06-01
Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.
VASSAR: Value assessment of system architectures using rules
NASA Astrophysics Data System (ADS)
Selva, D.; Crawley, E. F.
A key step of the mission development process is the selection of a system architecture, i.e., the layout of the major high-level system design decisions. This step typically involves the identification of a set of candidate architectures and a cost-benefit analysis to compare them. Computational tools have been used in the past to bring rigor and consistency into this process. These tools can automatically generate architectures by enumerating different combinations of decisions and options. They can also evaluate these architectures by applying cost models and simplified performance models. Current performance models are purely quantitative tools that are best fit for the evaluation of the technical performance of mission design. However, assessing the relative merit of a system architecture is a much more holistic task than evaluating performance of a mission design. Indeed, the merit of a system architecture comes from satisfying a variety of stakeholder needs, some of which are easy to quantify, and some of which are harder to quantify (e.g., elegance, scientific value, political robustness, flexibility). Moreover, assessing the merit of a system architecture at these very early stages of design often requires dealing with a mix of: a) quantitative and semi-qualitative data; objective and subjective information. Current computational tools are poorly suited for these purposes. In this paper, we propose a general methodology that can used to assess the relative merit of several candidate system architectures under the presence of objective, subjective, quantitative, and qualitative stakeholder needs. The methodology called VASSAR (Value ASsessment for System Architectures using Rules). The major underlying assumption of the VASSAR methodology is that the merit of a system architecture can assessed by comparing the capabilities of the architecture with the stakeholder requirements. Hence for example, a candidate architecture that fully satisfies all critical sta- eholder requirements is a good architecture. The assessment process is thus fundamentally seen as a pattern matching process where capabilities match requirements, which motivates the use of rule-based expert systems (RBES). This paper describes the VASSAR methodology and shows how it can be applied to a large complex space system, namely an Earth observation satellite system. Companion papers show its applicability to the NASA space communications and navigation program and the joint NOAA-DoD NPOESS program.
Anderson, Melissa L; Wolf Craig, Kelly S; Ziedonis, Douglas M
2017-01-01
Deaf individuals experience significant obstacles to participating in behavioral health research when careful consideration is not given to accessibility during the design of study methodology. To inform such considerations, we conducted an exploratory secondary analysis of a mixed-methods study that originally explored 16 Deaf trauma survivors' help-seeking experiences. Our objective was to identify key findings and qualitative themes from consumers' own words that could be applied to the design of behavioral clinical trials methodology. In many ways, the themes that emerged were not wholly dissimilar from the general preferences of members of other sociolinguistic minority groups-a need for communication access, empathy, respect, strict confidentiality procedures, trust, and transparency of the research process. Yet, how these themes are applied to the inclusion of Deaf research participants is distinct from any other sociolinguistic minority population, given Deaf people's unique sensory and linguistic characteristics. We summarize our findings in a preliminary "Checklist for Designing Deaf Behavioral Clinical Trials" to operationalize the steps researchers can take to apply Deaf-friendly approaches in their empirical work. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Yabalak, Erdal
2018-05-18
This study was performed to investigate the mineralization of ticarcillin in the artificially prepared aqueous solution presenting ticarcillin contaminated waters, which constitute a serious problem for human health. 81.99% of total organic carbon removal, 79.65% of chemical oxygen demand removal, and 94.35% of ticarcillin removal were achieved by using eco-friendly, time-saving, powerful and easy-applying, subcritical water oxidation method in the presence of a safe-to-use oxidizing agent, hydrogen peroxide. Central composite design, which belongs to the response surface methodology, was applied to design the degradation experiments, to optimize the methods, to evaluate the effects of the system variables, namely, temperature, hydrogen peroxide concentration, and treatment time, on the responses. In addition, theoretical equations were proposed in each removal processes. ANOVA tests were utilized to evaluate the reliability of the performed models. F values of 245.79, 88.74, and 48.22 were found for total organic carbon removal, chemical oxygen demand removal, and ticarcillin removal, respectively. Moreover, artificial neural network modeling was applied to estimate the response in each case and its prediction and optimizing performance was statistically examined and compared to the performance of central composite design.
Bias-dependent local structure of water molecules at an electrochemical interface
NASA Astrophysics Data System (ADS)
Pedroza, Luana; Brandimarte, Pedro; Rocha, Alexandre R.; Fernandez-Serra, Marivi
2015-03-01
Following the need for new - and renewable - sources of energy worldwide, fuel cells using electrocatalysts can be thought of as a viable option. Understanding the local structure of water molecules at the interfaces of the metallic electrodes is a key problem. Notably the system is under an external potential bias, which makes the task of simulating this setup difficult. A first principle description of all components of the system is the most appropriate methodology in order to advance understanding of electrochemical processes. There, the metal is usually charged. To correctly compute the effect of an external bias potential applied to electrodes, we combine density functional theory (DFT) and non-equilibrium Green's functions methods (NEGF), with and without van der Waals interactions. In this work, we apply this methodology to study the electronic properties and forces of one water molecule and water monolayer at the interface of gold electrodes. We find that the water molecule has a different torque direction depending on the sign of the bias applied. We also show that it changes the position of the most stable configuration indicating that the external bias plays an important role in the structural properties of the interface. We acknowledge financial support from FAPESP.
Lu, Haibin; Chandrasekar, Balakumaran; Oeljeklaus, Julian; Misas-Villamil, Johana C; Wang, Zheming; Shindo, Takayuki; Bogyo, Matthew; Kaiser, Markus; van der Hoorn, Renier A L
2015-08-01
Cysteine proteases are an important class of enzymes implicated in both developmental and defense-related programmed cell death and other biological processes in plants. Because there are dozens of cysteine proteases that are posttranslationally regulated by processing, environmental conditions, and inhibitors, new methodologies are required to study these pivotal enzymes individually. Here, we introduce fluorescence activity-based probes that specifically target three distinct cysteine protease subfamilies: aleurain-like proteases, cathepsin B-like proteases, and vacuolar processing enzymes. We applied protease activity profiling with these new probes on Arabidopsis (Arabidopsis thaliana) protease knockout lines and agroinfiltrated leaves to identify the probe targets and on other plant species to demonstrate their broad applicability. These probes revealed that most commercially available protease inhibitors target unexpected proteases in plants. When applied on germinating seeds, these probes reveal dynamic activities of aleurain-like proteases, cathepsin B-like proteases, and vacuolar processing enzymes, coinciding with the remobilization of seed storage proteins. © 2015 American Society of Plant Biologists. All Rights Reserved.
Optimisation study of a vehicle bumper subsystem with fuzzy parameters
NASA Astrophysics Data System (ADS)
Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.
2012-10-01
This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).
Crevillén-García, D
2018-04-01
Time-consuming numerical simulators for solving groundwater flow and dissolution models of physico-chemical processes in deep aquifers normally require some of the model inputs to be defined in high-dimensional spaces in order to return realistic results. Sometimes, the outputs of interest are spatial fields leading to high-dimensional output spaces. Although Gaussian process emulation has been satisfactorily used for computing faithful and inexpensive approximations of complex simulators, these have been mostly applied to problems defined in low-dimensional input spaces. In this paper, we propose a method for simultaneously reducing the dimensionality of very high-dimensional input and output spaces in Gaussian process emulators for stochastic partial differential equation models while retaining the qualitative features of the original models. This allows us to build a surrogate model for the prediction of spatial fields in such time-consuming simulators. We apply the methodology to a model of convection and dissolution processes occurring during carbon capture and storage.
Modified Method of Simplest Equation Applied to the Nonlinear Schrödinger Equation
NASA Astrophysics Data System (ADS)
Vitanov, Nikolay K.; Dimitrova, Zlatinka I.
2018-03-01
We consider an extension of the methodology of the modified method of simplest equation to the case of use of two simplest equations. The extended methodology is applied for obtaining exact solutions of model nonlinear partial differential equations for deep water waves: the nonlinear Schrödinger equation. It is shown that the methodology works also for other equations of the nonlinear Schrödinger kind.
Operator agency in process intervention: tampering versus application of tacit knowledge
NASA Astrophysics Data System (ADS)
Van Gestel, P.; Pons, D. J.; Pulakanam, V.
2015-09-01
Statistical process control (SPC) theory takes a negative view of adjustment of process settings, which is termed tampering. In contrast, quality and lean programmes actively encourage operators to acts of intervention and personal agency in the improvement of production outcomes. This creates a conflict that requires operator judgement: How does one differentiate between unnecessary tampering and needful intervention? Also, difficult is that operators apply tacit knowledge to such judgements. There is a need to determine where in a given production process the operators are applying tacit knowledge, and whether this is hindering or aiding quality outcomes. The work involved the conjoint application of systems engineering, statistics, and knowledge management principles, in the context of a case study. Systems engineering was used to create a functional model of a real plant. Actual plant data were analysed with the statistical methods of ANOVA, feature selection, and link analysis. This identified the variables to which the output quality was most sensitive. These key variables were mapped back to the functional model. Fieldwork was then directed to those areas to prospect for operator judgement activities. A natural conversational approach was used to determine where and how operators were applying judgement. This contrasts to the interrogative approach of conventional knowledge management. Data are presented for a case study of a meat rendering plant. The results identify specific areas where operators' tacit knowledge and mental model contribute to quality outcomes and untangles the motivations behind their agency. Also evident is how novice and expert operators apply their knowledge differently. Novices were focussed on meeting throughput objectives, and their incomplete understanding of the plant characteristics led them to inadvertently sacrifice quality in the pursuit of productivity in certain situations. Operators' responses to the plant are affected by their individual mental models of the plant, which differ between operators and have variable validity. Their behaviour is also affected by differing interpretations of how their personal agency should be applied to the achievement of production objectives. The methodology developed here is an integration of systems engineering, statistical analysis, and knowledge management. It shows how to determine where in a given production process the operator intervention is occurring, how it affects quality outcomes, and what tacit knowledge operators are using. It thereby assists the continuous quality improvement processes in a different way to SPC. A second contribution is the provision of a novel methodology for knowledge management, one that circumvents the usual codification barriers to knowledge management.
Ungers, L J; Moskowitz, P D; Owens, T W; Harmon, A D; Briggs, T M
1982-02-01
Determining occupational health and safety risks posed by emerging technologies is difficult because of limited statistics. Nevertheless, estimates of such risks must be constructed to permit comparison of various technologies to identify the most attractive processes. One way to estimate risks is to use statistics on related industries. Based on process labor requirements and associated occupational health data, risks to workers and to society posed by an emerging technology can be calculated. Using data from the California semiconductor industry, this study applies a five-step occupational risk assessment procedure to four processes for the fabrication of photovoltaic cells. The validity of the occupational risk assessment method is discussed.
NASA Astrophysics Data System (ADS)
Vieceli, Nathália; Nogueira, Carlos A.; Pereira, Manuel F. C.; Durão, Fernando O.; Guimarães, Carlos; Margarido, Fernanda
2018-01-01
The recovery of lithium from hard rock minerals has received increased attention given the high demand for this element. Therefore, this study optimized an innovative process, which does not require a high-temperature calcination step, for lithium extraction from lepidolite. Mechanical activation and acid digestion were suggested as crucial process parameters, and experimental design and response-surface methodology were applied to model and optimize the proposed lithium extraction process. The promoting effect of amorphization and the formation of lithium sulfate hydrate on lithium extraction yield were assessed. Several factor combinations led to extraction yields that exceeded 90%, indicating that the proposed process is an effective approach for lithium recovery.
New insights into soil temperature time series modeling: linear or nonlinear?
NASA Astrophysics Data System (ADS)
Bonakdari, Hossein; Moeeni, Hamid; Ebtehaj, Isa; Zeynoddin, Mohammad; Mahoammadian, Abdolmajid; Gharabaghi, Bahram
2018-03-01
Soil temperature (ST) is an important dynamic parameter, whose prediction is a major research topic in various fields including agriculture because ST has a critical role in hydrological processes at the soil surface. In this study, a new linear methodology is proposed based on stochastic methods for modeling daily soil temperature (DST). With this approach, the ST series components are determined to carry out modeling and spectral analysis. The results of this process are compared with two linear methods based on seasonal standardization and seasonal differencing in terms of four DST series. The series used in this study were measured at two stations, Champaign and Springfield, at depths of 10 and 20 cm. The results indicate that in all ST series reviewed, the periodic term is the most robust among all components. According to a comparison of the three methods applied to analyze the various series components, it appears that spectral analysis combined with stochastic methods outperformed the seasonal standardization and seasonal differencing methods. In addition to comparing the proposed methodology with linear methods, the ST modeling results were compared with the two nonlinear methods in two forms: considering hydrological variables (HV) as input variables and DST modeling as a time series. In a previous study at the mentioned sites, Kim and Singh Theor Appl Climatol 118:465-479, (2014) applied the popular Multilayer Perceptron (MLP) neural network and Adaptive Neuro-Fuzzy Inference System (ANFIS) nonlinear methods and considered HV as input variables. The comparison results signify that the relative error projected in estimating DST by the proposed methodology was about 6%, while this value with MLP and ANFIS was over 15%. Moreover, MLP and ANFIS models were employed for DST time series modeling. Due to these models' relatively inferior performance to the proposed methodology, two hybrid models were implemented: the weights and membership function of MLP and ANFIS (respectively) were optimized with the particle swarm optimization (PSO) algorithm in conjunction with the wavelet transform and nonlinear methods (Wavelet-MLP & Wavelet-ANFIS). A comparison of the proposed methodology with individual and hybrid nonlinear models in predicting DST time series indicates the lowest Akaike Information Criterion (AIC) index value, which considers model simplicity and accuracy simultaneously at different depths and stations. The methodology presented in this study can thus serve as an excellent alternative to complex nonlinear methods that are normally employed to examine DST.
Porfirif, María C; Milatich, Esteban J; Farruggia, Beatriz M; Romanini, Diana
2016-06-01
A one-step method as a strategy of alpha-amylase concentration and purification was developed in this work. This methodology requires the use of a very low concentration of biodegradable polyelectrolyte (Eudragit(®) E-PO) and represents a low cost, fast, easy to scale up and non-polluting technology. Besides, this methodology allows recycling the polymer after precipitation. The formation of reversible soluble/insoluble complexes between alpha-amylase and the polymer Eudragit(®) E-PO was studied, and their precipitation in selected conditions was applied with bioseparation purposes. Turbidimetric assays allowed to determine the pH range where the complexes are insoluble (4.50-7.00); pH 5.50 yielded the highest turbidity of the system. The presence of NaCl (0.05M) in the medium totally dissociates the protein-polymer complexes. When the adequate concentration of polymer was added under these conditions to a liquid culture of Aspergillus oryzae, purification factors of alpha-amylase up to 7.43 and recoveries of 88% were obtained in a simple step without previous clarification. These results demonstrate that this methodology is suitable for the concentration and production of alpha-amylase from this source and could be applied at the beginning of downstream processing. Copyright © 2016 Elsevier B.V. All rights reserved.
Hindrikson, Maris; Remm, Jaanus; Männil, Peep; Ozolins, Janis; Tammeleht, Egle; Saarma, Urmas
2013-01-01
Spatial genetics is a relatively new field in wildlife and conservation biology that is becoming an essential tool for unravelling the complexities of animal population processes, and for designing effective strategies for conservation and management. Conceptual and methodological developments in this field are therefore critical. Here we present two novel methodological approaches that further the analytical possibilities of STRUCTURE and DResD. Using these approaches we analyse structure and migrations in a grey wolf (Canislupus) population in north-eastern Europe. We genotyped 16 microsatellite loci in 166 individuals sampled from the wolf population in Estonia and Latvia that has been under strong and continuous hunting pressure for decades. Our analysis demonstrated that this relatively small wolf population is represented by four genetic groups. We also used a novel methodological approach that uses linear interpolation to statistically test the spatial separation of genetic groups. The new method, which is capable of using program STRUCTURE output, can be applied widely in population genetics to reveal both core areas and areas of low significance for genetic groups. We also used a recently developed spatially explicit individual-based method DResD, and applied it for the first time to microsatellite data, revealing a migration corridor and barriers, and several contact zones.
An experimental procedure to determine heat transfer properties of turbochargers
NASA Astrophysics Data System (ADS)
Serrano, J. R.; Olmeda, P.; Páez, A.; Vidal, F.
2010-03-01
Heat transfer phenomena in turbochargers have been a subject of investigation due to their importance for the correct determination of compressor real work when modelling. The commonly stated condition of adiabaticity for turbochargers during normal operation of an engine has been revaluated because important deviations from adiabatic behaviour have been stated in many studies in this issue especially when the turbocharger is running at low rotational speeds/loads. The deviations mentioned do not permit us to assess properly the turbine and compressor efficiencies since the pure aerodynamic effects cannot be separated from the non-desired heat transfer due to the presence of both phenomena during turbocharger operation. The correction of the aforesaid facts is necessary to properly feed engine models with reliable information and in this way increase the quality of the results in any modelling process. The present work proposes a thermal characterization methodology successfully applied in a turbocharger for a passenger car which is based on the physics of the turbocharger. Its application helps to understand the thermal behaviour of the turbocharger, and the results obtained constitute vital information for future modelling efforts which involve the use of the information obtained from the proposed methodology. The conductance values obtained from the proposed methodology have been applied to correct a procedure for measuring the mechanical efficiency of the tested turbocharger.
Jonkman, Nini H; Groenwold, Rolf H H; Trappenburg, Jaap C A; Hoes, Arno W; Schuurmans, Marieke J
2017-03-01
Meta-analyses using individual patient data (IPD) rather than aggregated data are increasingly applied to analyze sources of heterogeneity between trials and have only recently been applied to unravel multicomponent, complex interventions. This study reflects on methodological challenges encountered in two IPD meta-analyses on self-management interventions in patients with heart failure or chronic obstructive pulmonary disease. Critical reflection on prior IPD meta-analyses and discussion of literature. Experience from two IPD meta-analyses illustrates methodological challenges. Despite close collaboration with principal investigators, assessing the effect of characteristics of complex interventions on the outcomes of trials is compromised by lack of sufficient details on intervention characteristics and limited data on fidelity and adherence. Furthermore, trials collected baseline variables in a highly diverse way, limiting the possibilities to study subgroups of patients in a consistent manner. Possible solutions are proposed based on lessons learnt from the methodological challenges. Future researchers of complex interventions should pay considerable attention to the causal mechanism underlying the intervention and conducting process evaluations. Future researchers on IPD meta-analyses of complex interventions should carefully consider their own causal assumptions and availability of required data in eligible trials before undertaking such resource-intensive IPD meta-analysis. Copyright © 2017 Elsevier Inc. All rights reserved.
Seed defective reduction in automotive Electro-Deposition Coating Process of truck cabin
NASA Astrophysics Data System (ADS)
Sonthilug, Aekkalag; Chutima, Parames
2018-02-01
The case study company is one of players in Thailand’s Automotive Industry who manufacturing truck and bus for both domestic and international market. This research focuses on a product quality problem about seed defects occurred in the Electro-Deposition Coating Process of truck cabin. The 5-phase of Six Sigma methodology including D-Define, M-Measure, A-Analyze, I-Improve, and C-Control is applied to this research to identify root causes of problem for setting new parameters of each significant factor. After the improvement, seed defects in this process is reduced from 9,178 defects per unit to 876 defects per unit (90% improvement)
Process-oriented Observational Metrics for CMIP6 Climate Model Assessments
NASA Astrophysics Data System (ADS)
Jiang, J. H.; Su, H.
2016-12-01
Observational metrics based on satellite observations have been developed and effectively applied during post-CMIP5 model evaluation and improvement projects. As new physics and parameterizations continue to be included in models for the upcoming CMIP6, it is important to continue objective comparisons between observations and model results. This talk will summarize the process-oriented observational metrics and methodologies for constraining climate models with A-Train satellite observations and support CMIP6 model assessments. We target parameters and processes related to atmospheric clouds and water vapor, which are critically important for Earth's radiative budget, climate feedbacks, and water and energy cycles, and thus reduce uncertainties in climate models.
Najafpoor, Ali Asghar; Jonidi Jafari, Ahmad; Hosseinzadeh, Ahmad; Khani Jazani, Reza; Bargozin, Hasan
2018-01-01
Treatment with a non-thermal plasma (NTP) is a new and effective technology applied recently for conversion of gases for air pollution control. This research was initiated to optimize the efficient application of the NTP process in benzene, toluene, ethyl-benzene, and xylene (BTEX) removal. The effects of four variables including temperature, initial BTEX concentration, voltage, and flow rate on the BTEX elimination efficiency were investigated using response surface methodology (RSM). The constructed model was evaluated by analysis of variance (ANOVA). The model goodness-of-fit and statistical significance was assessed using determination coefficients (R 2 and R 2 adj ) and the F-test. The results revealed that the R 2 proportion was greater than 0.96 for BTEX removal efficiency. The statistical analysis demonstrated that the BTEX removal efficiency was significantly correlated with the temperature, BTEX concentration, voltage, and flow rate. Voltage was the most influential variable affecting the dependent variable as it exerted a significant effect (p < 0.0001) on the response variable. According to the achieved results, NTP can be applied as a progressive, cost-effective, and practical process for treatment of airstreams polluted with BTEX in conditions of low residence time and high concentrations of pollutants.
Kelly, Martina; Ellaway, Rachel H; Reid, Helen; Ganshorn, Heather; Yardley, Sarah; Bennett, Deirdre; Dornan, Tim
2018-05-14
Qualitative evidence synthesis (QES) is a suite of methodologies that combine qualitative techniques with the synthesis of qualitative knowledge. They are particularly suited to medical education as these approaches pool findings from original qualitative studies, whilst paying attention to context and theoretical development. Although increasingly sophisticated use is being made of qualitative primary research methodologies in health professions education (HPE) the use of secondary qualitative reviews in HPE remains underdeveloped. This study examined QES methods applied to clinical humanism in healthcare as a way of advancing thinking around the use of QES in HPE in general. A systematic search strategy identified 49 reviews that fulfilled the inclusion criteria. Meta-study was used to develop an analytic summary of methodological characteristics, the role of theory, and the synthetic processes used in QES reviews. Fifteen reviews used a defined methodology, and 17 clearly explained the processes that led from data extraction to synthesis. Eight reviews adopted a specific theoretical perspective. Authors rarely described their reflexive relationship with their data. Epistemological positions tended to be implied rather than explicit. Twenty-five reviews included some form of quality appraisal, although it was often unclear how authors acted on its results. Reviewers under-reported qualitative approaches in their review methodologies, and tended to focus on elements such as systematicity and checklist quality appraisal that were more germane to quantitative evidence synthesis. A core concern was that the axiological (value) dimensions of the source materials were rarely considered let alone accommodated in the synthesis techniques used. QES can be used in HPE research but only with careful attention to maintaining axiological integrity.
Sampling methods to the statistical control of the production of blood components.
Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo
2017-12-01
The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.
Patient-centered outcomes research in radiology: trends in funding and methodology.
Lee, Christoph I; Jarvik, Jeffrey G
2014-09-01
The creation of the Patient-Centered Outcomes Research Trust Fund and the Patient-Centered Outcomes Research Institute (PCORI) through the Patient Protection and Affordable Care Act of 2010 presents new opportunities for funding patient-centered comparative effectiveness research (CER) in radiology. We provide an overview of the evolution of federal funding and priorities for CER with a focus on radiology-related priority topics over the last two decades, and discuss the funding processes and methodological standards outlined by PCORI. We introduce key paradigm shifts in research methodology that will be required on the part of radiology health services researchers to obtain competitive federal grant funding in patient-centered outcomes research. These paradigm shifts include direct engagement of patients and other stakeholders at every stage of the research process, from initial conception to dissemination of results. We will also discuss the increasing use of mixed methods and novel trial designs. One of these trial designs, the pragmatic trial, has the potential to be readily applied to evaluating the effectiveness of diagnostic imaging procedures and imaging-based interventions among diverse patient populations in real-world settings. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Automated control of hierarchical systems using value-driven methods
NASA Technical Reports Server (NTRS)
Pugh, George E.; Burke, Thomas E.
1990-01-01
An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.
Casaseca-de-la-Higuera, Pablo; Simmross-Wattenberg, Federico; Martín-Fernández, Marcos; Alberola-López, Carlos
2009-07-01
Discontinuation of mechanical ventilation is a challenging task that involves a number of subtle clinical issues. The gradual removal of the respiratory support (referred to as weaning) should be performed as soon as autonomous respiration can be sustained. However, the prediction rate of successful extubation is still below 25% based on previous studies. Construction of an automatic system that provides information on extubation readiness is thus desirable. Recent works have demonstrated that the breathing pattern variability is a useful extubation readiness indicator, with improving performance when multiple respiratory signals are jointly processed. However, the existing methods for predictor extraction present several drawbacks when length-limited time series are to be processed in heterogeneous groups of patients. In this paper, we propose a model-based methodology for automatic readiness prediction. It is intended to deal with multichannel, nonstationary, short records of the breathing pattern. Results on experimental data yield an 87.27% of successful readiness prediction, which is in line with the best figures reported in the literature. A comparative analysis shows that our methodology overcomes the shortcomings of so far proposed methods when applied to length-limited records on heterogeneous groups of patients.
Fernández-Santander, Ana
2008-01-01
The informal activities of cooperative learning and short periods of lecturing has been combined and used in the university teaching of biochemistry as part of the first year course of Optics and Optometry in the academic years 2004-2005 and 2005-2006. The lessons were previously elaborated by the teacher and included all that is necessary to understand the topic (text, figures, graphics, diagrams, pictures, etc.). Additionally, a questionnaire was prepared for every chapter. All lessons contained three parts: objectives, approach and development, and the assessment of the topic. Team work, responsibility, and communication skills were some of the abilities developed with this new methodology. Students worked collaboratively in small groups of two or three following the teacher's instructions with short periods of lecturing that clarified misunderstood concepts. Homework was minimized. On comparing this combined methodology with the traditional one (only lecture), students were found to exhibit a higher satisfaction with the new method. They were more involved in the learning process and had a better attitude toward the subject. The use of this new methodology showed a significant increase in the mean score of the students' academic results. The rate of students who failed the subject was significantly inferior in comparison with those who failed in the previous years when only lecturing was applied. This combined methodology helped the teacher to observe the apprenticeship process of students better and to act as a facilitator in the process of building students' knowledge. Copyright © 2008 International Union of Biochemistry and Molecular Biology, Inc.
NASA Astrophysics Data System (ADS)
Shafii, Mahyar; Basu, Nandita; Schiff, Sherry; Van Cappellen, Philippe
2017-04-01
Dramatic increase in nitrogen circulating in the biosphere due to anthropogenic activities has resulted in impairment of water quality in groundwater and surface water causing eutrophication in coastal regions. Understanding the fate and transport of nitrogen from landscape to coastal areas requires exploring the drivers of nitrogen processes in both time and space, as well as the identification of appropriate flow pathways. Conceptual models can be used as diagnostic tools to provide insights into such controls. However, diagnostic evaluation of coupled hydrological-biogeochemical models is challenging. This research proposes a top-down methodology utilizing hydrochemical signatures to develop conceptual models for simulating the integrated streamflow and nitrate responses while taking into account dominant controls on nitrate variability (e.g., climate, soil water content, etc.). Our main objective is to seek appropriate model complexity that sufficiently reproduces multiple hydrological and nitrate signatures. Having developed a suitable conceptual model for a given watershed, we employ it in sensitivity studies to demonstrate the dominant process controls that contribute to the nitrate response at scales of interest. We apply the proposed approach to nitrate simulation in a range of small to large sub-watersheds in the Grand River Watershed (GRW) located in Ontario. Such multi-basin modeling experiment will enable us to address process scaling and investigate the consequences of lumping processes in terms of models' predictive capability. The proposed methodology can be applied to the development of large-scale models that can help decision-making associated with nutrients management at regional scale.
Taheri, M; Alavi Moghaddam, M R; Arami, M
2013-10-15
In this research, Response Surface Methodology (RSM) and Adaptive Neuro Fuzzy Inference System (ANFIS) models were applied for optimization of Reactive Blue 19 removal using combined electrocoagulation/coagulation process through Multi-Objective Particle Swarm Optimization (MOPSO). By applying RSM, the effects of five independent parameters including applied current, reaction time, initial dye concentration, initial pH and dosage of Poly Aluminum Chloride were studied. According to the RSM results, all the independent parameters are equally important in dye removal efficiency. In addition, ANFIS was applied for dye removal efficiency and operating costs modeling. High R(2) values (≥85%) indicate that the predictions of RSM and ANFIS models are acceptable for both responses. ANFIS was also used in MOPSO for finding the best techno-economical Reactive Blue 19 elimination conditions according to RSM design. Through MOPSO and the selected ANFIS model, Minimum and maximum values of 58.27% and 99.67% dye removal efficiencies were obtained, respectively. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Gutmanis, Ivars; And Others
The primary purpose of the study was to develop and apply a methodology for estimating the need for scientists and engineers by specialty in energy and energy-related industries. The projections methodology was based on the Case 1 estimates by the National Petroleum Council of the results of "maximum efforts" to develop domestic fuel sources by…
Importance of joint efforts for balanced process of designing and education
NASA Astrophysics Data System (ADS)
Mayorova, V. I.; Bannova, O. K.; Kristiansen, T.-H.; Igritsky, V. A.
2015-06-01
This paper discusses importance of a strategic planning and design process when developing long-term space exploration missions both robotic and manned. The discussion begins with reviewing current and/or traditional international perspectives on space development at the American, Russian and European space agencies. Some analogies and comparisons will be drawn upon analysis of several international student collaborative programs: Summer International workshops at the Bauman Moscow State Technical University, International European Summer Space School "Future Space Technologies and Experiments in Space", Summer school at Stuttgart University in Germany. The paper will focus on discussion about optimization of design and planning processes for successful space exploration missions and will highlight importance of the following: understanding connectivity between different levels of human being and machinery; simultaneous mission planning approach; reflections and correlations between disciplines involved in planning and executing space exploration missions; knowledge gained from different disciplines and through cross-applying and re-applying design approaches between variable space related fields of study and research. The conclusions will summarize benefits and complications of applying balanced design approach at all levels of the design process. Analysis of successes and failures of organizational efforts in space endeavors is used as a methodological approach to identify key questions to be researched as they often cause many planning and design processing problems.
Humphries, Angela; Peden, Carol; Jordan, Lesley; Crowe, Josephine; Peden, Carol
2016-01-01
A significant incidence of post-procedural deep vein thrombosis (DVT) and pulmonary embolus (PE) was identified in patients undergoing surgery at our hospital. Investigation showed an unreliable peri-operative process leading to patients receiving incorrect or missed venous thromboembolism (VTE) prophylaxis. The Trust had previously participated in a project funded by the Health Foundation using the "Safer Clinical Systems" methodology to assess, diagnose, appraise options, and implement interventions to improve a high risk medication pathway. We applied the methodology from that study to this cohort of patients demonstrating that the same approach could be applied in a different context. Interventions were linked to the greatest hazards and risks identified during the diagnostic phase. This showed that many surgical elective patients had no VTE risk assessment completed pre-operatively, leading to missed or delayed doses of VTE prophylaxis post-operatively. Collaborative work with stakeholders led to the development of a new process to ensure completion of the VTE risk assessment prior to surgery, which was implemented using the Model for Improvement methodology. The process was supported by the inclusion of a VTE check in the Sign Out element of the WHO Surgical Safety Checklist at the end of surgery, which also ensured that appropriate prophylaxis was prescribed. A standardised operation note including the post-operative VTE plan will be implemented in the near future. At the end of the project VTE risk assessments were completed for 100% of elective surgical patients on admission, compared with 40% in the baseline data. Baseline data also revealed that processes for chemical and mechanical prophylaxis were not reliable. Hospital wide interventions included standardisation of mechanical prophylaxis devices and anti-thromboembolic stockings (resulting in a cost saving of £52,000), and a Trust wide awareness and education programme. The education included increased emphasis on use of mechanical prophylaxis when chemical prophylaxis was contraindicated. VTE guidelines were also included in the existing junior Doctor guideline App. and a "CLOTS" anticoagulation webpage was developed and published on the hospital intranet. The improvement in VTE processes resulted in an 80% reduction in hospital associated thrombosis following surgery from 0.2% in January 2014 to 0.04% in December 2015 and a reduction in the number of all hospital associated VTE from a baseline median of 9 per month as of January 2014 to a median of 1 per month by December 2015.
Nixtamalized flour from quality protein maize (Zea mays L). optimization of alkaline processing.
Milán-Carrillo, J; Gutiérrez-Dorado, R; Cuevas-Rodríguez, E O; Garzón-Tiznado, J A; Reyes-Moreno, C
2004-01-01
Quality of maize proteins is poor, they are deficient in the essential amino acids lysine and tryptophan. Recently, in Mexico were successfully developed nutritionally improved 26 new hybrids and cultivars called quality protein maize (QPM) which contain greater amounts of lysine and tryptophan. Alkaline cooking of maize with lime (nixtamalization) is the first step for producing several maize products (masa, tortillas, flours, snacks). Processors adjust nixtamalization variables based on experience. The objective of this work was to determine the best combination of nixtamalization process variables for producing nixtamalized maize flour (NMF) from QPM V-537 variety. Nixtamalization conditions were selected from factorial combinations of process variables: nixtamalization time (NT, 20-85 min), lime concentration (LC, 3.3-6.7 g Ca(OH)2/l, in distilled water), and steep time (ST, 8-16 hours). Nixtamalization temperature and ratio of grain to cooking medium were 85 degrees C and 1:3 (w/v), respectively. At the end of each cooking treatment the steeping started for the required time. Steeping was finished by draining the cooking liquor (nejayote). Nixtamal (alkaline-cooked maize kernels) was washed with running tap water. Wet nixtamal was dried (24 hours, 55 degrees C) and milled to pass through 80-US mesh screen to obtain NMF. Response surface methodology (RSM) was applied as optimization technique, over four response variables: In vitro protein digestibility (PD), total color difference (deltaE), water absorption index (WAI), and pH. Predictive models for response variables were developed as a function of process variables. Conventional graphical method was applied to obtain maximum PD, WAI and minimum deltaE, pH. Contour plots of each of the response variables were utilized applying superposition surface methodology, to obtain three contour plots for observation and selection of best combination of NT (31 min), LC (5.4 g Ca(OH)2/l), and ST (8.1 hours) for producing optimized NMF from QPM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castillo, H.
1982-01-01
The Government of Costa Rica has stated the need for a formal procedure for the evaluation and categorization of an environmental program. Methodological studies were prepared as the basis for the development of the general methodology by which each government or institution can adapt and implement the procedure. The methodology was established by using different techniques according to their contribution to the evaluation process, such as: Systemic Approach, Delphi, and Saaty Methods. The methodology consists of two main parts: 1) evaluation of the environmental aspects by using different techniques; 2) categorization of the environmental aspects by applying the methodology tomore » the Costa Rican Environmental affairs using questionnaire answers supplied by experts both inside and outside of the country. The second part of the research includes Appendixes in which is presented general information concerning institutions related to environmental affairs; description of the methods used; results of the current status evaluation and its scale; the final scale of categorization; and the questionnaires and a list of experts. The methodology developed in this research will have a beneficial impact on environmental concerns in Costa Rica. As a result of this research, a Commission Office of Environmental Affairs, providing links between consumers, engineers, scientists, and the Government, is recommended. Also there is significant potential use of this methodology in developed countries for a better balancing of the budgets of major research programs such as cancer, heart, and other research areas.« less
The origin of life and its methodological challenge.
Wächtershäuser, G
1997-08-21
The problem of the origin of life is discussed from a methodological point of view as an encounter between the teleological thinking of the historian and the mechanistic thinking of the chemist; and as the Kantian task of replacing teleology by mechanism. It is shown how the Popperian situational logic of historic understanding and the Popperian principle of explanatory power of scientific theories, when jointly applied to biochemistry, lead to a methodology of biochemical retrodiction, whereby common precursor functions are constructed for disparate successor functions. This methodology is exemplified by central tenets of the theory of the chemo-autotrophic origin of life: the proposal of a surface metabolism with a two-dimensional order; the basic polarity of life with negatively charged constituents on positively charged mineral surfaces; the surface-metabolic origin of phosphorylated sugar metabolism and nucleic acids; the origin of membrane lipids and of chemi-osmosis on pyrite surfaces; and the principles of the origin of the genetic machinery. The theory presents the early evolution of life as a process that begins with chemical necessity and winds up in genetic chance.
Molinos-Senante, María; Hernández-Sancho, Francesc; Sala-Garrido, Ramón
2012-01-01
The concept of sustainability involves the integration of economic, environmental, and social aspects and this also applies in the field of wastewater treatment. Economic feasibility studies are a key tool for selecting the most appropriate option from a set of technological proposals. Moreover, these studies are needed to assess the viability of transferring new technologies from pilot-scale to full-scale. In traditional economic feasibility studies, the benefits that have no market price, such as environmental benefits, are not considered and are therefore underestimated. To overcome this limitation, we propose a new methodology to assess the economic viability of wastewater treatment technologies that considers internal and external impacts. The estimation of the costs is based on the use of cost functions. To quantify the environmental benefits from wastewater treatment, the distance function methodology is proposed to estimate the shadow price of each pollutant removed in the wastewater treatment. The application of this methodological approach by decision makers enables the calculation of the true costs and benefits associated with each alternative technology. The proposed methodology is presented as a useful tool to support decision making.
A methodology for overall consequence modeling in chemical industry.
Arunraj, N S; Maiti, J
2009-09-30
Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.
Xu, Jason; Guttorp, Peter; Kato-Maeda, Midori; Minin, Vladimir N
2015-12-01
Continuous-time birth-death-shift (BDS) processes are frequently used in stochastic modeling, with many applications in ecology and epidemiology. In particular, such processes can model evolutionary dynamics of transposable elements-important genetic markers in molecular epidemiology. Estimation of the effects of individual covariates on the birth, death, and shift rates of the process can be accomplished by analyzing patient data, but inferring these rates in a discretely and unevenly observed setting presents computational challenges. We propose a multi-type branching process approximation to BDS processes and develop a corresponding expectation maximization algorithm, where we use spectral techniques to reduce calculation of expected sufficient statistics to low-dimensional integration. These techniques yield an efficient and robust optimization routine for inferring the rates of the BDS process, and apply broadly to multi-type branching processes whose rates can depend on many covariates. After rigorously testing our methodology in simulation studies, we apply our method to study intrapatient time evolution of IS6110 transposable element, a genetic marker frequently used during estimation of epidemiological clusters of Mycobacterium tuberculosis infections. © 2015, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Gemitzi, Alexandra; Tsihrintzis, Vassilios A.; Voudrias, Evangelos; Petalas, Christos; Stravodimos, George
2007-01-01
This study presents a methodology for siting municipal solid waste landfills, coupling geographic information systems (GIS), fuzzy logic, and multicriteria evaluation techniques. Both exclusionary and non-exclusionary criteria are used. Factors, i.e., non-exclusionary criteria, are divided in two distinct groups which do not have the same level of trade off. The first group comprises factors related to the physical environment, which cannot be expressed in terms of monetary cost and, therefore, they do not easily trade off. The second group includes those factors related to human activities, i.e., socioeconomic factors, which can be expressed as financial cost, thus showing a high level of trade off. GIS are used for geographic data acquisition and processing. The analytical hierarchy process (AHP) is the multicriteria evaluation technique used, enhanced with fuzzy factor standardization. Besides assigning weights to factors through the AHP, control over the level of risk and trade off in the siting process is achieved through a second set of weights, i.e., order weights, applied to factors in each factor group, on a pixel-by-pixel basis, thus taking into account the local site characteristics. The method has been applied to Evros prefecture (NE Greece), an area of approximately 4,000 km2. The siting methodology results in two intermediate suitability maps, one related to environmental and the other to socioeconomic criteria. Combination of the two intermediate maps results in the final composite suitability map for landfill siting.
Nidheesh, T; Suresh, P V
2015-06-01
Chitin is one of the most abundant bioactive biopolymer on earth. It is commercially extracted from seafood processing crustacean shell byproducts by harsh thermochemical treatments. The extraction conditions, the source and pretreatment of raw material significantly affect its quality and bioactivity. In this investigation response surface methodology (RSM) has been applied to optimize and evaluate the interaction of variables for extraction of high quality chitin from shrimp processing raw byproducts. Variables such as, concentration of HCl (%, v/v) 4.5 (for wet) and 4.9 (for dry), reaction time 3 h, solid liquid ratio of HCl (w/v) 1:5.5 (for wet) and 1:7.9 (for dry) with two treatments achieved >98 % demineralization of shrimp byproduct. Variables such as, concentration of NaOH 3.6 % (w/v), reaction time 2.5 h, temperature 69.0 ± 1 °C, solid liquid ratio of NaOH 7.4 (w/v) and two treatments accomplished >98 % deproteinization of demineralized byproducts. Significant (p ≤ 0.05-0.001) interactive effects were observed between different variables. Chitin obtained in these conditions had residual content (%, w/w) of ash <0.4 and protein <0.8 and the degree of N-acetylation was >93 % with purity of >98 %. In conclusion, the optimized conditions by RSM can be applied for large scale preparation of high quality chitin from raw shrimp byproduct.
A validated methodology for genetic identification of tuna species (genus Thunnus).
Viñas, Jordi; Tudela, Sergi
2009-10-27
Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned.
NASA Technical Reports Server (NTRS)
Hoppa, Mary Ann; Wilson, Larry W.
1994-01-01
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.
Optimizing product life cycle processes in design phase
NASA Astrophysics Data System (ADS)
Faneye, Ola. B.; Anderl, Reiner
2002-02-01
Life cycle concepts do not only serve as basis in assisting product developers understand the dependencies between products and their life cycles, they also help in identifying potential opportunities for improvement in products. Common traditional concepts focus mainly on energy and material flow across life phases, necessitating the availability of metrics derived from a reference product. Knowledge of life cycle processes won from an existing product is directly reused in its redesign. Depending on sales volume nevertheless, the environmental impact before product optimization can be substantial. With modern information technologies today, computer-aided life cycle methodologies can be applied well before product use. On the basis of a virtual prototype, life cycle processes are analyzed and optimized, using simulation techniques. This preventive approach does not only help in minimizing (or even eliminating) environmental burdens caused by product, costs incurred due to changes in real product can also be avoided. The paper highlights the relationship between product and life cycle and presents a computer-based methodology for optimizing the product life cycle during design, as presented by SFB 392: Design for Environment - Methods and Tools at Technical University, Darmstadt.
Ratcliffe, Elizabeth; Hourd, Paul; Guijarro-Leach, Juan; Rayment, Erin; Williams, David J; Thomas, Robert J
2013-01-01
Commercial regenerative medicine will require large quantities of clinical-specification human cells. The cost and quality of manufacture is notoriously difficult to control due to highly complex processes with poorly defined tolerances. As a step to overcome this, we aimed to demonstrate the use of 'quality-by-design' tools to define the operating space for economic passage of a scalable human embryonic stem cell production method with minimal cell loss. Design of experiments response surface methodology was applied to generate empirical models to predict optimal operating conditions for a unit of manufacture of a previously developed automatable and scalable human embryonic stem cell production method. Two models were defined to predict cell yield and cell recovery rate postpassage, in terms of the predictor variables of media volume, cell seeding density, media exchange and length of passage. Predicted operating conditions for maximized productivity were successfully validated. Such 'quality-by-design' type approaches to process design and optimization will be essential to reduce the risk of product failure and patient harm, and to build regulatory confidence in cell therapy manufacturing processes.
López-Bolaños, Lizbeth; Campos-Rivera, Marisol; Villanueva-Borbolla, María Ángeles
2018-01-01
Objective. To reflect on the process of committing to participation in the implementation of a health strategic plan, using Participative Systematization of Social Experiences as a tool. Our study was a qualitative research-intervention study, based on the Dialectical Methodological Conception approach. We designed and implemented a two-day workshop, six hours daily, using Systematization methodology with a Community Work Group (CWG). During the workshop, women systematized their experience, with compromise as axis of the process. Using Grounded Theory techniques, we applied micro-analysis to data in order to identify and strengthen categories that emerged during the systematization process. We completed open and axial coding. The CWG identified that commitment and participation itself is influenced by group dynamics and structural determinants. They also reconsidered the way they understood and exercised commitment and participation, and generated knowledge, empowering them to improve their future practice. Commitment and participation were determined by group dynamics and structural factors such as socioeconomic conditions and gender roles. These determinants must be visible and understood in order to generate proposals that are aimed at strengthening the participation and organization of groups.
Protegiendo Nuestra Comunidad: empowerment participatory education for HIV prevention.
McQuiston, C; Choi-Hevel, S; Clawson, M
2001-10-01
To be effective, HIV/AIDS interventions must be culturally and linguistically appropriate and must occur within the context of the specific community in which they are delivered. In this article, the development of a culture-specific lay health advisor (LHA) program, Protegiendo Nuestra Comunidad, for recently immigrated Mexicans is described. This program is one component of a collaborative inquiry research project involving community participants and researchers working as partners in carrying out and assessing a program for the prevention of HIV/AIDS. The collaborative inquiry process was applied as an empowerment philosophy and methodology of Paulo Freire and an ecological framework was used for the development of Protegiendo Nuestra Comunidad. The use of principles of empowerment for curriculum development, teaching methodology, and program delivery are described.
Automated airplane surface generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, R.E.; Cordero, Y.; Jones, W.
1996-12-31
An efficient methodology and software axe presented for defining a class of airplane configurations. A small set of engineering design parameters and grid control parameters govern the process. The general airplane configuration has wing, fuselage, vertical tall, horizontal tail, and canard components. Wing, canard, and tail surface grids axe manifested by solving a fourth-order partial differential equation subject to Dirichlet and Neumann boundary conditions. The design variables are incorporated into the boundary conditions, and the solution is expressed as a Fourier series. The fuselage is described by an algebraic function with four design parameters. The computed surface grids are suitablemore » for a wide range of Computational Fluid Dynamics simulation and configuration optimizations. Both batch and interactive software are discussed for applying the methodology.« less
Goggins, Sean; Marsh, Barrie J; Lubben, Anneke T; Frost, Christopher G
2015-08-01
Signal transduction and signal amplification are both important mechanisms used within biological signalling pathways. Inspired by this process, we have developed a signal amplification methodology that utilises the selectivity and high activity of enzymes in combination with the robustness and generality of an organometallic catalyst, achieving a hybrid biological and synthetic catalyst cascade. A proligand enzyme substrate was designed to selectively self-immolate in the presence of the enzyme to release a ligand that can bind to a metal pre-catalyst and accelerate the rate of a transfer hydrogenation reaction. Enzyme-triggered catalytic signal amplification was then applied to a range of catalyst substrates demonstrating that signal amplification and signal transduction can both be achieved through this methodology.
Development of a diaphragmatic motion-based elastography framework for assessment of liver stiffness
NASA Astrophysics Data System (ADS)
Weis, Jared A.; Johnsen, Allison M.; Wile, Geoffrey E.; Yankeelov, Thomas E.; Abramson, Richard G.; Miga, Michael I.
2015-03-01
Evaluation of mechanical stiffness imaging biomarkers, through magnetic resonance elastography (MRE), has shown considerable promise for non-invasive assessment of liver stiffness to monitor hepatic fibrosis. MRE typically requires specialized externally-applied vibratory excitation and scanner-specific motion-sensitive pulse sequences. In this work, we have developed an elasticity imaging approach that utilizes natural diaphragmatic respiratory motion to induce deformation and eliminates the need for external deformation excitation hardware and specialized pulse sequences. Our approach uses clinically-available standard of care volumetric imaging acquisitions, combined with offline model-based post-processing to generate volumetric estimates of stiffness within the liver and surrounding tissue structures. We have previously developed a novel methodology for non-invasive elasticity imaging which utilizes a model-based elasticity reconstruction algorithm and MR image volumes acquired under different states of deformation. In prior work, deformation was external applied through inflation of an air bladder placed within the MR radiofrequency coil. In this work, we extend the methodology with the goal of determining the feasibility of assessing liver mechanical stiffness using diaphragmatic respiratory motion between end-inspiration and end-expiration breath-holds as a source of deformation. We present initial investigations towards applying this methodology to assess liver stiffness in healthy volunteers and cirrhotic patients. Our preliminary results suggest that this method is capable of non-invasive image-based assessment of liver stiffness using natural diaphragmatic respiratory motion and provides considerable enthusiasm for extension of our approach towards monitoring liver stiffness in cirrhotic patients with limited impact to standard-of-care clinical imaging acquisition workflow.
Ialongo, Cristiano; Bernardini, Sergio
2018-06-18
There is a compelling need for quality tools that enable effective control of the extra-analytical phase. In this regard, Six Sigma seems to offer a valid methodological and conceptual opportunity, and in recent times, the International Federation of Clinical Chemistry and Laboratory Medicine has adopted it for indicating the performance requirements for non-analytical laboratory processes. However, the Six Sigma implies a distinction between short-term and long-term quality that is based on the dynamics of the processes. These concepts are still not widespread and applied in the field of laboratory medicine although they are of fundamental importance to exploit the full potential of this methodology. This paper reviews the Six Sigma quality concepts and shows how they originated from Shewhart's control charts, in respect of which they are not an alternative but a completion. It also discusses the dynamic nature of process and how it arises, concerning particularly the long-term dynamic mean variation, and explains why this leads to the fundamental distinction of quality we previously mentioned.
Carbon Capture and Utilization in the Industrial Sector.
Psarras, Peter C; Comello, Stephen; Bains, Praveen; Charoensawadpong, Panunya; Reichelstein, Stefan; Wilcox, Jennifer
2017-10-03
The fabrication and manufacturing processes of industrial commodities such as iron, glass, and cement are carbon-intensive, accounting for 23% of global CO 2 emissions. As a climate mitigation strategy, CO 2 capture from flue gases of industrial processes-much like that of the power sector-has not experienced wide adoption given its high associated costs. However, some industrial processes with relatively high CO 2 flue concentration may be viable candidates to cost-competitively supply CO 2 for utilization purposes (e.g., polymer manufacturing, etc.). This work develops a methodology that determines the levelized cost ($/tCO 2 ) of separating, compressing, and transporting carbon dioxide. A top-down model determines the cost of separating and compressing CO 2 across 18 industrial processes. Further, the study calculates the cost of transporting CO 2 via pipeline and tanker truck to appropriately paired sinks using a bottom-up cost model and geo-referencing approach. The results show that truck transportation is generally the low-cost alternative given the relatively small volumes (ca. 100 kt CO 2 /a). We apply our methodology to a regional case study in Pennsylvania, which shows steel and cement manufacturing paired to suitable sinks as having the lowest levelized cost of capture, compression, and transportation.
Comprehensive analysis of line-edge and line-width roughness for EUV lithography
NASA Astrophysics Data System (ADS)
Bonam, Ravi; Liu, Chi-Chun; Breton, Mary; Sieg, Stuart; Seshadri, Indira; Saulnier, Nicole; Shearer, Jeffrey; Muthinti, Raja; Patlolla, Raghuveer; Huang, Huai
2017-03-01
Pattern transfer fidelity is always a major challenge for any lithography process and needs continuous improvement. Lithographic processes in semiconductor industry are primarily driven by optical imaging on photosensitive polymeric material (resists). Quality of pattern transfer can be assessed by quantifying multiple parameters such as, feature size uniformity (CD), placement, roughness, sidewall angles etc. Roughness in features primarily corresponds to variation of line edge or line width and has gained considerable significance, particularly due to shrinking feature sizes and variations of features in the same order. This has caused downstream processes (Etch (RIE), Chemical Mechanical Polish (CMP) etc.) to reconsider respective tolerance levels. A very important aspect of this work is relevance of roughness metrology from pattern formation at resist to subsequent processes, particularly electrical validity. A major drawback of current LER/LWR metric (sigma) is its lack of relevance across multiple downstream processes which effects material selection at various unit processes. In this work we present a comprehensive assessment of Line Edge and Line Width Roughness at multiple lithographic transfer processes. To simulate effect of roughness a pattern was designed with periodic jogs on the edges of lines with varying amplitudes and frequencies. There are numerous methodologies proposed to analyze roughness and in this work we apply them to programmed roughness structures to assess each technique's sensitivity. This work also aims to identify a relevant methodology to quantify roughness with relevance across downstream processes.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine
2008-01-01
NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the digital domain (such as statistical averaging of the reference pixels themselves) zeroes out the high-variance components, and the counterpart components in the active pixels remain uncorrected. This paper describes how the new methodology was demonstrated through analysis of fast-varying noise components using the Hilbert-Huang Transform Data Processing System tool (HHT-DPS) developed at NASA and the high-level programming language MATLAB (Trademark of MathWorks Inc.), as well as alternative methods for correcting for the high-variance noise component, using an HgCdTe sensor data. The NASA Hubble Space Telescope data post-processing, as well as future deep-space cosmology projects on-board instrument data processing from all the sensor channels, would benefit from this effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strons, Philip; Bailey, James L.; Davis, John
2016-03-01
In this work, we apply the CFD in modeling airflow and particulate transport. This modeling is then compared to field validation studies to both inform and validate the modeling assumptions. Based on the results of field tests, modeling assumptions and boundary conditions are refined and the process is repeated until the results are found to be reliable with a high level of confidence.
Applying Systems Engineering Methodologies to the Creative Process
2014-09-01
Carrasco, and Ranee A. Flores. 2013. “The Structure of Creative Cognition in the Human Brain .” Frontiers in Human Neuroscience 7: 1–13. doi: 10.3389/fnhum...The place aspect of creativity will be addressed in a limited fashion to raise awareness of possible prescriptive methods to enhance creativity...threshold theory states, “…creativity and intelligence are correlated up to a certain threshold [around an intelligence quotient ( IQ ) of 120] after which
Blaise George Grden
1979-01-01
This paper is an investigation of the Visual Management System (VMS) and the Visual Resource Inventory and Evaluation Process (VRIEP). Questionnaires were developed and sent to persons who were experienced with VMS and/or VRIEP. VMS has been found easier to under-stand and apply than VRIEP. The methodology of VRIEP has been found to he a more complete approach than...
Applying thematic analysis theory to practice: a researcher's experience.
Tuckett, Anthony G
2005-01-01
This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.
The Development of NASA's Fault Management Handbook
NASA Technical Reports Server (NTRS)
Dennehy, Cornelius J.; Fesq, Lorraine M.; Barth, Timothy; Clark, Micah; Day, John; Fretz, Kristen; Friberg, Kenneth; Johnson, Stephen; Hattis, Philip; McComas, David;
2011-01-01
NASA is developing a FM Handbook to establish guidelines and to provide recommendations for defining, developing, analyzing, evaluating, testing, and operating FM systems. It establishes a process for developing FM throughout the lifecycle of a mission and provides a basis for moving the field toward a formal and consistent FM methodology to be applied on future programs. This paper describes the motivation for, the development of, and the future plans for the NASA FM Handbook.
Olivier, Jérémy; Conrardy, Jean-Baptiste; Mahmoud, Akrama; Vaxelaire, Jean
2015-10-01
Compared to conventional dewatering techniques, electrical assisted mechanical dewatering, also called electro-dewatering (EDW) is an alternative and an effective technology for the dewatering of sewage sludge with low energy consumption. The objectives of this study were to evaluate the dewatering performance and to determine the influence of the process parameters (e.g. applied electric current, applied voltage, and the initial amount of dry solids) on the kinetics of EDW-process for activated urban sludge. Also significant efforts have been devoted herein to provide comprehensive information about the EDW mechanisms and to understand the relationship between these operating conditions with regards to develop a qualitative and quantitative understanding model of the electro-dewatering process and then produce a robust design methodology. The results showed a very strong correlation between the applied electric current and the filtrate flow rate and consequently the electro-dewatering kinetics. A higher applied electric current leads to faster EDW kinetics and a higher final dry solids content. In contrast, the results of this work showed a significant enhancement of the dewatering kinetics by decreasing the mass of the dry solids introduced into the cell (commonly known as the sludge loading). Copyright © 2015 Elsevier Ltd. All rights reserved.
Setting research priorities by applying the combined approach matrix.
Ghaffar, Abdul
2009-04-01
Priority setting in health research is a dynamic process. Different organizations and institutes have been working in the field of research priority setting for many years. In 1999 the Global Forum for Health Research presented a research priority setting tool called the Combined Approach Matrix or CAM. Since its development, the CAM has been successfully applied to set research priorities for diseases, conditions and programmes at global, regional and national levels. This paper briefly explains the CAM methodology and how it could be applied in different settings, giving examples and describing challenges encountered in the process of setting research priorities and providing recommendations for further work in this field. The construct and design of the CAM is explained along with different steps needed, including planning and organization of a priority-setting exercise and how it could be applied in different settings. The application of the CAM are described by using three examples. The first concerns setting research priorities for a global programme, the second describes application at the country level and the third setting research priorities for diseases. Effective application of the CAM in different and diverse environments proves its utility as a tool for setting research priorities. Potential challenges encountered in the process of research priority setting are discussed and some recommendations for further work in this field are provided.
Error Generation in CATS-Based Agents
NASA Technical Reports Server (NTRS)
Callantine, Todd
2003-01-01
This research presents a methodology for generating errors from a model of nominally preferred correct operator activities, given a particular operational context, and maintaining an explicit link to the erroneous contextual information to support analyses. It uses the Crew Activity Tracking System (CATS) model as the basis for error generation. This report describes how the process works, and how it may be useful for supporting agent-based system safety analyses. The report presents results obtained by applying the error-generation process and discusses implementation issues. The research is supported by the System-Wide Accident Prevention Element of the NASA Aviation Safety Program.
Optimum Design of Forging Process Parameters and Preform Shape under Uncertainties
NASA Astrophysics Data System (ADS)
Repalle, Jalaja; Grandhi, Ramana V.
2004-06-01
Forging is a highly complex non-linear process that is vulnerable to various uncertainties, such as variations in billet geometry, die temperature, material properties, workpiece and forging equipment positional errors and process parameters. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion and production risk. Identifying the sources of uncertainties, quantifying and controlling them will reduce risk in the manufacturing environment, which will minimize the overall cost of production. In this paper, various uncertainties that affect forging tool life and preform design are identified, and their cumulative effect on the forging process is evaluated. Since the forging process simulation is computationally intensive, the response surface approach is used to reduce time by establishing a relationship between the system performance and the critical process design parameters. Variability in system performance due to randomness in the parameters is computed by applying Monte Carlo Simulations (MCS) on generated Response Surface Models (RSM). Finally, a Robust Methodology is developed to optimize forging process parameters and preform shape. The developed method is demonstrated by applying it to an axisymmetric H-cross section disk forging to improve the product quality and robustness.
Mathieu, Amélie; Vidal, Tiphaine; Jullien, Alexandra; Wu, QiongLi; Chambon, Camille; Bayol, Benoit; Cournède, Paul-Henry
2018-06-19
Functional-structural plant models (FSPMs) describe explicitly the interactions between plants and their environment at organ to plant scale. However, the high level of description of the structure or model mechanisms makes this type of model very complex and hard to calibrate. A two-step methodology to facilitate the calibration process is proposed here. First, a global sensitivity analysis method was applied to the calibration loss function. It provided first-order and total-order sensitivity indexes that allow parameters to be ranked by importance in order to select the most influential ones. Second, the Akaike information criterion (AIC) was used to quantify the model's quality of fit after calibration with different combinations of selected parameters. The model with the lowest AIC gives the best combination of parameters to select. This methodology was validated by calibrating the model on an independent data set (same cultivar, another year) with the parameters selected in the second step. All the parameters were set to their nominal value; only the most influential ones were re-estimated. Sensitivity analysis applied to the calibration loss function is a relevant method to underline the most significant parameters in the estimation process. For the studied winter oilseed rape model, 11 out of 26 estimated parameters were selected. Then, the model could be recalibrated for a different data set by re-estimating only three parameters selected with the model selection method. Fitting only a small number of parameters dramatically increases the efficiency of recalibration, increases the robustness of the model and helps identify the principal sources of variation in varying environmental conditions. This innovative method still needs to be more widely validated but already gives interesting avenues to improve the calibration of FSPMs.
Applying Chomsky's Linguistic Methodology to the Clinical Interpretation of Symbolic Play.
ERIC Educational Resources Information Center
Ariel, Shlomo
This paper summarizes how Chomsky's methodological principles of linguistics may be applied to the clinical interpretation of children's play. Based on Chomsky's derivation of a "universal grammar" (the set of essential, formal, and substantive traits of any human language), a number of hypothesized formal universals of…
Udod, Sonia A; Racine, Louise
2017-12-01
To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that are meaningful to nurses. © 2017 John Wiley & Sons Ltd.
Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D
To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.
Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen
2014-07-01
Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.
Machine learning in sentiment reconstruction of the simulated stock market
NASA Astrophysics Data System (ADS)
Goykhman, Mikhail; Teimouri, Ali
2018-02-01
In this paper we continue the study of the simulated stock market framework defined by the driving sentiment processes. We focus on the market environment driven by the buy/sell trading sentiment process of the Markov chain type. We apply the methodology of the Hidden Markov Models and the Recurrent Neural Networks to reconstruct the transition probabilities matrix of the Markov sentiment process and recover the underlying sentiment states from the observed stock price behavior. We demonstrate that the Hidden Markov Model can successfully recover the transition probabilities matrix for the hidden sentiment process of the Markov Chain type. We also demonstrate that the Recurrent Neural Network can successfully recover the hidden sentiment states from the observed simulated stock price time series.
Interdiscplinary team processes within an in-home service delivery organization.
Gantert, Thomas W; McWilliam, Carol L
2004-01-01
Interdisciplinary teamwork is particularly difficult to achieve in the community context where geographical separateness and solo practices impede face to face contact and collaborative practice. Understanding the processes that occur within interdisciplinary teams is imperative, since client outcomes are influenced by interdisciplinary teamwork. The purpose of this exploratory study was to describe the processes that occur within interdisciplinary teams that deliver in-home care. Applying grounded theory methodology, the researcher conducted unstructured in-depth interviews with a purposeful sample of healthcare providers and used constant comparative analysis to elicit the findings. Findings revealed three key team processes: networking, navigating, and aligning. The descriptions afford several insights that are applicable to in-home healthcare agencies attempting to achieve effective interdisciplinary team functioning.
NASA Technical Reports Server (NTRS)
Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.
2006-01-01
Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.
NASA Astrophysics Data System (ADS)
Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.
2017-09-01
This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.
Wang, Ya-Qi; Wu, Zhen-Feng; Ke, Gang; Yang, Ming
2014-12-31
An effective vacuum assisted extraction (VAE) technique was proposed for the first time and applied to extract bioactive components from Andrographis paniculata. The process was carefully optimized by response surface methodology (RSM). Under the optimized experimental conditions, the best results were obtained using a boiling temperature of 65 °C, 50% ethanol concentration, 16 min of extraction time, one extraction cycles and a 12:1 liquid-solid ratio. Compared with conventional ultrasonic assisted extraction and heat reflux extraction, the VAE technique gave shorter extraction times and remarkable higher extraction efficiency, which indicated that a certain degree of vacuum gave the solvent a better penetration of the solvent into the pores and between the matrix particles, and enhanced the process of mass transfer. The present results demonstrated that VAE is an efficient, simple and fast method for extracting bioactive components from A. paniculata, which shows great potential for becoming an alternative technique for industrial scale-up applications.
From bricolage to BioBricks™: Synthetic biology and rational design.
Lewens, Tim
2013-12-01
Synthetic biology is often described as a project that applies rational design methods to the organic world. Although humans have influenced organic lineages in many ways, it is nonetheless reasonable to place synthetic biology towards one end of a continuum between purely 'blind' processes of organic modification at one extreme, and wholly rational, design-led processes at the other. An example from evolutionary electronics illustrates some of the constraints imposed by the rational design methodology itself. These constraints reinforce the limitations of the synthetic biology ideal, limitations that are often freely acknowledged by synthetic biology's own practitioners. The synthetic biology methodology reflects a series of constraints imposed on finite human designers who wish, as far as is practicable, to communicate with each other and to intervene in nature in reasonably targeted and well-understood ways. This is better understood as indicative of an underlying awareness of human limitations, rather than as expressive of an objectionable impulse to mastery over nature. Copyright © 2013 Elsevier Ltd. All rights reserved.
Ko, Wen-Ching; Chang, Chao-Kai; Wang, Hsiu-Ju; Wang, Shian-Jen; Hsieh, Chang-Wei
2015-04-01
The aim of this study was to develop an optimal microencapsulation method for an oil-soluble component (curcumin) using γ-PGA. The results show that Span80 significantly enhances the encapsulation efficiency (EE) of γ-Na(+)-PGA microcapsules. Therefore, the effects of γ-Na(+)-PGA, curcumin and Span80 concentration on EE of γ-Na(+)-PGA microcapsules were studied by means of response surface methodology (RSM). It was found that the optimal microencapsulation process is achieved by using γ-Na(+)-PGA 6.05%, curcumin 15.97% and Span80 0.61% with a high EE% (74.47 ± 0.20%). Furthermore, the models explain 98% of the variability in the responses. γ-Na(+)-PGA seems to be a good carrier for the encapsulation of curcumin. In conclusion, this simple and versatile approach can potentially be applied to the microencapsulation of various oil-soluble components for food applications. Copyright © 2014 Elsevier Ltd. All rights reserved.
Boiling process modelling peculiarities analysis of the vacuum boiler
NASA Astrophysics Data System (ADS)
Slobodina, E. N.; Mikhailov, A. G.
2017-06-01
The analysis of the low and medium powered boiler equipment development was carried out, boiler units possible development directions with the purpose of energy efficiency improvement were identified. Engineering studies for the vacuum boilers applying are represented. Vacuum boiler heat-exchange processes where boiling water is the working body are considered. Heat-exchange intensification method under boiling at the maximum heat- transfer coefficient is examined. As a result of the conducted calculation studies, heat-transfer coefficients variation curves depending on the pressure, calculated through the analytical and numerical methodologies were obtained. The conclusion about the possibility of numerical computing method application through RPI ANSYS CFX for the boiling process description in boiler vacuum volume was given.
Navarrete-Bolaños, J L; Téllez-Martínez, M G; Miranda-López, R; Jiménez-Islas, H
2017-07-03
For any fermentation process, the production cost depends on several factors, such as the genetics of the microorganism, the process condition, and the culture medium composition. In this work, a guideline for the design of cost-efficient culture media using a sequential approach based on response surface methodology is described. The procedure was applied to analyze and optimize a culture medium of registered trademark and a base culture medium obtained as a result of the screening analysis from different culture media used to grow the same strain according to the literature. During the experiments, the procedure quantitatively identified an appropriate array of micronutrients to obtain a significant yield and find a minimum number of culture medium ingredients without limiting the process efficiency. The resultant culture medium showed an efficiency that compares favorably with the registered trademark medium at a 95% lower cost as well as reduced the number of ingredients in the base culture medium by 60% without limiting the process efficiency. These results demonstrated that, aside from satisfying the qualitative requirements, an optimum quantity of each constituent is needed to obtain a cost-effective culture medium. Study process variables for optimized culture medium and scaling-up production for the optimal values are desirable.
Martinez-Haya, R; Gomis, J; Arques, A; Amat, A M; Miranda, M A; Marin, M L
2017-09-09
Advanced oxidation processes are useful methodologies to accomplish abatement of contaminants; however, elucidation of the reaction mechanisms is hampered by the difficult detection of the short-lived primary key species involved in the photocatalytic processes. Nevertheless, herein the combined use of an organic photocatalyst such as triphenylpyrylium (TPP + ) and photophysical techniques based on emission and absorption spectroscopy allowed monitoring the photocatalyst-derived short-lived intermediates. This methodology has been applied to the photocatalyzed degradation of different pollutants, such as acetaminophen, acetamiprid, caffeine and carbamazepine. First, photocatalytic degradation of a mixture of the pollutants showed that acetaminophen was the most easily photodegraded, followed by carbamazepine and caffeine, being the abatement of acetamiprid almost negligible. This process was accompanied by mineralization, as demonstrated by trapping of carbon dioxide using barium hydroxide. Then, emission spectroscopy measurements (steady-state and time-resolved fluorescence) allowed demonstrating quenching of the singlet excited state of TPP + . Laser flash photolysis experiments with absorption detection showed that oxidation of contaminants is accompanied by TPP + reduction, with formation of a pyranyl radical (TPP), that constituted a fingerprint of the redox nature of the occurring process. The relative amounts of TPP detected was also correlated with the efficiency of the photodegradation process. Copyright © 2017 Elsevier B.V. All rights reserved.
Martinez-Haya, R; Gomis, J; Arques, A; Amat, A M; Miranda, M A; Marin, M L
2018-08-15
Advanced oxidation processes are useful methodologies to accomplish abatement of contaminants; however, elucidation of the reaction mechanisms is hampered by the difficult detection of the short-lived primary key species involved in the photocatalytic processes. Nevertheless, herein the combined use of an organic photocatalyst such as triphenylpyrylium (TPP + ) and photophysical techniques based on emission and absorption spectroscopy allowed monitoring the photocatalyst-derived short-lived intermediates. This methodology has been applied to the photocatalyzed degradation of different pollutants, such as acetaminophen, acetamiprid, caffeine and carbamazepine. First, photocatalytic degradation of a mixture of the pollutants showed that acetaminophen was the most easily photodegraded, followed by carbamazepine and caffeine, being the abatement of acetamiprid almost negligible. This process was accompanied by mineralization, as demonstrated by trapping of carbon dioxide using barium hydroxide. Then, emission spectroscopy measurements (steady-state and time-resolved fluorescence) allowed demonstrating quenching of the singlet excited state of TPP + . Laser flash photolysis experiments with absorption detection showed that oxidation of contaminants is accompanied by TPP + reduction, with formation of a pyranyl radical (TPP), that constituted a fingerprint of the redox nature of the occurring process. The relative amounts of TPP detected was also correlated with the efficiency of the photodegradation process. Copyright © 2018. Published by Elsevier B.V.
Tsunami hazard assessments with consideration of uncertain earthquakes characteristics
NASA Astrophysics Data System (ADS)
Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.
2017-12-01
The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.
Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A
2015-01-08
Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.
Analytical group decision making in natural resources: Methodology and application
Schmoldt, D.L.; Peterson, D.L.
2000-01-01
Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.
Effect of Temperature, Time, and Material Thickness on the Dehydration Process of Tomato
Correia, A. F. K.; Loro, A. C.; Zanatta, S.; Spoto, M. H. F.; Vieira, T. M. F. S.
2015-01-01
This study aimed to evaluate the effects of temperature, time, and thickness of tomatoes fruits during adiabatic drying process. Dehydration, a simple and inexpensive process compared to other conservation methods, is widely used in the food industry in order to ensure a long shelf life for the product due to the low water activity. This study aimed to obtain the best processing conditions to avoid losses and keep product quality. Factorial design and surface response methodology were applied to fit predictive mathematical models. In the dehydration of tomatoes through the adiabatic process, temperature, time, and sample thickness, which greatly contribute to the physicochemical and sensory characteristics of the final product, were evaluated. The optimum drying conditions were 60°C with the lowest thickness level and shorter time. PMID:26904666
Graphics Processing Unit Assisted Thermographic Compositing
NASA Technical Reports Server (NTRS)
Ragasa, Scott; McDougal, Matthew; Russell, Sam
2012-01-01
Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often great, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques. Technical Methodology/Approach: Apply massively parallel algorithms and data structures to the specific analysis requirements presented when working with thermographic data sets.
Modelling and analysis of solar cell efficiency distributions
NASA Astrophysics Data System (ADS)
Wasmer, Sven; Greulich, Johannes
2017-08-01
We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.
An application of Six Sigma methodology to turnover intentions in health care.
Taner, Mehmet
2009-01-01
The purpose of this study is to show how the principles of Six Sigma can be applied to the high turnover problem of doctors in medical emergency services and paramedic backup. Six Sigma's define-measure-analyse-improve-control (DMAIC) is applied for reducing the turnover rate of doctors in an organisation operating in emergency services. Variables of the model are determined. Explanatory factor analysis, multiple regression, analysis of variance (ANOVA) and Gage R&R are employed for the analysis. Personal burnout/stress and dissatisfaction from salary were found to be the "vital few" variables. The organisation took a new approach by improving its initiatives to doctors' working conditions. Sigma level of the process is increased. New policy and process changes have been found to effectively decrease the incidence of turnover intentions. The improved process is gained, standardised and institutionalised. This study is one of the few papers in the literature that elaborates the turnover problem of doctors working in the emergency and paramedic backup services.
Jokić, Stela; Gagić, Tanja; Knez, Željko; Šubarić, Drago; Škerget, Mojca
2018-06-11
Large amounts of residues are produced in the food industries. The waste shells from cocoa processing are usually burnt for fuel or used as a mulch in gardens to add nutrients to soil and to suppress weeds. The objectives of this work were: (a) to separate valuable compounds from cocoa shell by applying sustainable green separation process—subcritical water extraction (SWE); (b) identification and quantification of active compounds, sugars and sugar degradation products in obtained extracts using HPLC; (c) characterization of the antioxidant activity of extracts; (d) optimization of separation process using response surface methodology (RSM). Depending on applied extraction conditions, different concentration of theobromine, caffeine, theophylline, epicatechin, catechin, chlorogenic acid and gallic acid were determined in the extracts obtained by subcritical water. Furthermore, mannose, glucose, xylose, arabinose, rhamnose and fucose were detected as well as their important degradation products such as 5-hydroxymethylfurfural (5-HMF), furfural, levulinic acid, lactic acid and formic acid.
Greenwood, Taylor J; Lopez-Costa, Rodrigo I; Rhoades, Patrick D; Ramírez-Giraldo, Juan C; Starr, Matthew; Street, Mandie; Duncan, James; McKinstry, Robert C
2015-01-01
The marked increase in radiation exposure from medical imaging, especially in children, has caused considerable alarm and spurred efforts to preserve the benefits but reduce the risks of imaging. Applying the principles of the Image Gently campaign, data-driven process and quality improvement techniques such as process mapping and flowcharting, cause-and-effect diagrams, Pareto analysis, statistical process control (control charts), failure mode and effects analysis, "lean" or Six Sigma methodology, and closed feedback loops led to a multiyear program that has reduced overall computed tomographic (CT) examination volume by more than fourfold and concurrently decreased radiation exposure per CT study without compromising diagnostic utility. This systematic approach involving education, streamlining access to magnetic resonance imaging and ultrasonography, auditing with comparison with benchmarks, applying modern CT technology, and revising CT protocols has led to a more than twofold reduction in CT radiation exposure between 2005 and 2012 for patients at the authors' institution while maintaining diagnostic utility. (©)RSNA, 2015.
Pareja, Lucía; Colazzo, Marcos; Pérez-Parada, Andrés; Besil, Natalia; Heinzen, Horacio; Böcking, Bernardo; Cesio, Verónica; Fernández-Alba, Amadeo R
2012-05-09
The results of an experiment to study the occurrence and distribution of pesticide residues during rice cropping and processing are reported. Four herbicides, nine fungicides, and two insecticides (azoxystrobin, byspiribac-sodium, carbendazim, clomazone, difenoconazole, epoxiconazole, isoprothiolane, kresoxim-methyl, propanil, quinclorac, tebuconazole, thiamethoxam, tricyclazole, trifloxystrobin, λ-cyhalotrin) were applied to an isolated rice-crop plot under controlled conditions, during the 2009-2010 cropping season in Uruguay. Paddy rice was harvested and industrially processed to brown rice, white rice, and rice bran, which were analyzed for pesticide residues using the original QuEChERS methodology and its citrate variation by LC-MS/MS and GC-MS. The distribution of pesticide residues was uneven among the different matrices. Ten different pesticide residues were found in paddy rice, seven in brown rice, and eight in rice bran. The highest concentrations were detected in paddy rice. These results provide information regarding the fate of pesticides in the rice food chain and its safety for consumers.
Investigating patients' experiences: methodological usefulness of interpretive interactionism.
Tower, Marion; Rowe, Jennifer; Wallis, Marianne
2012-01-01
To demonstrate the methodological usefulness of interpretive interactionism by applying it to the example of a study investigating healthcare experiences of women affected by domestic violence. Understanding patients' experiences of health, illness and health care is important to nurses. For many years, biomedical discourse has prevailed in healthcare language and research, and has influenced healthcare responses. Contemporary nursing scholarship can be developed by engaging with new ways of understanding therapeutic interactions with patients. Research that uses qualitative methods of inquiry is an important paradigm for nurses who seek to explain and understand or describe experiences rather than predict outcomes. Interpretive interactionism is an interpretive form of inquiry for conducting studies of social or personal problems that have healthcare policy implications. It puts the patient at the centre of the research process and makes visible the experiences of patients as they interact with the healthcare and social systems that surround them. Interpretive interactionism draws on concepts of symbolic interactionism, phenomenology and hermeneutics. Interpretive interactionism is a patient-centred methodology that provides an alternative way of understanding patients' experiences. It can contribute to policy and practice development by drawing on the perspectives and experiences of patients, who are central to the research process. It also allows research findings to be situated in and linked to healthcare policy, professional ethics and organisational approaches to care. Interpretive interactionism has methodological utility because it can contribute to policy and practice development by drawing on the perspectives and experiences of patients who are central to the research process. Interpretive interactionism allows research findings to be situated in and linked to health policy, professional ethics and organisational approaches to caring.
de Sousa Costa, Robherson Wector; da Silva, Giovanni Lucca França; de Carvalho Filho, Antonio Oseas; Silva, Aristófanes Corrêa; de Paiva, Anselmo Cardoso; Gattass, Marcelo
2018-05-23
Lung cancer presents the highest cause of death among patients around the world, in addition of being one of the smallest survival rates after diagnosis. Therefore, this study proposes a methodology for diagnosis of lung nodules in benign and malignant tumors based on image processing and pattern recognition techniques. Mean phylogenetic distance (MPD) and taxonomic diversity index (Δ) were used as texture descriptors. Finally, the genetic algorithm in conjunction with the support vector machine were applied to select the best training model. The proposed methodology was tested on computed tomography (CT) images from the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), with the best sensitivity of 93.42%, specificity of 91.21%, accuracy of 91.81%, and area under the ROC curve of 0.94. The results demonstrate the promising performance of texture extraction techniques using mean phylogenetic distance and taxonomic diversity index combined with phylogenetic trees. Graphical Abstract Stages of the proposed methodology.
A hybrid approach to select features and classify diseases based on medical data
NASA Astrophysics Data System (ADS)
AbdelLatif, Hisham; Luo, Jiawei
2018-03-01
Feature selection is popular problem in the classification of diseases in clinical medicine. Here, we developing a hybrid methodology to classify diseases, based on three medical datasets, Arrhythmia, Breast cancer, and Hepatitis datasets. This methodology called k-means ANOVA Support Vector Machine (K-ANOVA-SVM) uses K-means cluster with ANOVA statistical to preprocessing data and selection the significant features, and Support Vector Machines in the classification process. To compare and evaluate the performance, we choice three classification algorithms, decision tree Naïve Bayes, Support Vector Machines and applied the medical datasets direct to these algorithms. Our methodology was a much better classification accuracy is given of 98% in Arrhythmia datasets, 92% in Breast cancer datasets and 88% in Hepatitis datasets, Compare to use the medical data directly with decision tree Naïve Bayes, and Support Vector Machines. Also, the ROC curve and precision with (K-ANOVA-SVM) Achieved best results than other algorithms
A call to improve sampling methodology and reporting in young novice driver research.
Scott-Parker, B; Senserrick, T
2017-02-01
Young drivers continue to be over-represented in road crash fatalities despite a multitude of research, communication and intervention. Evidence-based improvement depends to a great extent upon research methodology quality and its reporting, with known limitations in the peer-review process. The aim of the current research was to review the scope of research methodologies applied in 'young driver' and 'teen driver' research and their reporting in four peer-review journals in the field between January 2006 and December 2013. In total, 806 articles were identified and assessed. Reporting omissions included participant gender (11% of papers), response rates (49%), retention rates (39%) and information regarding incentives (44%). Greater breadth and specific improvements in study designs and reporting are thereby identified as a means to further advance the field. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Mortar radiocarbon dating: preliminary accuracy evaluation of a novel methodology.
Marzaioli, Fabio; Lubritto, Carmine; Nonni, Sara; Passariello, Isabella; Capano, Manuela; Terrasi, Filippo
2011-03-15
Mortars represent a class of building and art materials that are widespread at archeological sites from the Neolithic period on. After about 50 years of experimentation, the possibility to evaluate their absolute chronology by means of radiocarbon ((14)C) remains still uncertain. With the use of a simplified mortar production process in the laboratory environment, this study shows the overall feasibility of a novel physical pretreatment for the isolation of the atmospheric (14)CO(2) (i.e., binder) signal absorbed by the mortars during their setting. This methodology is based on the assumption that an ultrasonic attack in liquid phase isolates a suspension of binder carbonates from bulk mortars. Isotopic ((13)C and (14)C), % C, X-ray diffractometry (XRD), and scanning electron microscopy (SEM) analyses were performed to characterize the proposed methodology. The applied protocol allows suppression of the fossil carbon (C) contamination originating from the incomplete burning of the limestone during the quick lime production, providing unbiased dating for "laboratory" mortars produced operating at historically adopted burning temperatures.
NASA Astrophysics Data System (ADS)
Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis
2009-08-01
The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.
Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis
2009-08-01
The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.
Characterization of proteomic and metabolomic responses to dietary factors and supplements.
Astle, John; Ferguson, Jonathan T; German, J Bruce; Harrigan, George G; Kelleher, Neil L; Kodadek, Thomas; Parks, Bryan A; Roth, Michael J; Singletary, Keith W; Wenger, Craig D; Mahady, Gail B
2007-12-01
Over the past decade there has been a renewed interest in research and development of both dietary and nutritional supplements. Significant advancements have been made in the scientific assessment of the quality, safety, and efficacy of these products because of the strong interest in and financial support of these projects. As research in both fields continues to advance, opportunities to use new and innovative research technologies and methodologies, such as proteomics and metabolomics, are critical for the future progress of the science. The purpose of the symposium was to begin the process of communicating new innovative proteomic and metabolomic methodologies that may be applied by researchers in both the nutrition and the natural product communities. This symposium highlighted 2 proteomic approaches, protein fingerprinting in complex mixtures with peptoid microarrays and top-down mass spectrometry for annotation of gene products. Likewise, an overview of the methodologies used in metabolomic profiling of natural products was presented, and an illustration of an integrated metabolomics approach in nutrition research was highlighted.
Applications of physiological bases of ageing to forensic sciences. Estimation of age-at-death.
C Zapico, Sara; Ubelaker, Douglas H
2013-03-01
Age-at-death estimation is one of the main challenges in forensic sciences since it contributes to the identification of individuals. There are many anthropological techniques to estimate the age at death in children and adults. However, in adults this methodology is less accurate and requires population specific references. For that reason, new methodologies have been developed. Biochemical methods are based on the natural process of ageing, which induces different biochemical changes that lead to alterations in cells and tissues. In this review, we describe different attempts to estimate the age in adults based on these changes. Chemical approaches imply modifications in molecules or accumulation of some products. Molecular biology approaches analyze the modifications in DNA and chromosomes. Although the most accurate technique appears to be aspartic acid racemization, it is important to take into account the other techniques because the forensic context and the human remains available will determine the possibility to apply one or another methodology. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Dash, S. M.; York, B. J.; Sinha, N.; Dvorak, F. A.
1987-01-01
An overview of parabolic and PNS (Parabolized Navier-Stokes) methodology developed to treat highly curved sub and supersonic wall jets is presented. The fundamental data base to which these models were applied is discussed in detail. The analysis of strong curvature effects was found to require a semi-elliptic extension of the parabolic modeling to account for turbulent contributions to the normal pressure variations, as well as an extension to the turbulence models utilized, to account for the highly enhanced mixing rates observed in situations with large convex curvature. A noniterative, pressure split procedure is shown to extend parabolic models to account for such normal pressure variations in an efficient manner, requiring minimal additional run time over a standard parabolic approach. A new PNS methodology is presented to solve this problem which extends parabolic methodology via the addition of a characteristic base wave solver. Applications of this approach to analyze the interaction of wave and turbulence processes in wall jets is presented.
Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M
2013-09-01
Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.
Bustos, Alejandro; Rubio, Higinio; Castejón, Cristina; García-Prada, Juan Carlos
2018-03-06
An efficient maintenance is a key consideration in systems of railway transport, especially in high-speed trains, in order to avoid accidents with catastrophic consequences. In this sense, having a method that allows for the early detection of defects in critical elements, such as the bogie mechanical components, is a crucial for increasing the availability of rolling stock and reducing maintenance costs. The main contribution of this work is the proposal of a methodology that, based on classical signal processing techniques, provides a set of parameters for the fast identification of the operating state of a critical mechanical system. With this methodology, the vibratory behaviour of a very complex mechanical system is characterised, through variable inputs, which will allow for the detection of possible changes in the mechanical elements. This methodology is applied to a real high-speed train in commercial service, with the aim of studying the vibratory behaviour of the train (specifically, the bogie) before and after a maintenance operation. The results obtained with this methodology demonstrated the usefulness of the new procedure and allowed for the disclosure of reductions between 15% and 45% in the spectral power of selected Intrinsic Mode Functions (IMFs) after the maintenance operation.
EMD-Based Methodology for the Identification of a High-Speed Train Running in a Gear Operating State
García-Prada, Juan Carlos
2018-01-01
An efficient maintenance is a key consideration in systems of railway transport, especially in high-speed trains, in order to avoid accidents with catastrophic consequences. In this sense, having a method that allows for the early detection of defects in critical elements, such as the bogie mechanical components, is a crucial for increasing the availability of rolling stock and reducing maintenance costs. The main contribution of this work is the proposal of a methodology that, based on classical signal processing techniques, provides a set of parameters for the fast identification of the operating state of a critical mechanical system. With this methodology, the vibratory behaviour of a very complex mechanical system is characterised, through variable inputs, which will allow for the detection of possible changes in the mechanical elements. This methodology is applied to a real high-speed train in commercial service, with the aim of studying the vibratory behaviour of the train (specifically, the bogie) before and after a maintenance operation. The results obtained with this methodology demonstrated the usefulness of the new procedure and allowed for the disclosure of reductions between 15% and 45% in the spectral power of selected Intrinsic Mode Functions (IMFs) after the maintenance operation. PMID:29509690
Juck, D F; Whissell, G; Steven, B; Pollard, W; McKay, C P; Greer, C W; Whyte, L G
2005-02-01
Fluorescent microspheres were applied in a novel fashion during subsurface drilling of permafrost and ground ice in the Canadian High Arctic to monitor the exogenous microbiological contamination of core samples obtained during the drilling process. Prior to each drill run, a concentrated fluorescent microsphere (0.5-microm diameter) solution was applied to the interior surfaces of the drill bit, core catcher, and core tube and allowed to dry. Macroscopic examination in the field demonstrated reliable transfer of the microspheres to core samples, while detailed microscopic examination revealed penetration levels of less than 1 cm from the core exterior. To monitor for microbial contamination during downstream processing of the permafrost and ground ice cores, a Pseudomonas strain expressing the green fluorescent protein (GFP) was painted on the core exterior prior to processing. Contamination of the processed core interiors with the GFP-expressing strain was not detected by culturing the samples or by PCR to detect the gfp marker gene. These methodologies were quick, were easy to apply, and should help to monitor the exogenous microbiological contamination of pristine permafrost and ground ice samples for downstream culture-dependent and culture-independent microbial analyses.
Juck, D. F.; Whissell, G.; Steven, B.; Pollard, W.; McKay, C. P.; Greer, C. W.; Whyte, L. G.
2005-01-01
Fluorescent microspheres were applied in a novel fashion during subsurface drilling of permafrost and ground ice in the Canadian High Arctic to monitor the exogenous microbiological contamination of core samples obtained during the drilling process. Prior to each drill run, a concentrated fluorescent microsphere (0.5-μm diameter) solution was applied to the interior surfaces of the drill bit, core catcher, and core tube and allowed to dry. Macroscopic examination in the field demonstrated reliable transfer of the microspheres to core samples, while detailed microscopic examination revealed penetration levels of less than 1 cm from the core exterior. To monitor for microbial contamination during downstream processing of the permafrost and ground ice cores, a Pseudomonas strain expressing the green fluorescent protein (GFP) was painted on the core exterior prior to processing. Contamination of the processed core interiors with the GFP-expressing strain was not detected by culturing the samples or by PCR to detect the gfp marker gene. These methodologies were quick, were easy to apply, and should help to monitor the exogenous microbiological contamination of pristine permafrost and ground ice samples for downstream culture-dependent and culture-independent microbial analyses. PMID:15691963
Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia
2015-01-01
The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.
Morales-Bayuelo, Alejandro
2016-07-01
Though QSAR was originally developed in the context of physical organic chemistry, it has been applied very extensively to chemicals (drugs) which act on biological systems, in this idea one of the most important QSAR methods is the 3D QSAR model. However, due to the complexity of understanding the results it is necessary to postulate new methodologies to highlight their physical-chemical meaning. In this sense, this work postulates new insights to understand the CoMFA results using molecular quantum similarity and chemical reactivity descriptors within the framework of density functional theory. To obtain these insights a simple theoretical scheme involving quantum similarity (overlap, coulomb operators, their euclidean distances) and chemical reactivity descriptors such as chemical potential (μ), hardness (ɳ), softness (S), electrophilicity (ω), and the Fukui functions, was used to understand the substitution effect. In this sense, this methodology can be applied to analyze the biological activity and the stabilization process in the non-covalent interactions on a particular molecular set taking a reference compound.
NASA Astrophysics Data System (ADS)
Nazir, Mohd Yusuf Mohd; Al-Shorgani, Najeeb Kaid Nasser; Kalil, Mohd Sahaid; Hamid, Aidil Abdul
2015-09-01
In this study, three factors (fructose concentration, agitation speed and monosodium glutamate (MSG) concentration) were optimized to enhance DHA production by Schizochytrium SW1 using response surface methodology (RSM). Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. The experiments were conducted using 500 mL flask with 100 mL working volume at 30°C for 96 hours. ANOVA analysis revealed that the process was adequately represented significantly by the quadratic model (p<0.0001) and two of the factors namely agitation speed and MSG concentration significantly affect DHA production (p<0.005). Level of influence for each variable and quadratic polynomial equation were obtained for DHA production by multiple regression analyses. The estimated optimum conditions for maximizing DHA production by SW1 were 70 g/L fructose, 250 rpm agitation speed and 12 g/L MSG. Consequently, the quadratic model was validated by applying of the estimated optimum conditions, which confirmed the model validity and 52.86% of DHA was produced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
García-Sánchez, Tania; Gómez-Lázaro, Emilio; Muljadi, E.
An alternative approach to characterise real voltage dips is proposed and evaluated in this study. The proposed methodology is based on voltage-space vector solutions, identifying parameters for ellipses trajectories by using the least-squares algorithm applied on a sliding window along the disturbance. The most likely patterns are then estimated through a clustering process based on the k-means algorithm. The objective is to offer an efficient and easily implemented alternative to characterise faults and visualise the most likely instantaneous phase-voltage evolution during events through their corresponding voltage-space vector trajectories. This novel solution minimises the data to be stored but maintains extensivemore » information about the dips including starting and ending transients. The proposed methodology has been applied satisfactorily to real voltage dips obtained from intensive field-measurement campaigns carried out in a Spanish wind power plant up to a time period of several years. A comparison to traditional minimum root mean square-voltage and time-duration classifications is also included in this study.« less
Toward methodological emancipation in applied health research.
Thorne, Sally
2011-04-01
In this article, I trace the historical groundings of what have become methodological conventions in the use of qualitative approaches to answer questions arising from the applied health disciplines and advocate an alternative logic more strategically grounded in the epistemological orientations of the professional health disciplines. I argue for an increasing emphasis on the modification of conventional qualitative approaches to the particular knowledge demands of the applied practice domain, challenging the merits of what may have become unwarranted attachment to theorizing. Reorienting our methodological toolkits toward the questions arising within an evidence-dominated policy agenda, I encourage my applied health disciplinary colleagues to make themselves useful to that larger project by illuminating that which quantitative research renders invisible, problematizing the assumptions on which it generates conclusions, and filling in the gaps in knowledge needed to make decisions on behalf of people and populations.
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.