Methodology for Physics and Engineering of Reliable Products
NASA Technical Reports Server (NTRS)
Cornford, Steven L.; Gibbel, Mark
1996-01-01
Physics of failure approaches have gained wide spread acceptance within the electronic reliability community. These methodologies involve identifying root cause failure mechanisms, developing associated models, and utilizing these models to inprove time to market, lower development and build costs and higher reliability. The methodology outlined herein sets forth a process, based on integration of both physics and engineering principles, for achieving the same goals.
Problem Solving in Biology: A Methodology
ERIC Educational Resources Information Center
Wisehart, Gary; Mandell, Mark
2008-01-01
A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…
Decision-theoretic methodology for reliability and risk allocation in nuclear power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.
1985-01-01
This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less
Principle of maximum entropy for reliability analysis in the design of machine components
NASA Astrophysics Data System (ADS)
Zhang, Yimin
2018-03-01
We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.
1988-09-01
applies to a one Air Transport Rack (ATR) volume LRU in an airborne, uninhabited, fighter environment.) The goal is to have a 2000 hour mean time between...benefits of applying reliability and 11 maintainability improvements to these weapon systems or components. Examples will be given in this research of...where the Pareto Principle applies . The Pareto analysis applies 25 to field failure types as well as to shop defect types. In the following automotive
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Traumatic brain injury: methodological approaches to estimate health and economic outcomes.
Lu, Juan; Roe, Cecilie; Aas, Eline; Lapane, Kate L; Niemeier, Janet; Arango-Lasprilla, Juan Carlos; Andelic, Nada
2013-12-01
The effort to standardize the methodology and adherence to recommended principles for all economic evaluations has been emphasized in medical literature. The objective of this review is to examine whether economic evaluations in traumatic brain injury (TBI) research have been compliant with existing guidelines. Medline search was performed between January 1, 1995 and August 11, 2012. All original TBI-related full economic evaluations were included in the study. Two authors independently rated each study's methodology and data presentation to determine compliance to the 10 methodological principles recommended by Blackmore et al. Descriptive analysis was used to summarize the data. Inter-rater reliability was assessed with Kappa statistics. A total of 28 studies met the inclusion criteria. Eighteen of these studies described cost-effectiveness, seven cost-benefit, and three cost-utility analyses. The results showed a rapid growth in the number of published articles on the economic impact of TBI since 2000 and an improvement in their methodological quality. However, overall compliance with recommended methodological principles of TBI-related economic evaluation has been deficient. On average, about six of the 10 criteria were followed in these publications, and only two articles met all 10 criteria. These findings call for an increased awareness of the methodological standards that should be followed by investigators both in performance of economic evaluation and in reviews of evaluation reports prior to publication. The results also suggest that all economic evaluations should be made by following the guidelines within a conceptual framework, in order to facilitate evidence-based practices in the field of TBI.
Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerner, Boris S.
It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (formore » example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.« less
Design Science Methodology Applied to a Chemical Surveillance Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.
Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less
Risk assessment for construction projects of transport infrastructure objects
NASA Astrophysics Data System (ADS)
Titarenko, Boris
2017-10-01
The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 4 2010-10-01 2010-10-01 false General Principles of Reliability-Based... STANDARDS Pt. 238, App. E Appendix E to Part 238—General Principles of Reliability-Based Maintenance... maintenance programs are based on the following general principles. A failure is an unsatisfactory condition...
NASA Astrophysics Data System (ADS)
Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.
2017-04-01
Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.
Rothenberger, Lillian G; Henschel, Andreas Dirk; Schrey, Dominik; Becker, Andreas; Boos, Joachim
2011-10-01
Due to the new European regulations for pediatric medications, future clinical trials will include an increasing number of minors. It is therefore important to reconsider and evaluate recent methodological and ethical aspects of clinical trials in minors. The following questions were investigated: How are randomized controlled clinical trials (RCTs) performed in practice? Do investigators take into consideration biomedical ethical principles, explicated for example by Beauchamp and Childress, when planning and conducting a trial? The study was conducted in a descriptive manner. A systematic, algorithm-guided search focusing on RCTs in minors with malignant diseases was carried out in PubMed. One-thousand-nine-hundred-sixty-two publications from 2001 to 2005 were randomized in sequence. The first 1,000 publications were screened according to a priori defined inclusion criteria. One hundred seventy-five publications met the criteria and were reviewed using the SIGN methodological checklist (2004), the CONSORT Statement (2001, section Methods, items 3-12) and indicators for ethical aspects. Seventeen publications were checked by two raters. Information on randomization and blinding was often equivocal. The publications were mainly rated positive for the criteria of the SIGN checklist, and mostly rated negative for the additional items of the CONSORT Statement. Regarding the ethical principles, only few contributions were found in the publications. Inter-rater reliability was good. In the publications analyzed, we found only limited information concerning methods and reflections on ethical principles of the trials. Improvements are thus necessary and possible. We suggest how such trials and their respective publications can be optimized for these aspects. Copyright © 2011 Wiley-Liss, Inc.
[Modern principles of the geriatric analysis in medicine].
Volobuev, A N; Zaharova, N O; Romanchuk, N P; Romanov, D V; Romanchuk, P I; Adyshirin-Zade, K A
2016-01-01
The offered methodological principles of the geriatric analysis in medicine enables to plan economic parameters of social protection of the population, necessary amount of medical help financing, to define a structure of the qualified medical personnel training. It is shown that personal health and cognitive longevity of the person depend on the adequate system geriatric analysis and use of biological parameters monitoring in time. That allows estimate efficiency of the combined individual treatment. The geriatric analysis and in particular its genetic-mathematical component aimed at reliability and objectivity of an estimation of the person life expectancy in the country and in region due to the account of influence of mutagen factors as on a gene of the person during his live, and on a population as a whole.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Going DEEP: guidelines for building simulation-based team assessments.
Grand, James A; Pearce, Marina; Rench, Tara A; Chao, Georgia T; Fernandez, Rosemarie; Kozlowski, Steve W J
2013-05-01
Whether for team training, research or evaluation, making effective use of simulation-based technologies requires robust, reliable and accurate assessment tools. Extant literature on simulation-based assessment practices has primarily focused on scenario and instructional design; however, relatively little direct guidance has been provided regarding the challenging decisions and fundamental principles related to assessment development and implementation. The objective of this manuscript is to introduce a generalisable assessment framework supplemented by specific guidance on how to construct and ensure valid and reliable simulation-based team assessment tools. The recommendations reflect best practices in assessment and are designed to empower healthcare educators, professionals and researchers with the knowledge to design and employ valid and reliable simulation-based team assessments. Information and actionable recommendations associated with creating assessments of team processes (non-technical 'teamwork' activities) and performance (demonstration of technical proficiency) are presented which provide direct guidance on how to Distinguish the underlying competencies one aims to assess, Elaborate the measures used to capture team member behaviours during simulation activities, Establish the content validity of these measures and Proceduralise the measurement tools in a way that is systematically aligned with the goals of the simulation activity while maintaining methodological rigour (DEEP). The DEEP framework targets fundamental principles and critical activities that are important for effective assessment, and should benefit healthcare educators, professionals and researchers seeking to design or enhance any simulation-based assessment effort.
Machine tools error characterization and compensation by on-line measurement of artifact
NASA Astrophysics Data System (ADS)
Wahid Khan, Abdul; Chen, Wuyi; Wu, Lili
2009-11-01
Most manufacturing machine tools are utilized for mass production or batch production with high accuracy at a deterministic manufacturing principle. Volumetric accuracy of machine tools depends on the positional accuracy of the cutting tool, probe or end effector related to the workpiece in the workspace volume. In this research paper, a methodology is presented for volumetric calibration of machine tools by on-line measurement of an artifact or an object of a similar type. The machine tool geometric error characterization was carried out through a standard or an artifact, having similar geometry to the mass production or batch production product. The artifact was measured at an arbitrary position in the volumetric workspace with a calibrated Renishaw touch trigger probe system. Positional errors were stored into a computer for compensation purpose, to further run the manufacturing batch through compensated codes. This methodology was found quite effective to manufacture high precision components with more dimensional accuracy and reliability. Calibration by on-line measurement gives the advantage to improve the manufacturing process by use of deterministic manufacturing principle and found efficient and economical but limited to the workspace or envelop surface of the measured artifact's geometry or the profile.
Seeking high reliability in primary care: Leadership, tools, and organization.
Weaver, Robert R
2015-01-01
Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an organization. Progress toward a reliability-seeking, system-oriented approach to care remains ongoing, and movement in that direction requires deliberate and sustained effort by committed leaders in health care.
Multi-viewpoint clustering analysis
NASA Technical Reports Server (NTRS)
Mehrotra, Mala; Wild, Chris
1993-01-01
In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.
NASA Technical Reports Server (NTRS)
Quintana, Rolando
2003-01-01
The goal of this research was to integrate a previously validated and reliable safety model, called Continuous Hazard Tracking and Failure Prediction Methodology (CHTFPM), into a software application. This led to the development of a safety management information system (PSMIS). This means that the theory or principles of the CHTFPM were incorporated in a software package; hence, the PSMIS is referred to as CHTFPM management information system (CHTFPM MIS). The purpose of the PSMIS is to reduce the time and manpower required to perform predictive studies as well as to facilitate the handling of enormous quantities of information in this type of studies. The CHTFPM theory encompasses the philosophy of looking at the concept of safety engineering from a new perspective: from a proactive, than a reactive, viewpoint. That is, corrective measures are taken before a problem instead of after it happened. That is why the CHTFPM is a predictive safety because it foresees or anticipates accidents, system failures and unacceptable risks; therefore, corrective action can be taken in order to prevent all these unwanted issues. Consequently, safety and reliability of systems or processes can be further improved by taking proactive and timely corrective actions.
How to Compute Electron Ionization Mass Spectra from First Principles.
Bauer, Christoph Alexander; Grimme, Stefan
2016-06-02
The prediction of electron ionization (EI) mass spectra (MS) from first principles has been a major challenge for quantum chemistry (QC). The unimolecular reaction space grows rapidly with increasing molecular size. On the one hand, statistical models like Eyring's quasi-equilibrium theory and Rice-Ramsperger-Kassel-Marcus theory have provided valuable insight, and some predictions and quantitative results can be obtained from such calculations. On the other hand, molecular dynamics-based methods are able to explore automatically the energetically available regions of phase space and thus yield reaction paths in an unbiased way. We describe in this feature article the status of both methodologies in relation to mass spectrometry for small to medium sized molecules. We further present results obtained with the QCEIMS program developed in our laboratory. Our method, which incorporates stochastic and dynamic elements, has been a significant step toward the reliable routine calculation of EI mass spectra.
CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology
NASA Technical Reports Server (NTRS)
Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy
2006-01-01
This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.
Durability evaluation of ceramic components using CARES/LIFE
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1994-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens which exhibit SCG when exposed to water.
Durability evaluation of ceramic components using CARES/LIFE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nemeth, N.N.; Janosik, L.A.; Gyekenyesi, J.P.
1996-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength andmore » fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens, which exhibit SCG when exposed to water.« less
NASA Astrophysics Data System (ADS)
Bag, S.; de, A.
2010-09-01
The transport phenomena based heat transfer and fluid flow calculations in weld pool require a number of input parameters. Arc efficiency, effective thermal conductivity, and viscosity in weld pool are some of these parameters, values of which are rarely known and difficult to assign a priori based on the scientific principles alone. The present work reports a bi-directional three-dimensional (3-D) heat transfer and fluid flow model, which is integrated with a real number based genetic algorithm. The bi-directional feature of the integrated model allows the identification of the values of a required set of uncertain model input parameters and, next, the design of process parameters to achieve a target weld pool dimension. The computed values are validated with measured results in linear gas-tungsten-arc (GTA) weld samples. Furthermore, a novel methodology to estimate the overall reliability of the computed solutions is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beenstock, M.; Goldin, E.; Haitovsky, Y.
1997-05-01
The economic cost of power outages is a central parameter in the cost-benefit analysis of electric power reliability and the design of electric power systems. The authors present a new methodology for estimating the cost of power outages in the business and public sections and illustrate with data for Israel. The methodology is based on the principle of revealed preference, the cost of an outage may be inferred from the actions taken by consumers to mitigate losses induced by unsupplied electricity. If outages impose costs on businesses, managers are likely to invest in back-up power to mitigate the losses thatmore » are incurred when electricity is not supplied. Investment in back-up generators may then be used to impute the mitigated and unmitigated damage from outages. 12 refs., 3 figs., 7 tabs.« less
International classification of reliability for implanted cochlear implant receiver stimulators.
Battmer, Rolf-Dieter; Backous, Douglas D; Balkany, Thomas J; Briggs, Robert J S; Gantz, Bruce J; van Hasselt, Andrew; Kim, Chong Sun; Kubo, Takeshi; Lenarz, Thomas; Pillsbury, Harold C; O'Donoghue, Gerard M
2010-10-01
To design an international standard to be used when reporting reliability of the implanted components of cochlear implant systems to appropriate governmental authorities, cochlear implant (CI) centers, and for journal editors in evaluating manuscripts involving cochlear implant reliability. The International Consensus Group for Cochlear Implant Reliability Reporting was assembled to unify ongoing efforts in the United States, Europe, Asia, and Australia to create a consistent and comprehensive classification system for the implanted components of CI systems across manufacturers. All members of the consensus group are from tertiary referral cochlear implant centers. None. A clinically relevant classification scheme adapted from principles of ISO standard 5841-2:2000 originally designed for reporting reliability of cardiac pacemakers, pulse generators, or leads. Standard definitions for device failure, survival time, clinical benefit, reduced clinical benefit, and specification were generated. Time intervals for reporting back to implant centers for devices tested to be "out of specification," categorization of explanted devices, the method of cumulative survival reporting, and content of reliability reports to be issued by manufacturers was agreed upon by all members. The methodology for calculating Cumulative survival was adapted from ISO standard 5841-2:2000. The International Consensus Group on Cochlear Implant Device Reliability Reporting recommends compliance to this new standard in reporting reliability of implanted CI components by all manufacturers of CIs and the adoption of this standard as a minimal reporting guideline for editors of journals publishing cochlear implant research results.
van den Noort, Josien C; Verhagen, Rens; van Dijk, Kees J; Veltink, Peter H; Vos, Michelle C P M; de Bie, Rob M A; Bour, Lo J; Heida, Ciska T
2017-10-01
This proof-of-principle study describes the methodology and explores and demonstrates the applicability of a system, existing of miniature inertial sensors on the hand and a separate force sensor, to objectively quantify hand motor symptoms in patients with Parkinson's disease (PD) in a clinical setting (off- and on-medication condition). Four PD patients were measured in off- and on- dopaminergic medication condition. Finger tapping, rapid hand opening/closing, hand pro/supination, tremor during rest, mental task and kinetic task, and wrist rigidity movements were measured with the system (called the PowerGlove). To demonstrate applicability, various outcome parameters of measured hand motor symptoms of the patients in off- vs. on-medication condition are presented. The methodology described and results presented show applicability of the PowerGlove in a clinical research setting, to objectively quantify hand bradykinesia, tremor and rigidity in PD patients, using a single system. The PowerGlove measured a difference in off- vs. on-medication condition in all tasks in the presented patients with most of its outcome parameters. Further study into the validity and reliability of the outcome parameters is required in a larger cohort of patients, to arrive at an optimal set of parameters that can assist in clinical evaluation and decision-making.
Reliability Centered Maintenance - Methodologies
NASA Technical Reports Server (NTRS)
Kammerer, Catherine C.
2009-01-01
Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.
NASA Technical Reports Server (NTRS)
Muller, Dagmar; Krasemann, Hajo; Brewin, Robert J. W.; Deschamps, Pierre-Yves; Doerffer, Roland; Fomferra, Norman; Franz, Bryan A.; Grant, Mike G.; Groom, Steve B.; Melin, Frederic;
2015-01-01
The Ocean Colour Climate Change Initiative intends to provide a long-term time series of ocean colour data and investigate the detectable climate impact. A reliable and stable atmospheric correction procedure is the basis for ocean colour products of the necessary high quality. In order to guarantee an objective selection from a set of four atmospheric correction processors, the common validation strategy of comparisons between in-situ and satellite derived water leaving reflectance spectra, is extended by a ranking system. In principle, the statistical parameters such as root mean square error, bias, etc. and measures of goodness of fit, are transformed into relative scores, which evaluate the relationship of quality dependent on the algorithms under study. The sensitivity of these scores to the selected database has been assessed by a bootstrapping exercise, which allows identification of the uncertainty in the scoring results. Although the presented methodology is intended to be used in an algorithm selection process, this paper focusses on the scope of the methodology rather than the properties of the individual processors.
Critical appraisal of published economic evaluations of home care for the elderly.
Ramos, Maria Lucia Teixeira; Ferraz, Marcos Bosi; Sesso, Ricardo
2004-01-01
The goal of the study was to appraise the economic evaluations published between 1980 and 2004 of "home care" for the elderly, focusing on the methodological aspects. MEDLINE was searched to identify and assess economic evaluations (defined as an analysis comparing two or more strategies, involving the assessment of both costs and consequences) related to "home care" exclusively for the elderly (65 years or more) and to critically appraise the methodology using five accepted principles used worldwide for conducting economic evaluations. Twenty-four economic evaluations of "home care" for the elderly were identified and the articles were assessed. All five principles were satisfactorily addressed in two studies (8.3%), four principles in four studies (16.7%), three principles in five studies (20.8%), two principles in eight studies (33.3%) and only one principle in five studies (20.8%). A disparity in the methodology of writing economic evaluations compromises the comparisons among outcomes and lately jeopardizes decisions on the choice of the most appropriate healthcare interventions. The methodological principles represent important guidelines but the discussion of the context of the economic evaluation and the special characteristics of some services and populations should be considered for the appropriate use of economic evaluations.
The development of a quality appraisal tool for studies of diagnostic reliability (QAREL).
Lucas, Nicholas P; Macaskill, Petra; Irwig, Les; Bogduk, Nikolai
2010-08-01
In systematic reviews of the reliability of diagnostic tests, no quality assessment tool has been used consistently. The aim of this study was to develop a specific quality appraisal tool for studies of diagnostic reliability. Key principles for the quality of studies of diagnostic reliability were identified with reference to epidemiologic principles, existing quality appraisal checklists, and the Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS) resources. Specific items that encompassed each of the principles were developed. Experts in diagnostic research provided feedback on the items that were to form the appraisal tool. This process was iterative and continued until consensus among experts was reached. The Quality Appraisal of Reliability Studies (QAREL) checklist includes 11 items that explore seven principles. Items cover the spectrum of subjects, spectrum of examiners, examiner blinding, order effects of examination, suitability of the time interval among repeated measurements, appropriate test application and interpretation, and appropriate statistical analysis. QAREL has been developed as a specific quality appraisal tool for studies of diagnostic reliability. The reliability of this tool in different contexts needs to be evaluated. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
NASA Astrophysics Data System (ADS)
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.
2018-04-01
Reliable first-principles calculations of electrochemical processes require accurate prediction of the interfacial capacitance, a challenge for current computationally efficient continuum solvation methodologies. We develop a model for the double layer of a metallic electrode that reproduces the features of the experimental capacitance of Ag(100) in a non-adsorbing, aqueous electrolyte, including a broad hump in the capacitance near the potential of zero charge and a dip in the capacitance under conditions of low ionic strength. Using this model, we identify the necessary characteristics of a solvation model suitable for first-principles electrochemistry of metal surfaces in non-adsorbing, aqueous electrolytes: dielectric and ionic nonlinearity, and a dielectric-only region at the interface. The dielectric nonlinearity, caused by the saturation of dipole rotational response in water, creates the capacitance hump, while ionic nonlinearity, caused by the compactness of the diffuse layer, generates the capacitance dip seen at low ionic strength. We show that none of the previously developed solvation models simultaneously meet all these criteria. We design the nonlinear electrochemical soft-sphere solvation model which both captures the capacitance features observed experimentally and serves as a general-purpose continuum solvation model.
Reliability and Probabilistic Risk Assessment - How They Play Together
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang
2015-01-01
PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will perform its intended function(s) for a specified mission profile. In general, the reliability metric can be calculated through the analyses using reliability demonstration and reliability prediction methodologies. Reliability analysis is very critical for understanding component failure mechanisms and in identifying reliability critical design and process drivers. The following sections discuss the PRA process and reliability engineering in detail and provide an application where reliability analysis and PRA were jointly used in a complementary manner to support a Space Shuttle flight risk assessment.
Rossell, David
2016-01-01
Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040
Reliability modelling and analysis of thermal MEMS
NASA Astrophysics Data System (ADS)
Muratet, Sylvaine; Lavu, Srikanth; Fourniols, Jean-Yves; Bell, George; Desmulliez, Marc P. Y.
2006-04-01
This paper presents a MEMS reliability study methodology based on the novel concept of 'virtual prototyping'. This methodology can be used for the development of reliable sensors or actuators and also to characterize their behaviour in specific use conditions and applications. The methodology is demonstrated on the U-shaped micro electro thermal actuator used as test vehicle. To demonstrate this approach, a 'virtual prototype' has been developed with the modeling tools MatLab and VHDL-AMS. A best practice FMEA (Failure Mode and Effect Analysis) is applied on the thermal MEMS to investigate and assess the failure mechanisms. Reliability study is performed by injecting the identified defaults into the 'virtual prototype'. The reliability characterization methodology predicts the evolution of the behavior of these MEMS as a function of the number of cycles of operation and specific operational conditions.
ERIC Educational Resources Information Center
Taylor, Bryan; Kroth, Michael
2009-01-01
This article creates the Teaching Methodology Instrument (TMI) to help determine the level of adult learning principles being used by a particular teaching methodology in a classroom. The instrument incorporates the principles and assumptions set forth by Malcolm Knowles of what makes a good adult learning environment. The Socratic method as used…
Basic principles, methodology, and applications of remote sensing in agriculture
NASA Technical Reports Server (NTRS)
Moreira, M. A. (Principal Investigator); Deassuncao, G. V.
1984-01-01
The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
Assessing the reliability of ecotoxicological studies: An overview of current needs and approaches.
Moermond, Caroline; Beasley, Amy; Breton, Roger; Junghans, Marion; Laskowski, Ryszard; Solomon, Keith; Zahner, Holly
2017-07-01
In general, reliable studies are well designed and well performed, and enough details on study design and performance are reported to assess the study. For hazard and risk assessment in various legal frameworks, many different types of ecotoxicity studies need to be evaluated for reliability. These studies vary in study design, methodology, quality, and level of detail reported (e.g., reviews, peer-reviewed research papers, or industry-sponsored studies documented under Good Laboratory Practice [GLP] guidelines). Regulators have the responsibility to make sound and verifiable decisions and should evaluate each study for reliability in accordance with scientific principles regardless of whether they were conducted in accordance with GLP and/or standardized methods. Thus, a systematic and transparent approach is needed to evaluate studies for reliability. In this paper, 8 different methods for reliability assessment were compared using a number of attributes: categorical versus numerical scoring methods, use of exclusion and critical criteria, weighting of criteria, whether methods are tested with case studies, domain of applicability, bias toward GLP studies, incorporation of standard guidelines in the evaluation method, number of criteria used, type of criteria considered, and availability of guidance material. Finally, some considerations are given on how to choose a suitable method for assessing reliability of ecotoxicity studies. Integr Environ Assess Manag 2017;13:640-651. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
NASA Astrophysics Data System (ADS)
Brezgin, V. I.; Brodov, Yu M.; Kultishev, A. Yu
2017-11-01
The report presents improvement methods review in the fields of the steam turbine units design and operation based on modern information technologies application. In accordance with the life cycle methodology support, a conceptual model of the information support system during life cycle main stages (LC) of steam turbine unit is suggested. A classifying system, which ensures the creation of sustainable information links between the engineer team (manufacture’s plant) and customer organizations (power plants), is proposed. Within report, the principle of parameterization expansion beyond the geometric constructions at the design and improvement process of steam turbine unit equipment is proposed, studied and justified. The report presents the steam turbine unit equipment design methodology based on the brand new oil-cooler design system that have been developed and implemented by authors. This design system combines the construction subsystem, which is characterized by extensive usage of family tables and templates, and computation subsystem, which includes a methodology for the thermal-hydraulic zone-by-zone oil coolers design calculations. The report presents data about the developed software for operational monitoring, assessment of equipment parameters features as well as its implementation on five power plants.
Typewriting Methodology 1977: Eight Basic Principles for Good Results
ERIC Educational Resources Information Center
Winger, Fred E.
1977-01-01
The eight basic principles of teaching methodology discussed are as follows: Stress position and technique, stress skill building, stress the pretest/practice/posttest method, stress action research, stress true production skills, stress good proofreading skills, stress performance goals, and stress individualized instruction. (TA)
NASA Astrophysics Data System (ADS)
Masuwai, Azwani; Tajudin, Nor'ain Mohd; Saad, Noor Shah
2017-05-01
The purpose of this study is to develop and establish the validity and reliability of an instrument to generate teaching and learning guiding principles using Teaching and Learning Guiding Principles Instrument (TLGPI). Participants consisted of 171 Malaysian teacher educators. It is an essential instrument to reflect in generating the teaching and learning guiding principles in higher education level in Malaysia. Confirmatory Factor Analysis has validated all 19 items of TLGPI whereby all items indicated high reliability and internal consistency. A Confirmatory Factor Analysis also confirmed that a single factor model was used to generate teaching and learning guiding principles.
NASA Technical Reports Server (NTRS)
Chen, Y.; Nguyen, D.; Guertin, S.; Berstein, J.; White, M.; Menke, R.; Kayali, S.
2003-01-01
This paper presents a reliability evaluation methodology to obtain the statistical reliability information of memory chips for space applications when the test sample size needs to be kept small because of the high cost of the radiation hardness memories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less
A Comparison of Two Methods of Determining Interrater Reliability
ERIC Educational Resources Information Center
Fleming, Judith A.; Taylor, Janeen McCracken; Carran, Deborah
2004-01-01
This article offers an alternative methodology for practitioners and researchers to use in establishing interrater reliability for testing purposes. The majority of studies on interrater reliability use a traditional methodology where by two raters are compared using a Pearson product-moment correlation. This traditional method of estimating…
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Industrial inspection of specular surfaces using a new calibration procedure
NASA Astrophysics Data System (ADS)
Aswendt, Petra; Hofling, Roland; Gartner, Soren
2005-06-01
The methodology of phase encoded reflection measurements has become a valuable tool for the industrial inspection of components with glossy surfaces. The measuring principle provides outstanding sensitivity for tiny variations of surface curvature so that sub-micron waviness and flaws are reliably detected. Quantitative curvature measurements can be obtained from a simple approach if the object is almost flat. 3D-objects with a high aspect ratio require more effort to determine both coordinates and normal direction of a surface point unambiguously. Stereoscopic solutions have been reported using more than one camera for a certain surface area. This paper will describe the combined double camera steady surface approach (DCSS) that is well suited for the implementation in industrial testing stations
Selmi, Giuliana da Fontoura Rodrigues; Trapé, Angelo Zanaga
2014-05-01
Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.
The conflict between randomized clinical trials and the therapeutic obligation.
Gifford, F
1986-11-01
The central dilemma concerning randomized clinical trials (RCTs) arises out of some simple facts about causal methodology (RCTs are the best way to generate the reliable causal knowledge necessary for optimally-informed action) and a prima facie plausible principle concerning how physicians should treat their patients (always do what it is most reasonable to believe will be best for the patient). A number of arguments related to this in the literature are considered. Attempts to avoid the dilemma fail. Appeals to informed consent and mechanisms for minimizing the resulting harm are important for policy, but informed consent is problematic and mechanisms for minimization of harm do not address the dilemma. Appeals to some sort of contract model of justification are promising and illuminating.
Flores, Walter
2010-01-01
Governance refers to decision-making processes in which power relationships and actors and institutions' particular interests converge. Situations of consensus and conflict are inherent to such processes. Furthermore, decision-making happens within a framework of ethical principles, motivations and incentives which could be explicit or implicit. Health systems in most Latin-American and Caribbean countries take the principles of equity, solidarity, social participation and the right to health as their guiding principles; such principles must thus rule governance processes. However, this is not always the case and this is where the importance of investigating governance in health systems lies. Making advances in investigating governance involves conceptual and methodological implications. Clarifying and integrating normative and analytical approaches is relevant at conceptual level as both are necessary for an approach seeking to investigate and understand social phenomena's complexity. In relation to methodological level, there is a need to expand the range of variables, sources of information and indicators for studying decision-making aimed to greater equity, health citizenship and public policy efficiency.
PAI-OFF: A new proposal for online flood forecasting in flash flood prone catchments
NASA Astrophysics Data System (ADS)
Schmitz, G. H.; Cullmann, J.
2008-10-01
SummaryThe Process Modelling and Artificial Intelligence for Online Flood Forecasting (PAI-OFF) methodology combines the reliability of physically based, hydrologic/hydraulic modelling with the operational advantages of artificial intelligence. These operational advantages are extremely low computation times and straightforward operation. The basic principle of the methodology is to portray process models by means of ANN. We propose to train ANN flood forecasting models with synthetic data that reflects the possible range of storm events. To this end, establishing PAI-OFF requires first setting up a physically based hydrologic model of the considered catchment and - optionally, if backwater effects have a significant impact on the flow regime - a hydrodynamic flood routing model of the river reach in question. Both models are subsequently used for simulating all meaningful and flood relevant storm scenarios which are obtained from a catchment specific meteorological data analysis. This provides a database of corresponding input/output vectors which is then completed by generally available hydrological and meteorological data for characterizing the catchment state prior to each storm event. This database subsequently serves for training both a polynomial neural network (PoNN) - portraying the rainfall-runoff process - and a multilayer neural network (MLFN), which mirrors the hydrodynamic flood wave propagation in the river. These two ANN models replace the hydrological and hydrodynamic model in the operational mode. After presenting the theory, we apply PAI-OFF - essentially consisting of the coupled "hydrologic" PoNN and "hydrodynamic" MLFN - to the Freiberger Mulde catchment in the Erzgebirge (Ore-mountains) in East Germany (3000 km 2). Both the demonstrated computational efficiency and the prediction reliability underline the potential of the new PAI-OFF methodology for online flood forecasting.
Mission Reliability Estimation for Repairable Robot Teams
NASA Technical Reports Server (NTRS)
Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen
2010-01-01
A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the current design paradigm of building a minimal number of highly robust robots may not be the best way to design robots for extended missions.
Verhaeghe, Nick; Lievens, Delfine; Annemans, Lieven; Vander Laenen, Freya; Putman, Koen
2016-01-01
Alcohol, tobacco, illicit drugs, and psychoactive pharmaceuticals' use is associated with a higher likelihood of developing several diseases and injuries and, as a consequence, considerable health-care expenditures. There is yet a lack of consistent methodologies to estimate the economic impact of addictive substances to society. The aim was to assess the methodological approaches applied in social cost studies estimating the economic impact of alcohol, tobacco, illicit drugs, and psychoactive pharmaceuticals. A systematic literature review through the electronic databases, Medline (PubMed) and Web of Science, was performed. Studies in English published from 1997 examining the social costs of the addictive substances alcohol, tobacco, illicit drugs, and psychoactive pharmaceuticals were eligible for inclusion. Twelve social cost studies met the inclusion criteria. In all studies, the direct and indirect costs were measured, but the intangible costs were seldom taken into account. A wide variety in cost items included across studies was observed. Sensitivity analyses to address the uncertainty around certain cost estimates were conducted in eight studies considered in the review. Differences in cost items included in cost-of-illness studies limit the comparison across studies. It is clear that it is difficult to deal with all consequences of substance use in cost-of-illness studies. Future social cost studies should be based on sound methodological principles in order to result in more reliable cost estimates of the economic burden of substance use.
Using a Principle-Based Method to Support a Disability Aesthetic
ERIC Educational Resources Information Center
Anderson, Bailey
2015-01-01
This article calls choreographers and educators alike to continue building an awareness of methodologies that support a disability aesthetic. A disability aesthetic supports the embodiment of dancers with disabilities by allowing for their bodies to set guidelines of beauty and value. Principle-based work is a methodology that supports a…
Vemić, Ana; Rakić, Tijana; Malenović, Anđelija; Medenica, Mirjana
2015-01-01
The aim of this paper is to present a development of liquid chromatographic method when chaotropic salts are used as mobile phase additives following the QbD principles. The effect of critical process parameters (column chemistry, salt nature and concentration, acetonitrile content and column temperature) on the critical quality attributes (retention of the first and last eluting peak and separation of the critical peak pairs) was studied applying the design of experiments-design space methodology (DoE-DS). D-optimal design is chosen in order to simultaneously examine both categorical and numerical factors in minimal number of experiments. Two ways for the achievement of quality assurance were performed and compared. Namely, the uncertainty originating from the models was assessed by Monte Carlo simulations propagating the error equal to the variance of the model residuals and propagating the error originating from the model coefficients' calculation. The baseline separation of pramipexole and its five impurities is achieved fulfilling all the required criteria while the method validation proved its reliability. Copyright © 2014 Elsevier B.V. All rights reserved.
A Novel Application for the Cavalieri Principle: A Stereological and Methodological Study
Altunkaynak, Berrin Zuhal; Altunkaynak, Eyup; Unal, Deniz; Unal, Bunyamin
2009-01-01
Objective The Cavalieri principle was applied to consecutive pathology sections that were photographed at the same magnification and used to estimate tissue volumes via superimposing a point counting grid on these images. The goal of this study was to perform the Cavalieri method quickly and practically. Materials and Methods In this study, 10 adult female Sprague Dawley rats were used. Brain tissue was removed and sampled both systematically and randomly. Brain volumes were estimated using two different methods. First, all brain slices were scanned with an HP ScanJet 3400C scanner, and their images were shown on a PC monitor. Brain volume was then calculated based on these images. Second, all brain slices were photographed in 10× magnification with a microscope camera, and brain volumes were estimated based on these micrographs. Results There was no statistically significant difference between the volume measurements of the two techniques (P>0.05; Paired Samples t Test). Conclusion This study demonstrates that personal computer scanning of serial tissue sections allows for easy and reliable volume determination based on the Cavalieri method. PMID:25610077
A novel application for the cavalieri principle: a stereological and methodological study.
Altunkaynak, Berrin Zuhal; Altunkaynak, Eyup; Unal, Deniz; Unal, Bunyamin
2009-08-01
The Cavalieri principle was applied to consecutive pathology sections that were photographed at the same magnification and used to estimate tissue volumes via superimposing a point counting grid on these images. The goal of this study was to perform the Cavalieri method quickly and practically. In this study, 10 adult female Sprague Dawley rats were used. Brain tissue was removed and sampled both systematically and randomly. Brain volumes were estimated using two different methods. First, all brain slices were scanned with an HP ScanJet 3400C scanner, and their images were shown on a PC monitor. Brain volume was then calculated based on these images. Second, all brain slices were photographed in 10× magnification with a microscope camera, and brain volumes were estimated based on these micrographs. There was no statistically significant difference between the volume measurements of the two techniques (P>0.05; Paired Samples t Test). This study demonstrates that personal computer scanning of serial tissue sections allows for easy and reliable volume determination based on the Cavalieri method.
A Chip and Pixel Qualification Methodology on Imaging Sensors
NASA Technical Reports Server (NTRS)
Chen, Yuan; Guertin, Steven M.; Petkov, Mihail; Nguyen, Duc N.; Novak, Frank
2004-01-01
This paper presents a qualification methodology on imaging sensors. In addition to overall chip reliability characterization based on sensor s overall figure of merit, such as Dark Rate, Linearity, Dark Current Non-Uniformity, Fixed Pattern Noise and Photon Response Non-Uniformity, a simulation technique is proposed and used to project pixel reliability. The projected pixel reliability is directly related to imaging quality and provides additional sensor reliability information and performance control.
Acar, Nihat; Karakasli, Ahmet; Karaarslan, Ahmet; Mas, Nermin Ng; Hapa, Onur
2017-01-01
Volumetric measurements of benign tumors enable surgeons to trace volume changes during follow-up periods. For a volumetric measurement technique to be applicable, it should be easy, rapid, and inexpensive and should carry a high interobserver reliability. We aimed to assess the interobserver reliability of a volumetric measurement technique using the Cavalier's principle of stereological methods. The computerized tomography (CT) of 15 patients with a histopathologically confirmed diagnosis of enchondroma with variant tumor sizes and localizations was retrospectively reviewed for interobserver reliability evaluation of the volumetric stereological measurement with the Cavalier's principle, V = t × [((SU) × d) /SL]2 × Σ P. The volumes of the 15 tumors collected by the observers are demonstrated in Table 1. There was no statistical significance between the first and second observers ( p = 0.000 and intraclass correlation coefficient = 0.970) and between the first and third observers ( p = 0.000 and intraclass correlation coefficient = 0.981). No statistical significance was detected between the second and third observers ( p = 0.000 and intraclass correlation coefficient = 0.976). The Cavalier's principle with the stereological technique using the CT scans is an easy, rapid, and inexpensive technique in volumetric evaluation of enchondromas with a trustable interobserver reliability.
Harris, Joshua D; Erickson, Brandon J; Cvetanovich, Gregory L; Abrams, Geoffrey D; McCormick, Frank M; Gupta, Anil K; Verma, Nikhil N; Bach, Bernard R; Cole, Brian J
2014-02-01
Condition-specific questionnaires are important components in evaluation of outcomes of surgical interventions. No condition-specific study methodological quality questionnaire exists for evaluation of outcomes of articular cartilage surgery in the knee. To develop a reliable and valid knee articular cartilage-specific study methodological quality questionnaire. Cross-sectional study. A stepwise, a priori-designed framework was created for development of a novel questionnaire. Relevant items to the topic were identified and extracted from a recent systematic review of 194 investigations of knee articular cartilage surgery. In addition, relevant items from existing generic study methodological quality questionnaires were identified. Items for a preliminary questionnaire were generated. Redundant and irrelevant items were eliminated, and acceptable items modified. The instrument was pretested and items weighed. The instrument, the MARK score (Methodological quality of ARticular cartilage studies of the Knee), was tested for validity (criterion validity) and reliability (inter- and intraobserver). A 19-item, 3-domain MARK score was developed. The 100-point scale score demonstrated face validity (focus group of 8 orthopaedic surgeons) and criterion validity (strong correlation to Cochrane Quality Assessment score and Modified Coleman Methodology Score). Interobserver reliability for the overall score was good (intraclass correlation coefficient [ICC], 0.842), and for all individual items of the MARK score, acceptable to perfect (ICC, 0.70-1.000). Intraobserver reliability ICC assessed over a 3-week interval was strong for 2 reviewers (≥0.90). The MARK score is a valid and reliable knee articular cartilage condition-specific study methodological quality instrument. This condition-specific questionnaire may be used to evaluate the quality of studies reporting outcomes of articular cartilage surgery in the knee.
Lange, Toni; Freiberg, Alice; Dröge, Patrik; Lützner, Jörg; Schmitt, Jochen; Kopkow, Christian
2015-06-01
Systematic literature review. Despite their frequent application in routine care, a systematic review on the reliability of clinical examination tests to evaluate the integrity of the ACL is missing. To summarize and evaluate intra- and interrater reliability research on physical examination tests used for the diagnosis of ACL tears. A comprehensive systematic literature search was conducted in MEDLINE, EMBASE and AMED until May 30th 2013. Studies were included if they assessed the intra- and/or interrater reliability of physical examination tests for the integrity of the ACL. Methodological quality was evaluated with the Quality Appraisal of Reliability Studies (QAREL) tool by two independent reviewers. 110 hits were achieved of which seven articles finally met the inclusion criteria. These studies examined the reliability of four physical examination tests. Intrarater reliability was assessed in three studies and ranged from fair to almost perfect (Cohen's k = 0.22-1.00). Interrater reliability was assessed in all included studies and ranged from slight to almost perfect (Cohen's k = 0.02-0.81). The Lachman test is the physical tests with the highest intrarater reliability (Cohen's k = 1.00), the Lachman test performed in prone position the test with the highest interrater reliability (Cohen's k = 0.81). Included studies were partly of low methodological quality. A meta-analysis could not be performed due to the heterogeneity in study populations, reliability measures and methodological quality of included studies. Systematic investigations on the reliability of physical examination tests to assess the integrity of the ACL are scarce and of varying methodological quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
A study of adaptation mechanisms based on ABR recorded at high stimulation rate.
Valderrama, Joaquin T; de la Torre, Angel; Alvarez, Isaac; Segura, Jose Carlos; Thornton, A Roger D; Sainz, Manuel; Vargas, Jose Luis
2014-04-01
This paper analyzes the fast and slow mechanisms of adaptation through a study of latencies and amplitudes on ABR recorded at high stimulation rates using the randomized stimulation and averaging (RSA) technique. The RSA technique allows a separate processing of auditory responses, and is used, in this study, to categorize responses according to the interstimulus interval (ISI) of their preceding stimulus. The fast and slow mechanisms of adaptation are analyzed by the separated responses methodology, whose underlying principles and mathematical basis are described in detail. The morphology of the ABR is influenced by both fast and slow mechanisms of adaptation. These results are consistent with previous animal studies based on spike rate. Both fast and slow mechanisms of adaptation are present in all subjects. In addition, the distribution of the jitter and the sequencing of the stimuli may be critical parameters when obtaining reliable ABRs. The separated responses methodology enables for the first time the analysis of the fast and slow mechanisms of adaptation in ABR obtained at stimulation rates greater than 100 Hz. The non-invasive nature of this methodology is appropriate for its use in humans. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Rater methodology for stroboscopy: a systematic review.
Bonilha, Heather Shaw; Focht, Kendrea L; Martin-Harris, Bonnie
2015-01-01
Laryngeal endoscopy with stroboscopy (LES) remains the clinical gold standard for assessing vocal fold function. LES is used to evaluate the efficacy of voice treatments in research studies and clinical practice. LES as a voice treatment outcome tool is only as good as the clinician interpreting the recordings. Research using LES as a treatment outcome measure should be evaluated based on rater methodology and reliability. The purpose of this literature review was to evaluate the rater-related methodology from studies that use stroboscopic findings as voice treatment outcome measures. Systematic literature review. Computerized journal databases were searched for relevant articles using terms: stroboscopy and treatment. Eligible articles were categorized and evaluated for the use of rater-related methodology, reporting of number of raters, types of raters, blinding, and rater reliability. Of the 738 articles reviewed, 80 articles met inclusion criteria. More than one-third of the studies included in the review did not report the number of raters who participated in the study. Eleven studies reported results of rater reliability analysis with only two studies reporting good inter- and intrarater reliability. The comparability and use of results from treatment studies that use LES are limited by a lack of rigor in rater methodology and variable, mostly poor, inter- and intrarater reliability. To improve our ability to evaluate and use the findings from voice treatment studies that use LES features as outcome measures, greater consistency of reporting rater methodology characteristics across studies and improved rater reliability is needed. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
First-principles study of metallic iron interfaces
NASA Astrophysics Data System (ADS)
Hung, A.; Yarovsky, I.; Muscat, J.; Russo, S.; Snook, I.; Watts, R. O.
2002-04-01
Adhesion between clean, bulk-terminated bcc Fe(1 0 0) and Fe(1 1 0) matched and mismatched surfaces was simulated within the theoretical framework of the density functional theory. The generalized-gradient spin approximation exchange-correlation functional was used in conjunction with a plane wave-ultrasoft pseudopotential representation. The structure and properties of bulk bcc Fe were calculated in order to establish the reliability of the methodology employed, as well as to determine suitably converged values of computational parameters to be used in subsequent surface calculations. Interfaces were modelled using a single supercell approach, with the interfacial separation distance manipulated by the size of vacuum separation between vertically adjacent surface cells. The adhesive energies at discrete interfacial separations were calculated for each interface and the resulting data fitted to the universal binding energy relation (UBER) of Rose et al. [Phys. Rev. Lett. 47 (1981) 675]. An interpretation of the values of the fitted UBER parameters for the four Fe interfaces studied is given. In addition, a discussion on the validity of the employed computational methodology is presented.
[Ethical considerations about research with women in situations of violence].
Rafael, Ricardo de Mattos Russo; Soares de Moura, Anna Tereza Miranda
2013-01-01
This essay aims at reflecting on the ethical and methodological principles involved in research with women in situation of violence. The text raises the discussion of the application of the principles of beneficence and non-maleficence during researches involving this issue, pointing to recommendations towards privacy, autonomy and immediate contributions for volunteers. Then, taking as theoretical reference the principles of justice and equity, the authors propose a debate on methodological aspects involved in protection of respondents, with a view at improving the quality of the data obtained and possible social contributions.
Characterizing the reliability of a bioMEMS-based cantilever sensor
NASA Astrophysics Data System (ADS)
Bhalerao, Kaustubh D.
2004-12-01
The cantilever-based BioMEMS sensor represents one instance from many competing ideas of biosensor technology based on Micro Electro Mechanical Systems. The advancement of BioMEMS from laboratory-scale experiments to applications in the field will require standardization of their components and manufacturing procedures as well as frameworks to evaluate their performance. Reliability, the likelihood with which a system performs its intended task, is a compact mathematical description of its performance. The mathematical and statistical foundation of systems-reliability has been applied to the cantilever-based BioMEMS sensor. The sensor is designed to detect one aspect of human ovarian cancer, namely the over-expression of the folate receptor surface protein (FR-alpha). Even as the application chosen is clinically motivated, the objective of this study was to demonstrate the underlying systems-based methodology used to design, develop and evaluate the sensor. The framework development can be readily extended to other BioMEMS-based devices for disease detection and will have an impact in the rapidly growing $30 bn industry. The Unified Modeling Language (UML) is a systems-based framework for design and development of object-oriented information systems which has potential application for use in systems designed to interact with biological environments. The UML has been used to abstract and describe the application of the biosensor, to identify key components of the biosensor, and the technology needed to link them together in a coherent manner. The use of the framework is also demonstrated in computation of system reliability from first principles as a function of the structure and materials of the biosensor. The outcomes of applying the systems-based framework to the study are the following: (1) Characterizing the cantilever-based MEMS device for disease (cell) detection. (2) Development of a novel chemical interface between the analyte and the sensor that provides a degree of selectivity towards the disease. (3) Demonstrating the performance and measuring the reliability of the biosensor prototype, and (4) Identification of opportunities in technological development in order to further refine the proposed biosensor. Application of the methodology to design develop and evaluate the reliability of BioMEMS devices will be beneficial in the streamlining the growth of the BioMEMS industry, while providing a decision-support tool in comparing and adopting suitable technologies from available competing options.
System principles, mathematical models and methods to ensure high reliability of safety systems
NASA Astrophysics Data System (ADS)
Zaslavskyi, V.
2017-04-01
Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.
Thomson, Hilary
2013-08-01
Systematic reviews have the potential to promote knowledge exchange between researchers and decision-makers. Review planning requires engagement with evidence users to ensure preparation of relevant reviews, and well-conducted reviews should provide accessible and reliable synthesis to support decision-making. Yet, systematic reviews are not routinely referred to by decision-makers, and innovative approaches to improve the utility of reviews is needed. Evidence synthesis for healthy public policy is typically complex and methodologically challenging. Although not lessening the value of reviews, these challenges can be overwhelming and threaten their utility. Using the interrelated principles of relevance, rigor, and readability, and in light of available resources, this article considers how utility of evidence synthesis for healthy public policy might be improved.
2013-01-01
Systematic reviews have the potential to promote knowledge exchange between researchers and decision-makers. Review planning requires engagement with evidence users to ensure preparation of relevant reviews, and well-conducted reviews should provide accessible and reliable synthesis to support decision-making. Yet, systematic reviews are not routinely referred to by decision-makers, and innovative approaches to improve the utility of reviews is needed. Evidence synthesis for healthy public policy is typically complex and methodologically challenging. Although not lessening the value of reviews, these challenges can be overwhelming and threaten their utility. Using the interrelated principles of relevance, rigor, and readability, and in light of available resources, this article considers how utility of evidence synthesis for healthy public policy might be improved. PMID:23763400
NASA-Ames workload research program
NASA Technical Reports Server (NTRS)
Hart, Sandra
1988-01-01
Research has been underway for several years to develop valid and reliable measures and predictors of workload as a function of operator state, task requirements, and system resources. Although the initial focus of this research was on aeronautics, the underlying principles and methodologies are equally applicable to space, and provide a set of tools that NASA and its contractors can use to evaluate design alternatives from the perspective of the astronauts. Objectives and approach of the research program are described, as well as the resources used in conducting research and the conceptual framework around which the program evolved. Next, standardized tasks are described, in addition to predictive models and assessment techniques and their application to the space program. Finally, some of the operational applications of these tasks and measures are reviewed.
High reliability organizing implementation at Sequoia and Kings Canyon National Parks
David A. Christenson; Mike DeGrosky; Anne E. Black; Brett Fay
2008-01-01
It is said that action often precedes cognition. For example, wildland fire management personnel already do things in the course of their work that they will later recognize as consistent with the principles of high reliability organizing (HRO), once they know about those principles. In the case of Sequoia and Kings Canyon National Parks (SEKI), the fire management...
ERIC Educational Resources Information Center
Renard, Colette; And Others
Principles of the "St. Cloud" audiovisual language instruction methodology based on "Le Francais fondamental" are presented in this guide for teachers. The material concentrates on course content, methodology, and application--including criteria for selection and gradation of course content, a description of the audiovisual and written language…
Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang
2017-01-01
The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System's underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored.
Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang
2017-01-01
The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242
Davenport, Paul B; Carter, Kimberly F; Echternach, Jeffrey M; Tuck, Christopher R
2018-02-01
High-reliability organizations (HROs) demonstrate unique and consistent characteristics, including operational sensitivity and control, situational awareness, hyperacute use of technology and data, and actionable process transformation. System complexity and reliance on information-based processes challenge healthcare organizations to replicate HRO processes. This article describes a healthcare organization's 3-year journey to achieve key HRO features to deliver high-quality, patient-centric care via an operations center powered by the principles of high-reliability data and software to impact patient throughput and flow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia
2015-04-26
Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), amore » systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.« less
Developing a Questionnaire for Iranian Women's Attitude on Medical Ethics in Vaginal Childbirth.
Mirzaee Rabor, Firoozeh; Taghipour, Ali; Mirzaee, Moghaddameh; Mirzaii Najmabadi, Khadigeh; Fazilat Pour, Masoud; Fattahi Masoum, Seyed Hosein
2015-12-01
Vaginal delivery is one of the challenging issues in medical ethics. It is important to use an appropriate instrument to assess medical ethics attitudes in normal delivery, but the lack of tool for this purpose is clear. The aim of this study was to develop and validate a questionnaire for the assessment of women's attitude on medical ethics application in normal vaginal delivery. This methodological study was carried out in Iran in 2013 - 2014. Medical ethics attitude in vaginal delivery questionnaire (MEAVDQ) was developed using the findings of a qualitative data obtained from a grounded theory research conducted on 20 women who had vaginal childbirth, in the first phase. Then, the validation criteria of this tool were tested by content and face validity in the second phase. Exploratory factor analysis was used for construct validity and reliability was also tested by Cronbach's alpha coefficient in the third phase of this study. SPSS version 13 was used in this study. The sample size for construct validity was 250 females who had normal vaginal childbirth. In the first phase of this study (tool development), by the use of four obtained categories and nine subcategories from grounded theory and literature review, three parts (98-items) of this tool were obtained (A, B and J). Part A explained the first principle of medical ethics, part B pointed to the second and third principles of medical ethics, and part J explained the fourth principle of medical ethics. After evaluating and confirming its face and content validity, 75 items remained in the questionnaire. In construct validity, by the employment of exploratory factor analysis, in parts A, B and J, 3, 7 and 3 factors were formed, respectively; and 62.8%, 64% and 51% of the total variances were explained by the obtained factors in parts A, B and J, respectively. The names of these factors in the three parts were achieved by consideration of the loading factor and medical ethics principles. The subscales of MEAVDQ showed significant reliability. In parts A, B and J, Cronbach's alpha coefficients were 0.76, 0.72 and 0.68, respectively and for the total questionnaire, it was 0.72. The results of the test-retest were satisfactory for all the items (ICC = 0.60 - 0.95). The present study showed that the 59-item MEAVDQ was a valid and reliable questionnaire for the assessment of women's attitudes toward medical ethics application in vaginal childbirth. This tool might assist specialists in making a judgment and plan appropriate for women in vaginal delivery management.
Code of Federal Regulations, 2010 CFR
2010-01-01
... methods and principles of accounting prescribed by the state regulatory body having jurisdiction over the... telecommunications companies (47 CFR part 32), as those methods and principles of accounting are supplemented from... instruments by prescribing accounting principles, methodologies, and procedures applicable to all...
Design for reliability: NASA reliability preferred practices for design and test
NASA Technical Reports Server (NTRS)
Lalli, Vincent R.
1994-01-01
This tutorial summarizes reliability experience from both NASA and industry and reflects engineering practices that support current and future civil space programs. These practices were collected from various NASA field centers and were reviewed by a committee of senior technical representatives from the participating centers (members are listed at the end). The material for this tutorial was taken from the publication issued by the NASA Reliability and Maintainability Steering Committee (NASA Reliability Preferred Practices for Design and Test. NASA TM-4322, 1991). Reliability must be an integral part of the systems engineering process. Although both disciplines must be weighed equally with other technical and programmatic demands, the application of sound reliability principles will be the key to the effectiveness and affordability of America's space program. Our space programs have shown that reliability efforts must focus on the design characteristics that affect the frequency of failure. Herein, we emphasize that these identified design characteristics must be controlled by applying conservative engineering principles.
Applying Lean principles and Kaizen rapid improvement events in public health practice.
Smith, Gene; Poteat-Godwin, Annah; Harrison, Lisa Macon; Randolph, Greg D
2012-01-01
This case study describes a local home health and hospice agency's effort to implement Lean principles and Kaizen methodology as a rapid improvement approach to quality improvement. The agency created a cross-functional team, followed Lean Kaizen methodology, and made significant improvements in scheduling time for home health nurses that resulted in reduced operational costs, improved working conditions, and multiple organizational efficiencies.
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
On a methodology for robust segmentation of nonideal iris images.
Schmid, Natalia A; Zuo, Jinyu
2010-06-01
Iris biometric is one of the most reliable biometrics with respect to performance. However, this reliability is a function of the ideality of the data. One of the most important steps in processing nonideal data is reliable and precise segmentation of the iris pattern from remaining background. In this paper, a segmentation methodology that aims at compensating various nonidealities contained in iris images during segmentation is proposed. The virtue of this methodology lies in its capability to reliably segment nonideal imagery that is simultaneously affected with such factors as specular reflection, blur, lighting variation, occlusion, and off-angle images. We demonstrate the robustness of our segmentation methodology by evaluating ideal and nonideal data sets, namely, the Chinese Academy of Sciences iris data version 3 interval subdirectory, the iris challenge evaluation data, the West Virginia University (WVU) data, and the WVU off-angle data. Furthermore, we compare our performance to that of our implementation of Camus and Wildes's algorithm and Masek's algorithm. We demonstrate considerable improvement in segmentation performance over the formerly mentioned algorithms.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.
Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever
2015-01-01
In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.
Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique
Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever
2015-01-01
In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method. PMID:25948132
Yusuf, Afiqah; Elsabbagh, Mayada
2015-12-15
Identifying biomarkers for autism can improve outcomes for those affected by autism. Engaging the diverse stakeholders in the research process using community-based participatory research (CBPR) can accelerate biomarker discovery into clinical applications. However, there are limited examples of stakeholder involvement in autism research, possibly due to conceptual and practical concerns. We evaluate the applicability of CBPR principles to biomarker discovery in autism and critically review empirical studies adopting these principles. Using a scoping review methodology, we identified and evaluated seven studies using CBPR principles in biomarker discovery. The limited number of studies in biomarker discovery adopting CBPR principles coupled with their methodological limitations suggests that such applications are feasible but challenging. These studies illustrate three CBPR themes: community assessment, setting global priorities, and collaboration in research design. We propose that further research using participatory principles would be useful in accelerating the pace of discovery and the development of clinically meaningful biomarkers. For this goal to be successful we advocate for increased attention to previously identified conceptual and methodological challenges to participatory approaches in health research, including improving scientific rigor and developing long-term partnerships among stakeholders.
ADM guidance-Ceramics: Fracture toughness testing and method selection.
Cesar, Paulo Francisco; Della Bona, Alvaro; Scherrer, Susanne S; Tholey, Michael; van Noort, Richard; Vichi, Alessandro; Kelly, Robert; Lohbauer, Ulrich
2017-06-01
The objective is within the scope of the Academy of Dental Materials Guidance Project, which is to provide dental materials researchers with a critical analysis of fracture toughness (FT) tests such that the assessment of the FT of dental ceramics is conducted in a reliable, repeatable and reproducible way. Fracture mechanics theory and FT methodologies were critically reviewed to introduce basic fracture principles and determine the main advantages and disadvantages of existing FT methods from the standpoint of the dental researcher. The recommended methods for FT determination of dental ceramics were the Single Edge "V" Notch Beam (SEVNB), Single Edge Precracked Beam (SEPB), Chevron Notch Beam (CNB), and Surface Crack in Flexure (SCF). SEVNB's main advantage is the ease of producing the notch via a cutting disk, SEPB allows for production of an atomically sharp crack generated by a specific precracking device, CNB is technically difficult, but based on solid fracture mechanics solutions, and SCF involves fracture from a clinically sized precrack. The IF test should be avoided due to heavy criticism that has arisen in the engineering field regarding the empirical nature of the calculations used for FT determination. Dental researchers interested in FT measurement of dental ceramics should start with a broad review of fracture mechanics theory to understand the underlying principles involved in fast fracture of ceramics. The choice of FT methodology should be based on the pros and cons of each test, as described in this literature review. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Adogwa, Owoicho; Elsamadicy, Aladine A; Cheng, Joseph; Bagley, Carlos
2016-03-01
The prospective acquisition of reliable patient-reported outcomes (PROs) measures demonstrating the effectiveness of spine surgery, or lack thereof, remains a challenge. The aims of this study are to compare the reliability of functional outcomes metrics obtained using full time employee (FTE) vs. non-FTE-dependent methodologies and to determine the independent predictors of response reliability using non FTE-dependent methodologies. One hundred and nineteen adult patients (male: 65, female: 54) undergoing one- and two-level lumbar fusions at Duke University Medical Center were enrolled in this prospective study. Enrollment criteria included available demographic, clinical and baseline functional outcomes data. All patients were administered two similar sets of baseline questionnaires-(I) phone interviews (FTE-dependent) and (II) hardcopy in clinic (patient self-survey, non-FTE-dependent). All patients had at least a two-week washout period between phone interviews and in-clinic self-surveys to minimize effect of recall. Questionnaires included Oswestry disability index (ODI) and Visual Analog Back and Leg Pain Scale (VAS-BP/LP). Reliability was assessed by the degree to which patient responses to baseline questionnaires differed between both time points. About 26.89% had a history an anxiety disorder and 28.57% reported a history of depression. At least 97.47% of patients had a High School Diploma or GED, with 49.57% attaining a 4-year college degree or post-graduate degree. 29.94% reported full-time employment and 14.28% were on disability. There was a very high correlation between baseline PRO's data captured between FTE-dependent compared to non-FTE-dependent methodologies (r=0.89). In a multivariate logistic regression model, the absence of anxiety and depression, higher levels of education (college or greater) and full-time employment, were independently associated with high response reliability using non-FTE-dependent methodologies. Our study suggests that capturing health-related quality of life data using non-FTE-dependent methodologies is highly reliable and maybe a more cost-effective alternative. Well-educated patients who are employed full-time appear to be the most reliable.
Deliberate Imagery Practice: The Reliability of Using a Retrospective Recall Methodology
ERIC Educational Resources Information Center
Cumming, Jennifer; Hall, Craig; Starkes, Janet L.
2005-01-01
This study examined the reliability of a retrospective recall methodology for providing evidence of deliberate imagery practice. A secondary purpose was to determine which imagery activities constituted the sport-specific definition of deliberate practice (Starkes, Deakin, Allard, Hodges, & Hayes, 1996). Ninety-three Canadian athletes from one…
Duong, Thien C.; Hackenberg, Robert E.; Landa, Alex; ...
2016-09-20
In this paper, thermodynamic and kinetic diffusivities of uranium–niobium (U–Nb) are re-assessed by means of the CALPHAD (CALculation of PHAse Diagram) methodology. In order to improve the consistency and reliability of the assessments, first-principles calculations are coupled with CALPHAD. In particular, heats of formation of γ -U–Nb are estimated and verified using various density-functional theory (DFT) approaches. These thermochemistry data are then used as constraints to guide the thermodynamic optimization process in such a way that the mutual-consistency between first-principles calculations and CALPHAD assessment is satisfactory. In addition, long-term aging experiments are conducted in order to generate new phase equilibriamore » data at the γ 2/α+γ 2 boundary. These data are meant to verify the thermodynamic model. Assessment results are generally in good agreement with experiments and previous calculations, without showing the artifacts that were observed in previous modeling. The mutual-consistent thermodynamic description is then used to evaluate atomic mobility and diffusivity of γ-U–Nb. Finally, Bayesian analysis is conducted to evaluate the uncertainty of the thermodynamic model and its impact on the system's phase stability.« less
The reliability of the Glasgow Coma Scale: a systematic review.
Reith, Florence C M; Van den Brande, Ruben; Synnot, Anneliese; Gruen, Russell; Maas, Andrew I R
2016-01-01
The Glasgow Coma Scale (GCS) provides a structured method for assessment of the level of consciousness. Its derived sum score is applied in research and adopted in intensive care unit scoring systems. Controversy exists on the reliability of the GCS. The aim of this systematic review was to summarize evidence on the reliability of the GCS. A literature search was undertaken in MEDLINE, EMBASE and CINAHL. Observational studies that assessed the reliability of the GCS, expressed by a statistical measure, were included. Methodological quality was evaluated with the consensus-based standards for the selection of health measurement instruments checklist and its influence on results considered. Reliability estimates were synthesized narratively. We identified 52 relevant studies that showed significant heterogeneity in the type of reliability estimates used, patients studied, setting and characteristics of observers. Methodological quality was good (n = 7), fair (n = 18) or poor (n = 27). In good quality studies, kappa values were ≥0.6 in 85%, and all intraclass correlation coefficients indicated excellent reliability. Poor quality studies showed lower reliability estimates. Reliability for the GCS components was higher than for the sum score. Factors that may influence reliability include education and training, the level of consciousness and type of stimuli used. Only 13% of studies were of good quality and inconsistency in reported reliability estimates was found. Although the reliability was adequate in good quality studies, further improvement is desirable. From a methodological perspective, the quality of reliability studies needs to be improved. From a clinical perspective, a renewed focus on training/education and standardization of assessment is required.
Integrated Design Methodology for Highly Reliable Liquid Rocket Engine
NASA Astrophysics Data System (ADS)
Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro
The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.
18 CFR 301.6 - Appendix 1 instructions.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., DEPARTMENT OF ENERGY REGULATIONS FOR FEDERAL POWER MARKETING ADMINISTRATIONS AVERAGE SYSTEM COST METHODOLOGY... must be in accord with Generally Accepted Accounting Principles and Practices as these principles and...
What really separates casuistry from principlism in biomedical ethics.
Cudney, Paul
2014-06-01
Since the publication of the first edition of Tom Beauchamp and James Childress's Principles of Biomedical Ethics there has been much debate about what a proper method in medical ethics should look like. The main rival for Beauchamp and Childress's account, principlism, has consistently been casuistry, an account that recommends argument by analogy from paradigm cases. Admirably, Beauchamp and Childress have modified their own view in successive editions of Principles of Biomedical Ethics in order to address the concerns proponents of casuistry and others have had about principlism. Given these adjustments to their view, some have claimed that principlism and casuistry no longer count as distinct methods. Even so, many still consider these two conceptions of bioethical methodologies as rivals. Both accounts of the relationship between casuistry and principlism are wrong. These two conceptions of methodology in biomedical ethics are significantly different, but the differences are not the ones pointed out by those who still claim that they are distinct positions. In this article, I explain where the real similarities and differences lie between these two views.
Beyond Objectivity and Subjectivity: The Intersubjective Foundations of Psychological Science.
Mascolo, Michael F
2016-12-01
The question of whether psychology can properly be regarded as a science has long been debated (Smedslund in Integrative Psychological & Behavioral Science, 50, 185-195, 2016). Science is typically understood as a method for producing reliable knowledge by testing falsifiable claims against objective evidence. Psychological phenomena, however, are traditionally taken to be "subjective" and hidden from view. To the extent that science relies upon objective observation, is a scientific psychology possible? In this paper, I argue that scientific psychology does not much fail to meet the requirements of objectivity as much as the concept of objectivity fails as a methodological principle for psychological science. The traditional notion of objectivity relies upon the distinction between a public, observable exterior and a private, subjective interior. There are good reasons, however, to reject this dichotomy. Scholarship suggests that psychological knowledge arises neither from the "inside out" (subjectively) nor from the outside-in (objectively), but instead intersubjective processes that occur between people. If this is so, then objectivist methodology may do more to obscure than illuminate our understanding of psychological functioning. From this view, we face a dilemma: Do we, in the name of science, cling to an objective epistemology that cuts us off from the richness of psychological activity? Or do we seek to develop a rigorous intersubjective psychology that exploits the processes through which we gain psychological knowledge in the first place? If such a psychology can produce systematic, reliable and useful knowledge, then the question of whether its practices are "scientific" in the traditional sense would become irrelevant.
Predicting the Reliability of Ceramics Under Transient Loads and Temperatures With CARES/Life
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
2003-01-01
A methodology is shown for predicting the time-dependent reliability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The methodology takes into account the changes in material response that can occur with temperature or time (i.e., changing fatigue and Weibull parameters with temperature or time). This capability has been added to the NASA CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. The code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
The design and methodology of premature ejaculation interventional studies
2016-01-01
Large well-designed clinical efficacy and safety randomized clinical trials (RCTs) are required to achieve regulatory approval of new drug treatments. The objective of this article is to make recommendations for the criteria for defining and selecting the clinical trial study population, design and efficacy outcomes measures which comprise ideal premature ejaculation (PE) interventional trial methodology. Data on clinical trial design, epidemiology, definitions, dimensions and psychological impact of PE was reviewed, critiqued and incorporated into a series of recommendations for standardisation of PE clinical trial design, outcome measures and reporting using the principles of evidence based medicine. Data from PE interventional studies are only reliable, interpretable and capable of being generalised to patients with PE, when study populations are defined by the International Society for Sexual Medicine (ISSM) multivariate definition of PE. PE intervention trials should employ a double-blind RCT methodology and include placebo control, active standard drug control, and/or dose comparison trials. Ejaculatory latency time (ELT) and subject/partner outcome measures of control, personal/partner/relationship distress and other study-specific outcome measures should be used as outcome measures. There is currently no published literature which identifies a clinically significant threshold response to intervention. The ISSM definition of PE reflects the contemporary understanding of PE and represents the state-of-the-art multi-dimensional definition of PE and is recommended as the basis of diagnosis of PE for all PE clinical trials. PMID:27652224
Evaluation methodologies for an advanced information processing system
NASA Technical Reports Server (NTRS)
Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.
1984-01-01
The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.
Assessing the Infusion of Sustainability Principles into University Curricula
ERIC Educational Resources Information Center
Biasutti, Michele; De Baz, Theodora; Alshawa, Hala
2016-01-01
The current paper presents the assessment of the infusion of sustainability principles into university curricula at two Jordanian universities. The peer review process of revising the curricula infusing sustainability principles is also discussed. The research methodology involved quantitative methods to assess the revised courses. The results…
17 CFR 38.1050 - Core Principle 20.
Code of Federal Regulations, 2014 CFR
2014-04-01
... automated systems, that are reliable, secure, and have adequate scalable capacity; (b) Establish and... CONTRACT MARKETS System Safeguards § 38.1050 Core Principle 20. Each designated contract market shall: (a...
17 CFR 38.1050 - Core Principle 20.
Code of Federal Regulations, 2013 CFR
2013-04-01
... automated systems, that are reliable, secure, and have adequate scalable capacity; (b) Establish and... CONTRACT MARKETS System Safeguards § 38.1050 Core Principle 20. Each designated contract market shall: (a...
17 CFR 37.1400 - Core Principle 14-System safeguards.
Code of Federal Regulations, 2014 CFR
2014-04-01
... procedures, and automated systems, that: (1) Are reliable and secure; and (2) Have adequate scalable capacity... 17 Commodity and Securities Exchanges 1 2014-04-01 2014-04-01 false Core Principle 14-System... SWAP EXECUTION FACILITIES System Safeguards § 37.1400 Core Principle 14—System safeguards. The swap...
7 CFR 1767.13 - Departures from the prescribed RUS Uniform System of Accounts.
Code of Federal Regulations, 2010 CFR
2010-01-01
... accounting methodologies and principles that depart from the provisions herein; or (2) File with such... borrower's rates, based upon accounting methods and principles inconsistent with the provisions of this... accounting methods or principles for the borrower that are inconsistent with the provisions of this part, the...
NASA Astrophysics Data System (ADS)
Lee, Chang-Chun; Huang, Pei-Chen
2018-05-01
The long-term reliability of multi-stacked coatings suffering the bending or rolling load was a severe challenge to extend the lifespan of foregoing structure. In addition, the adhesive strength of dissimilar materials was regarded as the major mechanical reliability concerns among multi-stacked films. However, the significant scale-mismatch from several nano-meter to micro-meter among the multi-stacked coatings causing the numerical accuracy and converged capability issues on fracture-based simulation approach. For those reasons, this study proposed the FEA-based multi-level submodeling and multi-point constraint (MPC) technique to conquer the foregoing scale-mismatch issue. The results indicated that the decent region of first and second-order submodeling can achieve the small error of 1.27% compared with the experimental result and significantly reduced the mesh density and computing time. Moreover, the MPC method adopted in FEA simulation also shown only 0.54% error when the boundary of selected local region was away the concerned critical region following the Saint-Venant principle. In this investigation, two FEA-based approaches were used to conquer the evidently scale mismatch issue when the adhesive strengths of micro and nano-scale multi-stacked coating were taken into account.
Development of an interprofessional lean facilitator assessment scale.
Bravo-Sanchez, Cindy; Dorazio, Vincent; Denmark, Robert; Heuer, Albert J; Parrott, J Scott
2018-05-01
High reliability is important for optimising quality and safety in healthcare organisations. Reliability efforts include interprofessional collaborative practice (IPCP) and Lean quality/process improvement strategies, which require skilful facilitation. Currently, no validated Lean facilitator assessment tool for interprofessional collaboration exists. This article describes the development and pilot evaluation of such a tool; the Interprofessional Lean Facilitator Assessment Scale (ILFAS), which measures both technical and 'soft' skills, which have not been measured in other instruments. The ILFAS was developed using methodologies and principles from Lean/Shingo, IPCP, metacognition research and Bloom's Taxonomy of Learning Domains. A panel of experts confirmed the initial face validity of the instrument. Researchers independently assessed five facilitators, during six Lean sessions. Analysis included quantitative evaluation of rater agreement. Overall inter-rater agreement of the assessment of facilitator performance was high (92%), and discrepancies in the agreement statistics were analysed. Face and content validity were further established, and usability was evaluated, through primary stakeholder post-pilot feedback, uncovering minor concerns, leading to tool revision. The ILFAS appears comprehensive in the assessment of facilitator knowledge, skills, abilities, and may be useful in the discrimination between facilitators of different skill levels. Further study is needed to explore instrument performance and validity.
NASA Astrophysics Data System (ADS)
Han, Seung Zeon; Kang, Joonhee; Kim, Sung-Dae; Choi, Si-Young; Kim, Hyung Giun; Lee, Jehyun; Kim, Kwangho; Lim, Sung Hwan; Han, Byungchan
2015-10-01
We report that a single crystal Ni2Si nanowire (NW) of intermetallic compound can be reliably designed using simple three-step processes: casting a ternary Cu-Ni-Si alloy, nucleate and growth of Ni2Si NWs as embedded in the alloy matrix via designing discontinuous precipitation (DP) of Ni2Si nanoparticles and thermal aging, and finally chemical etching to decouple the Ni2Si NWs from the alloy matrix. By direct application of uniaxial tensile tests to the Ni2Si NW we characterize its mechanical properties, which were rarely reported in previous literatures. Using integrated studies of first principles density functional theory (DFT) calculations, high-resolution transmission electron microscopy (HRTEM), and energy-dispersive X-ray spectroscopy (EDX) we accurately validate the experimental measurements. Our results indicate that our simple three-step method enables to design brittle Ni2Si NW with high tensile strength of 3.0 GPa and elastic modulus of 60.6 GPa. We propose that the systematic methodology pursued in this paper significantly contributes to opening innovative processes to design various kinds of low dimensional nanomaterials leading to advancement of frontiers in nanotechnology and related industry sectors.
Han, Seung Zeon; Kang, Joonhee; Kim, Sung-Dae; Choi, Si-Young; Kim, Hyung Giun; Lee, Jehyun; Kim, Kwangho; Lim, Sung Hwan; Han, Byungchan
2015-10-12
We report that a single crystal Ni2Si nanowire (NW) of intermetallic compound can be reliably designed using simple three-step processes: casting a ternary Cu-Ni-Si alloy, nucleate and growth of Ni2Si NWs as embedded in the alloy matrix via designing discontinuous precipitation (DP) of Ni2Si nanoparticles and thermal aging, and finally chemical etching to decouple the Ni2Si NWs from the alloy matrix. By direct application of uniaxial tensile tests to the Ni2Si NW we characterize its mechanical properties, which were rarely reported in previous literatures. Using integrated studies of first principles density functional theory (DFT) calculations, high-resolution transmission electron microscopy (HRTEM), and energy-dispersive X-ray spectroscopy (EDX) we accurately validate the experimental measurements. Our results indicate that our simple three-step method enables to design brittle Ni2Si NW with high tensile strength of 3.0 GPa and elastic modulus of 60.6 GPa. We propose that the systematic methodology pursued in this paper significantly contributes to opening innovative processes to design various kinds of low dimensional nanomaterials leading to advancement of frontiers in nanotechnology and related industry sectors.
On-line diagnosis of inter-turn short circuit fault for DC brushed motor.
Zhang, Jiayuan; Zhan, Wei; Ehsani, Mehrdad
2018-06-01
Extensive research effort has been made in fault diagnosis of motors and related components such as winding and ball bearing. In this paper, a new concept of inter-turn short circuit fault for DC brushed motors is proposed to include the short circuit ratio and short circuit resistance. A first-principle model is derived for motors with inter-turn short circuit fault. A statistical model based on Hidden Markov Model is developed for fault diagnosis purpose. This new method not only allows detection of motor winding short circuit fault, it can also provide estimation of the fault severity, as indicated by estimation of the short circuit ratio and the short circuit resistance. The estimated fault severity can be used for making appropriate decisions in response to the fault condition. The feasibility of the proposed methodology is studied for inter-turn short circuit of DC brushed motors using simulation in MATLAB/Simulink environment. In addition, it is shown that the proposed methodology is reliable with the presence of small random noise in the system parameters and measurement. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Detection and quantification of MS lesions using fuzzy topological principles
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Wei, Luogang; Samarasekera, Supun; Miki, Yukio; van Buchem, M. A.; Grossman, Robert I.
1996-04-01
Quantification of the severity of the multiple sclerosis (MS) disease through estimation of lesion volume via MR imaging is vital for understanding and monitoring the disease and its treatment. This paper presents a novel methodology and a system that can be routinely used for segmenting and estimating the volume of MS lesions via dual-echo spin-echo MR imagery. An operator indicates a few points in the images by pointing to the white matter, the gray matter, and the CSF. Each of these objects is then detected as a fuzzy connected set. The holes in the union of these objects correspond to potential lesion sites which are utilized to detect each potential lesion as a fuzzy connected object. These 3D objects are presented to the operator who indicates acceptance/rejection through the click of a mouse button. The volume of accepted lesions is then computed and output. Based on several evaluation studies and over 300 3D data sets that were processed, we conclude that the methodology is highly reliable and consistent, with a coefficient of variation (due to subjective operator actions) of less than 1.0% for volume.
Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon
2008-08-01
Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical appraisal of single-subject designs, thereby assisting to improve standards of single-case methodology.
Characterizing Aeroallergens by Infrared Spectroscopy of Fungal Spores and Pollen
Zimmermann, Boris; Tkalčec, Zdenko; Mešić, Armin; Kohler, Achim
2015-01-01
Background Fungal spores and plant pollen cause respiratory diseases in susceptible individuals, such as asthma, allergic rhinitis and hypersensitivity pneumonitis. Aeroallergen monitoring networks are an important part of treatment strategies, but unfortunately traditional analysis is time consuming and expensive. We have explored the use of infrared spectroscopy of pollen and spores for an inexpensive and rapid characterization of aeroallergens. Methodology The study is based on measurement of spore and pollen samples by single reflectance attenuated total reflectance Fourier transform infrared spectroscopy (SR-ATR FTIR). The experimental set includes 71 spore (Basidiomycota) and 121 pollen (Pinales, Fagales and Poales) samples. Along with fresh basidiospores, the study has been conducted on the archived samples collected within the last 50 years. Results The spectroscopic-based methodology enables clear spectral differentiation between pollen and spores, as well as the separation of confamiliar and congeneric species. In addition, the analysis of the scattering signals inherent in the infrared spectra indicates that the FTIR methodology offers indirect estimation of morphology of pollen and spores. The analysis of fresh and archived spores shows that chemical composition of spores is well preserved even after decades of storage, including the characteristic taxonomy-related signals. Therefore, biochemical analysis of fungal spores by FTIR could provide economical, reliable and timely methodologies for improving fungal taxonomy, as well as for fungal identification and monitoring. This proof of principle study shows the potential for using FTIR as a rapid tool in aeroallergen studies. In addition, the presented method is ready to be immediately implemented in biological and ecological studies for direct measurement of pollen and spores from flowers and sporocarps. PMID:25867755
Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.
Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander
2018-04-10
A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.
The redoubtable ecological periodic table
Ecological periodic tables are repositories of reliable information on quantitative, predictably recurring (periodic) habitat–community patterns and their uncertainty, scaling and transferability. Their reliability derives from their grounding in sound ecological principle...
Multidisciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Multi-Disciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song
1997-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Reliability and maintainability assessment factors for reliable fault-tolerant systems
NASA Technical Reports Server (NTRS)
Bavuso, S. J.
1984-01-01
A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.
NASA Technical Reports Server (NTRS)
Miller, James; Leggett, Jay; Kramer-White, Julie
2008-01-01
A team directed by the NASA Engineering and Safety Center (NESC) collected methodologies for how best to develop safe and reliable human rated systems and how to identify the drivers that provide the basis for assessing safety and reliability. The team also identified techniques, methodologies, and best practices to assure that NASA can develop safe and reliable human rated systems. The results are drawn from a wide variety of resources, from experts involved with the space program since its inception to the best-practices espoused in contemporary engineering doctrine. This report focuses on safety and reliability considerations and does not duplicate or update any existing references. Neither does it intend to replace existing standards and policy.
ERIC Educational Resources Information Center
Rahman, Nurulhuda Abd; Masuwai, Azwani; Tajudin, Nor'ain Mohd; Tek, Ong Eng; Adnan, Mazlini
2016-01-01
Purpose: This study was aimed at establishing, through the validation of the "Teaching and Learning Guiding Principles Instrument" (TLGPI), the validity and reliability of the underlying factor structure of the Teaching and Learning Guiding Principles (TLGP) generated by a previous study. Method: A survey method was used to collect data…
Morrison, Geoffrey Stewart
2014-05-01
In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.
Jiang, Chenghui; Whitehill, Tara L
2014-04-01
Speech errors associated with cleft palate are well established for English and several other Indo-European languages. Few articles describing the speech of Putonghua (standard Mandarin Chinese) speakers with cleft palate have been published in English language journals. Although methodological guidelines have been published for the perceptual speech evaluation of individuals with cleft palate, there has been no critical review of methodological issues in studies of Putonghua speakers with cleft palate. A literature search was conducted to identify relevant studies published over the past 30 years in Chinese language journals. Only studies incorporating perceptual analysis of speech were included. Thirty-seven articles which met inclusion criteria were analyzed and coded on a number of methodological variables. Reliability was established by having all variables recoded for all studies. This critical review identified many methodological issues. These design flaws make it difficult to draw reliable conclusions about characteristic speech errors in this group of speakers. Specific recommendations are made to improve the reliability and validity of future studies, as well to facilitate cross-center comparisons.
Fractography: determining the sites of fracture initiation.
Mecholsky, J J
1995-03-01
Fractography is the analysis of fracture surfaces. Here, it refers to quantitative fracture surface analysis (FSA) in the context of applying the principles of fracture mechanics to the topography observed on the fracture surface of brittle materials. The application of FSA is based on the principle that encoded on the fracture surface of brittle materials is the entire history of the fracture process. It is our task to develop the skills and knowledge to decode this information. There are several motivating factors for applying our knowledge of FSA. The first and foremost is that there is specific, quantitative information to be obtained from the fracture surface. This information includes the identification of the size and location of the fracture initiating crack or defect, the stress state at failure, the existence, or not, of local or global residual stress, the existence, or not, of stress corrosion and a knowledge of local processing anomalies which affect the fracture process. The second motivating factor is that the information is free. Once a material is tested to failure, the encoded information becomes available. If we decide to observe the features produced during fracture then we are rewarded with much information. If we decide to ignore the fracture surface, then we are left to guess and/or reason as to the cause of the failure without the benefit of all of the possible information available. This paper addresses the application of quantitative fracture surface analysis to basic research, material and product development, and "trouble-shooting" of in-service failures. First, the basic principles involved will be presented. Next, the methodology necessary to apply the principles will be presented. Finally, a summary of the presentation will be made showing the applicability to design and reliability.
Malygin, Ya V; Tsygankov, B D
The authors discussed a methodology of the study of medical service satisfaction and it's factors: moment of assessment, methodology of data collection, format of data, bench-marking, principles of inclusion of questions into a questionnaire, organizing and frequency of conducting studies.
Kepler: Analogies in the search for the law of refraction.
Cardona, Carlos Alberto
2016-10-01
This paper examines the methodology used by Kepler to discover a quantitative law of refraction. The aim is to argue that this methodology follows a heuristic method based on the following two Pythagorean principles: (1) sameness is made known by sameness, and (2) harmony arises from establishing a limit to what is unlimited. We will analyse some of the author's proposed analogies to find the aforementioned law and argue that the investigation's heuristic pursues such principles. Copyright © 2016 Elsevier Ltd. All rights reserved.
Remote sensing applied to agriculture: Basic principles, methodology, and applications
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Mendonca, F. J.
1981-01-01
The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.
2011-01-01
Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441
76 FR 65504 - Proposed Agency Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-21
..., including the validity of the methodology and assumptions used; (c) ways to enhance the quality, utility... Reliability Standard, FAC- 008-3--Facility Ratings, developed by the North American Electric Reliability... Reliability Standard FAC- 008-3 is pending before the Commission. The proposed Reliability Standard modifies...
Basic Principles of Electrical Network Reliability Optimization in Liberalised Electricity Market
NASA Astrophysics Data System (ADS)
Oleinikova, I.; Krishans, Z.; Mutule, A.
2008-01-01
The authors propose to select long-term solutions to the reliability problems of electrical networks in the stage of development planning. The guide lines or basic principles of such optimization are: 1) its dynamical nature; 2) development sustainability; 3) integrated solution of the problems of network development and electricity supply reliability; 4) consideration of information uncertainty; 5) concurrent consideration of the network and generation development problems; 6) application of specialized information technologies; 7) definition of requirements for independent electricity producers. In the article, the major aspects of liberalized electricity market, its functions and tasks are reviewed, with emphasis placed on the optimization of electrical network development as a significant component of sustainable management of power systems.
Advanced approach to the analysis of a series of in-situ nuclear forward scattering experiments
NASA Astrophysics Data System (ADS)
Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel
2017-03-01
This study introduces a sequential fitting procedure as a specific approach to nuclear forward scattering (NFS) data evaluation. Principles and usage of this advanced evaluation method are described in details and its utilization is demonstrated on NFS in-situ investigations of fast processes. Such experiments frequently consist of hundreds of time spectra which need to be evaluated. The introduced procedure allows the analysis of these experiments and significantly decreases the time needed for the data evaluation. The key contributions of the study are the sequential use of the output fitting parameters of a previous data set as the input parameters for the next data set and the model suitability crosscheck option of applying the procedure in ascending and descending directions of the data sets. Described fitting methodology is beneficial for checking of model validity and reliability of obtained results.
HPOTP low-speed flexible rotor balancing, phase 1
NASA Technical Reports Server (NTRS)
Giordano, J.; Zorzi, E.
1985-01-01
A method was developed that shows promise in overcoming many balancing limitations. This method establishes one or more windows for low speed, out-of-housing balancing of flexible rotors. These windows are regions of speed and support flexibility where two conditions are simultaneously fulfilled. First, the rotor system behaves flexibly; therefore, there is separation among balance planes. Second, the response due to balance weights is large enough to reliably measure. The analytic formulation of the low-speed flexible rotor balancing method is described. The results of proof-of-principle tests conducted under the program are presented. Based on this effort, it is concluded that low speed flexible rotor balancing is a viable technology. In particular, the method can be used to balance a rotor bearing system at low speed which results in smooth operation above more than one bending critical speed. Furthermore, this balancing methodology is applicable to SSME turbopump rotors.
Parts and Components Reliability Assessment: A Cost Effective Approach
NASA Technical Reports Server (NTRS)
Lee, Lydia
2009-01-01
System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.
Development and psychometric testing of the Cancer Knowledge Scale for Elders.
Su, Ching-Ching; Chen, Yuh-Min; Kuo, Bo-Jein
2009-03-01
To develop the Cancer Knowledge Scale for Elders and test its validity and reliability. The number of elders suffering from cancer is increasing. To facilitate cancer prevention behaviours among elders, they shall be educated about cancer-related knowledge. Prior to designing a programme that would respond to the special needs of elders, understanding the cancer-related knowledge within this population was necessary. However, extensive review of the literature revealed a lack of appropriate instruments for measuring cancer-related knowledge. A valid and reliable cancer knowledge scale for elders is necessary. A non-experimental methodological design was used to test the psychometric properties of the Cancer Knowledge Scale for Elders. Item analysis was first performed to screen out items that had low corrected item-total correlation coefficients. Construct validity was examined with a principle component method of exploratory factor analysis. Cancer-related health behaviour was used as the criterion variable to evaluate criterion-related validity. Internal consistency reliability was assessed by the KR-20. Stability was determined by two-week test-retest reliability. The factor analysis yielded a four-factor solution accounting for 49.5% of the variance. For criterion-related validity, cancer knowledge was positively correlated with cancer-related health behaviour (r = 0.78, p < 0.001). The KR-20 coefficients of each factor were 0.85, 0.76, 0.79 and 0.67 and 0.87 for the total scale. Test-retest reliability over a two-week period was 0.83 (p < 0.001). This study provides evidence for content validity, construct validity, criterion-related validity, internal consistency and stability of the Cancer Knowledge Scale for Elders. The results show that this scale is an easy-to-use instrument for elders and has adequate validity and reliability. The scale can be used as an assessment instrument when implementing cancer education programmes for elders. It can also be used to evaluate the effects of education programmes.
Reliability approach to rotating-component design. [fatigue life and stress concentration
NASA Technical Reports Server (NTRS)
Kececioglu, D. B.; Lalli, V. R.
1975-01-01
A probabilistic methodology for designing rotating mechanical components using reliability to relate stress to strength is explained. The experimental test machines and data obtained for steel to verify this methodology are described. A sample mechanical rotating component design problem is solved by comparing a deterministic design method with the new design-by reliability approach. The new method shows that a smaller size and weight can be obtained for specified rotating shaft life and reliability, and uses the statistical distortion-energy theory with statistical fatigue diagrams for optimum shaft design. Statistical methods are presented for (1) determining strength distributions for steel experimentally, (2) determining a failure theory for stress variations in a rotating shaft subjected to reversed bending and steady torque, and (3) relating strength to stress by reliability.
Fabrication Techniques and Principles for Flat Plate Antennas
DOT National Transportation Integrated Search
1973-09-01
The report documents the fabrication techniques and principles selected to produce one and ten million flat plate antennas per year. An engineering analysis of the reliability, electrical integrity, and repeatability is made, and a cost analysis summ...
Nondestructive methods of integrating energy harvesting systems with structures
NASA Astrophysics Data System (ADS)
Inamdar, Sumedh; Zimowski, Krystian; Crawford, Richard; Wood, Kristin; Jensen, Dan
2012-04-01
Designing an attachment structure that is both novel and meets the system requirements can be a difficult task especially for inexperienced designers. This paper presents a design methodology for concept generation of a "parent/child" attachment system. The "child" is broadly defined as any device, part, or subsystem that will attach to any existing system, part, or device called the "parent." An inductive research process was used to study a variety of products, patents, and biological examples that exemplified the parent/child system. Common traits among these products were found and categorized as attachment principles in three different domains: mechanical, material, and field. The attachment principles within the mechanical domain and accompanying examples are the focus of this paper. As an example of the method, a case study of generating concepts for a bridge mounted wind energy harvester using the mechanical attachment principles derived from the methodology and TRIZ principles derived from Altshuller's matrix of contradictions is presented.
NASA Technical Reports Server (NTRS)
Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)
1991-01-01
An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.
ERIC Educational Resources Information Center
Hoyt, William T.; Bhati, Kuldhir S.
2007-01-01
This article examines the 50 qualitative studies published in the Journal of Counseling Psychology (JCP) over a 15-year period in light of methodological principles advocated by qualitative theorists. The match between practices and principles is not high. In the modal investigation, coders (most of whom did not interact with or observe…
Gordon, G M; Steyn, M
2016-05-01
A recent review paper on cranio-facial superimposition (CFS) stated that "there have been specific conceptual variances" from the original methods used in the practice of skull-photo superimposition, leading to poor results as far as accuracy is concerned. It was argued that the deviations in the practice of the technique have resulted in the reduced accuracies (for both failure to include and failure to exclude) that are noted in several recent studies. This paper aims to present the results from recent research to highlight the advancement of skull-photo/cranio-facial superimposition, and to discuss some of the issues raised regarding deviations from original techniques. The evolving methodology of CFS is clarified in context with the advancement of technology, forensic science and specifically within the field of forensic anthropology. Developments in the skull-photo/cranio-facial superimposition techniques have largely focused on testing reliability and accuracy objectively. Techniques now being employed by forensic anthropologists must conform to rigorous scientific testing and methodologies. Skull-photo/cranio-facial superimposition is constantly undergoing accuracy and repeatability testing which is in line with the principles of the scientific method and additionally allows for advancement in the field. Much of the research has indicated that CFS is useful in exclusion which is consistent with the concept of Popperian falsifiability - a hypothesis and experimental design which is falsifiable. As the hypothesis is disproved or falsified, another evolves to replace it and explain the new observations. Current and future studies employing different methods to test the accuracy and reliability of skull-photo/cranio-facial superimposition will enable researchers to establish the contribution the technique can have for identification purposes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
Test-Retest Reliability of Pediatric Heart Rate Variability: A Meta-Analysis.
Weiner, Oren M; McGrath, Jennifer J
2017-01-01
Heart rate variability (HRV), an established index of autonomic cardiovascular modulation, is associated with health outcomes (e.g., obesity, diabetes) and mortality risk. Time- and frequency-domain HRV measures are commonly reported in longitudinal adult and pediatric studies of health. While test-retest reliability has been established among adults, less is known about the psychometric properties of HRV among infants, children, and adolescents. The objective was to conduct a meta-analysis of the test-retest reliability of time- and frequency-domain HRV measures from infancy to adolescence. Electronic searches (PubMed, PsycINFO; January 1970-December 2014) identified studies with nonclinical samples aged ≤ 18 years; ≥ 2 baseline HRV recordings separated by ≥ 1 day; and sufficient data for effect size computation. Forty-nine studies ( N = 5,170) met inclusion criteria. Methodological variables coded included factors relevant to study protocol, sample characteristics, electrocardiogram (ECG) signal acquisition and preprocessing, and HRV analytical decisions. Fisher's Z was derived as the common effect size. Analyses were age-stratified (infant/toddler < 5 years, n = 3,329; child/adolescent 5-18 years, n = 1,841) due to marked methodological differences across the pediatric literature. Meta-analytic results revealed HRV demonstrated moderate reliability; child/adolescent studies ( Z = 0.62, r = 0.55) had significantly higher reliability than infant/toddler studies ( Z = 0.42, r = 0.40). Relative to other reported measures, HF exhibited the highest reliability among infant/toddler studies ( Z = 0.42, r = 0.40), while rMSSD exhibited the highest reliability among child/adolescent studies ( Z = 1.00, r = 0.76). Moderator analyses indicated greater reliability with shorter test-retest interval length, reported exclusion criteria based on medical illness/condition, lower proportion of males, prerecording acclimatization period, and longer recording duration; differences were noted across age groups. HRV is reliable among pediatric samples. Reliability is sensitive to pertinent methodological decisions that require careful consideration by the researcher. Limited methodological reporting precluded several a priori moderator analyses. Suggestions for future research, including standards specified by Task Force Guidelines, are discussed.
Test-Retest Reliability of Pediatric Heart Rate Variability
Weiner, Oren M.; McGrath, Jennifer J.
2017-01-01
Heart rate variability (HRV), an established index of autonomic cardiovascular modulation, is associated with health outcomes (e.g., obesity, diabetes) and mortality risk. Time- and frequency-domain HRV measures are commonly reported in longitudinal adult and pediatric studies of health. While test-retest reliability has been established among adults, less is known about the psychometric properties of HRV among infants, children, and adolescents. The objective was to conduct a meta-analysis of the test-retest reliability of time- and frequency-domain HRV measures from infancy to adolescence. Electronic searches (PubMed, PsycINFO; January 1970–December 2014) identified studies with nonclinical samples aged ≤ 18 years; ≥ 2 baseline HRV recordings separated by ≥ 1 day; and sufficient data for effect size computation. Forty-nine studies (N = 5,170) met inclusion criteria. Methodological variables coded included factors relevant to study protocol, sample characteristics, electrocardiogram (ECG) signal acquisition and preprocessing, and HRV analytical decisions. Fisher’s Z was derived as the common effect size. Analyses were age-stratified (infant/toddler < 5 years, n = 3,329; child/adolescent 5–18 years, n = 1,841) due to marked methodological differences across the pediatric literature. Meta-analytic results revealed HRV demonstrated moderate reliability; child/adolescent studies (Z = 0.62, r = 0.55) had significantly higher reliability than infant/toddler studies (Z = 0.42, r = 0.40). Relative to other reported measures, HF exhibited the highest reliability among infant/toddler studies (Z = 0.42, r = 0.40), while rMSSD exhibited the highest reliability among child/adolescent studies (Z = 1.00, r = 0.76). Moderator analyses indicated greater reliability with shorter test-retest interval length, reported exclusion criteria based on medical illness/condition, lower proportion of males, prerecording acclimatization period, and longer recording duration; differences were noted across age groups. HRV is reliable among pediatric samples. Reliability is sensitive to pertinent methodological decisions that require careful consideration by the researcher. Limited methodological reporting precluded several a priori moderator analyses. Suggestions for future research, including standards specified by Task Force Guidelines, are discussed. PMID:29307951
Integrated biocircuits: engineering functional multicellular circuits and devices.
Prox, Jordan; Smith, Tory; Holl, Chad; Chehade, Nick; Guo, Liang
2018-04-01
Implantable neurotechnologies have revolutionized neuromodulatory medicine for treating the dysfunction of diseased neural circuitry. However, challenges with biocompatibility and lack of full control over neural network communication and function limits the potential to create more stable and robust neuromodulation devices. Thus, we propose a platform technology of implantable and programmable cellular systems, namely Integrated Biocircuits, which use only cells as the functional components of the device. We envision the foundational principles for this concept begins with novel in vitro platforms used for the study and reconstruction of cellular circuitry. Additionally, recent advancements in organoid and 3D culture systems account for microenvironment factors of cytoarchitecture to construct multicellular circuits as they are normally formed in the brain. We explore the current state of the art of these platforms to provide knowledge of their advancements in circuit fabrication and identify the current biological principles that could be applied in designing integrated biocircuit devices. We have highlighted the exemplary methodologies and techniques of in vitro circuit fabrication and propose the integration of selected controllable parameters, which would be required in creating suitable biodevices. We provide our perspective and propose new insights into the future of neuromodulaion devices within the scope of living cellular systems that can be applied in designing more reliable and biocompatible stimulation-based neuroprosthetics.
Integrated biocircuits: engineering functional multicellular circuits and devices
NASA Astrophysics Data System (ADS)
Prox, Jordan; Smith, Tory; Holl, Chad; Chehade, Nick; Guo, Liang
2018-04-01
Objective. Implantable neurotechnologies have revolutionized neuromodulatory medicine for treating the dysfunction of diseased neural circuitry. However, challenges with biocompatibility and lack of full control over neural network communication and function limits the potential to create more stable and robust neuromodulation devices. Thus, we propose a platform technology of implantable and programmable cellular systems, namely Integrated Biocircuits, which use only cells as the functional components of the device. Approach. We envision the foundational principles for this concept begins with novel in vitro platforms used for the study and reconstruction of cellular circuitry. Additionally, recent advancements in organoid and 3D culture systems account for microenvironment factors of cytoarchitecture to construct multicellular circuits as they are normally formed in the brain. We explore the current state of the art of these platforms to provide knowledge of their advancements in circuit fabrication and identify the current biological principles that could be applied in designing integrated biocircuit devices. Main results. We have highlighted the exemplary methodologies and techniques of in vitro circuit fabrication and propose the integration of selected controllable parameters, which would be required in creating suitable biodevices. Significance. We provide our perspective and propose new insights into the future of neuromodulaion devices within the scope of living cellular systems that can be applied in designing more reliable and biocompatible stimulation-based neuroprosthetics.
Forecasting of Radiation Belts: Results From the PROGRESS Project.
NASA Astrophysics Data System (ADS)
Balikhin, M. A.; Arber, T. D.; Ganushkina, N. Y.; Walker, S. N.
2017-12-01
Forecasting of Radiation Belts: Results from the PROGRESS Project. The overall goal of the PROGRESS project, funded in frame of EU Horizon2020 programme, is to combine first principles based models with the systems science methodologies to achieve reliable forecasts of the geo-space particle radiation environment.The PROGRESS incorporates three themes : The propagation of the solar wind to L1, Forecast of geomagnetic indices, and forecast of fluxes of energetic electrons within the magnetosphere. One of the important aspects of the PROGRESS project is the development of statistical wave models for magnetospheric waves that affect the dynamics of energetic electrons such as lower band chorus, hiss and equatorial noise. The error reduction ratio (ERR) concept has been used to optimise the set of solar wind and geomagnetic parameters for organisation of statistical wave models for these emissions. The resulting sets of parameters and statistical wave models will be presented and discussed. However the ERR analysis also indicates that the combination of solar wind and geomagnetic parameters accounts for only part of the variance of the emissions under investigation (lower band chorus, hiss and equatorial noise). In addition, advances in the forecast of fluxes of energetic electrons, exploiting empirical models and the first principles IMPTAM model achieved by the PROGRESS project is presented.
Chew, Gina; Walczyk, Thomas
2013-04-02
Subtle variations in the isotopic composition of elements carry unique information about physical and chemical processes in nature and are now exploited widely in diverse areas of research. Reliable measurement of natural isotope abundance variations is among the biggest challenges in inorganic mass spectrometry as they are highly sensitive to methodological bias. For decades, double spiking of the sample with a mix of two stable isotopes has been considered the reference technique for measuring such variations both by multicollector-inductively coupled plasma mass spectrometry (MC-ICPMS) and multicollector-thermal ionization mass spectrometry (MC-TIMS). However, this technique can only be applied to elements having at least four stable isotopes. Here we present a novel approach that requires measurement of three isotope signals only and which is more robust than the conventional double spiking technique. This became possible by gravimetric mixing of the sample with an isotopic spike in different proportions and by applying principles of isotope dilution for data analysis (GS-IDA). The potential and principle use of the technique is demonstrated for Mg in human urine using MC-TIMS for isotopic analysis. Mg is an element inaccessible to double spiking methods as it consists of three stable isotopes only and shows great potential for metabolically induced isotope effects waiting to be explored.
Transient Reliability of Ceramic Structures For Heat Engine Applications
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama M.
2002-01-01
The objectives of this report was to develop a methodology to predict the time-dependent reliability (probability of failure) of brittle material components subjected to transient thermomechanical loading, taking into account the change in material response with time. This methodology for computing the transient reliability in ceramic components subjected to fluctuation thermomechanical loading was developed, assuming SCG (Slow Crack Growth) as the delayed mode of failure. It takes into account the effect of varying Weibull modulus and materials with time. It was also coded into a beta version of NASA's CARES/Life code, and an example demonstrating its viability was presented.
ERIC Educational Resources Information Center
Jabar, Syaril Izwann; Albion, Peter R.
2016-01-01
Based on Chickering and Gamson's (1987) Seven Principles for Good Practice, this research project attempted to revitalize the principles by merging them with Merrill's (2006) Different Levels of Instructional Strategy. The aim was to develop, validate, and standardize a measurement instrument (DLISt7) using a pretest-posttest Internet…
NASA Astrophysics Data System (ADS)
Samaroo, Ryan
2015-11-01
This essay examines Friedman's recent approach to the analysis of physical theories. Friedman argues against Quine that the identification of certain principles as 'constitutive' is essential to a satisfactory methodological analysis of physics. I explicate Friedman's characterization of a constitutive principle, and I evaluate his account of the constitutive principles that Newtonian and Einsteinian gravitation presuppose for their formulation. I argue that something close to Friedman's thesis is defensible.
Mechanical system reliability for long life space systems
NASA Technical Reports Server (NTRS)
Kowal, Michael T.
1994-01-01
The creation of a compendium of mechanical limit states was undertaken in order to provide a reference base for the application of first-order reliability methods to mechanical systems in the context of the development of a system level design methodology. The compendium was conceived as a reference source specific to the problem of developing the noted design methodology, and not an exhaustive or exclusive compilation of mechanical limit states. The compendium is not intended to be a handbook of mechanical limit states for general use. The compendium provides a diverse set of limit-state relationships for use in demonstrating the application of probabilistic reliability methods to mechanical systems. The compendium is to be used in the reliability analysis of moderately complex mechanical systems.
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
The Scaling of Performance and Losses in Miniature Internal Combustion Engines
2010-01-01
reliable measurements of engine performance and losses in these small engines. Methodologies are also developed for measuring volumetric, heat transfer...making reliable measurements of engine performance and losses in these small engines. Methodologies are also developed for measuring volumetric, heat ...the most important challenge as it accounts for 60-70% of total energy losses. Combustion losses are followed in order of importance by heat transfer
Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets
2017-07-01
principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for
Han, Seung Zeon; Kang, Joonhee; Kim, Sung-Dae; Choi, Si-Young; Kim, Hyung Giun; Lee, Jehyun; Kim, Kwangho; Lim, Sung Hwan; Han, Byungchan
2015-01-01
We report that a single crystal Ni2Si nanowire (NW) of intermetallic compound can be reliably designed using simple three-step processes: casting a ternary Cu-Ni-Si alloy, nucleate and growth of Ni2Si NWs as embedded in the alloy matrix via designing discontinuous precipitation (DP) of Ni2Si nanoparticles and thermal aging, and finally chemical etching to decouple the Ni2Si NWs from the alloy matrix. By direct application of uniaxial tensile tests to the Ni2Si NW we characterize its mechanical properties, which were rarely reported in previous literatures. Using integrated studies of first principles density functional theory (DFT) calculations, high-resolution transmission electron microscopy (HRTEM), and energy-dispersive X-ray spectroscopy (EDX) we accurately validate the experimental measurements. Our results indicate that our simple three-step method enables to design brittle Ni2Si NW with high tensile strength of 3.0 GPa and elastic modulus of 60.6 GPa. We propose that the systematic methodology pursued in this paper significantly contributes to opening innovative processes to design various kinds of low dimensional nanomaterials leading to advancement of frontiers in nanotechnology and related industry sectors. PMID:26456769
Monakhova, Yulia B; Kohl-Himmelseher, Matthias; Kuballa, Thomas; Lachenmeier, Dirk W
2014-11-01
A fast and reliable nuclear magnetic resonance spectroscopic method for quantitative determination (qNMR) of targeted molecules in reference materials has been established using the ERETIC2 methodology (electronic reference to access in vivo concentrations) based on the PULCON principle (pulse length based concentration determination). The developed approach was validated for the analysis of pharmaceutical samples in the context of official medicines control, including ibandronic acid, amantadine, ambroxol and lercanidipine. The PULCON recoveries were above 94.3% and coefficients of variation (CVs) obtained by quantification of different targeted resonances ranged between 0.7% and 2.8%, demonstrating that the qNMR method is a precise tool for rapid quantification (approximately 15min) of reference materials and medicinal products. Generally, the values were within specification (certified values) provided by the manufactures. The results were in agreement with NMR quantification using an internal standard and validated reference HPLC analysis. The PULCON method was found to be a practical alternative with competitive precision and accuracy to the classical internal reference method and it proved to be applicable to different solvent conditions. The method can be recommended for routine use in medicines control laboratories, especially when the availability and costs of reference compounds are problematic. Copyright © 2014 Elsevier B.V. All rights reserved.
Guidelines for the Evaluation of Bilingual Education Programs.
ERIC Educational Resources Information Center
Cardoza, Desdemona
Principles of program evaluation research are outlined so that bilingual education program coordinators can conduct methodologically acceptable program evaluations. The three basic principles of evaluation research are: identification of the program participants, definition of the program intervention, and assessment of program effectiveness.…
Next level of board accountability in health care quality.
Pronovost, Peter J; Armstrong, C Michael; Demski, Renee; Peterson, Ronald R; Rothman, Paul B
2018-03-19
Purpose The purpose of this paper is to offer six principles that health system leaders can apply to establish a governance and management system for the quality of care and patient safety. Design/methodology/approach Leaders of a large academic health system set a goal of high reliability and formed a quality board committee in 2011 to oversee quality and patient safety everywhere care was delivered. Leaders of the health system and every entity, including inpatient hospitals, home care companies, and ambulatory services staff the committee. The committee works with the management for each entity to set and achieve quality goals. Through this work, the six principles emerged to address management structures and processes. Findings The principles are: ensure there is oversight for quality everywhere care is delivered under the health system; create a framework to organize and report the work; identify care areas where quality is ambiguous or underdeveloped (i.e. islands of quality) and work to ensure there is reporting and accountability for quality measures; create a consolidated quality statement similar to a financial statement; ensure the integrity of the data used to measure and report quality and safety performance; and transparently report performance and create an explicit accountability model. Originality/value This governance and management system for quality and safety functions similar to a finance system, with quality performance documented and reported, data integrity monitored, and accountability for performance from board to bedside. To the authors' knowledge, this is the first description of how a board has taken this type of systematic approach to oversee the quality of care.
Steenkamer, Betty; Baan, Caroline; Putters, Kim; van Oers, Hans; Drewes, Hanneke
2018-04-09
Purpose A range of strategies to improve pharmaceutical care has been implemented by population health management (PHM) initiatives. However, which strategies generate the desired outcomes is largely unknown. The purpose of this paper is to identify guiding principles underlying collaborative strategies to improve pharmaceutical care and the contextual factors and mechanisms through which these principles operate. Design/methodology/approach The evaluation was informed by a realist methodology examining the links between PHM strategies, their outcomes and the contexts and mechanisms by which these strategies operate. Guiding principles were identified by grouping context-specific strategies with specific outcomes. Findings In total, ten guiding principles were identified: create agreement and commitment based on a long-term vision; foster cooperation and representation at the board level; use layered governance structures; create awareness at all levels; enable interpersonal links at all levels; create learning environments; organize shared responsibility; adjust financial strategies to market contexts; organize mutual gains; and align regional agreements with national policies and regulations. Contextual factors such as shared savings influenced the effectiveness of the guiding principles. Mechanisms by which these guiding principles operate were, for instance, fostering trust and creating a shared sense of the problem. Practical implications The guiding principles highlight how collaboration can be stimulated to improve pharmaceutical care while taking into account local constraints and possibilities. The interdependency of these principles necessitates effectuating them together in order to realize the best possible improvements and outcomes. Originality/value This is the first study using a realist approach to understand the guiding principles underlying collaboration to improve pharmaceutical care.
Score Reliability: A Retrospective Look Back at 12 Years of Reliability Generalization Studies
ERIC Educational Resources Information Center
Vacha-Haase, Tammi; Thompson, Bruce
2011-01-01
The present study was conducted to characterize (a) the features of the thousands of primary reports synthesized in 47 reliability generalization (RG) measurement meta-analysis studies and (b) typical methodological practice within the RG literature to date. With respect to the treatment of score reliability in the literature, in an astounding…
Minimum Control Requirements for Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Boulange, Richard; Jones, Harry; Jones, Harry
2002-01-01
Advanced control technologies are not necessary for the safe, reliable and continuous operation of Advanced Life Support (ALS) systems. ALS systems can and are adequately controlled by simple, reliable, low-level methodologies and algorithms. The automation provided by advanced control technologies is claimed to decrease system mass and necessary crew time by reducing buffer size and minimizing crew involvement. In truth, these approaches increase control system complexity without clearly demonstrating an increase in reliability across the ALS system. Unless these systems are as reliable as the hardware they control, there is no savings to be had. A baseline ALS system is presented with the minimal control system required for its continuous safe reliable operation. This baseline control system uses simple algorithms and scheduling methodologies and relies on human intervention only in the event of failure of the redundant backup equipment. This ALS system architecture is designed for reliable operation, with minimal components and minimal control system complexity. The fundamental design precept followed is "If it isn't there, it can't fail".
NASA Technical Reports Server (NTRS)
Schlegel, R. G.
1982-01-01
It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.
ERIC Educational Resources Information Center
Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra
2012-01-01
Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…
Seductive Details in Multimedia Messages
ERIC Educational Resources Information Center
Rey, Gunter Daniel
2011-01-01
The seductive detail principle asserts that people learn more deeply from a multimedia presentation when interesting but irrelevant adjuncts are excluded rather than included. However, critics could argue that studies about this principle contain methodological problems. The recent experiment attempts to overcome these problems. Students (N = 108)…
Reflexive Principlism as an Effective Approach for Developing Ethical Reasoning in Engineering.
Beever, Jonathan; Brightman, Andrew O
2016-02-01
An important goal of teaching ethics to engineering students is to enhance their ability to make well-reasoned ethical decisions in their engineering practice: a goal in line with the stated ethical codes of professional engineering organizations. While engineering educators have explored a wide range of methodologies for teaching ethics, a satisfying model for developing ethical reasoning skills has not been adopted broadly. In this paper we argue that a principlist-based approach to ethical reasoning is uniquely suited to engineering ethics education. Reflexive Principlism is an approach to ethical decision-making that focuses on internalizing a reflective and iterative process of specification, balancing, and justification of four core ethical principles in the context of specific cases. In engineering, that approach provides structure to ethical reasoning while allowing the flexibility for adaptation to varying contexts through specification. Reflexive Principlism integrates well with the prevalent and familiar methodologies of reasoning within the engineering disciplines as well as with the goals of engineering ethics education.
Nondestructive methods of integrating energy harvesting systems for highway bridges
NASA Astrophysics Data System (ADS)
Inamdar, Sumedh; Zimowski, Krystian; Crawford, Richard; Wood, Kristin; Jensen, Dan
2012-04-01
Designing an attachment structure that is both novel and meets the system requirements can be a difficult task especially for inexperienced designers. This paper presents a design methodology for concept generation of a "parent/child" attachment system. The "child" is broadly defined as any device, part, or subsystem that will attach to any existing system, part, or device called the "parent." An inductive research process was used to study a variety of products, patents, and biological examples that exemplified the parent/child system. Common traits among these products were found and categorized as attachment principles in three different domains: mechanical, material, and field. The attachment principles within the mechanical domain and accompanying examples are the focus of this paper. As an example of the method, a case study of generating concepts for a bridge mounted wind energy harvester using the mechanical attachment principles derived from the methodology and TRIZ principles derived from Altshuller's matrix of contradictions is presented.
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
On the methodology of Engineering Geodesy
NASA Astrophysics Data System (ADS)
Brunner, Fritz K.
2007-09-01
Textbooks on geodetic surveying usually describe a very small number of principles which should provide the foundation of geodetic surveying. Here, the author argues that an applied field, such as engineering geodesy, has a methodology as foundation rather than a few principles. Ten methodological elements (ME) are identified: (1) Point discretisation of natural surfaces and objects, (2) distinction between coordinate and observation domain, (3) definition of reference systems, (4) specification of unknown parameters and desired precisions, (5) geodetic network and observation design, (6) quality control of equipment, (7) quality control of measurements, (8) establishment of measurement models, (9) establishment of parameter estimation models, (10) quality control of results. Each ME consists of a suite of theoretical developments, geodetic techniques and calculation procedures, which will be discussed. This paper is to be considered a first attempt at identifying the specific elements of the methodology of engineering geodesy. A better understanding of this methodology could lead to an increased objectivity, to a transformation of subjective practical experiences into objective working methods, and consequently to a new structure for teaching this rather diverse subject.
ERIC Educational Resources Information Center
Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen
2017-01-01
Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…
Fracture mechanics approach to estimate rail wear limits
DOT National Transportation Integrated Search
2009-10-01
This paper describes a systematic methodology to estimate allowable limits for rail head wear in terms of vertical head-height loss, gage-face side wear, and/or the combination of the two. This methodology is based on the principles of engineering fr...
The Pursuit of Chronically Reliable Neural Interfaces: A Materials Perspective.
Guo, Liang
2016-01-01
Brain-computer interfaces represent one of the most astonishing technologies in our era. However, the grand challenge of chronic instability and limited throughput of the electrode-tissue interface has significantly hindered the further development and ultimate deployment of such exciting technologies. A multidisciplinary research workforce has been called upon to respond to this engineering need. In this paper, I briefly review this multidisciplinary pursuit of chronically reliable neural interfaces from a materials perspective by analyzing the problem, abstracting the engineering principles, and summarizing the corresponding engineering strategies. I further draw my future perspectives by extending the proposed engineering principles.
Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring
ERIC Educational Resources Information Center
Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri
2017-01-01
Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…
Influence: A Key to Successful Leadership
ERIC Educational Resources Information Center
Hoy, Wayne K.; Smith, Page A.
2007-01-01
Purpose: The purpose of this article is to examine and condense the literature on influence and persuasion. Design/methodology/approach: The article identifies basic principles of influence in the theoretical and research literature, which are supported by empirical study. Findings: Ten principles of influence were identified, empirical support…
Evaluating Course Design Principles for Multimedia Learning Materials
ERIC Educational Resources Information Center
Scott, Bernard; Cong, Chunyu
2010-01-01
Purpose: This paper aims to report on evaluation studies of principles of course design for interactive multimedia learning materials. Design/methodology/approach: At the Defence Academy of the UK, Cranfield University has worked with military colleagues to produce multimedia learning materials for courses on "Military Knowledge". The…
NASA Technical Reports Server (NTRS)
Vesely, William E.; Colon, Alfredo E.
2010-01-01
Design Safety/Reliability is associated with the probability of no failure-causing faults existing in a design. Confidence in the non-existence of failure-causing faults is increased by performing tests with no failure. Reliability-Growth testing requirements are based on initial assurance and fault detection probability. Using binomial tables generally gives too many required tests compared to reliability-growth requirements. Reliability-Growth testing requirements are based on reliability principles and factors and should be used.
Human error identification for laparoscopic surgery: Development of a motion economy perspective.
Al-Hakim, Latif; Sevdalis, Nick; Maiping, Tanaphon; Watanachote, Damrongpan; Sengupta, Shomik; Dissaranan, Charuspong
2015-09-01
This study postulates that traditional human error identification techniques fail to consider motion economy principles and, accordingly, their applicability in operating theatres may be limited. This study addresses this gap in the literature with a dual aim. First, it identifies the principles of motion economy that suit the operative environment and second, it develops a new error mode taxonomy for human error identification techniques which recognises motion economy deficiencies affecting the performance of surgeons and predisposing them to errors. A total of 30 principles of motion economy were developed and categorised into five areas. A hierarchical task analysis was used to break down main tasks of a urological laparoscopic surgery (hand-assisted laparoscopic nephrectomy) to their elements and the new taxonomy was used to identify errors and their root causes resulting from violation of motion economy principles. The approach was prospectively tested in 12 observed laparoscopic surgeries performed by 5 experienced surgeons. A total of 86 errors were identified and linked to the motion economy deficiencies. Results indicate the developed methodology is promising. Our methodology allows error prevention in surgery and the developed set of motion economy principles could be useful for training surgeons on motion economy principles. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Bulk Fuel Pricing: DOD Needs to Take Additional Actions to Establish a More Reliable Methodology
2015-11-19
Page 1 GAO-16-78R Bulk Fuel Pricing 441 G St. N.W. Washington, DC 20548 November 19, 2015 The Honorable Ashton Carter The Secretary of...Defense Bulk Fuel Pricing : DOD Needs to Take Additional Actions to Establish a More Reliable Methodology Dear Secretary Carter: Each fiscal...year, the Office of the Under Secretary of Defense (Comptroller), in coordination with the Defense Logistics Agency, sets a standard price per barrel
Allocating SMART Reliability and Maintainability Goals to NASA Ground Systems
NASA Technical Reports Server (NTRS)
Gillespie, Amanda; Monaghan, Mark
2013-01-01
This paper will describe the methodology used to allocate Reliability and Maintainability (R&M) goals to Ground Systems Development and Operations (GSDO) subsystems currently being designed or upgraded.
Djulbegovic, Benjamin; Cantor, Alan; Clarke, Mike
2003-01-01
Previous research has identified methodological problems in the design and conduct of randomized trials that could, if left unaddressed, lead to biased results. In this report we discuss one such problem, inadequate control intervention, and argue that it can be by far the most important design characteristic of randomized trials in overestimating the effect of new treatments. Current guidelines for the design and reporting of randomized trials, such as the Consolidated Standards of Reporting Trials (CONSORT) statement, do not address the choice of the comparator intervention. We argue that an adequate control intervention can be selected if people designing a trial explicitly take into consideration the ethical principle of equipoise, also known as "the uncertainty principle."
Methodologies for Crawler Based Web Surveys.
ERIC Educational Resources Information Center
Thelwall, Mike
2002-01-01
Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…
Multiple Fingers - One Gestalt.
Lezkan, Alexandra; Manuel, Steven G; Colgate, J Edward; Klatzky, Roberta L; Peshkin, Michael A; Drewing, Knut
2016-01-01
The Gestalt theory of perception offered principles by which distributed visual sensations are combined into a structured experience ("Gestalt"). We demonstrate conditions whereby haptic sensations at two fingertips are integrated in the perception of a single object. When virtual bumps were presented simultaneously to the right hand's thumb and index finger during lateral arm movements, participants reported perceiving a single bump. A discrimination task measured the bump's perceived location and perceptual reliability (assessed by differential thresholds) for four finger configurations, which varied in their adherence to the Gestalt principles of proximity (small versus large finger separation) and synchrony (virtual spring to link movements of the two fingers versus no spring). According to models of integration, reliability should increase with the degree to which multi-finger cues integrate into a unified percept. Differential thresholds were smaller in the virtual-spring condition (synchrony) than when fingers were unlinked. Additionally, in the condition with reduced synchrony, greater proximity led to lower differential thresholds. Thus, with greater adherence to Gestalt principles, thresholds approached values predicted for optimal integration. We conclude that the Gestalt principles of synchrony and proximity apply to haptic perception of surface properties and that these principles can interact to promote multi-finger integration.
26 CFR 1.482-1 - Allocation of income and deductions among taxpayers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... section sets forth general principles and guidelines to be followed under section 482. Section 1.482-2... practices, economic principles, or statistical analyses. The extent and reliability of any adjustments will..., extraction, and assembly; (E) Purchasing and materials management; (F) Marketing and distribution functions...
Principles of Successful Implementation of Lecture Recordings in Higher Education
ERIC Educational Resources Information Center
Ollermann, Frank; Rolf, Rüdiger; Greweling, Christian; Klaßen, André
2017-01-01
Purpose: This paper aims to describe the principles underlying the successful implementation of a lecture recording service in higher education. Design/methodology/approach: The paper qualitatively reviews the practices and experiences of several years of automated lecture recording at a medium-sized university in Germany. Findings: The paper…
ERIC Educational Resources Information Center
Fetterman, David M.
The most important distinction between evaluation (in the psychometric tradition), ethnography, and auditing is that they are guided by three distinctively separate principles. The underlying principle guiding evaluation is assessment. Ethnography is guided by description. Auditing uses description and assessment to establish an opinion on…
ERIC Educational Resources Information Center
Roessger, Kevin M.
2012-01-01
The philosophy of radical behaviourism remains misunderstood within the field of adult education. Contributing to this trend is the field's homogeneous behaviourist interpretation, which attributes methodological behaviourism's principles to radical behaviourism. The guiding principles and assumptions of radical behaviourism are examined to…
The Macro and Micro Structure of the Foreign Language Curriculum.
ERIC Educational Resources Information Center
Politzer, Robert L.
A representative six-year foreign language curriculum (grades 7-12) is analyzed and noted to be based on several underlying principles. Difficulties which arise from rigid adherence to certain methodological principles are discussed, particularly those concerning the relationship between audiolingual and writing skills. Suggestions on ways to…
Assessing Financial Education Methods: Principles vs. Rules-of-Thumb Approaches
ERIC Educational Resources Information Center
Skimmyhorn, William L.; Davies, Evan R.; Mun, David; Mitchell, Brian
2016-01-01
Despite thousands of programs and tremendous public and private interest in improving financial decision-making, little is known about how best to teach financial education. Using an experimental approach, the authors estimated the effects of two different education methodologies (principles-based and rules-of-thumb) on the knowledge,…
Implementing the Sustainable Development Goals at University Level
ERIC Educational Resources Information Center
Albareda-Tiana, Silvia; Vidal-Raméntol, Salvador; Fernández-Morilla, Mónica
2018-01-01
Purpose: The purpose of this case study is to explore the principles and practices of sustainable development (SD) in the university curriculum. Design/methodology/approach: To explore the principles linked with the sustainable development goals (SDGs) and the learning and teaching practices in sustainability at the International University of…
Sign Language and the Brain: A Review
ERIC Educational Resources Information Center
Campbell, Ruth; MacSweeney, Mairead; Waters, Dafydd
2008-01-01
How are signed languages processed by the brain? This review briefly outlines some basic principles of brain structure and function and the methodological principles and techniques that have been used to investigate this question. We then summarize a number of different studies exploring brain activity associated with sign language processing…
Improving patient care in cardiac surgery using Toyota production system based methodology.
Culig, Michael H; Kunkle, Richard F; Frndak, Diane C; Grunden, Naida; Maher, Thomas D; Magovern, George J
2011-02-01
A new cardiac surgery program was developed in a community hospital setting using the operational excellence (OE) method, which is based on the principles of the Toyota production system. The initial results of the first 409 heart operations, performed over the 28 months between March 1, 2008, and June 30, 2010, are presented. Operational excellence methodology was taught to the cardiac surgery team. Coaching started 2 months before the opening of the program and continued for 24 months. Of the 409 cases presented, 253 were isolated coronary artery bypass graft operations. One operative death occurred. According to the database maintained by The Society of Thoracic Surgeons, the risk-adjusted operative mortality rate was 61% lower than the regional rate. Likewise, the risk-adjusted rate of major complications was 57% lower than The Society of Thoracic Surgeons regional rate. Daily solution to determine cause was attempted on 923 distinct perioperative problems by all team members. Using the cost of complications as described by Speir and coworkers, avoiding predicted complications resulted in a savings of at least $884,900 as compared with the regional average. By the systematic use of a real time, highly formatted problem-solving methodology, processes of care improved daily. Using carefully disciplined teamwork, reliable implementation of evidence-based protocols was realized by empowering the front line to make improvements. Low rates of complications were observed, and a cost savings of $3,497 per each case of isolated coronary artery bypass graft was realized. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Psychometric Principles in Measurement for Geoscience Education Research: A Climate Change Example
NASA Astrophysics Data System (ADS)
Libarkin, J. C.; Gold, A. U.; Harris, S. E.; McNeal, K.; Bowles, R.
2015-12-01
Understanding learning in geoscience classrooms requires that we use valid and reliable instruments aligned with intended learning outcomes. Nearly one hundred instruments assessing conceptual understanding in undergraduate science and engineering classrooms (often called concept inventories) have been published and are actively being used to investigate learning. The techniques used to develop these instruments vary widely, often with little attention to psychometric principles of measurement. This paper will discuss the importance of using psychometric principles to design, evaluate, and revise research instruments, with particular attention to the validity and reliability steps that must be undertaken to ensure that research instruments are providing meaningful measurement. An example from a climate change inventory developed by the authors will be used to exemplify the importance of validity and reliability, including the value of item response theory for instrument development. A 24-item instrument was developed based on published items, conceptions research, and instructor experience. Rasch analysis of over 1000 responses provided evidence for the removal of 5 items for misfit and one item for potential bias as measured via differential item functioning. The resulting 18-item instrument can be considered a valid and reliable measure based on pre- and post-implementation metrics. Consideration of the relationship between respondent demographics and concept inventory scores provides unique insight into the relationship between gender, religiosity, values and climate change understanding.
Development of reliable pavement models.
DOT National Transportation Integrated Search
2011-05-01
The current report proposes a framework for estimating the reliability of a given pavement structure as analyzed by : the Mechanistic-Empirical Pavement Design Guide (MEPDG). The methodology proposes using a previously fit : response surface, in plac...
Sabour, Siamak
2018-03-08
The purpose of this letter, in response to Hall, Mehta, and Fackrell (2017), is to provide important knowledge about methodology and statistical issues in assessing the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. The author uses reference textbooks and published articles regarding scientific assessment of the validity and reliability of a clinical test to discuss the statistical test and the methodological approach in assessing validity and reliability in clinical research. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess reliability and validity. The qualitative variables of sensitivity, specificity, positive predictive value, negative predictive value, false positive and false negative rates, likelihood ratio positive and likelihood ratio negative, as well as odds ratio (i.e., ratio of true to false results), are the most appropriate estimates to evaluate validity of a test compared to a gold standard. In the case of quantitative variables, depending on distribution of the variable, Pearson r or Spearman rho can be applied. Diagnostic accuracy (validity) and diagnostic precision (reliability or agreement) are two completely different methodological issues. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess validity.
A Study in Sexual Health Applying the Principles of Community-Based Participatory Research
Reece, Michael; Dodge, Brian
2012-01-01
The principles of community-based participatory research were applied to an exploratory sexual health study that examined “cruising for sex” among men on a college campus. In the context of a study seeking a broad interpretation of the health implications of cruising, and when faced with methodological challenges, the researchers found these principles to provide invaluable guidance. A review of the research process is offered and the manner in which the principles of community-based participatory research were operationalized for this study is described. PMID:15129042
HTGR plant availability and reliability evaluations. Volume I. Summary of evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cadwallader, G.J.; Hannaman, G.W.; Jacobsen, F.K.
1976-12-01
The report (1) describes a reliability assessment methodology for systematically locating and correcting areas which may contribute to unavailability of new and uniquely designed components and systems, (2) illustrates the methodology by applying it to such components in a high-temperature gas-cooled reactor (Public Service Company of Colorado's Fort St. Vrain 330-MW(e) HTGR), and (3) compares the results of the assessment with actual experience. The methodology can be applied to any component or system; however, it is particularly valuable for assessments of components or systems which provide essential functions, or the failure or mishandling of which could result in relatively largemore » economic losses.« less
Uher, Jana
2015-12-01
Taxonomic "personality" models are widely used in research and applied fields. This article applies the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) to scrutinise the three methodological steps that are required for developing comprehensive "personality" taxonomies: 1) the approaches used to select the phenomena and events to be studied, 2) the methods used to generate data about the selected phenomena and events and 3) the reduction principles used to extract the "most important" individual-specific variations for constructing "personality" taxonomies. Analyses of some currently popular taxonomies reveal frequent mismatches between the researchers' explicit and implicit metatheories about "personality" and the abilities of previous methodologies to capture the particular kinds of phenomena toward which they are targeted. Serious deficiencies that preclude scientific quantifications are identified in standardised questionnaires, psychology's established standard method of investigation. These mismatches and deficiencies derive from the lack of an explicit formulation and critical reflection on the philosophical and metatheoretical assumptions being made by scientists and from the established practice of radically matching the methodological tools to researchers' preconceived ideas and to pre-existing statistical theories rather than to the particular phenomena and individuals under study. These findings raise serious doubts about the ability of previous taxonomies to appropriately and comprehensively reflect the phenomena towards which they are targeted and the structures of individual-specificity occurring in them. The article elaborates and illustrates with empirical examples methodological principles that allow researchers to appropriately meet the metatheoretical requirements and that are suitable for comprehensively exploring individuals' "personality".
Fatigue criterion to system design, life and reliability
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.
1985-01-01
A generalized methodology to structural life prediction, design, and reliability based upon a fatigue criterion is advanced. The life prediction methodology is based in part on work of W. Weibull and G. Lundberg and A. Palmgren. The approach incorporates the computed life of elemental stress volumes of a complex machine element to predict system life. The results of coupon fatigue testing can be incorporated into the analysis allowing for life prediction and component or structural renewal rates with reasonable statistical certainty.
Methodology for Software Reliability Prediction. Volume 1.
1987-11-01
SPACECRAFT 0 MANNED SPACECRAFT B ATCH SYSTEM AIRBORNE AVIONICS 0 UNMANNED EVENT C014TROL a REAL TIME CLOSED 0 UNMANNED SPACECRAFT LOOP OPERATINS SPACECRAFT...software reliability. A Software Reliability Measurement Framework was established which spans the life cycle of a software system and includes the...specification, prediction, estimation, and assessment of software reliability. Data from 59 systems , representing over 5 million lines of code, were
Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio
2016-08-24
Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases).
Cybersecurity for distributed energy resources and smart inverters
Qi, Junjian; Hahn, Adam; Lu, Xiaonan; ...
2016-12-01
The increased penetration of distributed energy resources (DER) will significantly increase the number of devices that are owned and controlled by consumers and third parties. These devices have a significant dependency on digital communication and control, which presents a growing risk from cyber attacks. This paper proposes a holistic attack-resilient framework to protect the the integrated DER and the critical power grid infrastructure from malicious cyber attacks, helping ensure the secure integration of DER without harming the grid reliability and stability. Specifically, we discuss the architecture of the cyber-physical power system with a high penetration of DER and analyze themore » unique cybersecurity challenges introduced by DER integration. Next, we summarize important attack scenarios against DER, propose a systematic DER resilience analysis methodology, and develop effective and quantifiable resilience metrics and design principles. Lastly, we introduce attack prevention, detection, and response measures specifically designed for DER integration across cyber, physical device, and utility layers of the future smart grid.« less
A practical guide to surveys and questionnaires.
Slattery, Eric L; Voelker, Courtney C J; Nussenbaum, Brian; Rich, Jason T; Paniello, Randal C; Neely, J Gail
2011-06-01
Surveys with questionnaires play a vital role in decision and policy making in society. Within medicine, including otolaryngology, surveys with questionnaires may be the only method for gathering data on rare or unusual events. In addition, questionnaires can be developed and validated to be used as outcome measures in clinical trials and other clinical research architecture. Consequently, it is fundamentally important that such tools be properly developed and validated. Just asking questions that have not gone through rigorous design and development may be misleading and unfair at best; at worst, they can result in under- or overtreatment and unnecessary expense. Furthermore, it is important that consumers of the data produced by these instruments understand the principles of questionnaire design to interpret results in an optimal and meaningful way. This article presents a practical guide for understanding the methodologies of survey and questionnaire design, including the concepts of validity and reliability, how surveys are administered and implemented, and, finally, biases and pitfalls of surveys.
Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio
2016-01-01
Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases). PMID:27563912
Cerveri, Pietro; Manzotti, Alfonso; Confalonieri, Norberto; Baroni, Guido
2014-12-01
Personalized resection guides (PRG) have been recently proposed in the domain of knee replacement, demonstrating clinical outcome similar or even superior to both manual and navigated interventions. Among the mandatory pre-surgical steps for PRG prototyping, the measurement of clinical landmarks (CL) on the bony surfaces is recognized as a key issue due to lack of standardized methodologies, operator-dependent variability and time expenditure. In this paper, we focus on the reliability and repeatability of an anterior-posterior axis, also known as Whiteside line (WL), of the distal femur proposing automatic surface processing and modeling methods aimed at overcoming some of the major concerns related to the manual identification of such CL on 2D images and 3D models. We show that the measurement of WL, exploiting the principle of mean-shifting surface curvature, is highly repeatable and coherent with clinical knowledge. Copyright © 2014 Elsevier Ltd. All rights reserved.
Cybersecurity for distributed energy resources and smart inverters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, Junjian; Hahn, Adam; Lu, Xiaonan
The increased penetration of distributed energy resources (DER) will significantly increase the number of devices that are owned and controlled by consumers and third parties. These devices have a significant dependency on digital communication and control, which presents a growing risk from cyber attacks. This paper proposes a holistic attack-resilient framework to protect the the integrated DER and the critical power grid infrastructure from malicious cyber attacks, helping ensure the secure integration of DER without harming the grid reliability and stability. Specifically, we discuss the architecture of the cyber-physical power system with a high penetration of DER and analyze themore » unique cybersecurity challenges introduced by DER integration. Next, we summarize important attack scenarios against DER, propose a systematic DER resilience analysis methodology, and develop effective and quantifiable resilience metrics and design principles. Lastly, we introduce attack prevention, detection, and response measures specifically designed for DER integration across cyber, physical device, and utility layers of the future smart grid.« less
Ultra compact spectrometer using linear variable filters
NASA Astrophysics Data System (ADS)
Dami, M.; De Vidi, R.; Aroldi, G.; Belli, F.; Chicarella, L.; Piegari, A.; Sytchkova, A.; Bulir, J.; Lemarquis, F.; Lequime, M.; Abel Tibérini, L.; Harnisch, B.
2017-11-01
The Linearly Variable Filters (LVF) are complex optical devices that, integrated in a CCD, can realize a "single chip spectrometer". In the framework of an ESA Study, a team of industries and institutes led by SELEX-Galileo explored the design principles and manufacturing techniques, realizing and characterizing LVF samples based both on All-Dielectric (AD) and Metal-Dielectric (MD) Coating Structures in the VNIR and SWIR spectral ranges. In particular the achieved performances on spectral gradient, transmission bandwidth and Spectral Attenuation (SA) are presented and critically discussed. Potential improvements will be highlighted. In addition the results of a feasibility study of a SWIR Linear Variable Filter are presented with the comparison of design prediction and measured performances. Finally criticalities related to the filter-CCD packaging are discussed. The main achievements reached during these activities have been: - to evaluate by design, manufacturing and test of LVF samples the achievable performances compared with target requirements; - to evaluate the reliability of the projects by analyzing their repeatability; - to define suitable measurement methodologies
Scott, P J; Rigby, M; Ammenwerth, E; McNair, J Brender; Georgiou, A; Hyppönen, H; de Keizer, N; Magrabi, F; Nykänen, P; Gude, W T; Hackl, W
2017-08-01
Objectives: To set the scientific context and then suggest principles for an evidence-based approach to secondary uses of clinical data, covering both evaluation of the secondary uses of data and evaluation of health systems and services based upon secondary uses of data. Method: Working Group review of selected literature and policy approaches. Results: We present important considerations in the evaluation of secondary uses of clinical data from the angles of governance and trust, theory, semantics, and policy. We make the case for a multi-level and multi-factorial approach to the evaluation of secondary uses of clinical data and describe a methodological framework for best practice. We emphasise the importance of evaluating the governance of secondary uses of health data in maintaining trust, which is essential for such uses. We also offer examples of the re-use of routine health data to demonstrate how it can support evaluation of clinical performance and optimize health IT system design. Conclusions: Great expectations are resting upon "Big Data" and innovative analytics. However, to build and maintain public trust, improve data reliability, and assure the validity of analytic inferences, there must be independent and transparent evaluation. A mature and evidence-based approach needs not merely data science, but must be guided by the broader concerns of applied health informatics. Georg Thieme Verlag KG Stuttgart.
A Computational Methodology to Support Reimbursement Requests Analysis Concerning Electrical Damages
NASA Astrophysics Data System (ADS)
Almeida Junior, Afonso Bernardino; Gondim, Isaque Nogueira; Rezende, Paulo Henrique Oliveira; Oliveira, José Carlos
2015-12-01
In light of the growing number of reimbursement requests processed from consumers for electrical damage to equipment, supposedly caused through the manifestation of anomalies on the power grid, there comes the need for reliable means for providing a decision on the issues highlighted herein. Through the recognition that in the current context, the procedures used are based on reviews, information and records of occurrences in the field, there has been significant inadequacy and fragility in the issuing of conclusive advice or opinions. In particular, the search for mechanisms grounded in classical principles and accepted in electrical engineering presents itself as an important challenge on which to base the decision making process in full awareness of its incumbent science and technology. Therefore, with the aim of meeting these assumptions, the study in question excels in its presentation of the principles that guided the software analysis, which intend above all else to correlate cause and effect. The elaborated strategy involves modelling stages as well as studies aimed at: distribution supply reproduction; characterization of the distribution network to the complainant consumer; representation of the diverse electro-electronic appliances and lastly, a proposal for correlating the disturbances impacting on equipment with their dielectric and thermal supportability requirements. For the purpose of illustrating the software process, an actual case study coupled with a loss and claim scenario is presented.
Reliability and precision of pellet-group counts for estimating landscape-level deer density
David S. deCalesta
2013-01-01
This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...
Complexity, Representation and Practice: Case Study as Method and Methodology
ERIC Educational Resources Information Center
Miles, Rebecca
2015-01-01
While case study is considered a common approach to examining specific and particular examples in research disciplines such as law, medicine and psychology, in the social sciences case study is often treated as a lesser, flawed or undemanding methodology which is less valid, reliable or theoretically rigorous than other methodologies. Building on…
Casuistry and principlism: the convergence of method in biomedical ethics.
Kuczewski, M
1998-12-01
Casuistry and principlism are two of the leading contenders to be considered the methodology of bioethics. These methods may be incommensurable since the former emphasizes the examination of cases while the latter focuses on moral principles. Conversely, since both analyze cases in terms of mid-level principles, there is hope that these methods may be reconcilable or complementary. I analyze the role of principles in each and thereby show that these theories are virtually identical when interpreted in a certain light. That is, if the gaps in each method are filled by a concept of judgment or Aristotelian practical wisdom, these methods converge.
Methodology for nonwork travel analysis in suburban communities.
DOT National Transportation Integrated Search
1994-01-01
The increase in the number of nonwork trips during the past decade has contributed substantially to congestion and to environmental problems. Data collection methodologies, descriptive information, and reliable models of nonwork travel behavior are n...
Hierarchical specification of the SIFT fault tolerant flight control system
NASA Technical Reports Server (NTRS)
Melliar-Smith, P. M.; Schwartz, R. L.
1981-01-01
The specification and mechanical verification of the Software Implemented Fault Tolerance (SIFT) flight control system is described. The methodology employed in the verification effort is discussed, and a description of the hierarchical models of the SIFT system is given. To meet the objective of NASA for the reliability of safety critical flight control systems, the SIFT computer must achieve a reliability well beyond the levels at which reliability can be actually measured. The methodology employed to demonstrate rigorously that the SIFT computer meets as reliability requirements is described. The hierarchy of design specifications from very abstract descriptions of system function down to the actual implementation is explained. The most abstract design specifications can be used to verify that the system functions correctly and with the desired reliability since almost all details of the realization were abstracted out. A succession of lower level models refine these specifications to the level of the actual implementation, and can be used to demonstrate that the implementation has the properties claimed of the abstract design specifications.
Psychometric evaluation of commonly used game-specific skills tests in rugby: A systematic review
Oorschot, Sander; Chiwaridzo, Matthew; CM Smits-Engelsman, Bouwien
2017-01-01
Objectives To (1) give an overview of commonly used game-specific skills tests in rugby and (2) evaluate available psychometric information of these tests. Methods The databases PubMed, MEDLINE CINAHL and Africa Wide information were systematically searched for articles published between January 1995 and March 2017. First, commonly used game-specific skills tests were identified. Second, the available psychometrics of these tests were evaluated and the methodological quality of the studies assessed using the Consensus-based Standards for the selection of health Measurement Instruments checklist. Studies included in the first step had to report detailed information on the construct and testing procedure of at least one game-specific skill, and studies included in the second step had additionally to report at least one psychometric property evaluating reliability, validity or responsiveness. Results 287 articles were identified in the first step, of which 30 articles met the inclusion criteria and 64 articles were identified in the second step of which 10 articles were included. Reactive agility, tackling and simulated rugby games were the most commonly used tests. All 10 studies reporting psychometrics reported reliability outcomes, revealing mainly strong evidence. However, all studies scored poor or fair on methodological quality. Four studies reported validity outcomes in which mainly moderate evidence was indicated, but all articles had fair methodological quality. Conclusion Game-specific skills tests indicated mainly high reliability and validity evidence, but the studies lacked methodological quality. Reactive agility seems to be a promising domain, but the specific tests need further development. Future high methodological quality studies are required in order to develop valid and reliable test batteries for rugby talent identification. Trial registration number PROSPERO CRD42015029747. PMID:29259812
Efficient Network Coding-Based Loss Recovery for Reliable Multicast in Wireless Networks
NASA Astrophysics Data System (ADS)
Chi, Kaikai; Jiang, Xiaohong; Ye, Baoliu; Horiguchi, Susumu
Recently, network coding has been applied to the loss recovery of reliable multicast in wireless networks [19], where multiple lost packets are XOR-ed together as one packet and forwarded via single retransmission, resulting in a significant reduction of bandwidth consumption. In this paper, we first prove that maximizing the number of lost packets for XOR-ing, which is the key part of the available network coding-based reliable multicast schemes, is actually a complex NP-complete problem. To address this limitation, we then propose an efficient heuristic algorithm for finding an approximately optimal solution of this optimization problem. Furthermore, we show that the packet coding principle of maximizing the number of lost packets for XOR-ing sometimes cannot fully exploit the potential coding opportunities, and we then further propose new heuristic-based schemes with a new coding principle. Simulation results demonstrate that the heuristic-based schemes have very low computational complexity and can achieve almost the same transmission efficiency as the current coding-based high-complexity schemes. Furthermore, the heuristic-based schemes with the new coding principle not only have very low complexity, but also slightly outperform the current high-complexity ones.
Study on fast discrimination of varieties of yogurt using Vis/NIR-spectroscopy
NASA Astrophysics Data System (ADS)
He, Yong; Feng, Shuijuan; Deng, Xunfei; Li, Xiaoli
2006-09-01
A new approach for discrimination of varieties of yogurt by means of VisINTR-spectroscopy was present in this paper. Firstly, through the principal component analysis (PCA) of spectroscopy curves of 5 typical kinds of yogurt, the clustering of yogurt varieties was processed. The analysis results showed that the cumulate reliabilities of PC1 and PC2 (the first two principle components) were more than 98.956%, and the cumulate reliabilities from PC1 to PC7 (the first seven principle components) was 99.97%. Secondly, a discrimination model of Artificial Neural Network (ANN-BP) was set up. The first seven principles components of the samples were applied as ANN-BP inputs, and the value of type of yogurt were applied as outputs, then the three-layer ANN-BP model was build. In this model, every variety yogurt includes 27 samples, the total number of sample is 135, and the rest 25 samples were used as prediction set. The results showed the distinguishing rate of the five yogurt varieties was 100%. It presented that this model was reliable and practicable. So a new approach for the rapid and lossless discrimination of varieties of yogurt was put forward.
A first principles based methodology for design of axial compressor configurations
NASA Astrophysics Data System (ADS)
Iyengar, Vishwas
Axial compressors are widely used in many aerodynamic applications. The design of an axial compressor configuration presents many challenges. Until recently, compressor design was done using 2-D viscous flow analyses that solve the flow field around cascades or in meridional planes or 3-D inviscid analyses. With the advent of modern computational methods it is now possible to analyze the 3-D viscous flow and accurately predict the performance of 3-D multistage compressors. It is necessary to retool the design methodologies to take advantage of the improved accuracy and physical fidelity of these advanced methods. In this study, a first-principles based multi-objective technique for designing single stage compressors is described. The study accounts for stage aerodynamic characteristics, rotor-stator interactions and blade elastic deformations. A parametric representation of compressor blades that include leading and trailing edge camber line angles, thickness and camber distributions was used in this study. A design of experiment approach is used to reduce the large combinations of design variables into a smaller subset. A response surface method is used to approximately map the output variables as a function of design variables. An optimized configuration is determined as the extremum of all extrema. This method has been applied to a rotor-stator stage similar to NASA Stage 35. The study has two parts: a preliminary study where a limited number of design variables were used to give an understanding of the important design variables for subsequent use, and a comprehensive application of the methodology where a larger, more complete set of design variables are used. The extended methodology also attempts to minimize the acoustic fluctuations at the rotor-stator interface by considering a rotor-wake influence coefficient (RWIC). Results presented include performance map calculations at design and off-design speed along with a detailed visualization of the flow field at design and off-design conditions. The present methodology provides a way to systematically screening through the plethora of design variables. By selecting the most influential design parameters and by optimizing the blade leading edge and trailing edge mean camber line angles, phenomenon's such as tip blockages, blade-to-blade shock structures and other loss mechanisms can be weakened or alleviated. It is found that these changes to the configuration can have a beneficial effect on total pressure ratio and stage adiabatic efficiency, thereby improving the performance of the axial compression system. Aeroacoustic benefits were found by minimizing the noise generating mechanisms associated with rotor wake-stator interactions. The new method presented is reliable, low time cost, and easily applicable to industry daily design optimization of turbomachinery blades.
NASA Technical Reports Server (NTRS)
Aguilera, Frank J.
2015-01-01
A guiding principle for conducting research in technology, science, and engineering, leading to innovation is based on our use of research methodology (both qualitative and qualitative). A brief review of research methodology will be presented with an overview of NASA process in developing aeronautics technologies and other things to consider in research including what is innovation.
Applying Chomsky's Linguistic Methodology to the Clinical Interpretation of Symbolic Play.
ERIC Educational Resources Information Center
Ariel, Shlomo
This paper summarizes how Chomsky's methodological principles of linguistics may be applied to the clinical interpretation of children's play. Based on Chomsky's derivation of a "universal grammar" (the set of essential, formal, and substantive traits of any human language), a number of hypothesized formal universals of…
Arts-Informed Inquiry in Teacher Education: Contesting the Myths
ERIC Educational Resources Information Center
Ewing, Robyn; Hughes, John
2008-01-01
Arts-informed inquiry has attracted a great deal of controversy in recent times as it has gained popularity as an educational research methodology in teacher education. As with other innovative approaches and methodologies, there have been lively debates about its rigour, authenticity and appropriateness. This article suggests principles for its…
Constraint-Driven Software Design: An Escape from the Waterfall Model.
ERIC Educational Resources Information Center
de Hoog, Robert; And Others
1994-01-01
Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…
Positive Deviance: Learning from Positive Anomalies
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Gale, Dick
2017-01-01
Purpose: This paper is one of seven in this volume, each elaborating different approaches to quality improvement in education. The purpose of this paper is to delineate a methodology called positive deviance. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an…
History Teaching in Albania Following Educational Reform in 2008
ERIC Educational Resources Information Center
Vuka, Denis
2015-01-01
This article explores history teaching in Albania, with particular emphasis on educational and methodological aspects of new history textbooks published after the liberalization of the school textbook market in 2008. National history textbooks serve as a basis for the assessment of changing educational principles and methodologies in history…
NASA Technical Reports Server (NTRS)
Aguilera, Frank J.
2015-01-01
A guiding principle for conducting research in technology, science, and engineering, leading to innovation is based on our use of research methodology (both qualitative and quantitative). A brief review of research methodology will be presented with an overview of NASA process in developing aeronautics technologies and other things to consider in research including what is innovation.
NASA Astrophysics Data System (ADS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
Brittle materials today are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts, thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing brittle material components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The NASA CARES/Life 1 (Ceramic Analysis and Reliability Evaluation of Structure/Life) code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. This capability includes predicting the time-dependent failure probability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The developed methodology allows for changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time). For this article an overview of the transient reliability methodology and how this methodology is extended to account for proof testing is described. The CARES/Life code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Ethical Issues in the Use of Humans for Research.
ERIC Educational Resources Information Center
Bashaw, W. L.
The APA Ethical Principles, the University of Georgia policy, standard research texts, and research literature on specific methodologies, all in relation to ethical issues in human research, are discussed. The 10 APA principles state, in essence, that the investigator is responsible for what happens, that confidentiality and the protection of the…
Principles of Technology. Units 1-10 Pilot Test Findings.
ERIC Educational Resources Information Center
Center for Occupational Research and Development, Inc., Waco, TX.
This document provides the findings of pilot tests of 10 units for an applied science course for high school vocational students. Each of the reports on the pilot tests of the Principles of Technology units contains information on procedures, methodology limitations, sample, the pretest/posttest instrument and results, student attitude results,…
ERIC Educational Resources Information Center
Maloni, Michael J.; Smith, Shane D.; Napshin, Stuart
2012-01-01
Evidence from extant literature indicates that faculty support is a critical driver for implementing the United Nations Principles for Responsible Management Education (PRME), particularly for schools pursuing an advanced, cross-disciplinary level of sustainability integration. However, there is limited existing research offering insight into how…
ERIC Educational Resources Information Center
Renshaw, Tyler L.; Kuriakose, Sarah
2011-01-01
During the past 2 decades, pivotal response treatment (PRT) has emerged as an evidence-based methodology for intervening with the behavioral, communicative, social, and academic impairments of children with autism. Unlike other highly structured behavioral interventions for autism, PRT emphasizes principles over procedures and focuses on enhancing…
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
Proposed Reliability/Cost Model
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1982-01-01
New technique estimates cost of improvement in reliability for complex system. Model format/approach is dependent upon use of subsystem cost-estimating relationships (CER's) in devising cost-effective policy. Proposed methodology should have application in broad range of engineering management decisions.
A Simple and Reliable Method of Design for Standalone Photovoltaic Systems
NASA Astrophysics Data System (ADS)
Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.
2017-06-01
Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.
Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks.
Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo
2017-11-05
Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
Methodological challenges when doing research that includes ethnic minorities: a scoping review.
Morville, Anne-Le; Erlandsson, Lena-Karin
2016-11-01
There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.
Ensuring reliability in expansion schemes.
Kamal-Uddin, Abu Sayed; Williams, Donald Leigh
2005-01-01
Existing electricity power supplies must serve, or be adapted to serve, the expansion of hospital buildings. With the existing power supply assets of many hospitals being up to 20 years old, assessing the security and reliability of the power system must be given appropriate priority to avoid unplanned outages due to overloads and equipment failures. It is imperative that adequate contingency is planned for essential and non-essential electricity circuits. This article describes the methodology undertaken, and the subsequent recommendations that were made, when evaluating the security and reliability of electricity power supplies to a number of major London hospitals. The methodology described aligns with the latest issue of NHS Estates HTM 2011 'Primary Electrical Infrastructure Emergency Electrical Services Design Guidance' (to which ERA Technology has contributed).
NASA Astrophysics Data System (ADS)
McPhee, J.; William, Y. W.
2005-12-01
This work presents a methodology for pumping test design based on the reliability requirements of a groundwater model. Reliability requirements take into consideration the application of the model results in groundwater management, expressed in this case as a multiobjective management model. The pumping test design is formulated as a mixed-integer nonlinear programming (MINLP) problem and solved using a combination of genetic algorithm (GA) and gradient-based optimization. Bayesian decision theory provides a formal framework for assessing the influence of parameter uncertainty over the reliability of the proposed pumping test. The proposed methodology is useful for selecting a robust design that will outperform all other candidate designs under most potential 'true' states of the system
Response to formal comment on Myhrvold (2016) submitted by Griebeler and Werner (2017)
2018-01-01
Griebeler and Werner offer a formal comment on Myhrvold, 2016 defending the conclusions of Werner and Griebeler, 2014. Although the comment criticizes several aspects of methodology in Myhrvold, 2016, all three papers concur on a key conclusion: the metabolism of extant endotherms and ectotherms cannot be reliably classified using growth-rate allometry, because the growth rates of extant endotherms and ectotherms overlap. A key point of disagreement is that the 2014 paper concluded that despite this general case, one can nevertheless classify dinosaurs as ectotherms from their growth rate allometry. The 2014 conclusion is based on two factors: the assertion (made without any supporting arguments) that the comparison with dinosaurs must be restricted only to extant sauropsids, ignoring other vertebrate groups, and that extant sauropsid endotherm and ectotherm growth rates in a data set studied in the 2014 work do not overlap. The Griebeler and Werner formal comment presents their first arguments in support of the restriction proposition. In this response I show that this restriction is unsupported by established principles of phylogenetic comparison. In addition, I show that the data set studied in their 2014 work does show overlap, and that this is visible in one of its figures. I explain how either point effectively invalidates the conclusion of their 2014 paper. I also address the other methodological criticisms of Myhrvold 2016, and find them unsupported. PMID:29489880
Response to formal comment on Myhrvold (2016) submitted by Griebeler and Werner (2017).
Myhrvold, Nathan P
2018-01-01
Griebeler and Werner offer a formal comment on Myhrvold, 2016 defending the conclusions of Werner and Griebeler, 2014. Although the comment criticizes several aspects of methodology in Myhrvold, 2016, all three papers concur on a key conclusion: the metabolism of extant endotherms and ectotherms cannot be reliably classified using growth-rate allometry, because the growth rates of extant endotherms and ectotherms overlap. A key point of disagreement is that the 2014 paper concluded that despite this general case, one can nevertheless classify dinosaurs as ectotherms from their growth rate allometry. The 2014 conclusion is based on two factors: the assertion (made without any supporting arguments) that the comparison with dinosaurs must be restricted only to extant sauropsids, ignoring other vertebrate groups, and that extant sauropsid endotherm and ectotherm growth rates in a data set studied in the 2014 work do not overlap. The Griebeler and Werner formal comment presents their first arguments in support of the restriction proposition. In this response I show that this restriction is unsupported by established principles of phylogenetic comparison. In addition, I show that the data set studied in their 2014 work does show overlap, and that this is visible in one of its figures. I explain how either point effectively invalidates the conclusion of their 2014 paper. I also address the other methodological criticisms of Myhrvold 2016, and find them unsupported.
ERIC Educational Resources Information Center
Dmitrenko, ?amara ?.; Lavryk, Tatjana V.; Yaresko, Ekaterina V.
2015-01-01
Changes in the various fields of knowledge influenced the pedagogical science. The article explains the structure of the foundations of modern pedagogy through paradigmal and methodological aspects. Bases of modern pedagogy include complex of paradigms, object and subject of science, general and specific principles, methods and technologies.…
Code of Federal Regulations, 2011 CFR
2011-04-01
... actuarial mathematics and methodology by one of the following: (1) Joint Board basic examination. Successful... basic actuarial mathematics and methodology including compound interest, principles of life... major area of concentration was actuarial mathematics, or (ii) Which included at least as many semester...
Training Emotional and Social Competences in Higher Education: The Seminar Methodology
ERIC Educational Resources Information Center
Oberst, Ursula; Gallifa, Josep; Farriols, Nuria; Vilaregut, Anna
2009-01-01
This article discusses the importance of emotional and social competences in higher education and presents a training model. In 1991, Ramon Llull University of Barcelona (Spain) created the Seminar methodology to tackle these challenges. A general model derived from the Emotional Intelligence concept and the general principles of this methodology…
Viability, Advantages and Design Methodologies of M-Learning Delivery
ERIC Educational Resources Information Center
Zabel, Todd W.
2010-01-01
The purpose of this study was to examine the viability and principle design methodologies of Mobile Learning models in developing regions. Demographic and market studies were utilized to determine the viability of M-Learning delivery as well as best uses for such technologies and methods given socioeconomic and political conditions within the…
The Research and Evaluation of Serious Games: Toward a Comprehensive Methodology
ERIC Educational Resources Information Center
Mayer, Igor; Bekebrede, Geertje; Harteveld, Casper; Warmelink, Harald; Zhou, Qiqi; van Ruijven, Theo; Lo, Julia; Kortmann, Rens; Wenzler, Ivo
2014-01-01
The authors present the methodological background to and underlying research design of an ongoing research project on the scientific evaluation of serious games and/or computer-based simulation games (SGs) for advanced learning. The main research questions are: (1) what are the requirements and design principles for a comprehensive social…
The origin of life and its methodological challenge.
Wächtershäuser, G
1997-08-21
The problem of the origin of life is discussed from a methodological point of view as an encounter between the teleological thinking of the historian and the mechanistic thinking of the chemist; and as the Kantian task of replacing teleology by mechanism. It is shown how the Popperian situational logic of historic understanding and the Popperian principle of explanatory power of scientific theories, when jointly applied to biochemistry, lead to a methodology of biochemical retrodiction, whereby common precursor functions are constructed for disparate successor functions. This methodology is exemplified by central tenets of the theory of the chemo-autotrophic origin of life: the proposal of a surface metabolism with a two-dimensional order; the basic polarity of life with negatively charged constituents on positively charged mineral surfaces; the surface-metabolic origin of phosphorylated sugar metabolism and nucleic acids; the origin of membrane lipids and of chemi-osmosis on pyrite surfaces; and the principles of the origin of the genetic machinery. The theory presents the early evolution of life as a process that begins with chemical necessity and winds up in genetic chance.
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Gu, Chengfan
2018-01-01
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation. PMID:29415509
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter.
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Zhong, Yongmin; Gu, Chengfan
2018-02-06
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation.
Launch vehicle systems design analysis
NASA Technical Reports Server (NTRS)
Ryan, Robert; Verderaime, V.
1993-01-01
Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.
[Principles of fast track surgery. Multimodal perioperative therapy programme].
Kehlet, H
2009-08-01
Recent evidence has documented that a combination of single-modality evidence-based care principles into a multimodal effort to enhance postoperative recovery (the fast track methodology) has led to enhanced recovery with reduced medical morbidity, need for hospitalisation and convalescence. Nevertheless, general implementation of fast track surgery has been relatively slow despite concomitant economic benefits. Further improvement in postoperative outcome may be obtained by developments within each care principle with a specific focus on minimally invasive surgery, effective multimodal, non-opioid analgesia and pharmacological stress reduction.
Tailoring a Human Reliability Analysis to Your Industry Needs
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2016-01-01
Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed versus a requirement to provide a numerical value as part of a probabilistic risk assessment. Industries involved with humans operating large equipment or transport systems (ex. railroads or airlines) would have more need to address the man machine interface than medical workers administering medications. Human error occurs in every industry; in most cases the consequences are relatively benign and occasionally beneficial. In cases where the results can have disastrous consequences, the use of Human Reliability techniques to identify and classify the risk of human errors allows a company more opportunities to mitigate or eliminate these types of risks and prevent costly tragedies.
[MLPA technique--principles and use in practice].
Rusu, Cristina; Sireteanu, Adriana; Puiu, Maria; Skrypnyk, Cristina; Tomescu, E; Csep, Katalin; Creţ, Victoria; Barbarii, Ligia
2007-01-01
MLPA (Multiplex Ligation-dependent Probe Amplification) is a recently introduced method, based on PCR principle, useful for the detection of different genetic abnormalities (aneuploidies, gene deletions/duplications, subtelomeric rearrangements, methylation status etc). The technique is simple, reliable and cheap. We present this method to discuss its importance for a modern genetic service and to underline its multiple advantages.
ERIC Educational Resources Information Center
Yano, Masaharu; Tomita, Junichi
2006-01-01
Purpose: The purpose of this paper is to demonstrate the actual conditions of Japanese professors' mobility and to carry out an analysis of the principle on which university researcher mobility is based and of the relationship between mobility and research performance. Design/methodology/approach: Using the Japanese university researcher database…
Rasheed, Nadia; Amin, Shamsudin H M
2016-01-01
Grounded language acquisition is an important issue, particularly to facilitate human-robot interactions in an intelligent and effective way. The evolutionary and developmental language acquisition are two innovative and important methodologies for the grounding of language in cognitive agents or robots, the aim of which is to address current limitations in robot design. This paper concentrates on these two main modelling methods with the grounding principle for the acquisition of linguistic ability in cognitive agents or robots. This review not only presents a survey of the methodologies and relevant computational cognitive agents or robotic models, but also highlights the advantages and progress of these approaches for the language grounding issue.
Rasheed, Nadia; Amin, Shamsudin H. M.
2016-01-01
Grounded language acquisition is an important issue, particularly to facilitate human-robot interactions in an intelligent and effective way. The evolutionary and developmental language acquisition are two innovative and important methodologies for the grounding of language in cognitive agents or robots, the aim of which is to address current limitations in robot design. This paper concentrates on these two main modelling methods with the grounding principle for the acquisition of linguistic ability in cognitive agents or robots. This review not only presents a survey of the methodologies and relevant computational cognitive agents or robotic models, but also highlights the advantages and progress of these approaches for the language grounding issue. PMID:27069470
The Principle-Based Method of Practical Ethics.
Spielthenner, Georg
2017-09-01
This paper is about the methodology of doing practical ethics. There is a variety of methods employed in ethics. One of them is the principle-based approach, which has an established place in ethical reasoning. In everyday life, we often judge the rightness and wrongness of actions by their conformity to principles, and the appeal to principles plays a significant role in practical ethics, too. In this paper, I try to provide a better understanding of the nature of principle-based reasoning. To accomplish this, I show in the first section that these principles can be applied to cases in a meaningful and sufficiently precise way. The second section discusses the question how relevant applying principles is to the resolution of ethical issues. This depends on their nature. I argue that the principles under consideration in this paper should be interpreted as presumptive principles and I conclude that although they cannot be expected to bear the weight of definitely resolving ethical problems, these principles can nevertheless play a considerable role in ethical research.
Analysis of travel-time reliability for freight corridors connecting the Pacific Northwest.
DOT National Transportation Integrated Search
2012-11-01
A new methodology and algorithms were developed to combine diverse data sources and to estimate the impacts of recurrent and non-recurrent : congestion on freight movements reliability and delays, costs, and emissions. The results suggest that tra...
Regulation of transmission line capacity and reliability in electric networks
NASA Astrophysics Data System (ADS)
Celebi, Metin
This thesis is composed of two essays that analyze the incentives and optimal regulation of a monopolist line owner in providing capacity and reliability. Similar analyses in the economic literature resulted in under-investment by an unregulated line owner when line reliability was treated as an exogenous variable. However, reliability should be chosen on the basis of economic principles as well, taking into account not only engineering principles but also the preferences of electricity users. When reliability is treated as a choice variable, both over- and under-investment by the line owner becomes possible. The result depends on the cross-cost elasticity of line construction and on the interval in which the optimal choices of capacity take place. We present some sufficient conditions that lead to definite results about the incentives of the line owner. We also characterize the optimal regulation of the line owner under incomplete information. Our analysis shows that the existence of a line is justified for the social planner when the reliability of other lines on the network is not too high, or when the marginal cost of generation at the expensive generating plant is high. The expectation of higher demand in the future makes the regulator less likely to build the line if it will be congested and reliability of other lines is high enough. It is always optimal to have a congested line under complete information, but not necessarily under incomplete information.
Assuring Electronics Reliability: What Could and Should Be Done Differently
NASA Astrophysics Data System (ADS)
Suhir, E.
The following “ ten commandments” for the predicted and quantified reliability of aerospace electronic, and photonic products are addressed and discussed: 1) The best product is the best compromise between the needs for reliability, cost effectiveness and time-to-market; 2) Reliability cannot be low, need not be higher than necessary, but has to be adequate for a particular product; 3) When reliability is imperative, ability to quantify it is a must, especially if optimization is considered; 4) One cannot design a product with quantified, optimized and assured reliability by limiting the effort to the highly accelerated life testing (HALT) that does not quantify reliability; 5) Reliability is conceived at the design stage and should be taken care of, first of all, at this stage, when a “ genetically healthy” product should be created; reliability evaluations and assurances cannot be delayed until the product is fabricated and shipped to the customer, i.e., cannot be left to the prognostics-and-health-monitoring/managing (PHM) stage; it is too late at this stage to change the design or the materials for improved reliability; that is why, when reliability is imperative, users re-qualify parts to assess their lifetime and use redundancy to build a highly reliable system out of insufficiently reliable components; 6) Design, fabrication, qualification and PHM efforts should consider and be specific for particular products and their most likely actual or at least anticipated application(s); 7) Probabilistic design for reliability (PDfR) is an effective means for improving the state-of-the-art in the field: nothing is perfect, and the difference between an unreliable product and a robust one is “ merely” the probability of failure (PoF); 8) Highly cost-effective and highly focused failure oriented accelerated testing (FOAT) geared to a particular pre-determined reliability model and aimed at understanding the physics of failure- anticipated by this model is an important constituent part of the PDfR effort; 9) Predictive modeling (PM) is another important constituent of the PDfR approach; in combination with FOAT, it is a powerful means to carry out sensitivity analyses (SA), to quantify and nearly eliminate failures (“ principle of practical confidence” ); 10) Consistent, comprehensive and physically meaningful PDfR can effectively contribute to the most feasible and the most effective qualification test (QT) methodologies, practices and specifications. The general concepts addressed in the paper are illustrated by numerical examples. It is concluded that although the suggested concept is promising and fruitful, further research, refinement, and validations are needed before this concept becomes widely accepted by the engineering community and implemented into practice. It is important that this novel approach is introduced gradually, whenever feasible and appropriate, in addition to, and in some situations even instead of, the currently employed various types and modifications of the forty year old HALT.
Improving patient safety: patient-focused, high-reliability team training.
McKeon, Leslie M; Cunningham, Patricia D; Oswaks, Jill S Detty
2009-01-01
Healthcare systems are recognizing "human factor" flaws that result in adverse outcomes. Nurses work around system failures, although increasing healthcare complexity makes this harder to do without risk of error. Aviation and military organizations achieve ultrasafe outcomes through high-reliability practice. We describe how reliability principles were used to teach nurses to improve patient safety at the front line of care. Outcomes include safety-oriented, teamwork communication competency; reflections on safety culture and clinical leadership are discussed.
A study on reliability of power customer in distribution network
NASA Astrophysics Data System (ADS)
Liu, Liyuan; Ouyang, Sen; Chen, Danling; Ma, Shaohua; Wang, Xin
2017-05-01
The existing power supply reliability index system is oriented to power system without considering actual electricity availability in customer side. In addition, it is unable to reflect outage or customer’s equipment shutdown caused by instantaneous interruption and power quality problem. This paper thus makes a systematic study on reliability of power customer. By comparing with power supply reliability, reliability of power customer is defined and extracted its evaluation requirements. An indexes system, consisting of seven customer indexes and two contrast indexes, are designed to describe reliability of power customer from continuity and availability. In order to comprehensively and quantitatively evaluate reliability of power customer in distribution networks, reliability evaluation method is proposed based on improved entropy method and the punishment weighting principle. Practical application has proved that reliability index system and evaluation method for power customer is reasonable and effective.
Hulteen, Ryan M; Lander, Natalie J; Morgan, Philip J; Barnett, Lisa M; Robertson, Samuel J; Lubans, David R
2015-10-01
It has been suggested that young people should develop competence in a variety of 'lifelong physical activities' to ensure that they can be active across the lifespan. The primary aim of this systematic review is to report the methodological properties, validity, reliability, and test duration of field-based measures that assess movement skill competency in lifelong physical activities. A secondary aim was to clearly define those characteristics unique to lifelong physical activities. A search of four electronic databases (Scopus, SPORTDiscus, ProQuest, and PubMed) was conducted between June 2014 and April 2015 with no date restrictions. Studies addressing the validity and/or reliability of lifelong physical activity tests were reviewed. Included articles were required to assess lifelong physical activities using process-oriented measures, as well as report either one type of validity or reliability. Assessment criteria for methodological quality were adapted from a checklist used in a previous review of sport skill outcome assessments. Movement skill assessments for eight different lifelong physical activities (badminton, cycling, dance, golf, racquetball, resistance training, swimming, and tennis) in 17 studies were identified for inclusion. Methodological quality, validity, reliability, and test duration (time to assess a single participant), for each article were assessed. Moderate to excellent reliability results were found in 16 of 17 studies, with 71% reporting inter-rater reliability and 41% reporting intra-rater reliability. Only four studies in this review reported test-retest reliability. Ten studies reported validity results; content validity was cited in 41% of these studies. Construct validity was reported in 24% of studies, while criterion validity was only reported in 12% of studies. Numerous assessments for lifelong physical activities may exist, yet only assessments for eight lifelong physical activities were included in this review. Generalizability of results may be more applicable if more heterogeneous samples are used in future research. Moderate to excellent levels of inter- and intra-rater reliability were reported in the majority of studies. However, future work should look to establish test-retest reliability. Validity was less commonly reported than reliability, and further types of validity other than content validity need to be established in future research. Specifically, predictive validity of 'lifelong physical activity' movement skill competency is needed to support the assertion that such activities provide the foundation for a lifetime of activity.
Barnes, Brian B.; Wilson, Michael B.; Carr, Peter W.; Vitha, Mark F.; Broeckling, Corey D.; Heuberger, Adam L.; Prenni, Jessica; Janis, Gregory C.; Corcoran, Henry; Snow, Nicholas H.; Chopra, Shilpi; Dhandapani, Ramkumar; Tawfall, Amanda; Sumner, Lloyd W.; Boswell, Paul G.
2014-01-01
Gas chromatography-mass spectrometry (GC-MS) is a primary tool used to identify compounds in complex samples. Both mass spectra and GC retention times are matched to those of standards, but it is often impractical to have standards on hand for every compound of interest, so we must rely on shared databases of MS data and GC retention information. Unfortunately, retention databases (e.g. linear retention index libraries) are experimentally restrictive, notoriously unreliable, and strongly instrument dependent, relegating GC retention information to a minor, often negligible role in compound identification despite its potential power. A new methodology called “retention projection” has great potential to overcome the limitations of shared chromatographic databases. In this work, we tested the reliability of the methodology in five independent laboratories. We found that even when each lab ran nominally the same method, the methodology was 3-fold more accurate than retention indexing because it properly accounted for unintentional differences between the GC-MS systems. When the labs used different methods of their own choosing, retention projections were 4- to 165-fold more accurate. More importantly, the distribution of error in the retention projections was predictable across different methods and labs, thus enabling automatic calculation of retention time tolerance windows. Tolerance windows at 99% confidence were generally narrower than those widely used even when physical standards are on hand to measure their retention. With its high accuracy and reliability, the new retention projection methodology makes GC retention a reliable, precise tool for compound identification, even when standards are not available to the user. PMID:24205931
Reliability Prediction Analysis: Airborne System Results and Best Practices
NASA Astrophysics Data System (ADS)
Silva, Nuno; Lopes, Rui
2013-09-01
This article presents the results of several reliability prediction analysis for aerospace components, made by both methodologies, the 217F and the 217Plus. Supporting and complementary activities are described, as well as the differences concerning the results and the applications of both methodologies that are summarized in a set of lessons learned that are very useful for RAMS and Safety Prediction practitioners.The effort that is required for these activities is also an important point that is discussed, as is the end result and their interpretation/impact on the system design.The article concludes while positioning these activities and methodologies in an overall process for space and aeronautics equipment/components certification, and highlighting their advantages. Some good practices have also been summarized and some reuse rules have been laid down.
Prioritization Methodology for Chemical Replacement
NASA Technical Reports Server (NTRS)
Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.
1993-01-01
This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.
Probabilistic fatigue methodology for six nines reliability
NASA Technical Reports Server (NTRS)
Everett, R. A., Jr.; Bartlett, F. D., Jr.; Elber, Wolf
1990-01-01
Fleet readiness and flight safety strongly depend on the degree of reliability that can be designed into rotorcraft flight critical components. The current U.S. Army fatigue life specification for new rotorcraft is the so-called six nines reliability, or a probability of failure of one in a million. The progress of a round robin which was established by the American Helicopter Society (AHS) Subcommittee for Fatigue and Damage Tolerance is reviewed to investigate reliability-based fatigue methodology. The participants in this cooperative effort are in the U.S. Army Aviation Systems Command (AVSCOM) and the rotorcraft industry. One phase of the joint activity examined fatigue reliability under uniquely defined conditions for which only one answer was correct. The other phases were set up to learn how the different industry methods in defining fatigue strength affected the mean fatigue life and reliability calculations. Hence, constant amplitude and spectrum fatigue test data were provided so that each participant could perform their standard fatigue life analysis. As a result of this round robin, the probabilistic logic which includes both fatigue strength and spectrum loading variability in developing a consistant reliability analysis was established. In this first study, the reliability analysis was limited to the linear cumulative damage approach. However, it is expected that superior fatigue life prediction methods will ultimately be developed through this open AHS forum. To that end, these preliminary results were useful in identifying some topics for additional study.
Predicting Cost/Reliability/Maintainability of Advanced General Aviation Avionics Equipment
NASA Technical Reports Server (NTRS)
Davis, M. R.; Kamins, M.; Mooz, W. E.
1978-01-01
A methodology is provided for assisting NASA in estimating the cost, reliability, and maintenance (CRM) requirements for general avionics equipment operating in the 1980's. Practical problems of predicting these factors are examined. The usefulness and short comings of different approaches for modeling coast and reliability estimates are discussed together with special problems caused by the lack of historical data on the cost of maintaining general aviation avionics. Suggestions are offered on how NASA might proceed in assessing cost reliability CRM implications in the absence of reliable generalized predictive models.
PERFORMANCE, RELIABILITY, AND IMPROVEMENT OF A TISSUE-SPECIFIC METABOLIC SIMULATOR
A methodology is described that has been used to build and enhance a simulator for rat liver metabolism providing reliable predictions within a large chemical domain. The tissue metabolism simulator (TIMES) utilizes a heuristic algorithm to generate plausible metabolic maps using...
Evaluation of Explosive Strength for Young and Adult Athletes
ERIC Educational Resources Information Center
Viitasalo, Jukka T.
1988-01-01
The reliability of new electrical measurements of vertical jumping height and of throwing velocity was tested. These results were compared to traditional measurement techniques. The new method was found to give reliable results from children to adults. Methodology is discussed. (Author/JL)
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2011 CFR
2011-10-01
... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2013 CFR
2013-10-01
... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2012 CFR
2012-10-01
... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...
49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs
Code of Federal Regulations, 2014 CFR
2014-10-01
... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...
Large-scale systems: Complexity, stability, reliability
NASA Technical Reports Server (NTRS)
Siljak, D. D.
1975-01-01
After showing that a complex dynamic system with a competitive structure has highly reliable stability, a class of noncompetitive dynamic systems for which competitive models can be constructed is defined. It is shown that such a construction is possible in the context of the hierarchic stability analysis. The scheme is based on the comparison principle and vector Liapunov functions.
The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.
ERIC Educational Resources Information Center
Filinov, Nikolay B.; Ruchkina, Svetlana
2002-01-01
The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…
Unmanned Tactical Autonomous Control and Collaboration Situation Awareness
2017-06-01
methodology framework using interdependence analysis (IA) tables for informing design requirements based on SA requirements. Future research should seek...requirements of UTACC. The authors then apply SA principles to Coactive Design in order to inform robotic design. The result is a methodology framework using...28 2. Non -intrusive Methods ................................................................29 3. Post-Mission Reviews
Assessment Problems and Ensuring of Decent Work in the Russian Regions
ERIC Educational Resources Information Center
Simonova, Marina V.; Sankova, Larisa V.; Mirzabalaeva, Farida I.; Shchipanova, Dina Ye.; Dorozhkin, Vladimir E.
2016-01-01
The relevance of the research problem is inspired by the need to ensure decent work principles in Russia. The purpose of this article is to develop evaluation methodologies and identify areas to implement key principles of decent work at the regional level in modern Russia. A leading approach to study this problem is the development of a new…
An Analysis of Factors that Inhibit Business Use of User-Centered Design Principles: A Delphi Study
ERIC Educational Resources Information Center
Hilton, Tod M.
2010-01-01
The use of user-centered design (UCD) principles has a positive impact on the use of web-based interactive systems in customer-centric organizations. User-centered design methodologies are not widely adopted in organizations due to intraorganizational factors. A qualitative study using a modified Delphi technique was used to identify the factors…
ERIC Educational Resources Information Center
Hanke, Craig J.; Bauer-Dantoin, Angela C.
2006-01-01
Classroom discussion of scientific articles can be an effective means of teaching scientific principles and methodology to both undergraduate and graduate science students. The availability of classic papers from the American Physiological Society Legacy Project has made it possible to access articles dating back to the early portions of the 20th…
ERIC Educational Resources Information Center
Sahney, Sangeeta
2016-01-01
Purpose: Educational institutes must embrace the principles of total quality management (TQM) if they seek to remain competitive, and survive and succeed in the long run. An educational institution must embrace the principles of quality management and incorporate them into all of their activities. Starting with a theoretical background, the paper…
ERIC Educational Resources Information Center
Isaias, Pedro; Reis, Francisco; Coutinho, Clara; Lencastre, Jose Alberto
2017-01-01
Purpose: This paper examines the acceptance, of a group of 79 students, of an educational forum, used for mobile and distance learning, that has been modified to include empathic characteristics and affective principles. Design/Methodology/Approach: With this study is proposed that the introduction of empathic and affective principles in…
ERIC Educational Resources Information Center
Newbery, Natasha; McCambridge, Jim; Strang, John
2007-01-01
Purpose: The feasibility of a community-level drug prevention intervention based upon the principles of motivational interviewing within a further education college was investigated in a pilot study. Design/methodology/approach: The implementation over the course of a single term of "Let's Talk about Drugs" was studied with both action…
NASA Technical Reports Server (NTRS)
Putcha, Chandra S.; Mikula, D. F. Kip; Dueease, Robert A.; Dang, Lan; Peercy, Robert L.
1997-01-01
This paper deals with the development of a reliability methodology to assess the consequences of using hardware, without failure analysis or corrective action, that has previously demonstrated that it did not perform per specification. The subject of this paper arose from the need to provide a detailed probabilistic analysis to calculate the change in probability of failures with respect to the base or non-failed hardware. The methodology used for the analysis is primarily based on principles of Monte Carlo simulation. The random variables in the analysis are: Maximum Time of Operation (MTO) and operation Time of each Unit (OTU) The failure of a unit is considered to happen if (OTU) is less than MTO for the Normal Operational Period (NOP) in which this unit is used. NOP as a whole uses a total of 4 units. Two cases are considered. in the first specialized scenario, the failure of any operation or system failure is considered to happen if any of the units used during the NOP fail. in the second specialized scenario, the failure of any operation or system failure is considered to happen only if any two of the units used during the MOP fail together. The probability of failure of the units and the system as a whole is determined for 3 kinds of systems - Perfect System, Imperfect System 1 and Imperfect System 2. in a Perfect System, the operation time of the failed unit is the same as that of the MTO. In an Imperfect System 1, the operation time of the failed unit is assumed as 1 percent of the MTO. In an Imperfect System 2, the operation time of the failed unit is assumed as zero. in addition, simulated operation time of failed units is assumed as 10 percent of the corresponding units before zero value. Monte Carlo simulation analysis is used for this study. Necessary software has been developed as part of this study to perform the reliability calculations. The results of the analysis showed that the predicted change in failure probability (P(sub F)) for the previously failed units is as high as 49 percent above the baseline (perfect system) for the worst case. The predicted change in system P(sub F) for the previously failed units is as high as 36% for single unit failure without any redundancy. For redundant systems, with dual unit failure, the predicted change in P(sub F) for the previously failed units is as high as 16%. These results will help management to make decisions regarding the consequences of using previously failed units without adequate failure analysis or corrective action.
Design of high reliability organizations in health care.
Carroll, J S; Rudolph, J W
2006-12-01
To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
Avoiding reification. Heuristic effectiveness of mathematics and the prediction of the Ω- particle
NASA Astrophysics Data System (ADS)
Ginammi, Michele
2016-02-01
According to Steiner (1998), in contemporary physics new important discoveries are often obtained by means of strategies which rely on purely formal mathematical considerations. In such discoveries, mathematics seems to have a peculiar and controversial role, which apparently cannot be accounted for by means of standard methodological criteria. M. Gell-Mann and Y. Ne'eman's prediction of the Ω- particle is usually considered a typical example of application of this kind of strategy. According to Bangu (2008), this prediction is apparently based on the employment of a highly controversial principle-what he calls the "reification principle". Bangu himself takes this principle to be methodologically unjustifiable, but still indispensable to make the prediction logically sound. In the present paper I will offer a new reconstruction of the reasoning that led to this prediction. By means of this reconstruction, I will show that we do not need to postulate any "reificatory" role of mathematics in contemporary physics and I will contextually clarify the representative and heuristic role of mathematics in science.
Storage reliability analysis summary report. Volume 2: Electro mechanical devices
NASA Astrophysics Data System (ADS)
Smith, H. B., Jr.; Krulac, I. L.
1982-09-01
This document summarizes storage reliability data collected by the US Army Missile Command on electro-mechanical devices over a period of several years. Sources of data are detailed, major failure modes and mechanisms are listed and discussed. Non-operational failure rate prediction methodology is given, and conclusions and recommendations for enhancing the storage reliability of devices are drawn from the analysis of collected data.
Design of an integrated airframe/propulsion control system architecture
NASA Technical Reports Server (NTRS)
Cohen, Gerald C.; Lee, C. William; Strickland, Michael J.
1990-01-01
The design of an integrated airframe/propulsion control system architecture is described. The design is based on a prevalidation methodology that used both reliability and performance tools. An account is given of the motivation for the final design and problems associated with both reliability and performance modeling. The appendices contain a listing of the code for both the reliability and performance model used in the design.
1985-11-26
etc.).., Major decisions involving reliability ptudies, based on competing risk methodology , have been made in the past and will continue to be made...censoring mechanism. In such instances, the methodology for estimating relevant reliabili- ty probabilities has received considerable attention (cf. David...proposal for a discussion of the general methodology . .,4..% . - ’ -. - ’ . ’ , . * I - " . . - - - - . . ,_ . . . . . . . . .4
A Radial Basis Function Approach to Financial Time Series Analysis
1993-12-01
including efficient methods for parameter estimation and pruning, a pointwise prediction error estimator, and a methodology for controlling the "data...collection of practical techniques to address these issues for a modeling methodology . Radial Basis Function networks. These techniques in- clude efficient... methodology often then amounts to a careful consideration of the interplay between model complexity and reliability. These will be recurrent themes
It is Time the United States Air Force Changes the way it Feeds its Airmen
2008-03-01
narrative , phenomenology , ethnography , case study , and grounded theory . In purpose, these strategies are...methodology) the research will be analyzed. Methodology A qualitative research methodology and specifically a case study strategy for the...well as theory building in chapter five . Finally, in regards to reliability, Yin’s (2003) case study protocol guidance was used as a means to
Kiernan, Michaela; Schoffman, Danielle E.; Lee, Katherine; Brown, Susan D.; Fair, Joan M.; Perri, Michael G.; Haskell, William L.
2015-01-01
Background Physical activity is essential for chronic disease prevention, yet <40% of overweight/obese adults meet national activity recommendations. For time-efficient counseling, clinicians need a brief easy-to-use tool that reliably and validly assesses a full range of activity levels, and most importantly, is sensitive to clinically meaningful changes in activity. The Stanford Leisure-Time Activity Categorical Item (L-Cat) is a single item comprised of six descriptive categories ranging from inactive to very active. This novel methodological approach assesses national activity recommendations as well as multiple clinically relevant categories below and above recommendations, and incorporates critical methodological principles that enhance psychometrics (reliability, validity, sensitivity to change). Methods We evaluated the L-Cat’s psychometrics among 267 overweight/obese women asked to meet national activity recommendations in a randomized behavioral weight-loss trial. Results The L-Cat had excellent test-retest reliability (κ=0.64, P<.001) and adequate concurrent criterion validity; each L-Cat category at 6 months was associated with 1059 more daily pedometer steps (95% CI 712–1407, β=0.38, P<.001) and 1.9% greater initial weight loss at 6 months (95% CI −2.4 to −1.3, β=−0.38, P<.001). Of interest, L-Cat categories differentiated from each other in a dose-response gradient for steps and weight loss (Ps<.05) with excellent face validity. The L-Cat was sensitive to change in response to the trial’s activity component. Women increased one L-Cat category at 6 months (M=1.0±1.4, P<.001); 55.8% met recommendations at 6 months whereas 20.6% did at baseline (P<.001). Even among women not meeting recommendations at both baseline and 6 months (n=106), women who moved ≥1 L-Cat categories at 6 months lost more weight than those who did not (M=−4.6%, 95% CI −6.7 to −2.5, P<.001). Conclusions Given strong psychometrics, the L-Cat has timely potential for clinical use such as tracking activity changes via electronic medical records especially among overweight/obese populations unable or unlikely to reach national recommendations. PMID:23588625
Kiernan, M; Schoffman, D E; Lee, K; Brown, S D; Fair, J M; Perri, M G; Haskell, W L
2013-12-01
Physical activity is essential for chronic disease prevention, yet <40% of overweight/obese adults meet the national activity recommendations. For time-efficient counseling, clinicians need a brief, easy-to-use tool that reliably and validly assesses a full range of activity levels, and, most importantly, is sensitive to clinically meaningful changes in activity. The Stanford Leisure-Time Activity Categorical Item (L-Cat) is a single item comprising six descriptive categories ranging from inactive to very active. This novel methodological approach assesses national activity recommendations as well as multiple clinically relevant categories below and above the recommendations, and incorporates critical methodological principles that enhance psychometrics (reliability, validity and sensitivity to change). We evaluated the L-Cat's psychometrics among 267 overweight/obese women who were asked to meet the national activity recommendations in a randomized behavioral weight-loss trial. The L-Cat had excellent test-retest reliability (κ=0.64, P<0.001) and adequate concurrent criterion validity; each L-Cat category at 6 months was associated with 1059 more daily pedometer steps (95% CI 712-1407, β=0.38, P<0.001) and 1.9% greater initial weight loss at 6 months (95% CI -2.4 to -1.3, β=-0.38, P<0.001). Of interest, L-Cat categories differentiated from each other in a dose-response gradient for steps and weight loss (Ps<0.05) with excellent face validity. The L-Cat was sensitive to change in response to the trial's activity component. Women increased one L-Cat category at 6 months (M=1.0±1.4, P<0.001); 55.8% met the recommendations at 6 months whereas 20.6% did at baseline (P<0.001). Even among women not meeting the recommendations at both baseline and 6 months (n=106), women who moved 1 L-Cat categories at 6 months lost more weight than those who did not (M=-4.6%, 95% CI -6.7 to -2.5, P<0.001). Given strong psychometrics, the L-Cat has timely potential for clinical use such as tracking activity changes via electronic medical records, especially among overweight/obese populations who are unable or unlikely to reach national recommendations.
On-time reliability impacts of ATIS. Volume III, Implications for ATIS investment strategies
DOT National Transportation Integrated Search
2003-05-01
The effect of ATIS accuracy and extent of ATIS roadway instrumentation on the on-time reliability benefits to routine users of ATIS are evaluated through the application of Heuristic On-line Web-linked Arrival Time Estimation (HOWLATE) methodology. T...
34 CFR 462.11 - What must an application contain?
Code of Federal Regulations, 2010 CFR
2010-07-01
... the methodology and procedures used to measure the reliability of the test. (h) Construct validity... previous test, and results from validity, reliability, and equating or standard-setting studies undertaken... NRS educational functioning levels (content validity). Documentation of the extent to which the items...
75 FR 5779 - Proposed Emergency Agency Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-04
... proposed collection of information, including the validity of the methodology and assumptions used; (c... Collection Request Title: Electricity Delivery and Energy Reliability Recovery Act Smart Grid Grant Program..., Chief Operating Officer, Electricity Delivery and Energy Reliability. [FR Doc. 2010-2422 Filed 2-3-10; 8...
Meta-Analysis of Coefficient Alpha
ERIC Educational Resources Information Center
Rodriguez, Michael C.; Maeda, Yukiko
2006-01-01
The meta-analysis of coefficient alpha across many studies is becoming more common in psychology by a methodology labeled reliability generalization. Existing reliability generalization studies have not used the sampling distribution of coefficient alpha for precision weighting and other common meta-analytic procedures. A framework is provided for…
Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks
Dâmaso, Antônio; Maciel, Paulo
2017-01-01
Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078
Ancient DNA studies: new perspectives on old samples
2012-01-01
In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611
Aerospace reliability applied to biomedicine.
NASA Technical Reports Server (NTRS)
Lalli, V. R.; Vargo, D. J.
1972-01-01
An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.
Prioritization methodology for chemical replacement
NASA Technical Reports Server (NTRS)
Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott
1995-01-01
This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
Fifth Annual Workshop on the Application of Probabilistic Methods for Gas Turbine Engines
NASA Technical Reports Server (NTRS)
Briscoe, Victoria (Compiler)
2002-01-01
These are the proceedings of the 5th Annual FAA/Air Force/NASA/Navy Workshop on the Probabilistic Methods for Gas Turbine Engines hosted by NASA Glenn Research Center and held at the Holiday Inn Cleveland West. The history of this series of workshops stems from the recognition that both military and commercial aircraft engines are inevitably subjected to similar design and manufacturing principles. As such, it was eminently logical to combine knowledge bases on how some of these overlapping principles and methodologies are being applied. We have started the process by creating synergy and cooperation between the FAA, Air Force, Navy, and NASA in these workshops. The recent 3-day workshop was specifically designed to benefit the development of probabilistic methods for gas turbine engines by addressing recent technical accomplishments and forging new ideas. We accomplished our goals of minimizing duplication, maximizing the dissemination of information, and improving program planning to all concerned. This proceeding includes the final agenda, abstracts, presentations, and panel notes, plus the valuable contact information from our presenters and attendees. We hope that this proceeding will be a tool to enhance understanding of the developers and users of probabilistic methods. The fifth workshop doubled its attendance and had the success of collaboration with the many diverse groups represented including government, industry, academia, and our international partners. So, "Start your engines!" and utilize these proceedings towards creating safer and more reliable gas turbine engines for our commercial and military partners.
The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.
Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin
2007-11-01
This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.
Durán, Claudio; Daminelli, Simone; Thomas, Josephine M; Haupt, V Joachim; Schroeder, Michael; Cannistraci, Carlo Vittorio
2017-04-26
The bipartite network representation of the drug-target interactions (DTIs) in a biosystem enhances understanding of the drugs' multifaceted action modes, suggests therapeutic switching for approved drugs and unveils possible side effects. As experimental testing of DTIs is costly and time-consuming, computational predictors are of great aid. Here, for the first time, state-of-the-art DTI supervised predictors custom-made in network biology were compared-using standard and innovative validation frameworks-with unsupervised pure topological-based models designed for general-purpose link prediction in bipartite networks. Surprisingly, our results show that the bipartite topology alone, if adequately exploited by means of the recently proposed local-community-paradigm (LCP) theory-initially detected in brain-network topological self-organization and afterwards generalized to any complex network-is able to suggest highly reliable predictions, with comparable performance with the state-of-the-art-supervised methods that exploit additional (non-topological, for instance biochemical) DTI knowledge. Furthermore, a detailed analysis of the novel predictions revealed that each class of methods prioritizes distinct true interactions; hence, combining methodologies based on diverse principles represents a promising strategy to improve drug-target discovery. To conclude, this study promotes the power of bio-inspired computing, demonstrating that simple unsupervised rules inspired by principles of topological self-organization and adaptiveness arising during learning in living intelligent systems (like the brain) can efficiently equal perform complicated algorithms based on advanced, supervised and knowledge-based engineering. © The Author 2017. Published by Oxford University Press.
Factors Influencing the Reliability of the Glasgow Coma Scale: A Systematic Review.
Reith, Florence Cm; Synnot, Anneliese; van den Brande, Ruben; Gruen, Russell L; Maas, Andrew Ir
2017-06-01
The Glasgow Coma Scale (GCS) characterizes patients with diminished consciousness. In a recent systematic review, we found overall adequate reliability across different clinical settings, but reliability estimates varied considerably between studies, and methodological quality of studies was overall poor. Identifying and understanding factors that can affect its reliability is important, in order to promote high standards for clinical use of the GCS. The aim of this systematic review was to identify factors that influence reliability and to provide an evidence base for promoting consistent and reliable application of the GCS. A comprehensive literature search was undertaken in MEDLINE, EMBASE, and CINAHL from 1974 to July 2016. Studies assessing the reliability of the GCS in adults or describing any factor that influences reliability were included. Two reviewers independently screened citations, selected full texts, and undertook data extraction and critical appraisal. Methodological quality of studies was evaluated with the consensus-based standards for the selection of health measurement instruments checklist. Data were synthesized narratively and presented in tables. Forty-one studies were included for analysis. Factors identified that may influence reliability are education and training, the level of consciousness, and type of stimuli used. Conflicting results were found for experience of the observer, the pathology causing the reduced consciousness, and intubation/sedation. No clear influence was found for the professional background of observers. Reliability of the GCS is influenced by multiple factors and as such is context dependent. This review points to the potential for improvement from training and education and standardization of assessment methods, for which recommendations are presented. Copyright © 2017 by the Congress of Neurological Surgeons.
NASA Astrophysics Data System (ADS)
Chakraborty, A.; Goto, H.
2017-12-01
The 2011 off the Pacific coast of Tohoku earthquake caused severe damage in many areas further inside the mainland because of site-amplification. Furukawa district in Miyagi Prefecture, Japan recorded significant spatial differences in ground motion even at sub-kilometer scales. The site responses in the damage zone far exceeded the levels in the hazard maps. A reason why the mismatch occurred is that mapping follow only the mean value at the measurement locations with no regard to the data uncertainties and thus are not always reliable. Our research objective is to develop a methodology to incorporate data uncertainties in mapping and propose a reliable map. The methodology is based on a hierarchical Bayesian modeling of normally-distributed site responses in space where the mean (μ), site-specific variance (σ2) and between-sites variance(s2) parameters are treated as unknowns with a prior distribution. The observation data is artificially created site responses with varying means and variances for 150 seismic events across 50 locations in one-dimensional space. Spatially auto-correlated random effects were added to the mean (μ) using a conditionally autoregressive (CAR) prior. The inferences on the unknown parameters are done using Markov Chain Monte Carlo methods from the posterior distribution. The goal is to find reliable estimates of μ sensitive to uncertainties. During initial trials, we observed that the tau (=1/s2) parameter of CAR prior controls the μ estimation. Using a constraint, s = 1/(k×σ), five spatial models with varying k-values were created. We define reliability to be measured by the model likelihood and propose the maximum likelihood model to be highly reliable. The model with maximum likelihood was selected using a 5-fold cross-validation technique. The results show that the maximum likelihood model (μ*) follows the site-specific mean at low uncertainties and converges to the model-mean at higher uncertainties (Fig.1). This result is highly significant as it successfully incorporates the effect of data uncertainties in mapping. This novel approach can be applied to any research field using mapping techniques. The methodology is now being applied to real records from a very dense seismic network in Furukawa district, Miyagi Prefecture, Japan to generate a reliable map of the site responses.
van Exel, Job; Baker, Rachel; Mason, Helen; Donaldson, Cam; Brouwer, Werner
2015-02-01
Resources available to the health care sector are finite and typically insufficient to fulfil all the demands for health care in the population. Decisions must be made about which treatments to provide. Relatively little is known about the views of the general public regarding the principles that should guide such decisions. We present the findings of a Q methodology study designed to elicit the shared views in the general public across ten countries regarding the appropriate principles for prioritising health care resources. In 2010, 294 respondents rank ordered a set of cards and the results of these were subject to by-person factor analysis to identify common patterns in sorting. Five distinct viewpoints were identified, (I) "Egalitarianism, entitlement and equality of access"; (II) "Severity and the magnitude of health gains"; (III) "Fair innings, young people and maximising health benefits"; (IV) "The intrinsic value of life and healthy living"; (V) "Quality of life is more important than simply staying alive". Given the plurality of views on the principles for health care priority setting, no single equity principle can be used to underpin health care priority setting. Hence, the process of decision making becomes more important, in which, arguably, these multiple perspectives in society should be somehow reflected. Copyright © 2014 Elsevier Ltd. All rights reserved.
Schueller, Stephen M; Riley, William T; Brown, C Hendricks; Cuijpers, Pim; Duan, Naihua; Kwasny, Mary J; Stiles-Shields, Colleen; Cheung, Ken
2015-01-01
In recent years, there has been increasing discussion of the limitations of traditional randomized controlled trial (RCT) methodologies for the evaluation of eHealth and mHealth interventions, and in particular, the requirement that these interventions be locked down during evaluation. Locking down these interventions locks in defects and eliminates the opportunities for quality improvement and adaptation to the changing technological environment, often leading to validation of tools that are outdated by the time that trial results are published. Furthermore, because behavioral intervention technologies change frequently during real-world deployment, even if a tested intervention were deployed in the real world, its shelf life would be limited. We argue that RCTs will have greater scientific and public health value if they focus on the evaluation of intervention principles (rather than a specific locked-down version of the intervention), allowing for ongoing quality improvement modifications to the behavioral intervention technology based on the core intervention principles, while continuously improving the functionality and maintaining technological currency. This paper is an initial proposal of a framework and methodology for the conduct of trials of intervention principles (TIPs) aimed at minimizing the risks of in-trial changes to intervention technologies and maximizing the potential for knowledge acquisition. The focus on evaluation of intervention principles using clinical and usage outcomes has the potential to provide more generalizable and durable information than trials focused on a single intervention technology. PMID:26155878
Mohr, David C; Schueller, Stephen M; Riley, William T; Brown, C Hendricks; Cuijpers, Pim; Duan, Naihua; Kwasny, Mary J; Stiles-Shields, Colleen; Cheung, Ken
2015-07-08
In recent years, there has been increasing discussion of the limitations of traditional randomized controlled trial (RCT) methodologies for the evaluation of eHealth and mHealth interventions, and in particular, the requirement that these interventions be locked down during evaluation. Locking down these interventions locks in defects and eliminates the opportunities for quality improvement and adaptation to the changing technological environment, often leading to validation of tools that are outdated by the time that trial results are published. Furthermore, because behavioral intervention technologies change frequently during real-world deployment, even if a tested intervention were deployed in the real world, its shelf life would be limited. We argue that RCTs will have greater scientific and public health value if they focus on the evaluation of intervention principles (rather than a specific locked-down version of the intervention), allowing for ongoing quality improvement modifications to the behavioral intervention technology based on the core intervention principles, while continuously improving the functionality and maintaining technological currency. This paper is an initial proposal of a framework and methodology for the conduct of trials of intervention principles (TIPs) aimed at minimizing the risks of in-trial changes to intervention technologies and maximizing the potential for knowledge acquisition. The focus on evaluation of intervention principles using clinical and usage outcomes has the potential to provide more generalizable and durable information than trials focused on a single intervention technology.
Man-Machine Communication in Remote Manipulation: Task-Oriented Supervisory Command Language (TOSC).
1980-03-01
ORIENTED SUPERVISORY CONTROL SYSTEM METHODOLOGY 3-1 3.1 Overview 3-1 3.2 Background 3-3 3.2.1 General 3-3 3.2.2 Preliminary Principles of Command Language...Design 3-4 3.2.3 Preliminary Principles of Feedback Display Design 3-9 3.3 Man-Machine Communication Models 3-12 3.3.1 Background 3-12 3.3.2 Adapted...and feedback mode. The work ends with the presentation of a performance prediction model and a set of principles and guidelines, applicable to the
Learning on the Move: A Reassessment of Mobility through the Lens of Bateson's Learning Theory
ERIC Educational Resources Information Center
Janand, Anne; Notais, Amélie
2018-01-01
Purpose: This paper aims to explore the types of learning engendered by internal mobility (IM) by referring to the principles elaborated by Bateson (1972). Design/methodology/approach: A qualitative methodology is followed with interviews among 50 professionals working at four large French firms. Findings: A system of classification for IM is…
Correcting Fallacies in Validity, Reliability, and Classification
ERIC Educational Resources Information Center
Sijtsma, Klaas
2009-01-01
This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…
76 FR 3604 - Information Collection; Qualified Products List for Engine Driven Pumps
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-20
... levels. 2. Reliability and endurance requirements. These requirements include a 100-hour endurance test... evaluated to meet specific requirements related to safety, effectiveness, efficiency, and reliability of the... of the collection of information, including the validity of the methodology and assumptions used; (3...
Alppay, Cem; Bayazit, Nigan
2015-11-01
In this paper, we study the arrangement of displays in flight instrument panels of multi-purpose civil helicopters following a user-centered design method based on ergonomics principles. Our methodology can also be described as a user-interface arrangement methodology based on user opinions and preferences. This study can be outlined as gathering user-centered data using two different research methods and then analyzing and integrating the collected data to come up with an optimal instrument panel design. An interview with helicopter pilots formed the first step of our research. In that interview, pilots were asked to provide a quantitative evaluation of basic interface arrangement principles. In the second phase of the research, a paper prototyping study was conducted with same pilots. The final phase of the study entailed synthesizing the findings from interviews and observational studies to formulate an optimal flight instrument arrangement methodology. The primary results that we present in our paper are the methodology that we developed and three new interface arrangement concepts, namely relationship of inseparability, integrated value and locational value. An optimum instrument panel arrangement is also proposed by the researchers. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Methodology for building confidence measures
NASA Astrophysics Data System (ADS)
Bramson, Aaron L.
2004-04-01
This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.
NASA Technical Reports Server (NTRS)
Hoppa, Mary Ann; Wilson, Larry W.
1994-01-01
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Brunett, Acacia J.; Passerini, Stefano
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory (Argonne) participated in a two year collaboration to modernize and update the probabilistic risk assessment (PRA) for the PRISM sodium fast reactor. At a high level, the primary outcome of the project was the development of a next-generation PRA that is intended to enable risk-informed prioritization of safety- and reliability-focused research and development. A central Argonne task during this project was a reliability assessment of passive safety systems, which included the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedbacks of the metal fuel core. Both systems were examinedmore » utilizing a methodology derived from the Reliability Method for Passive Safety Functions (RMPS), with an emphasis on developing success criteria based on mechanistic system modeling while also maintaining consistency with the Fuel Damage Categories (FDCs) of the mechanistic source term assessment. This paper provides an overview of the reliability analyses of both systems, including highlights of the FMEAs, the construction of best-estimate models, uncertain parameter screening and propagation, and the quantification of system failure probability. In particular, special focus is given to the methodologies to perform the analysis of uncertainty propagation and the determination of the likelihood of violating FDC limits. Additionally, important lessons learned are also reviewed, such as optimal sampling methodologies for the discovery of low likelihood failure events and strategies for the combined treatment of aleatory and epistemic uncertainties.« less
Reliability analysis of repairable systems using Petri nets and vague Lambda-Tau methodology.
Garg, Harish
2013-01-01
The main objective of the paper is to developed a methodology, named as vague Lambda-Tau, for reliability analysis of repairable systems. Petri net tool is applied to represent the asynchronous and concurrent processing of the system instead of fault tree analysis. To enhance the relevance of the reliability study, vague set theory is used for representing the failure rate and repair times instead of classical(crisp) or fuzzy set theory because vague sets are characterized by a truth membership function and false membership functions (non-membership functions) so that sum of both values is less than 1. The proposed methodology involves qualitative modeling using PN and quantitative analysis using Lambda-Tau method of solution with the basic events represented by intuitionistic fuzzy numbers of triangular membership functions. Sensitivity analysis has also been performed and the effects on system MTBF are addressed. The methodology improves the shortcomings of the existing probabilistic approaches and gives a better understanding of the system behavior through its graphical representation. The washing unit of a paper mill situated in a northern part of India, producing approximately 200 ton of paper per day, has been considered to demonstrate the proposed approach. The results may be helpful for the plant personnel for analyzing the systems' behavior and to improve their performance by adopting suitable maintenance strategies. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Larsen, Camilla Marie; Juul-Kristensen, Birgit; Lund, Hans; Søgaard, Karen
2014-10-01
The aims were to compile a schematic overview of clinical scapular assessment methods and critically appraise the methodological quality of the involved studies. A systematic, computer-assisted literature search using Medline, CINAHL, SportDiscus and EMBASE was performed from inception to October 2013. Reference lists in articles were also screened for publications. From 50 articles, 54 method names were identified and categorized into three groups: (1) Static positioning assessment (n = 19); (2) Semi-dynamic (n = 13); and (3) Dynamic functional assessment (n = 22). Fifteen studies were excluded for evaluation due to no/few clinimetric results, leaving 35 studies for evaluation. Graded according to the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN checklist), the methodological quality in the reliability and validity domains was "fair" (57%) to "poor" (43%), with only one study rated as "good". The reliability domain was most often investigated. Few of the assessment methods in the included studies that had "fair" or "good" measurement property ratings demonstrated acceptable results for both reliability and validity. We found a substantially larger number of clinical scapular assessment methods than previously reported. Using the COSMIN checklist the methodological quality of the included measurement properties in the reliability and validity domains were in general "fair" to "poor". None were examined for all three domains: (1) reliability; (2) validity; and (3) responsiveness. Observational evaluation systems and assessment of scapular upward rotation seem suitably evidence-based for clinical use. Future studies should test and improve the clinimetric properties, and especially diagnostic accuracy and responsiveness, to increase utility for clinical practice.
Assurance of Learning, "Closing the Loop": Utilizing a Pre and Post Test for Principles of Finance
ERIC Educational Resources Information Center
Flanegin, Frank; Letterman, Denise; Racic, Stanko; Schimmel, Kurt
2010-01-01
Since there is no standard national Pre and Post Test for Principles of Finance, akin to the one for Economics, by authors created one by selecting questions from previously administered examinations. The Cronbach's Alpha of 0.851, exceeding the minimum of 0.70 for reliable pen and paper test, indicates that our Test can detect differences in…
Mani, Suresh; Sharma, Shobha; Omar, Baharudin; Paungmali, Aatit; Joseph, Leonard
2017-04-01
Purpose The purpose of this review is to systematically explore and summarise the validity and reliability of telerehabilitation (TR)-based physiotherapy assessment for musculoskeletal disorders. Method A comprehensive systematic literature review was conducted using a number of electronic databases: PubMed, EMBASE, PsycINFO, Cochrane Library and CINAHL, published between January 2000 and May 2015. The studies examined the validity, inter- and intra-rater reliabilities of TR-based physiotherapy assessment for musculoskeletal conditions were included. Two independent reviewers used the Quality Appraisal Tool for studies of diagnostic Reliability (QAREL) and the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool to assess the methodological quality of reliability and validity studies respectively. Results A total of 898 hits were achieved, of which 11 articles based on inclusion criteria were reviewed. Nine studies explored the concurrent validity, inter- and intra-rater reliabilities, while two studies examined only the concurrent validity. Reviewed studies were moderate to good in methodological quality. The physiotherapy assessments such as pain, swelling, range of motion, muscle strength, balance, gait and functional assessment demonstrated good concurrent validity. However, the reported concurrent validity of lumbar spine posture, special orthopaedic tests, neurodynamic tests and scar assessments ranged from low to moderate. Conclusion TR-based physiotherapy assessment was technically feasible with overall good concurrent validity and excellent reliability, except for lumbar spine posture, orthopaedic special tests, neurodynamic testa and scar assessment.
Reliability Issues and Solutions in Flexible Electronics Under Mechanical Fatigue
NASA Astrophysics Data System (ADS)
Yi, Seol-Min; Choi, In-Suk; Kim, Byoung-Joon; Joo, Young-Chang
2018-07-01
Flexible devices are of significant interest due to their potential expansion of the application of smart devices into various fields, such as energy harvesting, biological applications and consumer electronics. Due to the mechanically dynamic operations of flexible electronics, their mechanical reliability must be thoroughly investigated to understand their failure mechanisms and lifetimes. Reliability issue caused by bending fatigue, one of the typical operational limitations of flexible electronics, has been studied using various test methodologies; however, electromechanical evaluations which are essential to assess the reliability of electronic devices for flexible applications had not been investigated because the testing method was not established. By employing the in situ bending fatigue test, we has studied the failure mechanism for various conditions and parameters, such as bending strain, fatigue area, film thickness, and lateral dimensions. Moreover, various methods for improving the bending reliability have been developed based on the failure mechanism. Nanostructures such as holes, pores, wires and composites of nanoparticles and nanotubes have been suggested for better reliability. Flexible devices were also investigated to find the potential failures initiated by complex structures under bending fatigue strain. In this review, the recent advances in test methodology, mechanism studies, and practical applications are introduced. Additionally, perspectives including the future advance to stretchable electronics are discussed based on the current achievements in research.
Reliability Issues and Solutions in Flexible Electronics Under Mechanical Fatigue
NASA Astrophysics Data System (ADS)
Yi, Seol-Min; Choi, In-Suk; Kim, Byoung-Joon; Joo, Young-Chang
2018-03-01
Flexible devices are of significant interest due to their potential expansion of the application of smart devices into various fields, such as energy harvesting, biological applications and consumer electronics. Due to the mechanically dynamic operations of flexible electronics, their mechanical reliability must be thoroughly investigated to understand their failure mechanisms and lifetimes. Reliability issue caused by bending fatigue, one of the typical operational limitations of flexible electronics, has been studied using various test methodologies; however, electromechanical evaluations which are essential to assess the reliability of electronic devices for flexible applications had not been investigated because the testing method was not established. By employing the in situ bending fatigue test, we has studied the failure mechanism for various conditions and parameters, such as bending strain, fatigue area, film thickness, and lateral dimensions. Moreover, various methods for improving the bending reliability have been developed based on the failure mechanism. Nanostructures such as holes, pores, wires and composites of nanoparticles and nanotubes have been suggested for better reliability. Flexible devices were also investigated to find the potential failures initiated by complex structures under bending fatigue strain. In this review, the recent advances in test methodology, mechanism studies, and practical applications are introduced. Additionally, perspectives including the future advance to stretchable electronics are discussed based on the current achievements in research.
Igras, Susan; Diakité, Mariam; Lundgren, Rebecka
2017-07-01
In West Africa, social factors influence whether couples with unmet need for family planning act on birth-spacing desires. Tékponon Jikuagou is testing a social network-based intervention to reduce social barriers by diffusing new ideas. Individuals and groups judged socially influential by their communities provide entrée to networks. A participatory social network mapping methodology was designed to identify these diffusion actors. Analysis of monitoring data, in-depth interviews, and evaluation reports assessed the methodology's acceptability to communities and staff and whether it produced valid, reliable data to identify influential individuals and groups who diffuse new ideas through their networks. Results indicated the methodology's acceptability. Communities were actively and equitably engaged. Staff appreciated its ability to yield timely, actionable information. The mapping methodology also provided valid and reliable information by enabling communities to identify highly connected and influential network actors. Consistent with social network theory, this methodology resulted in the selection of informal groups and individuals in both informal and formal positions. In-depth interview data suggest these actors were diffusing new ideas, further confirming their influence/connectivity. The participatory methodology generated insider knowledge of who has social influence, challenging commonly held assumptions. Collecting and displaying information fostered staff and community learning, laying groundwork for social change.
Are seismic hazard assessment errors and earthquake surprises unavoidable?
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir
2013-04-01
Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.
ERIC Educational Resources Information Center
Brown, M. E.; Phillpotts, C. A. R.
1978-01-01
Discusses the principle of nonisothermal kinetics and some of the factors involved in such reactions, especially when considering the reliability of the kinetic parameters, compared to those of isothermal conditions. (GA)
Scientific white paper on concentration-QTc modeling.
Garnett, Christine; Bonate, Peter L; Dang, Qianyu; Ferber, Georg; Huang, Dalong; Liu, Jiang; Mehrotra, Devan; Riley, Steve; Sager, Philip; Tornoe, Christoffer; Wang, Yaning
2018-06-01
The International Council for Harmonisation revised the E14 guideline through the questions and answers process to allow concentration-QTc (C-QTc) modeling to be used as the primary analysis for assessing the QTc interval prolongation risk of new drugs. A well-designed and conducted QTc assessment based on C-QTc modeling in early phase 1 studies can be an alternative approach to a thorough QT study for some drugs to reliably exclude clinically relevant QTc effects. This white paper provides recommendations on how to plan and conduct a definitive QTc assessment of a drug using C-QTc modeling in early phase clinical pharmacology and thorough QT studies. Topics included are: important study design features in a phase 1 study; modeling objectives and approach; exploratory plots; the pre-specified linear mixed effects model; general principles for model development and evaluation; and expectations for modeling analysis plans and reports. The recommendations are based on current best modeling practices, scientific literature and personal experiences of the authors. These recommendations are expected to evolve as their implementation during drug development provides additional data and with advances in analytical methodology.
The role of emotions in moral case deliberation: theory, practice, and methodology.
Molewijk, Bert; Kleinlugtenbelt, Dick; Widdershoven, Guy
2011-09-01
In clinical moral decision making, emotions often play an important role. However, many clinical ethicists are ignorant, suspicious or even critical of the role of emotions in making moral decisions and in reflecting on them. This raises practical and theoretical questions about the understanding and use of emotions in clinical ethics support services. This paper presents an Aristotelian view on emotions and describes its application in the practice of moral case deliberation. According to Aristotle, emotions are an original and integral part of (virtue) ethics. Emotions are an inherent part of our moral reasoning and being, and therefore they should be an inherent part of any moral deliberation. Based on Aristotle's view, we examine five specific aspects of emotions: the description of emotions, the attitude towards emotions, the thoughts present in emotions, the reliability of emotions, and the reasonable principle that guides an emotion. We then discuss three ways of dealing with emotions in the process of moral case deliberation. Finally, we present an Aristotelian conversation method, and present practical experiences using this method. © 2011 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Bogusz, Michael
1993-01-01
The need for a systematic methodology for the analysis of aircraft electromagnetic compatibility (EMC) problems is examined. The available computer aids used in aircraft EMC analysis are assessed and a theoretical basis is established for the complex algorithms which identify and quantify electromagnetic interactions. An overview is presented of one particularly well established aircraft antenna to antenna EMC analysis code, the Aircraft Inter-Antenna Propagation with Graphics (AAPG) Version 07 software. The specific new algorithms created to compute cone geodesics and their associated path losses and to graph the physical coupling path are discussed. These algorithms are validated against basic principles. Loss computations apply the uniform geometrical theory of diffraction and are subsequently compared to measurement data. The increased modelling and analysis capabilities of the newly developed AAPG Version 09 are compared to those of Version 07. Several models of real aircraft, namely the Electronic Systems Trainer Challenger, are generated and provided as a basis for this preliminary comparative assessment. Issues such as software reliability, algorithm stability, and quality of hardcopy output are also discussed.
NASA Astrophysics Data System (ADS)
Singaravelu, J.; Sundaresan, S.; Nageswara Rao, B.
2013-04-01
This article presents a methodology for evaluation of the proof load factor (PLF) for clamp band system (CBS) made of M250 Maraging steel following fracture mechanics principles.CBS is most widely used as a structural element and as a separation system. Using Taguchi's design of experiments and the response surface method (RSM) the compact tension specimens were tested to establish an empirical relation for the failure load ( P max) in terms of the ultimate strength, width, thickness, and initial crack length. The test results of P max closely matched with the developed RSM empirical relation. Crack growth rates of the maraging steel in different environments were examined. Fracture strength (σf) of center surface cracks and through-crack tension specimens are evaluated utilizing the fracture toughness ( K IC). Stress induced in merman band at flight loading conditions is evaluated to estimate the higher load factor and PLF. Statistical safety factor and reliability assessments were made for the specified flaw sizes useful in the development of fracture control plan for CBS of launch vehicles.
Depth and thermal sensor fusion to enhance 3D thermographic reconstruction.
Cao, Yanpeng; Xu, Baobei; Ye, Zhangyu; Yang, Jiangxin; Cao, Yanlong; Tisse, Christel-Loic; Li, Xin
2018-04-02
Three-dimensional geometrical models with incorporated surface temperature data provide important information for various applications such as medical imaging, energy auditing, and intelligent robots. In this paper we present a robust method for mobile and real-time 3D thermographic reconstruction through depth and thermal sensor fusion. A multimodal imaging device consisting of a thermal camera and a RGB-D sensor is calibrated geometrically and used for data capturing. Based on the underlying principle that temperature information remains robust against illumination and viewpoint changes, we present a Thermal-guided Iterative Closest Point (T-ICP) methodology to facilitate reliable 3D thermal scanning applications. The pose of sensing device is initially estimated using correspondences found through maximizing the thermal consistency between consecutive infrared images. The coarse pose estimate is further refined by finding the motion parameters that minimize a combined geometric and thermographic loss function. Experimental results demonstrate that complimentary information captured by multimodal sensors can be utilized to improve performance of 3D thermographic reconstruction. Through effective fusion of thermal and depth data, the proposed approach generates more accurate 3D thermal models using significantly less scanning data.
Semiclassical Path Integral Calculation of Nonlinear Optical Spectroscopy.
Provazza, Justin; Segatta, Francesco; Garavelli, Marco; Coker, David F
2018-02-13
Computation of nonlinear optical response functions allows for an in-depth connection between theory and experiment. Experimentally recorded spectra provide a high density of information, but to objectively disentangle overlapping signals and to reach a detailed and reliable understanding of the system dynamics, measurements must be integrated with theoretical approaches. Here, we present a new, highly accurate and efficient trajectory-based semiclassical path integral method for computing higher order nonlinear optical response functions for non-Markovian open quantum systems. The approach is, in principle, applicable to general Hamiltonians and does not require any restrictions on the form of the intrasystem or system-bath couplings. This method is systematically improvable and is shown to be valid in parameter regimes where perturbation theory-based methods qualitatively breakdown. As a test of the methodology presented here, we study a system-bath model for a coupled dimer for which we compare against numerically exact results and standard approximate perturbation theory-based calculations. Additionally, we study a monomer with discrete vibronic states that serves as the starting point for future investigation of vibronic signatures in nonlinear electronic spectroscopy.
Attya, Mohamed; Benabdelkamel, Hicham; Perri, Enzo; Russo, Anna; Sindona, Giovanni
2010-12-01
The quality of olive oils is sensorially tested by accurate and well established methods. It enables the classification of the pressed oils into the classes of extra virgin oil, virgin oil and lampant oil. Nonetheless, it would be convenient to have analytical methods for screening oils or supporting sensorial analysis using a reliable independent approach based on exploitation of mass spectrometric methodologies. A number of methods have been proposed to evaluate deficiencies of extra virgin olive oils resulting from inappropriate technological treatments, such as high or low temperature deodoration, and home cooking processes. The quality and nutraceutical value of extra virgin olive oil (EVOO) can be related to the antioxidant property of its phenolic compounds. Olive oil is a source of at least 30 phenolic compounds, such as oleuropein, oleocanthal, hydroxytyrosol, and tyrosol, all acting as strong antioxidants, radical scavengers and NSAI-like drugs. We now report the efficacy of MRM tandem mass spectrometry, assisted by the isotope dilution assay, in the evaluation of the thermal stability of selected active principles of extra virgin olive oil.
NASA Astrophysics Data System (ADS)
Tadano, Terumasa; Tsuneyuki, Shinji
2015-08-01
We present an ab initio framework to calculate anharmonic phonon frequency and phonon lifetime that is applicable to severely anharmonic systems. We employ self-consistent phonon (SCPH) theory with microscopic anharmonic force constants, which are extracted from density functional calculations using the least absolute shrinkage and selection operator technique. We apply the method to the high-temperature phase of SrTiO3 and obtain well-defined phonon quasiparticles that are free from imaginary frequencies. Here we show that the anharmonic phonon frequency of the antiferrodistortive mode depends significantly on the system size near the critical temperature of the cubic-to-tetragonal phase transition. By applying perturbation theory to the SCPH result, phonon lifetimes are calculated for cubic SrTiO3, which are then employed to predict lattice thermal conductivity using the Boltzmann transport equation within the relaxation-time approximation. The presented methodology is efficient and accurate, paving the way toward a reliable description of thermodynamic, dynamic, and transport properties of systems with severe anharmonicity, including thermoelectric, ferroelectric, and superconducting materials.
DOT National Transportation Integrated Search
2004-03-01
The ability of Advanced Traveler Information Systems (ATIS) to improve the on-time reliability of urban truck movements is evaluated through the application of the Heuristic On-Line Web- : Linked Arrival Time Estimation (HOWLATE) methodology. In HOWL...
ERIC Educational Resources Information Center
Stenner, A. Jackson; Rohlf, Richard J.
The merits of generalizability theory in the formulation of construct definitions and in the determination of reliability estimates are discussed. The broadened conceptualization of reliability brought about by Cronbach's generalizability theory is reviewed. Career Maturity Inventory data from a sample of 60 ninth grade students is used to…
34 CFR 668.144 - Application for test approval.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the comparability of scores on the current test to scores on the previous test, and data from validity... explanation of the methodology and procedures for measuring the reliability of the test; (ii) Evidence that different forms of the test, including, if applicable, short forms, are comparable in reliability; (iii...
Methodological Issues in Measuring the Development of Character
ERIC Educational Resources Information Center
Card, Noel A.
2017-01-01
In this article I provide an overview of the methodological issues involved in measuring constructs relevant to character development and education. I begin with a nontechnical overview of the 3 fundamental psychometric properties of measurement: reliability, validity, and equivalence. Developing and evaluating measures to ensure evidence of all 3…
Allometric scaling theory applied to FIA biomass estimation
David C. Chojnacky
2002-01-01
Tree biomass estimates in the Forest Inventory and Analysis (FIA) database are derived from numerous methodologies whose abundance and complexity raise questions about consistent results throughout the U.S. A new model based on allometric scaling theory ("WBE") offers simplified methodology and a theoretically sound basis for improving the reliability and...
Kohlberg's Moral Judgment Scale: Some Methodological Considerations
ERIC Educational Resources Information Center
Rubin, Kenneth H.; Trotter, Kristin T.
1977-01-01
Examined 3 methodological issues in the use of Kohlberg's Moral Judgment Scale: (1) test-retest reliability, (2) consistency of moral judgment stages from one dilemma to the next, and (3) influence of subject's verbal facility on the projective test scores. Forty children in grades 3 and 5 participated. (JMB)
Zachariah, Marianne; Seidling, Hanna M; Neri, Pamela M; Cresswell, Kathrin M; Duke, Jon; Bloomrosen, Meryl; Volk, Lynn A; Bates, David W
2011-01-01
Background Medication-related decision support can reduce the frequency of preventable adverse drug events. However, the design of current medication alerts often results in alert fatigue and high over-ride rates, thus reducing any potential benefits. Methods The authors previously reviewed human-factors principles for relevance to medication-related decision support alerts. In this study, instrument items were developed for assessing the appropriate implementation of these human-factors principles in drug–drug interaction (DDI) alerts. User feedback regarding nine electronic medical records was considered during the development process. Content validity, construct validity through correlation analysis, and inter-rater reliability were assessed. Results The final version of the instrument included 26 items associated with nine human-factors principles. Content validation on three systems resulted in the addition of one principle (Corrective Actions) to the instrument and the elimination of eight items. Additionally, the wording of eight items was altered. Correlation analysis suggests a direct relationship between system age and performance of DDI alerts (p=0.0016). Inter-rater reliability indicated substantial agreement between raters (κ=0.764). Conclusion The authors developed and gathered preliminary evidence for the validity of an instrument that measures the appropriate use of human-factors principles in the design and display of DDI alerts. Designers of DDI alerts may use the instrument to improve usability and increase user acceptance of medication alerts, and organizations selecting an electronic medical record may find the instrument helpful in meeting their clinicians' usability needs. PMID:21946241
Baschung Pfister, Pierrette; de Bruin, Eling D; Tobler-Ammann, Bernadette C; Maurer, Britta; Knols, Ruud H
2015-10-01
Physical exercise seems to be a safe and effective intervention in patients with inflammatory myopathy (IM). However, the optimal training intervention is not clear. To achieve an optimum training effect, physical exercise training principles must be considered and to replicate research findings, FITT components (frequency, intensity, time, and type) of exercise training should be reported. This review aims to evaluate exercise interventions in studies with IM patients in relation to (1) the application of principles of exercise training, (2) the reporting of FITT components, (3) the adherence of participants to the intervention, and (4) to assess the methodological quality of the included studies. The literature was searched for exercise studies in IM patients. Data were extracted to evaluate the application of the training principles, the reporting of and the adherence to the exercise prescription. The Downs and Black checklist was used to assess methodological quality of the included studies. From the 14 included studies, four focused on resistance, two on endurance, and eight on combined training. In terms of principles of exercise training, 93 % reported specificity, 50 % progression and overload, and 79 % initial values. Reversibility and diminishing returns were never reported. Six articles reported all FITT components in the prescription of the training though no study described adherence to all of these components. Incomplete application of the exercise training principles and insufficient reporting of the exercise intervention prescribed and completed hamper the reproducibility of the intervention and the ability to determine the optimal dose of exercise.
Connors, Brenda L.; Rende, Richard; Colton, Timothy J.
2014-01-01
The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic – the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts – and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from movement pattern analysis (MPA), an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective), inter-rater reliability for patterning (proportional indicators of each factor) was significantly higher and excellent (ICC = 0.89). Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring patterning versus discrete behavioral counts of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns. PMID:24999336
Connors, Brenda L; Rende, Richard; Colton, Timothy J
2014-01-01
The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic - the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts - and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from movement pattern analysis (MPA), an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective), inter-rater reliability for patterning (proportional indicators of each factor) was significantly higher and excellent (ICC = 0.89). Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring patterning versus discrete behavioral counts of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns.
Real time and in vivo monitoring of nitric oxide by electrochemical sensors--from dream to reality.
Zhang, Xueji
2004-09-01
Nitric oxide is a key intercellular messenger in the human and animal bodies. The identification of nitric oxide (NO) as the endothelium-derived relaxing factor (EDRF) has driven an enormous effort to further elucidate the chemistry, biology and therapeutic actions of this important molecule. It has found that nitric oxide is involved in many disease states such as such as chronic heart failure, stroke, impotent (erectile dysfunction). The bioactivity of nitric oxide intrinsically linked to its diffusion from its site production to the sites of action. Accurate reliable in real time detection of NO in various biological systems is therefore crucial to understanding its biological role. However, the instability of NO in aqueous solution and its high reactivity with other molecules can cause difficulties for its measurement depending on the detection method employed. Although a variety of methods have been described to measure NO in aqueous environments, it is now generally accepted that electrochemical (amperometric) detection using NO-specific electrodes is the most reliable and sensitive technique available for real-time in situ detection of NO. In 1992 the first commercial NO electrode-based amperometric detection system was developed by WPI. The system has been used successfully for a number of years in a wide range of research applications, both in vitro and in vivo. Recently, many new electrochemical nitric sensors have been invented and commercialized. Here we describe some of the background principles in NO sensors design, methodology and their applications.
Dai, Sheng-Yun; Xu, Bing; Zhang, Yi; Li, Jian-Yu; Sun, Fei; Shi, Xin-Yuan; Qiao, Yan-Jiang
2016-09-01
Coptis chinensis (Huanglian) is a commonly used traditional Chinese medicine (TCM) herb and alkaloids are the most important chemical constituents in it. In the present study, an isocratic reverse phase high performance liquid chromatography (RP-HPLC) method allowing the separation of six alkaloids in Huanglian was for the first time developed under the quality by design (QbD) principles. First, five chromatographic parameters were identified to construct a Plackett-Burman experimental design. The critical resolution, analysis time, and peak width were responses modeled by multivariate linear regression. The results showed that the percentage of acetonitrile, concentration of sodium dodecyl sulfate, and concentration of potassium phosphate monobasic were statistically significant parameters (P < 0.05). Then, the Box-Behnken experimental design was applied to further evaluate the interactions between the three parameters on selected responses. Full quadratic models were built and used to establish the analytical design space. Moreover, the reliability of design space was estimated by the Bayesian posterior predictive distribution. The optimal separation was predicted at 40% acetonitrile, 1.7 g·mL(-1) of sodium dodecyl sulfate and 0.03 mol·mL(-1) of potassium phosphate monobasic. Finally, the accuracy profile methodology was used to validate the established HPLC method. The results demonstrated that the QbD concept could be efficiently used to develop a robust RP-HPLC analytical method for Huanglian. Copyright © 2016 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.
Allam, Ayman; Tawfik, Ahmed; Yoshimura, Chihiro; Fleifle, Amr
2016-06-01
The present study proposes a waste load allocation (WLA) framework for a sustainable quality management of agricultural drainage water (ADW). Two multi-objective models, namely, abatement-performance and abatement-equity-performance, were developed through the integration of a water quality model (QAUL2Kw) and a genetic algorithm, by considering (1) the total waste load abatement, and (2) the inequity among waste dischargers. For successfully accomplishing modeling tasks, we developed a comprehensive overall performance measure (E wla ) reflecting possible violations of Egyptian standards for ADW reuse in irrigation. This methodology was applied to the Gharbia drain in the Nile Delta, Egypt, during both summer and winter seasons of 2012. Abatement-performance modeling results for a target of E wla = 100 % corresponded to the abatement ratio of the dischargers ranging from 20.7 to 75.6 % and 29.5 to 78.5 % in summer and in winter, respectively, alongside highly shifting inequity values. Abatement-equity-performance modeling results for a target of E wla = 90 % unraveled the necessity of increasing treatment efforts in three out of five dischargers during summer, and four out of five in winter. The trade-off curves obtained from WLA models proved their reliability in selecting appropriate WLA procedures as a function of budget constraints, principles of social equity, and desired overall performance level. Hence, the proposed framework of methodologies is of great importance to decision makers working toward a sustainable reuse of the ADW in irrigation.
Bozorgmehr, Kayvan; Gabrysch, Sabine; Müller, Olaf; Neuhann, Florian; Jordan, Irmgard; Knipper, Michael; Razum, Oliver
2013-10-16
There is an unresolved debate about the potential effects of financial speculation on food prices and price volatility. Germany's largest financial institution and leading global investment bank recently decided to continue investing in agricultural commodities, stating that there is little empirical evidence to support the notion that the growth of agricultural-based financial products has caused price increases or volatility. The statement is supported by a recently published literature review, which concludes that financial speculation does not have an adverse effect on the functioning of the agricultural commodities market. As public health professionals concerned with global food insecurity, we have appraised the methodological quality of the review using a validated and reliable appraisal tool. The appraisal revealed major shortcomings in the methodological quality of the review. These were particularly related to intransparencies in the search strategy and in the selection/presentation of studies and findings; the neglect of the possibility of publication bias; a lack of objective or rigorous criteria for assessing the scientific quality of included studies and for the formulation of conclusions. Based on the results of our appraisal, we conclude that it is not justified to reject the hypothesis that financial speculation might have adverse effects on food prices/price volatility. We hope to initiate reflections about scientific standards beyond the boundaries of disciplines and call for high quality, rigorous systematic reviews on the effects of financial speculation on food prices or price volatility.
Semi Automated Land Cover Layer Updating Process Utilizing Spectral Analysis and GIS Data Fusion
NASA Astrophysics Data System (ADS)
Cohen, L.; Keinan, E.; Yaniv, M.; Tal, Y.; Felus, A.; Regev, R.
2018-04-01
Technological improvements made in recent years of mass data gathering and analyzing, influenced the traditional methods of updating and forming of the national topographic database. It has brought a significant increase in the number of use cases and detailed geo information demands. Processes which its purpose is to alternate traditional data collection methods developed in many National Mapping and Cadaster Agencies. There has been significant progress in semi-automated methodologies aiming to facilitate updating of a topographic national geodatabase. Implementation of those is expected to allow a considerable reduction of updating costs and operation times. Our previous activity has focused on building automatic extraction (Keinan, Zilberstein et al, 2015). Before semiautomatic updating method, it was common that interpreter identification has to be as detailed as possible to hold most reliable database eventually. When using semi-automatic updating methodologies, the ability to insert human insights based knowledge is limited. Therefore, our motivations were to reduce the created gap by allowing end-users to add their data inputs to the basic geometric database. In this article, we will present a simple Land cover database updating method which combines insights extracted from the analyzed image, and a given spatial data of vector layers. The main stages of the advanced practice are multispectral image segmentation and supervised classification together with given vector data geometric fusion while maintaining the principle of low shape editorial work to be done. All coding was done utilizing open source software components.
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
Design of high reliability organizations in health care
Carroll, J S; Rudolph, J W
2006-01-01
To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self‐understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self‐design for safety and reliability. PMID:17142607
[SciELO: method for electronic publishing].
Laerte Packer, A; Rocha Biojone, M; Antonio, I; Mayumi Takemaka, R; Pedroso García, A; Costa da Silva, A; Toshiyuki Murasaki, R; Mylek, C; Carvalho Reisl, O; Rocha F Delbucio, H C
2001-01-01
It describes the SciELO Methodology Scientific Electronic Library Online for electronic publishing of scientific periodicals, examining issues such as the transition from traditional printed publication to electronic publishing, the scientific communication process, the principles which founded the methodology development, its application in the building of the SciELO site, its modules and components, the tools use for its construction etc. The article also discusses the potentialities and trends for the area in Brazil and Latin America, pointing out questions and proposals which should be investigated and solved by the methodology. It concludes that the SciELO Methodology is an efficient, flexible and wide solution for the scientific electronic publishing.
DOT National Transportation Integrated Search
2007-03-01
This course provides INDOT staff with foundational knowledge and skills in project management principles and methodologies. INDOTs project management processes provide the tools for interdisciplinary teams to efficiently and effectively deliver pr...
NASA Astrophysics Data System (ADS)
Nakatsuji, Hiroshi
Chemistry is a science of complex subjects that occupy this universe and biological world and that are composed of atoms and molecules. Its essence is diversity. However, surprisingly, whole of this science is governed by simple quantum principles like the Schrödinger and the Dirac equations. Therefore, if we can find a useful general method of solving these quantum principles under the fermionic and/or bosonic constraints accurately in a reasonable speed, we can replace somewhat empirical methodologies of this science with purely quantum theoretical and computational logics. This is the purpose of our series of studies - called ``exact theory'' in our laboratory. Some of our documents are cited below. The key idea was expressed as the free complement (FC) theory (originally called ICI theory) that was introduced to solve the Schrödinger and Dirac equations analytically. For extending this methodology to larger systems, order N methodologies are essential, but actually the antisymmetry constraints for electronic wave functions become big constraints. Recently, we have shown that the antisymmetry rule or `dogma' can be very much relaxed when our subjects are large molecular systems. In this talk, I want to present our recent progress in our FC methodology. The purpose is to construct ``predictive quantum chemistry'' that is useful in chemical and physical researches and developments in institutes and industries
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
Reliability model derivation of a fault-tolerant, dual, spare-switching, digital computer system
NASA Technical Reports Server (NTRS)
1974-01-01
A computer based reliability projection aid, tailored specifically for application in the design of fault-tolerant computer systems, is described. Its more pronounced characteristics include the facility for modeling systems with two distinct operational modes, measuring the effect of both permanent and transient faults, and calculating conditional system coverage factors. The underlying conceptual principles, mathematical models, and computer program implementation are presented.
ERIC Educational Resources Information Center
Wesolowski, Brian C.; Amend, Ross M.; Barnstead, Thomas S.; Edwards, Andrew S.; Everhart, Matthew; Goins, Quentin R.; Grogan, Robert J., III; Herceg, Amanda M.; Jenkins, S. Ira; Johns, Paul M.; McCarver, Christopher J.; Schaps, Robin E.; Sorrell, Gary W.; Williams, Jonathan D.
2017-01-01
The purpose of this study was to describe the development of a valid and reliable rubric to assess secondary-level solo instrumental music performance based on principles of invariant measurement. The research questions that guided this study included (1) What is the psychometric quality (i.e., validity, reliability, and precision) of a scale…
ERIC Educational Resources Information Center
Doshybekov, Aidyn Bagdatovich; Abildabekov, Sabit Akimbaevich; Kasymbaev, Medet Imanbekovich; Berekbusynova, Gulzhan Maulsharifkyzy; Niyazakynov, Erdos Bagdatovich
2016-01-01
The aim of this study is to examine the state of marketing in the sphere of physical culture and sport and develop methodological foundations of sports and health services marketing on its basis. In the study we adhere to the following philosophical and pedagogical strategies--methodological principles: axiological, humanistic and synergistic…
Software Requirements Specification for an Ammunition Management System
1986-09-01
thesis takes the form of a software requirements specification. Such a specification, according to Pressman [Ref. 7], establishes a complete...defined by Pressman , is depicted in Figure 1.1. 11 Figure 1.1 Generalized Software Life Cycle The common thread which binds the various phases together...application of software engineering principles requires an established methodology. This methodology, according to Pressman [Ref. 8:p. 151 is an
Applications of Mass Spectrometry for Cellular Lipid Analysis
Wang, Chunyan; Wang, Miao; Han, Xianlin
2015-01-01
Mass spectrometric analysis of cellular lipids is an enabling technology for lipidomics, which is a rapidly-developing research field. In this review, we briefly discuss the principles, advantages, and possible limitations of electrospray ionization (ESI) and matrix assisted laser desorption/ionization (MALDI) mass spectrometry-based methodologies for the analysis of lipid species. The applications of these methodologies to lipidomic research are also summarized. PMID:25598407
ERIC Educational Resources Information Center
Hallgren, Kenneth Glenn
A study investigated the relationship of students' cognitive level of development and teaching methodology with student achievement. The sample was composed of 79 students in two sections of the introductory marketing course at the University of Northern Colorado. The control group was taught by a lecture strategy, and the experimental group by a…
Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?
ERIC Educational Resources Information Center
Pool, Jessica; Laubscher, Dorothy
2016-01-01
This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…
NASA Technical Reports Server (NTRS)
Motyka, P.
1983-01-01
A methodology for quantitatively analyzing the reliability of redundant avionics systems, in general, and the dual, separated Redundant Strapdown Inertial Measurement Unit (RSDIMU), in particular, is presented. The RSDIMU is described and a candidate failure detection and isolation system presented. A Markov reliability model is employed. The operational states of the system are defined and the single-step state transition diagrams discussed. Graphical results, showing the impact of major system parameters on the reliability of the RSDIMU system, are presented and discussed.
Advanced reliability modeling of fault-tolerant computer-based systems
NASA Technical Reports Server (NTRS)
Bavuso, S. J.
1982-01-01
Two methodologies for the reliability assessment of fault tolerant digital computer based systems are discussed. The computer-aided reliability estimation 3 (CARE 3) and gate logic software simulation (GLOSS) are assessment technologies that were developed to mitigate a serious weakness in the design and evaluation process of ultrareliable digital systems. The weak link is based on the unavailability of a sufficiently powerful modeling technique for comparing the stochastic attributes of one system against others. Some of the more interesting attributes are reliability, system survival, safety, and mission success.
Development of a probabilistic analysis methodology for structural reliability estimation
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.
1991-01-01
The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.
An Analytical Methodology for Predicting Repair Time Distributions of Advanced Technology Aircraft.
1985-12-01
1984. 3. Barlow, Richard E. "Mathematical Theory of Reliabilitys A Historical Perspective." ZEEE Transactions on Reliability, 33. 16-19 (April 1984...Technology (AU), Wright-Patterson AFB OH, March 1971. 11. Coppola, Anthony. "Reliability Engineering of J- , Electronic Equipment," ZEEE Transactions on...1982. 64. Woodruff, Brian W. at al. "Modified Goodness-o-Fit Tests for Gamma Distributions with Unknown Location and Scale Parameters," ZEEE
Do we need methodological theory to do qualitative research?
Avis, Mark
2003-09-01
Positivism is frequently used to stand for the epistemological assumption that empirical science based on principles of verificationism, objectivity, and reproducibility is the foundation of all genuine knowledge. Qualitative researchers sometimes feel obliged to provide methodological alternatives to positivism that recognize their different ethical, ontological, and epistemological commitments and have provided three theories: phenomenology, grounded theory, and ethnography. The author argues that positivism was a doomed attempt to define empirical foundations for knowledge through a rigorous separation of theory and evidence; offers a pragmatic, coherent view of knowledge; and suggests that rigorous, rational empirical investigation does not need methodological theory. Therefore, qualitative methodological theory is unnecessary and counterproductive because it hinders critical reflection on the relation between methodological theory and empirical evidence.
Physical and reliability issues in MEMS microrelays with gold contacts
NASA Astrophysics Data System (ADS)
Lafontan, Xavier; Pressecq, Francis; Perez, Guy; Dufaza, Christian; Karam, Jean Michel
2001-10-01
This paper presents the work we have done on micro-relays with gold micro-contacts in MUMPs. Firstly, the theoretical physical principles of MEMS micro-relay are described. This study is divided in two parts: the micro-contact and the micro-actuator. The micro-contact part deals with resistance of constriction, contact area, adhesion, arcing and wear. Whereas the micro-actuator part describes general principles, contact force, restoring force and actuator reliability. Then, in a second part, an innovative electrostatic relay design in MUMPs is presented. The concept, the implementation and the final realization are discussed. Then, in the third part, characterization results are reported. This part particularly focuses on the micro-contact study. Conduction mode, contact area, mechanical and thermal deformation, and adhesion energies are presented.
Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu
2006-01-01
To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.
Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D
2015-06-01
The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.
A Methodological Critique of the ProPublica Surgeon Scorecard
Friedberg, Mark W.; Pronovost, Peter J.; Shahian, David M.; Safran, Dana Gelb; Bilimoria, Karl Y.; Elliott, Marc N.; Damberg, Cheryl L.; Dimick, Justin B.; Zaslavsky, Alan M.
2016-01-01
Abstract On July 14, 2015, ProPublica published its Surgeon Scorecard, which displays “Adjusted Complication Rates” for individual, named surgeons for eight surgical procedures performed in hospitals. Public reports of provider performance have the potential to improve the quality of health care that patients receive. A valid performance report can drive quality improvement and usefully inform patients' choices of providers. However, performance reports with poor validity and reliability are potentially damaging to all involved. This article critiques the methods underlying the Scorecard and identifies opportunities for improvement. Until these opportunities are addressed, the authors advise users of the Scorecard—most notably, patients who might be choosing their surgeons—not to consider the Scorecard a valid or reliable predictor of the health outcomes any individual surgeon is likely to provide. The authors hope that this methodological critique will contribute to the development of more-valid and more-reliable performance reports in the future. PMID:28083411
Estimating the Reliability of Electronic Parts in High Radiation Fields
NASA Technical Reports Server (NTRS)
Everline, Chester; Clark, Karla; Man, Guy; Rasmussen, Robert; Johnston, Allan; Kohlhase, Charles; Paulos, Todd
2008-01-01
Radiation effects on materials and electronic parts constrain the lifetime of flight systems visiting Europa. Understanding mission lifetime limits is critical to the design and planning of such a mission. Therefore, the operational aspects of radiation dose are a mission success issue. To predict and manage mission lifetime in a high radiation environment, system engineers need capable tools to trade radiation design choices against system design and reliability, and science achievements. Conventional tools and approaches provided past missions with conservative designs without the ability to predict their lifetime beyond the baseline mission.This paper describes a more systematic approach to understanding spacecraft design margin, allowing better prediction of spacecraft lifetime. This is possible because of newly available electronic parts radiation effects statistics and an enhanced spacecraft system reliability methodology. This new approach can be used in conjunction with traditional approaches for mission design. This paper describes the fundamentals of the new methodology.
NASA Astrophysics Data System (ADS)
Lee, Chang-Chun; Shih, Yan-Shin; Wu, Chih-Sheng; Tsai, Chia-Hao; Yeh, Shu-Tang; Peng, Yi-Hao; Chen, Kuang-Jung
2012-07-01
This work analyses the overall stress/strain characteristic of flexible encapsulations with organic light-emitting diode (OLED) devices. A robust methodology composed of a mechanical model of multi-thin film under bending loads and related stress simulations based on nonlinear finite element analysis (FEA) is proposed, and validated to be more reliable compared with related experimental data. With various geometrical combinations of cover plate, stacked thin films and plastic substrate, the position of the neutral axis (NA) plate, which is regarded as a key design parameter to minimize stress impact for the concerned OLED devices, is acquired using the present methodology. The results point out that both the thickness and mechanical properties of the cover plate help in determining the NA location. In addition, several concave and convex radii are applied to examine the reliable mechanical tolerance and to provide an insight into the estimated reliability of foldable OLED encapsulations.
A Step Made Toward Designing Microelectromechanical System (MEMS) Structures With High Reliability
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
2003-01-01
The mechanical design of microelectromechanical systems-particularly for micropower generation applications-requires the ability to predict the strength capacity of load-carrying components over the service life of the device. These microdevices, which typically are made of brittle materials such as polysilicon, show wide scatter (stochastic behavior) in strength as well as a different average strength for different sized structures (size effect). These behaviors necessitate either costly and time-consuming trial-and-error designs or, more efficiently, the development of a probabilistic design methodology for MEMS. Over the years, the NASA Glenn Research Center s Life Prediction Branch has developed the CARES/Life probabilistic design methodology to predict the reliability of advanced ceramic components. In this study, done in collaboration with Johns Hopkins University, the ability of the CARES/Life code to predict the reliability of polysilicon microsized structures with stress concentrations is successfully demonstrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Xiangqi; Zhang, Yingchen
This paper presents an optimal voltage control methodology with coordination among different voltage-regulating resources, including controllable loads, distributed energy resources such as energy storage and photovoltaics (PV), and utility voltage-regulating devices such as voltage regulators and capacitors. The proposed methodology could effectively tackle the overvoltage and voltage regulation device distortion problems brought by high penetrations of PV to improve grid operation reliability. A voltage-load sensitivity matrix and voltage-regulator sensitivity matrix are used to deploy the resources along the feeder to achieve the control objectives. Mixed-integer nonlinear programming is used to solve the formulated optimization control problem. The methodology has beenmore » tested on the IEEE 123-feeder test system, and the results demonstrate that the proposed approach could actively tackle the voltage problem brought about by high penetrations of PV and improve the reliability of distribution system operation.« less
Reliability study on high power 638-nm triple emitter broad area laser diode
NASA Astrophysics Data System (ADS)
Yagi, T.; Kuramoto, K.; Kadoiwa, K.; Wakamatsu, R.; Miyashita, M.
2016-03-01
Reliabilities of the 638-nm triple emitter broad area laser diode (BA-LD) with the window-mirror structure were studied. Methodology to estimate mean time to failure (MTTF) due to catastrophic optical mirror degradation (COMD) in reasonable aging duration was newly proposed. Power at which the LD failed due to COMD (PCOMD) was measured for the aged LDs under the several aging conditions. It was revealed that the PCOMD was proportional to logarithm of aging duration, and MTTF due to COMD (MTTF(COMD)) could be estimated by using this relation. MTTF(COMD) estimated by the methodology with the aging duration of approximately 2,000 hours was consistent with that estimated by the long term aging. By using this methodology, the MTTF of the BA-LD was estimated exceeding 100,000 hours under the output of 2.5 W, duty cycles of 30% .
Ahern, Tracey; Gardner, Anne; Gardner, Glenn; Middleton, Sandy; Della, Phillip
2013-05-01
The final phase of a three phase study analysing the implementation and impact of the nurse practitioner role in Australia (the Australian Nurse Practitioner Project or AUSPRAC) was undertaken in 2009, requiring nurse telephone interviewers to gather information about health outcomes directly from patients and their treating nurse practitioners. A team of several registered nurses was recruited and trained as telephone interviewers. The aim of this paper is to report on development and evaluation of the training process for telephone interviewers. The training process involved planning the content and methods to be used in the training session; delivering the session; testing skills and understanding of interviewers post-training; collecting and analysing data to determine the degree to which the training process was successful in meeting objectives and post-training follow-up. All aspects of the training process were informed by established educational principles. Interrater reliability between interviewers was high for well-validated sections of the survey instrument resulting in 100% agreement between interviewers. Other sections with unvalidated questions showed lower agreement (between 75% and 90%). Overall the agreement between interviewers was 92%. Each interviewer was also measured against a specifically developed master script or gold standard and for this each interviewer achieved a percentage of correct answers of 94.7% or better. This equated to a Kappa value of 0.92 or better. The telephone interviewer training process was very effective and achieved high interrater reliability. We argue that the high reliability was due to the use of well validated instruments and the carefully planned programme based on established educational principles. There is limited published literature on how to successfully operationalise educational principles and tailor them for specific research studies; this report addresses this knowledge gap. Copyright © 2012 Elsevier Ltd. All rights reserved.
Elastohydrodynamic principles applied to the design of helicopter components.
NASA Technical Reports Server (NTRS)
Townsend, D. P.
1973-01-01
Elastohydrodynamic principles affecting the lubrication of transmission components are presented and discussed. Surface temperatures of the transmission bearings and gears affect elastohydrodynamic film thickness. Traction forces and sliding as well as the inlet temperature determine surface temperatures. High contact ratio gears cause increased sliding and may run at higher surface temperatures. Component life is a function of the ratio of elastohydrodynamic film thickness to composite surface roughness. Lubricant starvation reduces elastohydrodynamic film thickness and increases surface temperatures. Methods are presented which allow for the application of elastohydrodynamic principles to transmission design in order to increase system life and reliability.
Elastohydrodynamic principles applied to the design of helicopter components
NASA Technical Reports Server (NTRS)
Townsend, D. P.
1973-01-01
Elastohydrodynamic principles affecting the lubrication of transmission components are presented and discussed. Surface temperature of the transmission bearings and gears affect elastohydrodynamic film thickness. Traction forces and sliding as well as the inlet temperature determine surface temperatures. High contact ratio gears cause increased sliding and may run at higher surface temperatures. Component life is a function of the ratio of elastohydrodynamic film thickness to composite surface roughness. Lubricant starvation reduces elastrohydrodynamic film thickness and increases surface temperatures. Methods are presented which allow for the application of elastohydrodynamic principles to transmission design in order to increase system life and reliability.
MEMS reliability: coming of age
NASA Astrophysics Data System (ADS)
Douglass, Michael R.
2008-02-01
In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.
USDA-ARS?s Scientific Manuscript database
Background: The utility of glycemic index (GI) values for chronic disease risk management remains controversial. While absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value de...
Evaluation of Scale Reliability with Binary Measures Using Latent Variable Modeling
ERIC Educational Resources Information Center
Raykov, Tenko; Dimitrov, Dimiter M.; Asparouhov, Tihomir
2010-01-01
A method for interval estimation of scale reliability with discrete data is outlined. The approach is applicable with multi-item instruments consisting of binary measures, and is developed within the latent variable modeling methodology. The procedure is useful for evaluation of consistency of single measures and of sum scores from item sets…
Evaluation of Weighted Scale Reliability and Criterion Validity: A Latent Variable Modeling Approach
ERIC Educational Resources Information Center
Raykov, Tenko
2007-01-01
A method is outlined for evaluating the reliability and criterion validity of weighted scales based on sets of unidimensional measures. The approach is developed within the framework of latent variable modeling methodology and is useful for point and interval estimation of these measurement quality coefficients in counseling and education…
Design of an integrated airframe/propulsion control system architecture
NASA Technical Reports Server (NTRS)
Cohen, Gerald C.; Lee, C. William; Strickland, Michael J.; Torkelson, Thomas C.
1990-01-01
The design of an integrated airframe/propulsion control system architecture is described. The design is based on a prevalidation methodology that uses both reliability and performance. A detailed account is given for the testing associated with a subset of the architecture and concludes with general observations of applying the methodology to the architecture.
NASA Astrophysics Data System (ADS)
Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui
2018-04-01
Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.
NASA Astrophysics Data System (ADS)
Morse, Llewellyn; Sharif Khodaei, Zahra; Aliabadi, M. H.
2018-01-01
In this work, a reliability based impact detection strategy for a sensorized composite structure is proposed. Impacts are localized using Artificial Neural Networks (ANNs) with recorded guided waves due to impacts used as inputs. To account for variability in the recorded data under operational conditions, Bayesian updating and Kalman filter techniques are applied to improve the reliability of the detection algorithm. The possibility of having one or more faulty sensors is considered, and a decision fusion algorithm based on sub-networks of sensors is proposed to improve the application of the methodology to real structures. A strategy for reliably categorizing impacts into high energy impacts, which are probable to cause damage in the structure (true impacts), and low energy non-damaging impacts (false impacts), has also been proposed to reduce the false alarm rate. The proposed strategy involves employing classification ANNs with different features extracted from captured signals used as inputs. The proposed methodologies are validated by experimental results on a quasi-isotropic composite coupon impacted with a range of impact energies.
Afsar-Manesh, Nasim; Lonowski, Sarah; Namavar, Aram A
2017-12-01
UCLA Health embarked to transform care by integrating lean methodology in a key clinical project, Readmission Reduction Initiative (RRI). The first step focused on assembling a leadership team to articulate system-wide priorities for quality improvement. The lean principle of creating a culture of change and accountability was established by: 1) engaging stakeholders, 2) managing the process with performance accountability, and, 3) delivering patient-centered care. The RRI utilized three major lean principles: 1) A3, 2) root cause analyses, 3) value stream mapping. Baseline readmission rate at UCLA from 9/2010-12/2011 illustrated a mean of 12.1%. After the start of the RRI program, for the period of 1/2012-6/2013, the readmission rate decreased to 11.3% (p<0.05). To impact readmissions, solutions must evolve from smaller service- and location-based interventions into strategies with broader approach. As elucidated, a systematic clinical approach grounded in lean methodologies is a viable solution to this complex problem. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Dontsov, Dmitry; Yushkova, Natalia
2017-01-01
The paper is aimed at detecting conceptual conflicts within the architectural and urban construction activity (AUCA), defining their reasons and substantiating ways to decrease adverse effects they caused. Methods of causes and effects analyses are used, as well as evolutional and comparative analyses. They allow defining the laws to form activity model in modern environment, whose elements are ranked. Relevance of the paper is based on defining scientific and theoretical grounds of necessity to improve methodology of AUCA via its adaption to the imperatives of state management. System analyses enabled to prove practicability of considering factors of institution environment for reorganization of the model of AUCA, which provide the fullest implementation of sustainable development principles. It was proved that territorial planning is not only the leading type of AUCA, but also integrator for functioning structures of state management within planning of social and economic development. As main result of the paper consist in detection of the perspective ways for evolution of modern methodology due to increasing interdisciplinary aspect leading to the qualitative renewal of territorial management principles.
Reliability of physical functioning tests in patients with low back pain: a systematic review.
Denteneer, Lenie; Van Daele, Ulrike; Truijen, Steven; De Hertogh, Willem; Meirte, Jill; Stassijns, Gaetane
2018-01-01
The aim of this study was to provide a comprehensive overview of physical functioning tests in patients with low back pain (LBP) and to investigate their reliability. A systematic computerized search was finalized in four different databases on June 24, 2017: PubMed, Web of Science, Embase, and MEDLINE. Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were followed during all stages of this review. Clinical studies that investigate the reliability of physical functioning tests in patients with LBP were eligible. The methodological quality of the included studies was assessed with the use of the Consensus-based Standards for the selection of health Measurement Instruments (COSMIN) checklist. To come to final conclusions on the reliability of the identified clinical tests, the current review assessed three factors, namely, outcome assessment, methodological quality, and consistency of description. A total of 20 studies were found eligible and 38 clinical tests were identified. Good overall test-retest reliability was concluded for the extensor endurance test (intraclass correlation coefficient [ICC]=0.93-0.97), the flexor endurance test (ICC=0.90-0.97), the 5-minute walking test (ICC=0.89-0.99), the 50-ft walking test (ICC=0.76-0.96), the shuttle walk test (ICC=0.92-0.99), the sit-to-stand test (ICC=0.91-0.99), and the loaded forward reach test (ICC=0.74-0.98). For inter-rater reliability, only one test, namely, the Biering-Sörensen test (ICC=0.88-0.99), could be concluded to have an overall good inter-rater reliability. None of the identified clinical tests could be concluded to have a good intrarater reliability. Further investigation should focus on a better overall study methodology and the use of identical protocols for the description of clinical tests. The assessment of reliability is only a first step in the recommendation process for the use of clinical tests. In future research, the identified clinical tests in the current review should be further investigated for validity. Only when these clinimetric properties of a clinical test have been thoroughly investigated can a final conclusion regarding the clinical and scientific use of the identified tests be made. Copyright © 2017 Elsevier Inc. All rights reserved.
The Baby Care Questionnaire: A measure of parenting principles and practices during infancy☆
Winstanley, Alice; Gattis, Merideth
2013-01-01
The current report provides a new framework to explore the role of parenting practices and principles during infancy. We identify structure and attunement as key parenting principles during infancy. Structure represents reliance on regularity and routines in daily life. Attunement represents reliance on infant cues and close physical contact. We suggest parents’ relative endorsement of these parenting principles is related to their choices about practices such as feeding, holding and night-time sleeping. We designed the Baby Care Questionnaire to measure parents’ endorsement of structure and attunement, as well as their daily parenting practices. We report data demonstrating the factor structure, reliability and validity of the BCQ. The BCQ, to our knowledge, is the first comprehensive measure of parenting practices and principles during infancy. We conclude with a discussion of future directions for the measure. PMID:24050932
A Taxonomy of Ethical Ideologies.
ERIC Educational Resources Information Center
Forsyth, Donelson R.
1980-01-01
Assesses the reliability and validity of the Ethics Position Questionnaire: an instrument with two scales, one measuring idealism and another measuring the rejection of universal moral principles in favor of relativism. (Author/SS)
Advanced project management : training manual.
DOT National Transportation Integrated Search
2006-07-14
This course identifies the principles and methodologies adopted by the Indiana Department of Transportation (INDOT) to support successful project management and delivery. Project management requires the application of knowledge, skills, tools, and te...
NASA Astrophysics Data System (ADS)
Aditya, B. R.; Permadi, A.
2018-03-01
This paper describes implementation of Unified Theory of Acceptance and User of Technology (UTAUT) model to assess the use of virtual classroom in support of teaching and learning in higher education. The purpose of this research is how virtual classroom that has fulfilled the basic principle can be accepted and used by students positively. This research methodology uses the quantitative and descriptive approach with a questionnaire as a tool for measuring the height of virtual classroom principle acception. This research uses a sample of 105 students in D3 Informatics Management at Telkom University. The result of this research is that the use of classroom virtual principle are positive and relevant to the students in higher education.
Hardware and software reliability estimation using simulations
NASA Technical Reports Server (NTRS)
Swern, Frederic L.
1994-01-01
The simulation technique is used to explore the validation of both hardware and software. It was concluded that simulation is a viable means for validating both hardware and software and associating a reliability number with each. This is useful in determining the overall probability of system failure of an embedded processor unit, and improving both the code and the hardware where necessary to meet reliability requirements. The methodologies were proved using some simple programs, and simple hardware models.
Lange, Toni; Matthijs, Omer; Jain, Nitin B; Schmitt, Jochen; Lützner, Jörg; Kopkow, Christian
2017-03-01
Shoulder pain in the general population is common and to identify the aetiology of shoulder pain, history, motion and muscle testing, and physical examination tests are usually performed. The aim of this systematic review was to summarise and evaluate intrarater and inter-rater reliability of physical examination tests in the diagnosis of shoulder pathologies. A comprehensive systematic literature search was conducted using MEDLINE, EMBASE, Allied and Complementary Medicine Database (AMED) and Physiotherapy Evidence Database (PEDro) through 20 March 2015. Methodological quality was assessed using the Quality Appraisal of Reliability Studies (QAREL) tool by 2 independent reviewers. The search strategy revealed 3259 articles, of which 18 finally met the inclusion criteria. These studies evaluated the reliability of 62 test and test variations used for the specific physical examination tests for the diagnosis of shoulder pathologies. Methodological quality ranged from 2 to 7 positive criteria of the 11 items of the QAREL tool. This review identified a lack of high-quality studies evaluating inter-rater as well as intrarater reliability of specific physical examination tests for the diagnosis of shoulder pathologies. In addition, reliability measures differed between included studies hindering proper cross-study comparisons. PROSPERO CRD42014009018. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Taghipour, Morteza; Mohseni-Bandpei, Mohammad Ali; Behtash, Hamid; Abdollahi, Iraj; Rajabzadeh, Fatemeh; Pourahmadi, Mohammad Reza; Emami, Mahnaz
2018-04-24
Rehabilitative ultrasound (US) imaging is one of the popular methods for investigating muscle morphologic characteristics and dimensions in recent years. The reliability of this method has been investigated in different studies. As studies have been performed with different designs and quality, reported values of rehabilitative US have a wide range. The objective of this study was to systematically review the literature conducted on the reliability of rehabilitative US imaging for the assessment of deep abdominal and lumbar trunk muscle dimensions. The PubMed/MEDLINE, Scopus, Google Scholar, Science Direct, Embase, Physiotherapy Evidence, Ovid, and CINAHL databases were searched to identify original research articles conducted on the reliability of rehabilitative US imaging published from June 2007 to August 2017. The articles were qualitatively assessed; reliability data were extracted; and the methodological quality was evaluated by 2 independent reviewers. Of the 26 included studies, 16 were considered of high methodological quality. Except for 2 studies, all high-quality studies reported intraclass correlation coefficients (ICCs) for intra-rater reliability of 0.70 or greater. Also, ICCs reported for inter-rater reliability in high-quality studies were generally greater than 0.70. Among low-quality studies, reported ICCs ranged from 0.26 to 0.99 and 0.68 to 0.97 for intra- and inter-rater reliability, respectively. Also, the reported standard error of measurement and minimal detectable change for rehabilitative US were generally in an acceptable range. Generally, the results of the reviewed studies indicate that rehabilitative US imaging has good levels of both inter- and intra-rater reliability. © 2018 by the American Institute of Ultrasound in Medicine.
ERIC Educational Resources Information Center
Pfefferbaum, Betty; Weems, Carl F.; Scott, Brandon G.; Nitiéma, Pascal; Noffsinger, Mary A.; Pfefferbaum, Rose L.; Varma, Vandana; Chakraburtty, Amarsha
2013-01-01
Background: A comprehensive review of the design principles and methodological approaches that have been used to make inferences from the research on disasters in children is needed. Objective: To identify the methodological approaches used to study children's reactions to three recent major disasters--the September 11, 2001, attacks; the…
JPRS Report, Soviet Union, World Economy & International Relations, No. 6, June 1989.
1989-10-05
analysis of the discussions, held in France, is presented in the article "On Single European Market" written by I. EGOROV, our Paris correspondent... analysis that it studies current problems of international politics. But what are the basic theoretical principles of the new political thinking...loyalty to Marxism-Leninism is tested by the ability of its followers to use its theoretical and methodological principles for the creative analysis of
Adaptable Constrained Genetic Programming: Extensions and Applications
NASA Technical Reports Server (NTRS)
Janikow, Cezary Z.
2005-01-01
An evolutionary algorithm applies evolution-based principles to problem solving. To solve a problem, the user defines the space of potential solutions, the representation space. Sample solutions are encoded in a chromosome-like structure. The algorithm maintains a population of such samples, which undergo simulated evolution by means of mutation, crossover, and survival of the fittest principles. Genetic Programming (GP) uses tree-like chromosomes, providing very rich representation suitable for many problems of interest. GP has been successfully applied to a number of practical problems such as learning Boolean functions and designing hardware circuits. To apply GP to a problem, the user needs to define the actual representation space, by defining the atomic functions and terminals labeling the actual trees. The sufficiency principle requires that the label set be sufficient to build the desired solution trees. The closure principle allows the labels to mix in any arity-consistent manner. To satisfy both principles, the user is often forced to provide a large label set, with ad hoc interpretations or penalties to deal with undesired local contexts. This unfortunately enlarges the actual representation space, and thus usually slows down the search. In the past few years, three different methodologies have been proposed to allow the user to alleviate the closure principle by providing means to define, and to process, constraints on mixing the labels in the trees. Last summer we proposed a new methodology to further alleviate the problem by discovering local heuristics for building quality solution trees. A pilot system was implemented last summer and tested throughout the year. This summer we have implemented a new revision, and produced a User's Manual so that the pilot system can be made available to other practitioners and researchers. We have also designed, and partly implemented, a larger system capable of dealing with much more powerful heuristics.
Accountability and Internal Control--Do We Really Need It?
ERIC Educational Resources Information Center
Clarke, Allan B.
1987-01-01
Briefly looks at some of the basic principles of accountability and internal control as a review of present accounting system procedures to aid administrators to ensure reliable financial records. (MLF)
Computational methods for efficient structural reliability and reliability sensitivity analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.
1993-01-01
This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.
Observations of fallibility in applications of modern programming methodologies
NASA Technical Reports Server (NTRS)
Gerhart, S. L.; Yelowitz, L.
1976-01-01
Errors, inconsistencies, or confusing points are noted in a variety of published algorithms, many of which are being used as examples in formulating or teaching principles of such modern programming methodologies as formal specification, systematic construction, and correctness proving. Common properties of these points of contention are abstracted. These properties are then used to pinpoint possible causes of the errors and to formulate general guidelines which might help to avoid further errors. The common characteristic of mathematical rigor and reasoning in these examples is noted, leading to some discussion about fallibility in mathematics, and its relationship to fallibility in these programming methodologies. The overriding goal is to cast a more realistic perspective on the methodologies, particularly with respect to older methodologies, such as testing, and to provide constructive recommendations for their improvement.
Yamato, Tie Parma; Maher, Chris; Koes, Bart; Moseley, Anne
2017-06-01
The Physiotherapy Evidence Database (PEDro) scale has been widely used to investigate methodological quality in physiotherapy randomized controlled trials; however, its validity has not been tested for pharmaceutical trials. The aim of this study was to investigate the validity and interrater reliability of the PEDro scale for pharmaceutical trials. The reliability was also examined for the Cochrane Back and Neck (CBN) Group risk of bias tool. This is a secondary analysis of data from a previous study. We considered randomized placebo controlled trials evaluating any pain medication for chronic spinal pain or osteoarthritis. Convergent validity was evaluated by correlating the PEDro score with the summary score of the CBN risk of bias tool. The construct validity was tested using a linear regression analysis to determine the degree to which the total PEDro score is associated with treatment effect sizes, journal impact factor, and the summary score for the CBN risk of bias tool. The interrater reliability was estimated using the Prevalence and Bias Adjusted Kappa coefficient and 95% confidence interval (CI) for the PEDro scale and CBN risk of bias tool. Fifty-three trials were included, with 91 treatment effect sizes included in the analyses. The correlation between PEDro scale and CBN risk of bias tool was 0.83 (95% CI 0.76-0.88) after adjusting for reliability, indicating strong convergence. The PEDro score was inversely associated with effect sizes, significantly associated with the summary score for the CBN risk of bias tool, and not associated with the journal impact factor. The interrater reliability for each item of the PEDro scale and CBN risk of bias tool was at least substantial for most items (>0.60). The intraclass correlation coefficient for the PEDro score was 0.80 (95% CI 0.68-0.88), and for the CBN, risk of bias tool was 0.81 (95% CI 0.69-0.88). There was evidence for the convergent and construct validity for the PEDro scale when used to evaluate methodological quality of pharmacological trials. Both risk of bias tools have acceptably high interrater reliability. Copyright © 2017 Elsevier Inc. All rights reserved.
Comprehensive Design Reliability Activities for Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Christenson, R. L.; Whitley, M. R.; Knight, K. C.
2000-01-01
This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.
Van der Elst, Wim; Molenberghs, Geert; Hilgers, Ralf-Dieter; Verbeke, Geert; Heussen, Nicole
2016-11-01
There are various settings in which researchers are interested in the assessment of the correlation between repeated measurements that are taken within the same subject (i.e., reliability). For example, the same rating scale may be used to assess the symptom severity of the same patients by multiple physicians, or the same outcome may be measured repeatedly over time in the same patients. Reliability can be estimated in various ways, for example, using the classical Pearson correlation or the intra-class correlation in clustered data. However, contemporary data often have a complex structure that goes well beyond the restrictive assumptions that are needed with the more conventional methods to estimate reliability. In the current paper, we propose a general and flexible modeling approach that allows for the derivation of reliability estimates, standard errors, and confidence intervals - appropriately taking hierarchies and covariates in the data into account. Our methodology is developed for continuous outcomes together with covariates of an arbitrary type. The methodology is illustrated in a case study, and a Web Appendix is provided which details the computations using the R package CorrMixed and the SAS software. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Arheart, Kristopher L; Sly, David F; Trapido, Edward J; Rodriguez, Richard D; Ellestad, Amy J
2004-11-01
To identify multi-item attitude/belief scales associated with the theoretical foundations of an anti-tobacco counter-marketing campaign and assess their reliability and validity. The data analyzed are from two state-wide, random, cross-sectional telephone surveys [n(S1)=1,079, n(S2)=1,150]. Items forming attitude/belief scales are identified using factor analysis. Reliability is assessed with Chronbach's alpha. Relationships among scales are explored using Pearson correlation. Validity is assessed by testing associations derived from the Centers for Disease Control and Prevention's (CDC) logic model for tobacco control program development and evaluation linking media exposure to attitudes/beliefs, and attitudes/beliefs to smoking-related behaviors. Adjusted odds ratios are employed for these analyses. Three factors emerged: traditional attitudes/beliefs about tobacco and tobacco use, tobacco industry manipulation and anti-tobacco empowerment. Reliability coefficients are in the range of 0.70 and vary little between age groups. The factors are correlated with one-another as hypothesized. Associations between media exposure and the attitude/belief scales and between these scales and behaviors are consistent with the CDC logic model. Using reliable, valid multi-item scales is theoretically and methodologically more sound than employing single-item measures of attitudes/beliefs. Methodological, theoretical and practical implications are discussed.
Validation in the cross-cultural adaptation of the Korean version of the Oswestry Disability Index.
Jeon, Chang-Hoon; Kim, Dong-Jae; Kim, Se-Kang; Kim, Dong-Jun; Lee, Hwan-Mo; Park, Heui-Jeon
2006-12-01
Disability questionnaires are used for clinical assessment, outcome measurement, and research methodology. Any disability measurement must be adapted culturally for comparability of data, when the patients, who are measured, use different languages. This study aimed to conduct cross-cultural adaptation in translating the original (English) version of the Oswestry Disability Index (ODI) into Korean, and then to assess the reliability of the Korean versions of the Oswestry Disability Index (KODI). We used methodology to obtain semantic, idiomatic, experimental, and conceptual equivalences for the process of cross-cultural adaptation. The KODI were tested in 116 patients with chronic low back pain. The internal consistency and reliability for the KODI reached 0.9168 (Cronbach's alpha). The test-retest reliability was assessed with 32 patients (who were not included in the assessment of Cronbach's alpha) over a time interval of 4 days. Test-retest correlation reliability was 0.9332. The entire process and the results of this study were reported to the developer (Dr. Fairbank JC), who appraised the KODI. There is little evidence of differential item functioning in KODI. The results suggest that the KODI is internally consistent and reliable. Therefore, the KODI can be recommended as a low back pain assessment tool in Korea.
Validation in the Cross-Cultural Adaptation of the Korean Version of the Oswestry Disability Index
Kim, Dong-Jae; Kim, Se-Kang; Kim, Dong-Jun; Lee, Hwan-Mo; Park, Heui-Jeon
2006-01-01
Disability questionnaires are used for clinical assessment, outcome measurement, and research methodology. Any disability measurement must be adapted culturally for comparability of data, when the patients, who are measured, use different languages. This study aimed to conduct cross-cultural adaptation in translating the original (English) version of the Oswestry Disability Index (ODI) into Korean, and then to assess the reliability of the Korean versions of the Oswestry Disability Index (KODI). We used methodology to obtain semantic, idiomatic, experimental, and conceptual equivalences for the process of cross-cultural adaptation. The KODI were tested in 116 patients with chronic low back pain. The internal consistency and reliability for the KODI reached 0.9168 (Cronbach's alpha). The test-retest reliability was assessed with 32 patients (who were not included in the assessment of Cronbach's alpha) over a time interval of 4 days. Test-retest correlation reliability was 0.9332. The entire process and the results of this study were reported to the developer (Dr. Fairbank JC), who appraised the KODI. There is little evidence of differential item functioning in KODI. The results suggest that the KODI is internally consistent and reliable. Therefore, the KODI can be recommended as a low back pain assessment tool in Korea. PMID:17179693
Developing an oropharyngeal cancer (OPC) knowledge and behaviors survey.
Dodd, Virginia J; Riley Iii, Joseph L; Logan, Henrietta L
2012-09-01
To use the community participation research model to (1) develop a survey assessing knowledge about mouth and throat cancer and (2) field test and establish test-retest reliability with newly developed instrument. Cognitive interviews with primarily rural African American adults to assess their perception and interpretation of survey items. Test-retest reliability was established with a racially diverse rural population. Test-retest reliabilities ranged from .79 to .40 for screening awareness and .74 to .19 for knowledge. Coefficients increased for composite scores. Community participation methodology provided a culturally appropriate survey instrument that demonstrated acceptable levels of reliability.
Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base.
Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E; Wilkinson, Mark D
2016-01-01
Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be "FAIR"-Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences-the Pathogen-Host Interaction Database (PHI-base)-to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings.
Publishing FAIR Data: An Exemplar Methodology Utilizing PHI-Base
Rodríguez-Iglesias, Alejandro; Rodríguez-González, Alejandro; Irvine, Alistair G.; Sesma, Ane; Urban, Martin; Hammond-Kosack, Kim E.; Wilkinson, Mark D.
2016-01-01
Pathogen-Host interaction data is core to our understanding of disease processes and their molecular/genetic bases. Facile access to such core data is particularly important for the plant sciences, where individual genetic and phenotypic observations have the added complexity of being dispersed over a wide diversity of plant species vs. the relatively fewer host species of interest to biomedical researchers. Recently, an international initiative interested in scholarly data publishing proposed that all scientific data should be “FAIR”—Findable, Accessible, Interoperable, and Reusable. In this work, we describe the process of migrating a database of notable relevance to the plant sciences—the Pathogen-Host Interaction Database (PHI-base)—to a form that conforms to each of the FAIR Principles. We discuss the technical and architectural decisions, and the migration pathway, including observations of the difficulty and/or fidelity of each step. We examine how multiple FAIR principles can be addressed simultaneously through careful design decisions, including making data FAIR for both humans and machines with minimal duplication of effort. We note how FAIR data publishing involves more than data reformatting, requiring features beyond those exhibited by most life science Semantic Web or Linked Data resources. We explore the value-added by completing this FAIR data transformation, and then test the result through integrative questions that could not easily be asked over traditional Web-based data resources. Finally, we demonstrate the utility of providing explicit and reliable access to provenance information, which we argue enhances citation rates by encouraging and facilitating transparent scholarly reuse of these valuable data holdings. PMID:27433158
Fleet management performance monitoring.
DOT National Transportation Integrated Search
2013-05-01
The principle goal of this project was to enhance and expand the analytical modeling methodology previously developed as part of the Fleet Management Criteria: Disposal Points and Utilization Rates project completed in 2010. The enhanced and ex...
Rogers, Stephen C.; Gibbons, Lindsey B.; Griffin, Sherraine; Doctor, Allan
2012-01-01
This chapter summarizes the principles of RSNO measurement in the gas phase, utilizing ozone-based chemiluminescence and the copper cysteine (2C) ± carbon monoxide (3C) reagent. Although an indirect method for quantifying RSNOs, this assay represents one of the most robust methodologies available. It exploits the NO• detection sensitivity of ozone based chemiluminscence, which is within the range required to detect physiological concentrations of RSNO metabolites. Additionally, the specificity of the copper cysteine (2C and 3C) reagent for RSNOs negates the need for sample pretreatment, thereby minimizing the likelihood of sample contamination (false positive results), NO species inter-conversion, or the loss of certain highly labile RSNO species. Herein, we outline the principles of this methodology, summarizing key issues, potential pitfalls and corresponding solutions. PMID:23116707
Reliability and Validity of the International Physical Activity Questionnaire for Assessing Walking
ERIC Educational Resources Information Center
van der Ploeg, Hidde P.; Tudor-Locke, Catrine; Marshall, Alison L.; Craig, Cora; Hagstromer, Maria; Sjostrom, Michael; Bauman, Adrian
2010-01-01
The single most commonly reported physical activity in public health surveys is walking. As evidence accumulates that walking is important for preventing weight gain and reducing the risk of diabetes, there is increased need to capture this behavior in a valid and reliable manner. Although the disadvantages of a self-report methodology are well…
ERIC Educational Resources Information Center
Wang, Yan Z.; Wiley, Angela R.; Zhou, Xiaobin
2007-01-01
This study used a mixed methodology to investigate reliability, validity, and analysis level with Chinese immigrant observational data. European-American and Chinese coders quantitatively rated 755 minutes of Chinese immigrant parent-toddler dinner interactions on parental sensitivity, intrusiveness, detachment, negative affect, positive affect,…
The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale
ERIC Educational Resources Information Center
Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine
2013-01-01
Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…
ERIC Educational Resources Information Center
Lee, Jihyun; Jang, Seonyoung
2014-01-01
Instructional design (ID) models have been developed to promote understandings of ID reality and guide ID performance. As the number and diversity of ID practices grows, implicit doubts regarding the reliability, validity, and usefulness of ID models suggest the need for methodological guidance that would help to generate ID models that are…
Scaled CMOS Technology Reliability Users Guide
NASA Technical Reports Server (NTRS)
White, Mark
2010-01-01
The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is presented revealing a power relationship. General models describing the soft error rates across scaled product generations are presented. The analysis methodology may be applied to other scaled microelectronic products and their key parameters.
Preliminary study of the reliability of imaging charge coupled devices
NASA Technical Reports Server (NTRS)
Beall, J. R.; Borenstein, M. D.; Homan, R. A.; Johnson, D. L.; Wilson, D. D.; Young, V. F.
1978-01-01
Imaging CCDs are capable of low light level response and high signal-to-noise ratios. In space applications they offer the user the ability to achieve extremely high resolution imaging with minimum circuitry in the photo sensor array. This work relates the CCD121H Fairchild device to the fundamentals of CCDs and the representative technologies. Several failure modes are described, construction is analyzed and test results are reported. In addition, the relationship of the device reliability to packaging principles is analyzed and test data presented. Finally, a test program is defined for more general reliability evaluation of CCDs.
Metrological Reliability of Medical Devices
NASA Astrophysics Data System (ADS)
Costa Monteiro, E.; Leon, L. F.
2015-02-01
The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.
Jensen, Eric Allen
2017-01-01
With the rapid global proliferation of social media, there has been growing interest in using this existing source of easily accessible 'big data' to develop social science knowledge. However, amidst the big data gold rush, it is important that long-established principles of good social research are not ignored. This article critically evaluates Mitchell et al.'s (2013) study, 'The Geography of Happiness: Connecting Twitter Sentiment and Expression, Demographics, and Objective Characteristics of Place', demonstrating the importance of attending to key methodological issues associated with secondary data analysis.
Seals Research at AlliedSignal
NASA Technical Reports Server (NTRS)
Ullah, M. Rifat
1996-01-01
A consortium has been formed to address seal problems in the Aerospace sector of Allied Signal, Inc. The consortium is represented by makers of Propulsion Engines, Auxiliary Power Units, Gas Turbine Starters, etc. The goal is to improve Face Seal reliability, since Face Seals have become reliability drivers in many of our product lines. Several research programs are being implemented simultaneously this year. They include: Face Seal Modeling and Analysis Methodology; Oil Cooling of Seals; Seal Tracking Dynamics; Coking Formation & Prevention; and Seal Reliability Methods.
NASA Technical Reports Server (NTRS)
Anderson, B. H.
1983-01-01
A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.
Emerging technologies for the changing global market
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi-quantative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
Kirchner, M K; Schulze Westerath, H; Knierim, U; Tessitore, E; Cozzi, G; Winckler, C
2014-03-01
Consistency over time of (on-farm) animal welfare assessment systems forms part of reliability, meaning that results of the assessment should be representative of the longer-term welfare state of the farm as long as the housing and management conditions have not changed considerably. This is especially important if assessments are to be used for certification purposes. It was the aim of the present study to investigate consistency over time of the Welfare Quality(®) (WQ(®)) assessment system for fattening cattle at single measure level, aggregated criterion and principle scores, and overall classification across short-term (1 month) and longer-term periods (6 months). We hypothesized that consistency over time of aggregated criterion and principle scores is higher than that of single measures. Consistency was also expected to be lower with longer intervals between assessments. Data were obtained using the WQ(®) protocol for fattening cattle during three visits (months 0, 1 and 7) on 63 beef farms in Austria, Germany and Italy. Only data from farms where no major changes in housing and management had taken place were considered for analysis. At the single measure level, Spearman rank correlations between visits were >0.7 and variance was lower within farms than between farms for six and two of 19 measures after 1 month and 6 months, respectively. After aggregation of single measures into criterion and principle scores, five and two of 10 criteria and three and one of four principles were found reliable after 1 and 6 months, respectively. At the WQ(®) principle level, this was the case for three and one of four principles. Seventy-nine per cent and 75% of the farms were allocated to the same overall welfare category after 1 month and 6 months. Possible reasons for a lack of consistency are seasonal effects or short-term fluctuations that occur under normal farm conditions, low prevalence of clinical measures and probably insufficient sample size, whereas poor inter-observer agreement leading to inflation of correlation can be ruled out. At the criterion and principle level, aggregation of information into scores appears to partly smoothen undirected variation at the single measure level without losing sensitivity in terms of welfare evaluation. Reliable on-farm animal welfare assessments should therefore be based on repeated assessments. Further long-term studies are recommended to better understand the factors influencing consistency over time.
Inauen, A; Jenny, G J; Bauer, G F
2012-06-01
This article focuses on organizational analysis in workplace health promotion (WHP) projects. It shows how this analysis can be designed such that it provides rational data relevant to the further context-specific and goal-oriented planning of WHP and equally supports individual and organizational change processes implied by WHP. Design principles for organizational analysis were developed on the basis of a narrative review of the guiding principles of WHP interventions and organizational change as well as the scientific principles of data collection. Further, the practical experience of WHP consultants who routinely conduct organizational analysis was considered. This resulted in a framework with data-oriented and change-oriented design principles, addressing the following elements of organizational analysis in WHP: planning the overall procedure, data content, data-collection methods and information processing. Overall, the data-oriented design principles aim to produce valid, reliable and representative data, whereas the change-oriented design principles aim to promote motivation, coherence and a capacity for self-analysis. We expect that the simultaneous consideration of data- and change-oriented design principles for organizational analysis will strongly support the WHP process. We finally illustrate the applicability of the design principles to health promotion within a WHP case study.
Life on the arc: principle-centered comprehensive care.
Fohey, T; Cassidy, J L
1998-01-01
Today's dental practice is experiencing an evolution in the manner through which new materials and techniques are marketed and introduced. An increasing concern among the patient population regarding aesthetics contributes to the acceptance of a commodity dental philosophy, without questioning the reliability of the technique or new material. A principle-centered practice differentiates the product marketing from the viability of a restorative material in vivo. This article discusses the concept of a principle-centered practice and describes how to place quality products in a balanced system in which harmony exits between all components of the masticatory system: the teeth, the muscles, and the temporomandibular joints.
NASA Astrophysics Data System (ADS)
Cardona Quintero, Y.; Ramanath, Ganpati; Ramprasad, R.
2013-10-01
A parameter-free, quantitative, first-principles methodology to determine the environment-dependent interfacial strength of metal-metal oxide interfaces is presented. This approach uses the notion of the weakest link to identify the most likely cleavage plane, and first principles thermodynamics to calculate the average work of separation as a function of the environment (in this case, temperature and oxygen pressure). The method is applied to the case of the Pt-HfO2 interface, and it is shown that the computed environment-dependent work of separation is in quantitative agreement with available experimental data.
Charpentier, R.R.; Klett, T.R.
2005-01-01
During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.
Feynman’s clock, a new variational principle, and parallel-in-time quantum dynamics
McClean, Jarrod R.; Parkhill, John A.; Aspuru-Guzik, Alán
2013-01-01
We introduce a discrete-time variational principle inspired by the quantum clock originally proposed by Feynman and use it to write down quantum evolution as a ground-state eigenvalue problem. The construction allows one to apply ground-state quantum many-body theory to quantum dynamics, extending the reach of many highly developed tools from this fertile research area. Moreover, this formalism naturally leads to an algorithm to parallelize quantum simulation over time. We draw an explicit connection between previously known time-dependent variational principles and the time-embedded variational principle presented. Sample calculations are presented, applying the idea to a hydrogen molecule and the spin degrees of freedom of a model inorganic compound, demonstrating the parallel speedup of our method as well as its flexibility in applying ground-state methodologies. Finally, we take advantage of the unique perspective of this variational principle to examine the error of basis approximations in quantum dynamics. PMID:24062428
Islam and the four principles of medical ethics.
Mustafa, Yassar
2014-07-01
The principles underpinning Islam's ethical framework applied to routine clinical scenarios remain insufficiently understood by many clinicians, thereby unfortunately permitting the delivery of culturally insensitive healthcare.This paper summarises the foundations of the Islamic ethical theory, elucidating the principles and methodology employed by the Muslim jurist in deriving rulings in the field of medical ethics. The four-principles approach, as espoused by Beauchamp and Childress, is also interpreted through the prism of Islamic ethical theory. Each of the four principles (beneficence, nonmaleficence,justice and autonomy) is investigated in turn, looking in particular at the extent to which each is rooted in the Islamic paradigm. This will provide an important insight into Islamic medical ethics, enabling the clinician to have a better informed discussion with the Muslim patient. It will also allow for a higher degree of concordance in consultations and consequently optimise culturally sensitive healthcare delivery.
NASA Astrophysics Data System (ADS)
Artrith, Nongnuch; Urban, Alexander; Ceder, Gerbrand
2018-06-01
The atomistic modeling of amorphous materials requires structure sizes and sampling statistics that are challenging to achieve with first-principles methods. Here, we propose a methodology to speed up the sampling of amorphous and disordered materials using a combination of a genetic algorithm and a specialized machine-learning potential based on artificial neural networks (ANNs). We show for the example of the amorphous LiSi alloy that around 1000 first-principles calculations are sufficient for the ANN-potential assisted sampling of low-energy atomic configurations in the entire amorphous LixSi phase space. The obtained phase diagram is validated by comparison with the results from an extensive sampling of LixSi configurations using molecular dynamics simulations and a general ANN potential trained to ˜45 000 first-principles calculations. This demonstrates the utility of the approach for the first-principles modeling of amorphous materials.
Mechanical System Reliability and Cost Integration Using a Sequential Linear Approximation Method
NASA Technical Reports Server (NTRS)
Kowal, Michael T.
1997-01-01
The development of new products is dependent on product designs that incorporate high levels of reliability along with a design that meets predetermined levels of system cost. Additional constraints on the product include explicit and implicit performance requirements. Existing reliability and cost prediction methods result in no direct linkage between variables affecting these two dominant product attributes. A methodology to integrate reliability and cost estimates using a sequential linear approximation method is proposed. The sequential linear approximation method utilizes probability of failure sensitivities determined from probabilistic reliability methods as well a manufacturing cost sensitivities. The application of the sequential linear approximation method to a mechanical system is demonstrated.
Tautin, J.; Lebreton, J.-D.; North, P.M.
1993-01-01
Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.
A clash of paradigms? Western and indigenous views on health research involving Aboriginal peoples.
Campbell, Theresa Diane
2014-07-01
To explore the issues of data management and data ownership with regard to health research conducted in aboriginal or indigenous populations in Canada. Research with aboriginal communities in Canada has often been conducted by researchers who had little or no understanding of the community in which the research was taking place. This led to 'helicopter' research, which benefitted the researcher but not the community. National aboriginal leadership developed the ownership, control, access, and possession (OCAP) principles, which outline how to manage research data regarding aboriginal people and to counteract disrespectful methodologies. However, these principles present their own set of challenges to those who would conduct research with aboriginal populations. Documents from the Assembly of First Nations, the Government of Canada, Aboriginal writers and researchers, and Nursing theorists and researchers. This is a methodology paper that reviews the issues of data ownership when conducting research with Aboriginal populations. The authors explore indigenous and Western views of knowledge development, outline and discuss the OCAP principles, and present the Canadian Institute of Health Research's guidelines for health research involving aboriginal people as a guide for those who want to carry out ethical and culturally competent research, do no harm and produce research that can benefit aboriginal peoples. There are special considerations associated with conducting research with Aboriginal populations. The Assembly of First Nations wants researchers to use the Ownership, Control, Access and Possession (OCAP) principles with First Nations data. These principles are restrictive and need to be discussed with stakeholders before research is undertaken. In Canada, it is imperative that researchers use the Canadian Institute of Health Research Guidelines for Health Research Involving Aboriginal People to ensure culturally sensitive and ethical conduct during the course of the research with Aboriginal populations. However, some communities may also want to use the OCAP principles and these principles will need to be taken into consideration when designing the study.
Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M
2013-09-01
Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
2008-01-01
on Op. Sys. Principles, ACM SIGOPS, Brighton , UK , October. Pollack, S. and McQuay, W.K. (2005) ‘Joint battlespace infosphere applications using...the voting protocols for good performance while meeting the reliability requirements of data delivery in a high assurance setting. Two metric quantify...the effectiveness of voting protocols: Data Transfer Efficiency (DTE) and Time-to-Complete (TTC) data delivery . DTE captures the network bandwidth
From Not-So-Great to Worse: The Myth of Best Practice Methodologies
2016-09-13
Collins’ arguments and suggested principled commonalities about great fi rms were unsupported. Resnick and Smunt conducted a fi nancial analysis over... market performance according to Collins’ measure, and that none do so when measured according to a metric based on modern portfolio theory. We...applying the fi ve principles to other fi rms or time periods will lead to anything other than average results.” By the way, Col- lins’ list of 11 great
Constructing the principles: Method and metaphysics in the progress of theoretical physics
NASA Astrophysics Data System (ADS)
Glass, Lawrence C.
This thesis presents a new framework for the philosophy of physics focused on methodological differences found in the practice of modern theoretical physics. The starting point for this investigation is the longstanding debate over scientific realism. Some philosophers have argued that it is the aim of science to produce an accurate description of the world including explanations for observable phenomena. These scientific realists hold that our best confirmed theories are approximately true and that the entities they propose actually populate the world, whether or not they have been observed. Others have argued that science achieves only frameworks for the prediction and manipulation of observable phenomena. These anti-realists argue that truth is a misleading concept when applied to empirical knowledge. Instead, focus should be on the empirical adequacy of scientific theories. This thesis argues that the fundamental distinction at issue, a division between true scientific theories and ones which are empirically adequate, is best explored in terms of methodological differences. In analogy with the realism debate, there are at least two methodological strategies. Rather than focusing on scientific theories as wholes, this thesis takes as units of analysis physical principles which are systematic empirical generalizations. The first possible strategy, the conservative, takes the assumption that the empirical adequacy of a theory in one domain serves as good evidence for such adequacy in other domains. This then motivates the application of the principle to new domains. The second strategy, the innovative, assumes that empirical adequacy in one domain does not justify the expectation of adequacy in other domains. New principles are offered as explanations in the new domain. The final part of the thesis is the application of this framework to two examples. On the first, Lorentz's use of the aether is reconstructed in terms of the conservative strategy with respect to the principles of Galilean relativity. A comparison between the conservative strategy as an application of the conservative strategy and TeVeS as one of the innovative constitutes the second example.
Synthetic chemists have always had an objective to achieve reliable and high-yielding routes to the syntheses of targeted molecules. The importance of minimal waste generation has emphasized the use of green chemistry principles and sustainable development. These directions lead ...
Measurement of Radon in Indoor Air.
ERIC Educational Resources Information Center
Downey, Daniel M.; Simolunas, Glenn
1988-01-01
Describes a laboratory experiment to teach the principles of air sampling, gamma ray spectroscopy, nuclear decay, and radioactive equilibrium. Analyzes radon by carbon adsorption and gamma ray counting. Provides methodology and rate of decay equations. (MVL)
Lauricella, Leticia L; Costa, Priscila B; Salati, Michele; Pego-Fernandes, Paulo M; Terra, Ricardo M
2018-06-01
Database quality measurement should be considered a mandatory step to ensure an adequate level of confidence in data used for research and quality improvement. Several metrics have been described in the literature, but no standardized approach has been established. We aimed to describe a methodological approach applied to measure the quality and inter-rater reliability of a regional multicentric thoracic surgical database (Paulista Lung Cancer Registry). Data from the first 3 years of the Paulista Lung Cancer Registry underwent an audit process with 3 metrics: completeness, consistency, and inter-rater reliability. The first 2 methods were applied to the whole data set, and the last method was calculated using 100 cases randomized for direct auditing. Inter-rater reliability was evaluated using percentage of agreement between the data collector and auditor and through calculation of Cohen's κ and intraclass correlation. The overall completeness per section ranged from 0.88 to 1.00, and the overall consistency was 0.96. Inter-rater reliability showed many variables with high disagreement (>10%). For numerical variables, intraclass correlation was a better metric than inter-rater reliability. Cohen's κ showed that most variables had moderate to substantial agreement. The methodological approach applied to the Paulista Lung Cancer Registry showed that completeness and consistency metrics did not sufficiently reflect the real quality status of a database. The inter-rater reliability associated with κ and intraclass correlation was a better quality metric than completeness and consistency metrics because it could determine the reliability of specific variables used in research or benchmark reports. This report can be a paradigm for future studies of data quality measurement. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Stolarova, Margarita; Wolf, Corinna; Rinker, Tanja; Brielmann, Aenne
2014-01-01
This report has two main purposes. First, we combine well-known analytical approaches to conduct a comprehensive assessment of agreement and correlation of rating-pairs and to dis-entangle these often confused concepts, providing a best-practice example on concrete data and a tutorial for future reference. Second, we explore whether a screening questionnaire developed for use with parents can be reliably employed with daycare teachers when assessing early expressive vocabulary. A total of 53 vocabulary rating pairs (34 parent–teacher and 19 mother–father pairs) collected for two-year-old children (12 bilingual) are evaluated. First, inter-rater reliability both within and across subgroups is assessed using the intra-class correlation coefficient (ICC). Next, based on this analysis of reliability and on the test-retest reliability of the employed tool, inter-rater agreement is analyzed, magnitude and direction of rating differences are considered. Finally, Pearson correlation coefficients of standardized vocabulary scores are calculated and compared across subgroups. The results underline the necessity to distinguish between reliability measures, agreement and correlation. They also demonstrate the impact of the employed reliability on agreement evaluations. This study provides evidence that parent–teacher ratings of children's early vocabulary can achieve agreement and correlation comparable to those of mother–father ratings on the assessed vocabulary scale. Bilingualism of the evaluated child decreased the likelihood of raters' agreement. We conclude that future reports of agreement, correlation and reliability of ratings will benefit from better definition of terms and stricter methodological approaches. The methodological tutorial provided here holds the potential to increase comparability across empirical reports and can help improve research practices and knowledge transfer to educational and therapeutic settings. PMID:24994985
NASA Technical Reports Server (NTRS)
Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)
2001-01-01
The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by employing soft computing technologies, the quality and reliability of the overall scheme to engine controller development is further improved and vehicle safety is further insured. The final product that this paper proposes is an approach to development of an alternative low cost engine controller that would be capable of performing in unique vision spacecraft vehicles requiring low cost advanced avionics architectures for autonomous operations from engine pre-start to engine shutdown.
Bartolazzi, Armando; Bellotti, Carlo; Sciacchitano, Salvatore
2012-01-01
In the last decade, the β-galactosyl binding protein galectin-3 has been the object of extensive molecular, structural, and functional studies aimed to clarify its biological role in cancer. Multicenter studies also contributed to discover the potential clinical value of galectin-3 expression analysis in distinguishing, preoperatively, benign from malignant thyroid nodules. As a consequence galectin-3 is receiving significant attention as tumor marker for thyroid cancer diagnosis, but some conflicting results mostly owing to methodological problems have been published. The possibility to apply preoperatively a reliable galectin-3 test method on fine needle aspiration biopsy (FNA)-derived thyroid cells represents an important achievement. When correctly applied, the method reduces consistently the gray area of thyroid FNA cytology, contributing to avoid unnecessary thyroid surgery. Although the efficacy and reliability of the galectin-3 test method have been extensively proved in several studies, its translation in the clinical setting requires well-standardized reagents and procedures. After a decade of experimental work on galectin-3-related basic and translational research projects, the major methodological problems that may potentially impair the diagnostic performance of galectin-3 immunotargeting are highlighted and discussed in detail. A standardized protocol for a reliable galectin-3 expression analysis is finally provided. The aim of this contribution is to improve the clinical management of patients with thyroid nodules, promoting the preoperative use of a reliable galectin-3 test method as ancillary technique to conventional thyroid FNA cytology. The final goal is to decrease unnecessary thyroid surgery and its related social costs.
Got (the Right) Milk? How a Blended Quality Improvement Approach Catalyzed Change.
Luton, Alexandra; Bondurant, Patricia G; Campbell, Amy; Conkin, Claudia; Hernandez, Jae; Hurst, Nancy
2015-10-01
The expression, storage, preparation, fortification, and feeding of breast milk are common ongoing activities in many neonatal intensive care units (NICUs) today. Errors in breast milk administration are a serious issue that should be prevented to preserve the health and well-being of NICU babies and their families. This paper describes how a program to improve processes surrounding infant feeding was developed, implemented, and evaluated. The project team used a blended quality improvement approach that included the Model for Improvement, Lean and Six Sigma methodologies, and principles of High Reliability Organizations to identify and drive short-term, medium-term, and long-term improvement strategies. Through its blended quality improvement approach, the team strengthened the entire dispensation system for both human milk and formula and outlined a clear vision and plan for further improvements as well. The NICU reduced feeding errors by 83%. Be systematic in the quality improvement approach, and apply proven methods to improving processes surrounding infant feeding. Involve expert project managers with nonclinical perspective to guide work in a systematic way and provide unbiased feedback. Create multidisciplinary, cross-departmental teams that include a vast array of stakeholders in NICU feeding processes to ensure comprehensive examination of current state, identification of potential risks, and "outside the box" potential solutions. As in the realm of pharmacy, the processes involved in preparing feedings for critically ill infants should be carried out via predictable, reliable means including robust automated verification that integrates seamlessly into existing processes. The use of systems employed in pharmacy for medication preparation should be considered in the human milk and formula preparation setting.
Weems, Carl F.; Scott, Brandon G.; Nitiéma, Pascal; Noffsinger, Mary A.; Pfefferbaum, Rose L.; Varma, Vandana; Chakraburtty, Amarsha
2013-01-01
Background A comprehensive review of the design principles and methodological approaches that have been used to make inferences from the research on disasters in children is needed. Objective To identify the methodological approaches used to study children’s reactions to three recent major disasters—the September 11, 2001, attacks; the 2004 Indian Ocean Tsunami; and Hurricane Katrina. Methods This review was guided by a systematic literature search. Results A total of 165 unduplicated empirical reports were generated by the search and examined for this review. This included 83 references on September 11, 29 on the 2004 Tsunami, and 53 on Hurricane Katrina. Conclusions A diversity of methods has been brought to bear in understanding children’s reactions to disasters. While cross-sectional studies predominate, pre-event data for some investigations emerged from archival data and data from studies examining non-disaster topics. The nature and extent of the influence of risk and protective variables beyond disaster exposure are not fully understood due, in part, to limitations in the study designs used in the extant research. Advancing an understanding of the roles of exposure and various individual, family, and social factors depends upon the extent to which measures and assessment techniques are valid and reliable, as well as on data sources and data collection designs. Comprehensive assessments that extend beyond questionnaires and checklists to include interviews and cognitive and biological measures to elucidate the negative and positive effects of disasters on children also may improve the knowledge base. PMID:24443635
Abookasis, David; Volkov, Boris; Shochat, Ariel; Kofman, Itamar
2016-04-01
Optical techniques have gained substantial interest over the past four decades for biomedical imaging due to their unique advantages, which may suggest their use as alternatives to conventional methodologies. Several optical techniques have been successfully adapted to clinical practice and biomedical research to monitor tissue structure and function in both humans and animal models. This paper reviews the analysis of the optical properties of brain tissue in the wavelength range between 500 and 1000 nm by three different diffuse optical reflectance methods: spatially modulated illumination, orthogonal diffuse light spectroscopy, and dual-wavelength laser speckle imaging, to monitor changes in brain tissue morphology, chromophore content, and metabolism following head injury. After induction of closed head injury upon anesthetized mice by weight-drop method, significant changes in hemoglobin oxygen saturation, blood flow, and metabolism were readily detectible by all three optical setups, up to 1 h post-trauma. Furthermore, the experimental results clearly demonstrate the feasibility and reliability of the three methodologies, and the differences between the system performances and capabilities are also discussed. The long-term goal of this line of study is to combine these optical systems to study brain pathophysiology in high spatiotemporal resolution using additional models of brain trauma. Such combined use of complementary algorithms should fill the gaps in each system's capabilities, toward the development of a noninvasive, quantitative tool to expand our knowledge of the principles underlying brain function following trauma, and to monitor the efficacy of therapeutic interventions in the clinic.
Abookasis, David; Volkov, Boris; Shochat, Ariel; Kofman, Itamar
2016-01-01
Abstract. Optical techniques have gained substantial interest over the past four decades for biomedical imaging due to their unique advantages, which may suggest their use as alternatives to conventional methodologies. Several optical techniques have been successfully adapted to clinical practice and biomedical research to monitor tissue structure and function in both humans and animal models. This paper reviews the analysis of the optical properties of brain tissue in the wavelength range between 500 and 1000 nm by three different diffuse optical reflectance methods: spatially modulated illumination, orthogonal diffuse light spectroscopy, and dual-wavelength laser speckle imaging, to monitor changes in brain tissue morphology, chromophore content, and metabolism following head injury. After induction of closed head injury upon anesthetized mice by weight-drop method, significant changes in hemoglobin oxygen saturation, blood flow, and metabolism were readily detectible by all three optical setups, up to 1 h post-trauma. Furthermore, the experimental results clearly demonstrate the feasibility and reliability of the three methodologies, and the differences between the system performances and capabilities are also discussed. The long-term goal of this line of study is to combine these optical systems to study brain pathophysiology in high spatiotemporal resolution using additional models of brain trauma. Such combined use of complementary algorithms should fill the gaps in each system’s capabilities, toward the development of a noninvasive, quantitative tool to expand our knowledge of the principles underlying brain function following trauma, and to monitor the efficacy of therapeutic interventions in the clinic. PMID:27175372
Pfefferbaum, Betty; Weems, Carl F; Scott, Brandon G; Nitiéma, Pascal; Noffsinger, Mary A; Pfefferbaum, Rose L; Varma, Vandana; Chakraburtty, Amarsha
2013-08-01
A comprehensive review of the design principles and methodological approaches that have been used to make inferences from the research on disasters in children is needed. To identify the methodological approaches used to study children's reactions to three recent major disasters-the September 11, 2001, attacks; the 2004 Indian Ocean Tsunami; and Hurricane Katrina. This review was guided by a systematic literature search. A total of 165 unduplicated empirical reports were generated by the search and examined for this review. This included 83 references on September 11, 29 on the 2004 Tsunami, and 53 on Hurricane Katrina. A diversity of methods has been brought to bear in understanding children's reactions to disasters. While cross-sectional studies predominate, pre-event data for some investigations emerged from archival data and data from studies examining non-disaster topics. The nature and extent of the influence of risk and protective variables beyond disaster exposure are not fully understood due, in part, to limitations in the study designs used in the extant research. Advancing an understanding of the roles of exposure and various individual, family, and social factors depends upon the extent to which measures and assessment techniques are valid and reliable, as well as on data sources and data collection designs. Comprehensive assessments that extend beyond questionnaires and checklists to include interviews and cognitive and biological measures to elucidate the negative and positive effects of disasters on children also may improve the knowledge base.
The Handicap Principle, Strategic Information Warfare and the Paradox of Asymmetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Zhanshan; Sheldon, Frederick T; Krings, Axel
2010-01-01
The term asymmetric threat (or warfare) often refers to tactics utilized by countries, terrorist groups, or individuals to carry out attacks on a superior opponent while trying to avoid direct confrontation. Information warfare is sometimes also referred to as a type of asymmetric warfare perhaps due to its asymmetry in terms of cost and efficacy. Obviously, there are differences and commonalities between two types of asymmetric warfare. One major difference lies in the goal to avoid confrontation and one commonality is the asymmetry. Regardless, the unique properties surrounding asymmetric warfare warrant a strategic-level study. Despite enormous studies conducted in themore » last decade, a consensus on the strategy a nation state should take to deal with asymmetric threat seems still intriguing. In this article, we try to shed some light on the issue from the handicap principle in the context of information warfare. The Handicap principle was first proposed by Zahavi (1975) to explain the honesty or reliability of animal communication signals. He argued that in a signaling system such as one used in mate selection, a superior male is able to signal with a highly developed "handicap" to demonstrate its quality, and the handicap serves "as a kind of (quality) test imposed on the individual" (Zahavi 1975, Searcy and Nowicki 2005). The underlying thread that inspires us for the attempt to establish a connection between the two apparently unrelated areas is the observation that competition, communication and cooperation (3C), which are three fundamental processes in nature and against which natural selection optimize living things, may also make sense in human society. Furthermore, any communication networks, whether it is biological networks (such as animal communication networks) or computer networks (such as the Internet) must be reasonably reliable (honest in the case of animal signaling) to fulfill its missions for transmitting and receiving messages. The strategic goal of information warfare is then to destroy or defend the reliability (honesty) of communication networks. The handicap principle that governs the reliability (honesty) of animal communication networks can be considered as the nature s version of information warfare strategy because it is a product of natural selection. What is particularly interesting is to transfer the evolutionary game theory models [e.g., Sir Philip Sydney (SPS) game] for the handicap principle to the study of information warfare. In a broad perspective, we realize that the handicap principle may actually contradict the principle of asymmetry in asymmetric warfare. Anyway, not every species of animals has evolved expensive signaling equipments like male peacocks (whose exaggerated train is an example of handicap). Furthermore, the handicap principle is not only about communication, and it also embodies the spirits of cooperation and competition. In human societies, communication modulates cooperation and competition; so does in animal communication networks. Therefore, to evolve or maintain a sustainable communication network, the proper strategy should be to balance (modulate) the cooperation and competition with communication tools (information warfare tools), which is perhaps in contradiction with the asymmetric strategy. There might be a paradox in the strategy of asymmetric warfare, and whether or not information warfare can be used as an asymmetric tool is still an open question.« less
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
Time-Varying, Multi-Scale Adaptive System Reliability Analysis of Lifeline Infrastructure Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Kurtz, Nolan Scot
2014-09-01
The majority of current societal and economic needs world-wide are met by the existing networked, civil infrastructure. Because the cost of managing such infrastructure is high and increases with time, risk-informed decision making is essential for those with management responsibilities for these systems. To address such concerns, a methodology that accounts for new information, deterioration, component models, component importance, group importance, network reliability, hierarchical structure organization, and efficiency concerns has been developed. This methodology analyzes the use of new information through the lens of adaptive Importance Sampling for structural reliability problems. Deterioration, multi-scale bridge models, and time-variant component importance aremore » investigated for a specific network. Furthermore, both bridge and pipeline networks are studied for group and component importance, as well as for hierarchical structures in the context of specific networks. Efficiency is the primary driver throughout this study. With this risk-informed approach, those responsible for management can address deteriorating infrastructure networks in an organized manner.« less
Space Station man-machine automation trade-off analysis
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.; Bard, J.; Feinberg, A.
1985-01-01
The man machine automation tradeoff methodology presented is of four research tasks comprising the autonomous spacecraft system technology (ASST) project. ASST was established to identify and study system level design problems for autonomous spacecraft. Using the Space Station as an example spacecraft system requiring a certain level of autonomous control, a system level, man machine automation tradeoff methodology is presented that: (1) optimizes man machine mixes for different ground and on orbit crew functions subject to cost, safety, weight, power, and reliability constraints, and (2) plots the best incorporation plan for new, emerging technologies by weighing cost, relative availability, reliability, safety, importance to out year missions, and ease of retrofit. A fairly straightforward approach is taken by the methodology to valuing human productivity, it is still sensitive to the important subtleties associated with designing a well integrated, man machine system. These subtleties include considerations such as crew preference to retain certain spacecraft control functions; or valuing human integration/decision capabilities over equivalent hardware/software where appropriate.
MEMS Reliability Assurance Guidelines for Space Applications
NASA Technical Reports Server (NTRS)
Stark, Brian (Editor)
1999-01-01
This guide is a reference for understanding the various aspects of microelectromechanical systems, or MEMS, with an emphasis on device reliability. Material properties, failure mechanisms, processing techniques, device structures, and packaging techniques common to MEMS are addressed in detail. Design and qualification methodologies provide the reader with the means to develop suitable qualification plans for the insertion of MEMS into the space environment.
ERIC Educational Resources Information Center
Henson, Robin K.; Thompson, Bruce
Given the potential value of reliability generalization (RG) studies in the development of cumulative psychometric knowledge, the purpose of this paper is to provide a tutorial on how to conduct such studies and to serve as a guide for researchers wishing to use this methodology. After some brief comments on classical test theory, the paper…
Schiffman, Eric L; Truelove, Edmond L; Ohrbach, Richard; Anderson, Gary C; John, Mike T; List, Thomas; Look, John O
2010-01-01
The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. The aim of this article is to provide an overview of the project's methodology, descriptive statistics, and data for the study participant sample. This article also details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. The Axis I reference standards were based on the consensus of two criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion examination reliability was also assessed within study sites. Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas > or = 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion examiner agreement with reference standards was excellent (k > or = 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods.
Rubio-Ochoa, J; Benítez-Martínez, J; Lluch, E; Santacruz-Zaragozá, S; Gómez-Contreras, P; Cook, C E
2016-02-01
It has been suggested that differential diagnosis of headaches should consist of a robust subjective examination and a detailed physical examination of the cervical spine. Cervicogenic headache (CGH) is a form of headache that involves referred pain from the neck. To our knowledge, no studies have summarized the reliability and diagnostic accuracy of physical examination tests for CGH. The aim of this study was to summarize the reliability and diagnostic accuracy of physical examination tests used to diagnose CGH. A systematic review following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines was performed in four electronic databases (MEDLINE, Web of Science, Embase and Scopus). Full text reports concerning physical tests for the diagnosis of CGH which reported the clinometric properties for assessment of CGH, were included and screened for methodological quality. Quality Appraisal for Reliability Studies (QAREL) and Quality Assessment of Studies of Diagnostic Accuracy (QUADAS-2) scores were completed to assess article quality. Eight articles were retrieved for quality assessment and data extraction. Studies investigating diagnostic reliability of physical examination tests for CGH scored poorer on methodological quality (higher risk of bias) than those of diagnostic accuracy. There is sufficient evidence showing high levels of reliability and diagnostic accuracy of the selected physical examination tests for the diagnosis of CGH. The cervical flexion-rotation test (CFRT) exhibited both the highest reliability and the strongest diagnostic accuracy for the diagnosis of CGH. Copyright © 2015 Elsevier Ltd. All rights reserved.
Lagarde, Marloes L J; Kamalski, Digna M A; van den Engel-Hoek, Lenie
2016-02-01
To systematically review the available evidence for the reliability and validity of cervical auscultation in diagnosing the several aspects of dysphagia in adults and children suffering from dysphagia. Medline (PubMed), Embase and the Cochrane Library databases. The systematic review was carried out applying the steps of the PRISMA-statement. The methodological quality of the included studies were evaluated using the Dutch 'Cochrane checklist for diagnostic accuracy studies'. A total of 90 articles were identified through the search strategy, and after applying the inclusion and exclusion criteria, six articles were included in this review. In the six studies, 197 patients were assessed with cervical auscultation. Two of the six articles were considered to be of 'good' quality and three studies were of 'moderate' quality. One article was excluded because of a 'poor' methodological quality. Sensitivity ranges from 23%-94% and specificity ranges from 50%-74%. Inter-rater reliability was 'poor' or 'fair' in all studies. The intra-rater reliability shows a wide variance among speech language therapists. In this systematic review, conflicting evidence is found for the validity of cervical auscultation. The reliability of cervical auscultation is insufficient when used as a stand-alone tool in the diagnosis of dysphagia in adults. There is no available evidence for the validity and reliability of cervical auscultation in children. Cervical auscultation should not be used as a stand-alone instrument to diagnose dysphagia. © The Author(s) 2015.
Polnaszek, Brock; Gilmore-Bykovskyi, Andrea; Hovanes, Melissa; Roiland, Rachel; Ferguson, Patrick; Brown, Roger; Kind, Amy J H
2016-10-01
Unstructured data encountered during retrospective electronic medical record (EMR) abstraction has routinely been identified as challenging to reliably abstract, as these data are often recorded as free text, without limitations to format or structure. There is increased interest in reliably abstracting this type of data given its prominent role in care coordination and communication, yet limited methodological guidance exists. As standard abstraction approaches resulted in substandard data reliability for unstructured data elements collected as part of a multisite, retrospective EMR study of hospital discharge communication quality, our goal was to develop, apply and examine the utility of a phase-based approach to reliably abstract unstructured data. This approach is examined using the specific example of discharge communication for warfarin management. We adopted a "fit-for-use" framework to guide the development and evaluation of abstraction methods using a 4-step, phase-based approach including (1) team building; (2) identification of challenges; (3) adaptation of abstraction methods; and (4) systematic data quality monitoring. Unstructured data elements were the focus of this study, including elements communicating steps in warfarin management (eg, warfarin initiation) and medical follow-up (eg, timeframe for follow-up). After implementation of the phase-based approach, interrater reliability for all unstructured data elements demonstrated κ's of ≥0.89-an average increase of +0.25 for each unstructured data element. As compared with standard abstraction methodologies, this phase-based approach was more time intensive, but did markedly increase abstraction reliability for unstructured data elements within multisite EMR documentation.
Human Reliability and the Cost of Doing Business
NASA Technical Reports Server (NTRS)
DeMott, Diana
2014-01-01
Most businesses recognize that people will make mistakes and assume errors are just part of the cost of doing business, but does it need to be? Companies with high risk, or major consequences, should consider the effect of human error. In a variety of industries, Human Errors have caused costly failures and workplace injuries. These have included: airline mishaps, medical malpractice, administration of medication and major oil spills have all been blamed on human error. A technique to mitigate or even eliminate some of these costly human errors is the use of Human Reliability Analysis (HRA). Various methodologies are available to perform Human Reliability Assessments that range from identifying the most likely areas for concern to detailed assessments with human error failure probabilities calculated. Which methodology to use would be based on a variety of factors that would include: 1) how people react and act in different industries, and differing expectations based on industries standards, 2) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 3) type and availability of data and 4) how the industry views risk & reliability influences ( types of emergencies, contingencies and routine tasks versus cost based concerns). The Human Reliability Assessments should be the first step to reduce, mitigate or eliminate the costly mistakes or catastrophic failures. Using Human Reliability techniques to identify and classify human error risks allows a company more opportunities to mitigate or eliminate these risks and prevent costly failures.
2011-09-01
a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range
Reliability and availability evaluation of Wireless Sensor Networks for industrial applications.
Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco
2012-01-01
Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements.
Reliability and Availability Evaluation of Wireless Sensor Networks for Industrial Applications
Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco
2012-01-01
Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements. PMID:22368497
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
Evolving Reliability and Maintainability Allocations for NASA Ground Systems
NASA Technical Reports Server (NTRS)
Munoz, Gisela; Toon, T.; Toon, J.; Conner, A.; Adams, T.; Miranda, D.
2016-01-01
This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) programs subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.
The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles
NASA Technical Reports Server (NTRS)
Latimer, John A.
2009-01-01
This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.
Evolving Reliability and Maintainability Allocations for NASA Ground Systems
NASA Technical Reports Server (NTRS)
Munoz, Gisela; Toon, Troy; Toon, Jamie; Conner, Angelo C.; Adams, Timothy C.; Miranda, David J.
2016-01-01
This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program’s subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.
Evolving Reliability and Maintainability Allocations for NASA Ground Systems
NASA Technical Reports Server (NTRS)
Munoz, Gisela; Toon, Jamie; Toon, Troy; Adams, Timothy C.; Miranda, David J.
2016-01-01
This paper describes the methodology that was developed to allocate reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program's subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. Allocating is an iterative process; as systems moved beyond their conceptual and preliminary design phases this provided an opportunity for the reliability engineering team to reevaluate allocations based on updated designs and maintainability characteristics of the components. Trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper will discuss the value of modifying reliability and maintainability allocations made for the GSDO subsystems as the program nears the end of its design phase.
Golsteijn, Laura; Lessard, Lindsay; Campion, Jean-Florent; Capelli, Alexandre; D'Enfert, Virginie; King, Henry; Kremer, Joachim; Krugman, Michael; Orliac, Hélène; Furnemont, Severine Roullet; Schuh, Werner; Stalmans, Mark; O'Hanlon, Natasha Williams; Coroama, Manuela
2018-06-05
In 2013, the European Commission launched the Environmental Footprint Rules pilot phase. This initiative aims at setting specific rules for life cycle assessment (LCA: raw material sourcing, production, logistics, use- and disposal phase) studies within one product category, so called product environmental footprint category rules (PEFCR), as well as for organisations, so called organisational environmental footprint sector rules (OEFSR). Such specific rules for measuring environmental performance throughout the life cycle should facilitate the comparability between LCA studies, and provide principles for communicating the environmental performance, such as transparency, reliability, completeness, and clarity. Cosmetics Europe, the association representing the cosmetics industry in the EU, completed a voluntary study into the development of PEFCR for shampoo, generally following the guidelines and methodology developed by the European Commission for its own pilot projects. The study assessed the feasibility and relevance of establishing PEFCR for shampoo. Specifically, the study defines a large number of modelling assumptions and default values relevant for shampoo (e.g. for the functional unit, the system boundaries, default transport distances, rinsing water volumes, temperature differences, life cycle inventory data sources etc) that can be modified as appropriate, according to specificities of individual products, manufacturing companies and countries. The results of the study may be used to support internal decision-making (e.g. to identify 'hotspots' with high environmental impact and opportunities for improvement) or to meet information requests from commercial partners, consumers, media or authorities on product environmental characteristics. In addition, the shampoo study also highlighted many of the challenges and limitations of the current PEF methodology, namely its complexity and resource intensiveness. It highlighted two areas where improvements are much needed: (1) data quality and availability, and (2) impact assessment methodologies and robustness. Many of the learnings are applicable to other rinse-off cosmetic products such as shower gels, liquid soaps, bath products and hair conditioners. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Core principles of evolutionary medicine
Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E
2018-01-01
Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660
Machine learning approach for automatic quality criteria detection of health web pages.
Gaudinat, Arnaud; Grabar, Natalia; Boyer, Célia
2007-01-01
The number of medical websites is constantly growing [1]. Owing to the open nature of the Web, the reliability of information available on the Web is uneven. Internet users are overwhelmed by the quantity of information available on the Web. The situation is even more critical in the medical area, as the content proposed by health websites can have a direct impact on the users' well being. One way to control the reliability of health websites is to assess their quality and to make this assessment available to users. The HON Foundation has defined a set of eight ethical principles. HON's experts are working in order to manually define whether a given website complies with s the required principles. As the number of medical websites is constantly growing, manual expertise becomes insufficient and automatic systems should be used in order to help medical experts. In this paper we present the design and the evaluation of an automatic system conceived for the categorisation of medical and health documents according to he HONcode ethical principles. A first evaluation shows promising results. Currently the system shows 0.78 micro precision and 0.73 F-measure, with 0.06 errors.
Reliability of ceramics for heat engine applications
NASA Technical Reports Server (NTRS)
1980-01-01
The advantages and disadvantages associated with the use of monolithic ceramics in heat engines are discussed. The principle gaps in the state of understanding of ceramic material, failure origins, nondestructive tests as well as life prediction are included.
Welding in airplane construction
NASA Technical Reports Server (NTRS)
Rechtlich, A; Schrenk, M
1928-01-01
The present article attempts to explain the principles for the production of a perfect weld and to throw light on the unexplained problems. Moreover, it is intended to elucidate the possibilities of testing the strength and reliability of welded parts.
Balancing Technology with Established Methodology in the Accounting Classroom.
ERIC Educational Resources Information Center
Hoyt, William B.
1996-01-01
Discusses the role of technology in secondary accounting courses. Indicates that students must master the principles and concepts in accounting and must experience the manual preparation of documents before automated procedures are integrated. (Author/JOW)
42 CFR 440.340 - Actuarial report for benchmark-equivalent coverage.
Code of Federal Regulations, 2010 CFR
2010-10-01
... individual who is a member of the American Academy of Actuaries (AAA). (2) Using generally accepted actuarial principles and methodologies of the AAA. (3) Using a standardized set of utilization and price factors. (4...
Arthropod surveillance programs: Basic components, strategies, and analysis
USDA-ARS?s Scientific Manuscript database
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthro...
45 CFR 156.470 - Allocation of rates for advance payments of the premium tax credit.
Code of Federal Regulations, 2014 CFR
2014-10-01
... of Actuaries in accordance with generally accepted actuarial principles and methodologies; (2...) of this section is performed by a member of the American Academy of Actuaries in accordance with...
Quantum-secure covert communication on bosonic channels.
Bash, Boulat A; Gheorghe, Andrei H; Patel, Monika; Habif, Jonathan L; Goeckel, Dennis; Towsley, Don; Guha, Saikat
2015-10-19
Computational encryption, information-theoretic secrecy and quantum cryptography offer progressively stronger security against unauthorized decoding of messages contained in communication transmissions. However, these approaches do not ensure stealth--that the mere presence of message-bearing transmissions be undetectable. We characterize the ultimate limit of how much data can be reliably and covertly communicated over the lossy thermal-noise bosonic channel (which models various practical communication channels). We show that whenever there is some channel noise that cannot in principle be controlled by an otherwise arbitrarily powerful adversary--for example, thermal noise from blackbody radiation--the number of reliably transmissible covert bits is at most proportional to the square root of the number of orthogonal modes (the time-bandwidth product) available in the transmission interval. We demonstrate this in a proof-of-principle experiment. Our result paves the way to realizing communications that are kept covert from an all-powerful quantum adversary.
NASA Astrophysics Data System (ADS)
Xu, Li; Liu, Lanlan; Niu, Jie; Tang, Li; Li, Jinliang; Zhou, Zhanfan; Long, Chenhai; Yang, Qi; Yi, Ziqi; Guo, Hao; Long, Yang; Fu, Yanyi
2017-05-01
As social requirement of power supply reliability keeps rising, distribution network working with power uninterrupted has been widely carried out, while the high - temperature operating environment in summer can easily lead to physical discomfort for the operators, and then lead to safety incidents. Aiming at above problem, air-conditioning suit for distribution network working with power uninterrupted has been putted forward in this paper, and the structure composition and cooling principle of which has been explained, and it has been ultimately put to on-site application. The results showed that, cooling effect of air-conditioning suits was remarkable, and improved the working environment for the operators effectively, which is of great significance to improve Chinese level of working with power uninterrupted, reduce the probability of accidents and enhance the reliability of power supply.
Human reliability in petrochemical industry: an action research.
Silva, João Alexandre Pinheiro; Camarotto, João Alberto
2012-01-01
This paper aims to identify conflicts and gaps between the operators' strategies and actions and the organizational managerial approach for human reliability. In order to achieve these goals, the research approach adopted encompasses literature review, mixing action research methodology and Ergonomic Workplace Analysis in field research. The result suggests that the studied company has a classical and mechanistic point of view focusing on error identification and building barriers through procedures, checklists and other prescription alternatives to improve performance in reliability area. However, it was evident the fundamental role of the worker as an agent of maintenance and construction of system reliability during the action research cycle.
Marshak Lectureship: Vibrational properties of isolated color centers in diamond
NASA Astrophysics Data System (ADS)
Alkauskas, Audrius
In this talk we review our recent work on first-principles calculations of vibrational properties of isolated defect spin qubits and single photon emitters in diamond. These properties include local vibrational spectra, luminescence lineshapes, and electron-phonon coupling. They are key in understanding physical mechanisms behind spin-selective optical initialization and read-out, quantum efficiency of single-photon emitters, as well as in the experimental identification of as yet unknown centers. We first present the methodology to calculate and analyze vibrational properties of effectively isolated defect centers. We then apply the methodology to the nitrogen-vacancy and the silicon-vacancy centers in diamond. First-principles calculations yield important new insights about these important defects. Work performed in collaboration with M. W. Doherty, A. Gali, E. Londero, L. Razinkovas, and C. G. Van de Walle. Supported by the Research Council of Lithuania (Grant M-ERA.NET-1/2015).
Optimal Management of Redundant Control Authority for Fault Tolerance
NASA Technical Reports Server (NTRS)
Wu, N. Eva; Ju, Jianhong
2000-01-01
This paper is intended to demonstrate the feasibility of a solution to a fault tolerant control problem. It explains, through a numerical example, the design and the operation of a novel scheme for fault tolerant control. The fundamental principle of the scheme was formalized in [5] based on the notion of normalized nonspecificity. The novelty lies with the use of a reliability criterion for redundancy management, and therefore leads to a high overall system reliability.
Methodology for Designing Operational Banking Risks Monitoring System
NASA Astrophysics Data System (ADS)
Kostjunina, T. N.
2018-05-01
The research looks at principles of designing an information system for monitoring operational banking risks. A proposed design methodology enables one to automate processes of collecting data on information security incidents in the banking network, serving as the basis for an integrated approach to the creation of an operational risk management system. The system can operate remotely ensuring tracking and forecasting of various operational events in the bank network. A structure of a content management system is described.
Towards A Topological Framework for Integrating Semantic Information Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael
2014-09-07
In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.
Research Applications of Magnetic Resonance Spectroscopy (MRS) to Investigate Psychiatric Disorders
Dager, SR; Oskin, NM; Richards, TL; Posse, S
2009-01-01
Advances in magnetic resonance spectroscopy (MRS) methodology and related analytic strategies allow sophisticated testing of neurobiological models of disease pathology in psychiatric disorders. An overview of principles underlying MRS, methodological considerations and investigative approaches is presented. A review of recent research is presented that highlights innovative approaches applying MRS, in particular 1H MRS, to systematically investigate specific psychiatric disorders, including autism spectrum disorders, schizophrenia, panic disorder, major depression and bipolar disorder. PMID:19363431
Frankel, Allan S; Leonard, Michael W; Denham, Charles R
2006-01-01
Background Disparate health care provider attitudes about autonomy, teamwork, and administrative operations have added to the complexity of health care delivery and are a central factor in medicine's unacceptably high rate of errors. Other industries have improved their reliability by applying innovative concepts to interpersonal relationships and administrative hierarchical structures (Chandler 1962). In the last 10 years the science of patient safety has become more sophisticated, with practical concepts identified and tested to improve the safety and reliability of care. Objective Three initiatives stand out as worthy regarding interpersonal relationships and the application of provider concerns to shape operational change: The development and implementation of Fair and Just Culture principles, the broad use of Teamwork Training and Communication, and tools like WalkRounds that promote the alignment of leadership and frontline provider perspectives through effective use of adverse event data and provider comments. Methods Fair and Just Culture, Teamwork Training, and WalkRounds are described, and implementation examples provided. The argument is made that they must be systematically and consistently implemented in an integrated fashion. Conclusions There are excellent examples of institutions applying Just Culture principles, Teamwork Training, and Leadership WalkRounds—but to date, they have not been comprehensively instituted in health care organizations in a cohesive and interdependent manner. To achieve reliability, organizations need to begin thinking about the relationship between these efforts and linking them conceptually. PMID:16898986
Single service point: it's all in the design.
Bradigan, Pamela S; Rodman, Ruey L
2008-01-01
"Design thinking" principles from a leading design firm, IDEO, were key elements in the planning process for a one-desk service model, the ASK Desk, at the John A. Prior Health Sciences Library. The library administration and staff employed the methodology to enhance customer experiences, meet technology challenges, and compete in a changing education environment. The most recent renovations demonstrate how the principles were applied. The concept of "continuous design thinking" is important in the library's daily operations to serve customers most effectively.
The Role of Metaphysical Naturalism in Science
NASA Astrophysics Data System (ADS)
Mahner, Martin
2012-10-01
This paper defends the view that metaphysical naturalism is a constitutive ontological principle of science in that the general empirical methods of science, such as observation, measurement and experiment, and thus the very production of empirical evidence, presuppose a no-supernature principle. It examines the consequences of metaphysical naturalism for the testability of supernatural claims, and it argues that explanations involving supernatural entities are pseudo-explanatory due to the many semantic and ontological problems of supernatural concepts. The paper also addresses the controversy about metaphysical versus methodological naturalism.
EHV systems technology - A look at the principles and current status. [Electric and Hybrid Vehicle
NASA Technical Reports Server (NTRS)
Kurtz, D. W.; Levin, R. R.
1983-01-01
An examination of the basic principles and practices of systems engineering is undertaken in the context of their application to the component and subsystem technologies involved in electric and hybrid vehicle (EHV) development. The limitations of purely electric vehicles are contrasted with hybrid, heat engine-incorporating vehicle technology, which is inherently more versatile. A hybrid vehicle concept assessment methodology is presented which employs current technology and yet fully satisfies U.S. Department of Energy petroleum displacement goals.
Strategy for continuous improvement in IC manufacturability, yield, and reliability
NASA Astrophysics Data System (ADS)
Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary
1993-01-01
Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.
Applying reliability analysis to design electric power systems for More-electric aircraft
NASA Astrophysics Data System (ADS)
Zhang, Baozhu
The More-Electric Aircraft (MEA) is a type of aircraft that replaces conventional hydraulic and pneumatic systems with electrically powered components. These changes have significantly challenged the aircraft electric power system design. This thesis investigates how reliability analysis can be applied to automatically generate system topologies for the MEA electric power system. We first use a traditional method of reliability block diagrams to analyze the reliability level on different system topologies. We next propose a new methodology in which system topologies, constrained by a set reliability level, are automatically generated. The path-set method is used for analysis. Finally, we interface these sets of system topologies with control synthesis tools to automatically create correct-by-construction control logic for the electric power system.
Optimization of controlled processes in combined-cycle plant (new developments and researches)
NASA Astrophysics Data System (ADS)
Tverskoy, Yu S.; Muravev, I. K.
2017-11-01
All modern complex technical systems, including power units of TPP and nuclear power plants, work in the system-forming structure of multifunctional APCS. The development of the modern APCS mathematical support allows bringing the automation degree to the solution of complex optimization problems of equipment heat-mass-exchange processes in real time. The difficulty of efficient management of a binary power unit is related to the need to solve jointly at least three problems. The first problem is related to the physical issues of combined-cycle technologies. The second problem is determined by the criticality of the CCGT operation to changes in the regime and climatic factors. The third problem is related to a precise description of a vector of controlled coordinates of a complex technological object. To obtain a joint solution of this complex of interconnected problems, the methodology of generalized thermodynamic analysis, methods of the theory of automatic control and mathematical modeling are used. In the present report, results of new developments and studies are shown. These results allow improving the principles of process control and the automatic control systems structural synthesis of power units with combined-cycle plants that provide attainable technical and economic efficiency and operational reliability of equipment.
A Review of Rock Bolt Monitoring Using Smart Sensors.
Song, Gangbing; Li, Weijie; Wang, Bo; Ho, Siu Chun Michael
2017-04-05
Rock bolts have been widely used as rock reinforcing members in underground coal mine roadways and tunnels. Failures of rock bolts occur as a result of overloading, corrosion, seismic burst and bad grouting, leading to catastrophic economic and personnel losses. Monitoring the health condition of the rock bolts plays an important role in ensuring the safe operation of underground mines. This work presents a brief introduction on the types of rock bolts followed by a comprehensive review of rock bolt monitoring using smart sensors. Smart sensors that are used to assess rock bolt integrity are reviewed to provide a firm perception of the application of smart sensors for enhanced performance and reliability of rock bolts. The most widely used smart sensors for rock bolt monitoring are the piezoelectric sensors and the fiber optic sensors. The methodologies and principles of these smart sensors are reviewed from the point of view of rock bolt integrity monitoring. The applications of smart sensors in monitoring the critical status of rock bolts, such as the axial force, corrosion occurrence, grout quality and resin delamination, are highlighted. In addition, several prototypes or commercially available smart rock bolt devices are also introduced.
Davarani, Saied Saeed Hosseiny; Najarian, Amin Morteza; Nojavan, Saeed; Tabatabaei, Mohammad-Ali
2012-05-06
Recent advances in electromembrane extraction (EME) methodology calls for effective and accessible detection methods. Using imipramine and clomipramine as model therapeutics, this proof-of-principle work combines EME with gas chromatography analysis employing a flame ionization detector (FID). The drugs were extracted from acidic aqueous sample solutions, through a supported liquid membrane (SLM) consisting of 2-nitrophenyl octyl ether (NPOE) impregnated on the walls of the hollow fiber. EME parameters, such as SLM composition, type of ion carrier, pH and the composition of donor and acceptor solutions, agitation speed, extraction voltage, and extraction time were studied in detail. Under optimized conditions, the therapeutics were effectively extracted from different matrices with recoveries ranging from 90 to 95%. The samples were preconcentrated 270-280 times prior to GC analysis. Reliable linearity was also achieved for calibration curves with a regression coefficient of at least 0.995. Detection limits and intra-day precision (n=3) were less than 0.7 ng mL(-1) and 8.5%, respectively. Finally, method was applied to determination and quantification of drugs in human plasma and urine samples and satisfactory results were achieved. Copyright © 2012 Elsevier B.V. All rights reserved.
Development and exemplification of a model for Teacher Assessment in Primary Science
NASA Astrophysics Data System (ADS)
Davies, D. J.; Earle, S.; McMahon, K.; Howe, A.; Collier, C.
2017-09-01
The Teacher Assessment in Primary Science project is funded by the Primary Science Teaching Trust and based at Bath Spa University. The study aims to develop a whole-school model of valid, reliable and manageable teacher assessment to inform practice and make a positive impact on primary-aged children's learning in science. The model is based on a data-flow 'pyramid' (analogous to the flow of energy through an ecosystem), whereby the rich formative assessment evidence gathered in the classroom is summarised for monitoring, reporting and evaluation purposes [Nuffield Foundation. (2012). Developing policy, principles and practice in primary school science assessment. London: Nuffield Foundation]. Using a design-based research (DBR) methodology, the authors worked in collaboration with teachers from project schools and other expert groups to refine, elaborate, validate and operationalise the data-flow 'pyramid' model, resulting in the development of a whole-school self-evaluation tool. In this paper, we argue that a DBR approach to theory-building and school improvement drawing upon teacher expertise has led to the identification, adaptation and successful scaling up of a promising approach to school self-evaluation in relation to assessment in science.
Policymaking in European healthy cities.
de Leeuw, Evelyne; Green, Geoff; Spanswick, Lucy; Palmer, Nicola
2015-06-01
This paper assesses policy development in, with and for Healthy Cities in the European Region of the World Health Organization. Materials for the assessment were sourced through case studies, a questionnaire and statistical databases. They were compiled in a realist synthesis methodology, applying theory-based evaluation principles. Non-response analyses were applied to ascertain the degree of representatives of the high response rates for the entire network of Healthy Cities in Europe. Further measures of reliability and validity were applied, and it was found that our material was indicative of the entire network. European Healthy Cities are successful in developing local health policy across many sectors within and outside government. They were also successful in addressing 'wicked' problems around equity, governance and participation in themes such as Healthy Urban Planning. It appears that strong local leadership for policy change is driven by international collaboration and the stewardship of the World Health Organization. The processes enacted by WHO, structuring membership of the Healthy City Network (designation) and the guidance on particular themes, are identified as being important for the success of local policy development. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A Review of Rock Bolt Monitoring Using Smart Sensors
Song, Gangbing; Li, Weijie; Wang, Bo; Ho, Siu Chun Michael
2017-01-01
Rock bolts have been widely used as rock reinforcing members in underground coal mine roadways and tunnels. Failures of rock bolts occur as a result of overloading, corrosion, seismic burst and bad grouting, leading to catastrophic economic and personnel losses. Monitoring the health condition of the rock bolts plays an important role in ensuring the safe operation of underground mines. This work presents a brief introduction on the types of rock bolts followed by a comprehensive review of rock bolt monitoring using smart sensors. Smart sensors that are used to assess rock bolt integrity are reviewed to provide a firm perception of the application of smart sensors for enhanced performance and reliability of rock bolts. The most widely used smart sensors for rock bolt monitoring are the piezoelectric sensors and the fiber optic sensors. The methodologies and principles of these smart sensors are reviewed from the point of view of rock bolt integrity monitoring. The applications of smart sensors in monitoring the critical status of rock bolts, such as the axial force, corrosion occurrence, grout quality and resin delamination, are highlighted. In addition, several prototypes or commercially available smart rock bolt devices are also introduced. PMID:28379167
Forecasting magma-chamber rupture at Santorini volcano, Greece.
Browning, John; Drymoni, Kyriaki; Gudmundsson, Agust
2015-10-28
How much magma needs to be added to a shallow magma chamber to cause rupture, dyke injection, and a potential eruption? Models that yield reliable answers to this question are needed in order to facilitate eruption forecasting. Development of a long-lived shallow magma chamber requires periodic influx of magmas from a parental body at depth. This redistribution process does not necessarily cause an eruption but produces a net volume change that can be measured geodetically by inversion techniques. Using continuum-mechanics and fracture-mechanics principles, we calculate the amount of magma contained at shallow depth beneath Santorini volcano, Greece. We demonstrate through structural analysis of dykes exposed within the Santorini caldera, previously published data on the volume of recent eruptions, and geodetic measurements of the 2011-2012 unrest period, that the measured 0.02% increase in volume of Santorini's shallow magma chamber was associated with magmatic excess pressure increase of around 1.1 MPa. This excess pressure was high enough to bring the chamber roof close to rupture and dyke injection. For volcanoes with known typical extrusion and intrusion (dyke) volumes, the new methodology presented here makes it possible to forecast the conditions for magma-chamber failure and dyke injection at any geodetically well-monitored volcano.