Symbolic algebra approach to the calculation of intraocular lens power following cataract surgery
NASA Astrophysics Data System (ADS)
Hjelmstad, David P.; Sayegh, Samir I.
2013-03-01
We present a symbolic approach based on matrix methods that allows for the analysis and computation of intraocular lens power following cataract surgery. We extend the basic matrix approach corresponding to paraxial optics to include astigmatism and other aberrations. The symbolic approach allows for a refined analysis of the potential sources of errors ("refractive surprises"). We demonstrate the computation of lens powers including toric lenses that correct for both defocus (myopia, hyperopia) and astigmatism. A specific implementation in Mathematica allows an elegant and powerful method for the design and analysis of these intraocular lenses.
ERIC Educational Resources Information Center
Gilmore, Alex
2015-01-01
Discourse studies is a vast, multidisciplinary, and rapidly expanding area of research, embracing a range of approaches including discourse analysis, corpus analysis, conversation analysis, interactional sociolinguistics, critical discourse analysis, genre analysis and multimodal discourse analysis. Each approach offers its own unique perspective…
Assessing the Validity of Discourse Analysis: Transdisciplinary Convergence
ERIC Educational Resources Information Center
Jaipal-Jamani, Kamini
2014-01-01
Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to…
Loría-Castellanos, Jorge; Rivera-lbarra, Doris Beatriz; Márquez-Avila, Guadalupe
2009-01-01
Compare the outreach of a promotional educational strategy that focuses on active participation and compare it with a more traditional approach to medical training. A quasi-experimental design was approved by the research committee. We compared the outreach of two different approaches to medical training. We administered a validated instrument that included 72 items that analyze statements used to measure educational tasks in the form of duplets through 3 indicators. A group that included seven physicians that were actively participating in teaching activities was stratified according to teaching approaches. One of the approaches was a traditional one and the other included a promotional strategy aimed at increasing participation. All participants signed informed consent before answering the research instruments. Statistical analysis was done using non-parametric tests. Mann-Whitney results did not show differences among the group in the preliminary analysis. A second analysis with the same test after the interventions found significant differences (p d" 0.018) in favor of those subjects that had participated in the promotional approach mainly in the indicator measuring "consequence". The Wilcoxon test showed that all participants in the promotional approach increased significantly (pd" 0.018) in 3 main indicators as compared with the control group. A promotional strategy aimed at increasing physician participation constitutes a more profitable approach when compared with traditional teaching methods.
Nonlinear Stochastic PDEs: Analysis and Approximations
2016-05-23
numerical performance. Main theoretical and experimental advances include: 1.Introduction of a number of effective approaches to numerical analysis of...Stokes and Euler SPDEs, quasi -geostrophic SPDE, Ginzburg-Landau SPDE and Duffing oscillator REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT...compare their numerical performance. Main theoretical and experimental advances include: 1.Introduction of a number of effective approaches to
ERIC Educational Resources Information Center
Kim, Kyung Hi
2014-01-01
This research, based on a case study of vulnerable children in Korea, used a mixed methods transformative approach to explore strategies to support and help disadvantaged children. The methodological approach includes three phases: a mixed methods contextual analysis, a qualitative dominant analysis based on Sen's capability approach and critical…
Note on Professor Sizer's Paper.
ERIC Educational Resources Information Center
Balderston, Frederick E.
1979-01-01
Issues suggested by John Sizer's paper, an overview of the assessment of institutional performance, include: the efficient-frontier approach, multiple-criterion decision-making models, performance analysis approached as path analysis, and assessment of academic quality. (JMD)
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
ERIC Educational Resources Information Center
Bhathal, Ragbir; Sharma, Manjula D.; Mendez, Alberto
2010-01-01
This paper describes an educational analysis of a first year physics experiment on standing waves for engineering students. The educational analysis is based on the ACELL (Advancing Chemistry by Enhancing Learning in the Laboratory) approach which includes a statement of educational objectives and an analysis of student learning experiences. The…
Qualitative research methods in renal medicine: an introduction.
Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M
2015-09-01
Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
NASA Astrophysics Data System (ADS)
Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.
2012-01-01
This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method.
1991-12-01
urally. 6.5 Summary of Current or Potential Approaches Many approaches to context analysis were discussed by the group, including: * Causal Trees * SWOT ... Apple Computer, 1988 1 Aseltine, J., Beam, W.R., Palmer, J.D., Sage, A.P., 1989, Introduction To Computer Systems: Analysis, Design and Application
Cross-Sectional Time Series Designs: A General Transformation Approach.
ERIC Educational Resources Information Center
Velicer, Wayne F.; McDonald, Roderick P.
1991-01-01
The general transformation approach to time series analysis is extended to the analysis of multiple unit data by the development of a patterned transformation matrix. The procedure includes alternatives for special cases and requires only minor revisions in existing computer software. (SLD)
A Spatial Analysis and Game Theoretical Approach Over the Disputed Islands in the Aegean Sea
2016-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited A SPATIAL ANALYSIS ...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE A SPATIAL ANALYSIS AND GAME THEORETICAL APPROACH OVER THE DISPUTED ISLANDS...including perimeter, area, population, distance to Greece, distance to Turkey, and territorial water area. After applying spatial analysis to two
Approaches to answering critical CER questions.
Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y
2015-01-01
While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.
Benefit-Risk Analysis for Decision-Making: An Approach.
Raju, G K; Gurumurthi, K; Domike, R
2016-12-01
The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB). © 2016 American Society for Clinical Pharmacology and Therapeutics.
Multiattribute risk analysis in nuclear emergency management.
Hämäläinen, R P; Lindstedt, M R; Sinkko, K
2000-08-01
Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful.
Djomba, Janet Klara; Zaletel-Kragelj, Lijana
2016-12-01
Research on social networks in public health focuses on how social structures and relationships influence health and health-related behaviour. While the sociocentric approach is used to study complete social networks, the egocentric approach is gaining popularity because of its focus on individuals, groups and communities. One of the participants of the healthy lifestyle health education workshop 'I'm moving', included in the study of social support for exercise was randomly selected. The participant was denoted as the ego and members of her/his social network as the alteri. Data were collected by personal interviews using a self-made questionnaire. Numerical methods and computer programmes for the analysis of social networks were used for the demonstration of analysis. The size, composition and structure of the egocentric social network were obtained by a numerical analysis. The analysis of composition included homophily and homogeneity. Moreover, the analysis of the structure included the degree of the egocentric network, the strength of the ego-alter ties and the average strength of ties. Visualisation of the network was performed by three freely available computer programmes, namely: Egonet.QF, E-net and Pajek. The computer programmes were described and compared by their usefulness. Both numerical analysis and visualisation have their benefits. The decision what approach to use is depending on the purpose of the social network analysis. While the numerical analysis can be used in large-scale population-based studies, visualisation of personal networks can help health professionals at creating, performing and evaluation of preventive programmes, especially if focused on behaviour change.
Parsimonious nonstationary flood frequency analysis
NASA Astrophysics Data System (ADS)
Serago, Jake M.; Vogel, Richard M.
2018-02-01
There is now widespread awareness of the impact of anthropogenic influences on extreme floods (and droughts) and thus an increasing need for methods to account for such influences when estimating a frequency distribution. We introduce a parsimonious approach to nonstationary flood frequency analysis (NFFA) based on a bivariate regression equation which describes the relationship between annual maximum floods, x, and an exogenous variable which may explain the nonstationary behavior of x. The conditional mean, variance and skewness of both x and y = ln (x) are derived, and combined with numerous common probability distributions including the lognormal, generalized extreme value and log Pearson type III models, resulting in a very simple and general approach to NFFA. Our approach offers several advantages over existing approaches including: parsimony, ease of use, graphical display, prediction intervals, and opportunities for uncertainty analysis. We introduce nonstationary probability plots and document how such plots can be used to assess the improved goodness of fit associated with a NFFA.
Cost analysis in support of minimum energy standards for clothes washers and dryers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-02-02
The results of the cost analysis of energy conservation design options for laundry products are presented. The analysis was conducted using two approaches. The first, is directed toward the development of industrial engineering cost estimates of each energy conservation option. This approach results in the estimation of manufacturers costs. The second approach is directed toward determining the market price differential of energy conservation features. The results of this approach are shown. The market cost represents the cost to the consumer. It is the final cost, and therefore includes distribution costs as well as manufacturing costs.
Mathematics reflecting sensorimotor organization.
McCollum, Gin
2003-02-01
This review combines short presentations of several mathematical approaches that conceptualize issues in sensorimotor neuroscience from different perspectives and levels of analysis. The intricate organization of neural structures and sensorimotor performance calls for characterization using a variety of mathematical approaches. This review points out the prospects for mathematical neuroscience: in addition to computational approaches, there is a wide variety of mathematical approaches that provide insight into the organization of neural systems. By starting from the perspective that provides the greatest clarity, a mathematical approach avoids specificity that is inaccurate in characterizing the inherent biological organization. Approaches presented include the mathematics of ordered structures, motion-phase space, subject-coincident coordinates, equivalence classes, topological biodynamics, rhythm space metric, and conditional dynamics. Issues considered in this paper include unification of levels of analysis, response equivalence, convergence, relationship of physics to motor control, support of rhythms, state transitions, and focussing on low-dimensional subspaces of a high-dimensional sensorimotor space.
Comparison of Two Analysis Approaches for Measuring Externalized Mental Models
ERIC Educational Resources Information Center
Al-Diban, Sabine; Ifenthaler, Dirk
2011-01-01
Mental models are basic cognitive constructs that are central for understanding phenomena of the world and predicting future events. Our comparison of two analysis approaches, SMD and QFCA, for measuring externalized mental models reveals different levels of abstraction and different perspectives. The advantages of the SMD include possibilities…
Williams, Claire; Lewsey, James D; Briggs, Andrew H; Mackay, Daniel F
2017-05-01
This tutorial provides a step-by-step guide to performing cost-effectiveness analysis using a multi-state modeling approach. Alongside the tutorial, we provide easy-to-use functions in the statistics package R. We argue that this multi-state modeling approach using a package such as R has advantages over approaches where models are built in a spreadsheet package. In particular, using a syntax-based approach means there is a written record of what was done and the calculations are transparent. Reproducing the analysis is straightforward as the syntax just needs to be run again. The approach can be thought of as an alternative way to build a Markov decision-analytic model, which also has the option to use a state-arrival extended approach. In the state-arrival extended multi-state model, a covariate that represents patients' history is included, allowing the Markov property to be tested. We illustrate the building of multi-state survival models, making predictions from the models and assessing fits. We then proceed to perform a cost-effectiveness analysis, including deterministic and probabilistic sensitivity analyses. Finally, we show how to create 2 common methods of visualizing the results-namely, cost-effectiveness planes and cost-effectiveness acceptability curves. The analysis is implemented entirely within R. It is based on adaptions to functions in the existing R package mstate to accommodate parametric multi-state modeling that facilitates extrapolation of survival curves.
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.
1981-01-01
The achievement of an optimized design from the system standpoint under the low cost, high risk constraints of the present day environment was analyzed. Space Shuttle illustrates the requirement for an analysis approach that considers all major disciplines (coupling between structures control, propulsion, thermal, aeroelastic, and performance), simultaneously. The Space Shuttle and certain payloads, Space Telescope and Spacelab, are examined. The requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements, control requirements, and the resulting design verification approaches are illustrated. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.
Performer-centric Interface Design.
ERIC Educational Resources Information Center
McGraw, Karen L.
1995-01-01
Describes performer-centric interface design and explains a model-based approach for conducting performer-centric analysis and design. Highlights include design methodology, including cognitive task analysis; creating task scenarios; creating the presentation model; creating storyboards; proof of concept screens; object models and icons;…
Enhancing Critical Thinking by Teaching Two Distinct Approaches to Management
ERIC Educational Resources Information Center
Dyck, Bruno; Walker, Kent; Starke, Frederick A.; Uggerslev, Krista
2012-01-01
The authors explore the effect on students' critical thinking of teaching only one approach to management versus teaching two approaches to management. Results from a quasiexperiment--which included a survey, interviews, and case analysis--suggest that compared with students who are taught only a conventional approach to management (which…
Forestry sector analysis for developing countries: issues and methods.
R.W. Haynes
1993-01-01
A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...
Semiotic Approach to the Analysis of Children's Drawings
ERIC Educational Resources Information Center
Turkcan, Burcin
2013-01-01
Semiotics, which is used for the analysis of a number of communication languages, helps describe the specific operational rules by determining the sub-systems included in the field it examines. Considering that art is a communication language, this approach could be used in analyzing children's products in art education. The present study aiming…
Considering a Cost Analysis Project? A Planned Approach
ERIC Educational Resources Information Center
Parish, Mina; Teetor, Travis
2006-01-01
As resources become more constrained in the library community, many organizations are finding that they need to have a better understanding of their costs. To this end, this article will present one approach to conducting a cost analysis (including questions to ask yourself, project team makeup, organizational support, and data organization). We…
ERIC Educational Resources Information Center
Grannas, Amanda M.; Lagalante, Anthony F.
2010-01-01
A new curricular approach in our undergraduate second-year instrumental analysis laboratory was implemented. Students work collaboratively on scenarios in diverse fields including pharmaceuticals, forensics, gemology, art conservation, and environmental chemistry. Each laboratory section (approximately 12 students) is divided into three groups…
NASA Technical Reports Server (NTRS)
Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.
1977-01-01
The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.
Optimization of composite box-beam structures including effects of subcomponent interactions
NASA Technical Reports Server (NTRS)
Ragon, Scott A.; Guerdal, Zafer; Starnes, James H., Jr.
1995-01-01
Minimum mass designs are obtained for a simple box beam structure subject to bending, torque and combined bending/torque load cases. These designs are obtained subject to point strain and linear buckling constraints. The present work differs from previous efforts in that special attention is payed to including the effects of subcomponent panel interaction in the optimal design process. Two different approaches are used to impose the buckling constraints. When the global approach is used, buckling constraints are imposed on the global structure via a linear eigenvalue analysis. This approach allows the subcomponent panels to interact in a realistic manner. The results obtained using this approach are compared to results obtained using a traditional, less expensive approach, called the local approach. When the local approach is used, in-plane loads are extracted from the global model and used to impose buckling constraints on each subcomponent panel individually. In the global cases, it is found that there can be significant interaction between skin, spar, and rib design variables. This coupling is weak or nonexistent in the local designs. It is determined that weight savings of up to 7% may be obtained by using the global approach instead of the local approach to design these structures. Several of the designs obtained using the linear buckling analysis are subjected to a geometrically nonlinear analysis. For the designs which were subjected to bending loads, the innermost rib panel begins to collapse at less than half the intended design load and in a mode different from that predicted by linear analysis. The discrepancy between the predicted linear and nonlinear responses is attributed to the effects of the nonlinear rib crushing load, and the parameter which controls this rib collapse failure mode is shown to be the rib thickness. The rib collapse failure mode may be avoided by increasing the rib thickness above the value obtained from the (linear analysis based) optimizer. It is concluded that it would be necessary to include geometric nonlinearities in the design optimization process if the true optimum in this case were to be found.
The Effect of Laminar Flow on Rotor Hover Performance
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Martin, Preston B.
2017-01-01
The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.
Multiple shooting shadowing for sensitivity analysis of chaotic dynamical systems
NASA Astrophysics Data System (ADS)
Blonigan, Patrick J.; Wang, Qiqi
2018-02-01
Sensitivity analysis methods are important tools for research and design with simulations. Many important simulations exhibit chaotic dynamics, including scale-resolving turbulent fluid flow simulations. Unfortunately, conventional sensitivity analysis methods are unable to compute useful gradient information for long-time-averaged quantities in chaotic dynamical systems. Sensitivity analysis with least squares shadowing (LSS) can compute useful gradient information for a number of chaotic systems, including simulations of chaotic vortex shedding and homogeneous isotropic turbulence. However, this gradient information comes at a very high computational cost. This paper presents multiple shooting shadowing (MSS), a more computationally efficient shadowing approach than the original LSS approach. Through an analysis of the convergence rate of MSS, it is shown that MSS can have lower memory usage and run time than LSS.
Liu, Dungang; Liu, Regina; Xie, Minge
2014-01-01
Meta-analysis has been widely used to synthesize evidence from multiple studies for common hypotheses or parameters of interest. However, it has not yet been fully developed for incorporating heterogeneous studies, which arise often in applications due to different study designs, populations or outcomes. For heterogeneous studies, the parameter of interest may not be estimable for certain studies, and in such a case, these studies are typically excluded from conventional meta-analysis. The exclusion of part of the studies can lead to a non-negligible loss of information. This paper introduces a metaanalysis for heterogeneous studies by combining the confidence density functions derived from the summary statistics of individual studies, hence referred to as the CD approach. It includes all the studies in the analysis and makes use of all information, direct as well as indirect. Under a general likelihood inference framework, this new approach is shown to have several desirable properties, including: i) it is asymptotically as efficient as the maximum likelihood approach using individual participant data (IPD) from all studies; ii) unlike the IPD analysis, it suffices to use summary statistics to carry out the CD approach. Individual-level data are not required; and iii) it is robust against misspecification of the working covariance structure of the parameter estimates. Besides its own theoretical significance, the last property also substantially broadens the applicability of the CD approach. All the properties of the CD approach are further confirmed by data simulated from a randomized clinical trials setting as well as by real data on aircraft landing performance. Overall, one obtains an unifying approach for combining summary statistics, subsuming many of the existing meta-analysis methods as special cases. PMID:26190875
An Extended Spectral-Spatial Classification Approach for Hyperspectral Data
NASA Astrophysics Data System (ADS)
Akbari, D.
2017-11-01
In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF) algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1) unsupervised feature extraction methods including principal component analysis (PCA), independent component analysis (ICA), and minimum noise fraction (MNF); (2) supervised feature extraction including decision boundary feature extraction (DBFE), discriminate analysis feature extraction (DAFE), and nonparametric weighted feature extraction (NWFE); (3) genetic algorithm (GA). The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.
Meta-analysis of gene-level tests for rare variant association.
Liu, Dajiang J; Peloso, Gina M; Zhan, Xiaowei; Holmen, Oddgeir L; Zawistowski, Matthew; Feng, Shuang; Nikpay, Majid; Auer, Paul L; Goel, Anuj; Zhang, He; Peters, Ulrike; Farrall, Martin; Orho-Melander, Marju; Kooperberg, Charles; McPherson, Ruth; Watkins, Hugh; Willer, Cristen J; Hveem, Kristian; Melander, Olle; Kathiresan, Sekar; Abecasis, Gonçalo R
2014-02-01
The majority of reported complex disease associations for common genetic variants have been identified through meta-analysis, a powerful approach that enables the use of large sample sizes while protecting against common artifacts due to population structure and repeated small-sample analyses sharing individual-level data. As the focus of genetic association studies shifts to rare variants, genes and other functional units are becoming the focus of analysis. Here we propose and evaluate new approaches for performing meta-analysis of rare variant association tests, including burden tests, weighted burden tests, variable-threshold tests and tests that allow variants with opposite effects to be grouped together. We show that our approach retains useful features from single-variant meta-analysis approaches and demonstrate its use in a study of blood lipid levels in ∼18,500 individuals genotyped with exome arrays.
A joint probability approach for coincidental flood frequency analysis at ungauged basin confluences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Cheng
2016-03-12
A reliable and accurate flood frequency analysis at the confluence of streams is of importance. Given that long-term peak flow observations are often unavailable at tributary confluences, at a practical level, this paper presents a joint probability approach (JPA) to address the coincidental flood frequency analysis at the ungauged confluence of two streams based on the flow rate data from the upstream tributaries. One case study is performed for comparison against several traditional approaches, including the position-plotting formula, the univariate flood frequency analysis, and the National Flood Frequency Program developed by US Geological Survey. It shows that the results generatedmore » by the JPA approach agree well with the floods estimated by the plotting position and univariate flood frequency analysis based on the observation data.« less
Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...
Assessing the validity of discourse analysis: transdisciplinary convergence
NASA Astrophysics Data System (ADS)
Jaipal-Jamani, Kamini
2014-12-01
Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.
VoroTop: Voronoi cell topology visualization and analysis toolkit
NASA Astrophysics Data System (ADS)
Lazar, Emanuel A.
2018-01-01
This paper introduces a new open-source software program called VoroTop, which uses Voronoi topology to analyze local structure in atomic systems. Strengths of this approach include its abilities to analyze high-temperature systems and to characterize complex structure such as grain boundaries. This approach enables the automated analysis of systems and mechanisms previously not possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolker, Eugene
Our project focused primarily on analysis of different types of data produced by global high-throughput technologies, data integration of gene annotation, and gene and protein expression information, as well as on getting a better functional annotation of Shewanella genes. Specifically, four of our numerous major activities and achievements include the development of: statistical models for identification and expression proteomics, superior to currently available approaches (including our own earlier ones); approaches to improve gene annotations on the whole-organism scale; standards for annotation, transcriptomics and proteomics approaches; and generalized approaches for data integration of gene annotation, gene and protein expression information.
Ren, Tong; Liu, Yan; Zhao, Xiaowen; Ni, Shaobin; Zhang, Cheng; Guo, Changgang; Ren, Minghua
2014-01-01
To compare the efficiency and safety of the transperitoneal approaches with retroperitoneal approaches in laparoscopic partial nephrectomy for renal cell carcinoma and provide evidence-based medicine support for clinical treatment. A systematic computer search of PUBMED, EMBASE, and the Cochrane Library was executed to identify retrospective observational and prospective randomized controlled trials studies that compared the outcomes of the two approaches in laparoscopic partial nephrectomy. Two reviewers independently screened, extracted, and evaluated the included studies and executed statistical analysis by using software STATA 12.0. Outcomes of interest included perioperative and postoperative variables, surgical complications and oncological variables. There were 8 studies assessed transperitoneal laparoscopic partial nephrectomy (TLPN) versus retroperitoneal laparoscopic partial nephrectomy (RLPN) were included. RLPN had a shorter operating time (SMD = 1.001,95%confidence interval[CI] 0.609-1.393,P<0.001), a lower estimated blood loss (SMD = 0.403,95%CI 0.015-0.791,P = 0.042) and a shorter length of hospital stay (WMD = 0.936 DAYS,95%CI 0.609-1.263,P<0.001) than TLPN. There were no significant differences between the transperitoneal and retroperitoneal approaches in other outcomes of interest. This meta-analysis indicates that, in appropriately selected patients, especially patients with intraperitoneal procedures history or posteriorly located renal tumors, the RLPN can shorten the operation time, reduce the estimated blood loss and shorten the length of hospital stay. RLPN may be equally safe and be faster compared with the TLPN.
Leyrat, Clémence; Seaman, Shaun R; White, Ian R; Douglas, Ian; Smeeth, Liam; Kim, Joseph; Resche-Rigon, Matthieu; Carpenter, James R; Williamson, Elizabeth J
2017-01-01
Inverse probability of treatment weighting is a popular propensity score-based approach to estimate marginal treatment effects in observational studies at risk of confounding bias. A major issue when estimating the propensity score is the presence of partially observed covariates. Multiple imputation is a natural approach to handle missing data on covariates: covariates are imputed and a propensity score analysis is performed in each imputed dataset to estimate the treatment effect. The treatment effect estimates from each imputed dataset are then combined to obtain an overall estimate. We call this method MIte. However, an alternative approach has been proposed, in which the propensity scores are combined across the imputed datasets (MIps). Therefore, there are remaining uncertainties about how to implement multiple imputation for propensity score analysis: (a) should we apply Rubin's rules to the inverse probability of treatment weighting treatment effect estimates or to the propensity score estimates themselves? (b) does the outcome have to be included in the imputation model? (c) how should we estimate the variance of the inverse probability of treatment weighting estimator after multiple imputation? We studied the consistency and balancing properties of the MIte and MIps estimators and performed a simulation study to empirically assess their performance for the analysis of a binary outcome. We also compared the performance of these methods to complete case analysis and the missingness pattern approach, which uses a different propensity score model for each pattern of missingness, and a third multiple imputation approach in which the propensity score parameters are combined rather than the propensity scores themselves (MIpar). Under a missing at random mechanism, complete case and missingness pattern analyses were biased in most cases for estimating the marginal treatment effect, whereas multiple imputation approaches were approximately unbiased as long as the outcome was included in the imputation model. Only MIte was unbiased in all the studied scenarios and Rubin's rules provided good variance estimates for MIte. The propensity score estimated in the MIte approach showed good balancing properties. In conclusion, when using multiple imputation in the inverse probability of treatment weighting context, MIte with the outcome included in the imputation model is the preferred approach.
Cullis, B R; Smith, A B; Beeck, C P; Cowling, W A
2010-11-01
Exploring and exploiting variety by environment (V × E) interaction is one of the major challenges facing plant breeders. In paper I of this series, we presented an approach to modelling V × E interaction in the analysis of complex multi-environment trials using factor analytic models. In this paper, we develop a range of statistical tools which explore V × E interaction in this context. These tools include graphical displays such as heat-maps of genetic correlation matrices as well as so-called E-scaled uniplots that are a more informative alternative to the classical biplot for large plant breeding multi-environment trials. We also present a new approach to prediction for multi-environment trials that include pedigree information. This approach allows meaningful selection indices to be formed either for potential new varieties or potential parents.
An Informally Annotated Bibliography of Sociolinguistics.
ERIC Educational Resources Information Center
Tannen, Deborah
This annotated bibliography of sociolinguistics is divided into the following sections: speech events, ethnography of speaking and anthropological approaches to analysis of conversation; discourse analysis (including analysis of conversation and narrative), ethnomethodology and nonverbal communication; sociolinguistics; pragmatics (including…
NASA Technical Reports Server (NTRS)
Egolf, T. A.; Landgrebe, A. J.
1982-01-01
A user's manual is provided which includes the technical approach for the Prescribed Wake Rotor Inflow and Flow Field Prediction Analysis. The analysis is used to provide the rotor wake induced velocities at the rotor blades for use in blade airloads and response analyses and to provide induced velocities at arbitrary field points such as at a tail surface. This analysis calculates the distribution of rotor wake induced velocities based on a prescribed wake model. Section operating conditions are prescribed from blade motion and controls determined by a separate blade response analysis. The analysis represents each blade by a segmented lifting line, and the rotor wake by discrete segmented trailing vortex filaments. Blade loading and circulation distributions are calculated based on blade element strip theory including the local induced velocity predicted by the numerical integration of the Biot-Savart Law applied to the vortex wake model.
An interactive modular design for computerized photometry in spectrochemical analysis
NASA Technical Reports Server (NTRS)
Bair, V. L.
1980-01-01
A general functional description of totally automatic photometry of emission spectra is not available for an operating environment in which the sample compositions and analysis procedures are low-volume and non-routine. The advantages of using an interactive approach to computer control in such an operating environment are demonstrated. This approach includes modular subroutines selected at multiple-option, menu-style decision points. This style of programming is used to trace elemental determinations, including the automated reading of spectrographic plates produced by a 3.4 m Ebert mount spectrograph using a dc-arc in an argon atmosphere. The simplified control logic and modular subroutine approach facilitates innovative research and program development, yet is easily adapted to routine tasks. Operator confidence and control are increased by the built-in options including degree of automation, amount of intermediate data printed out, amount of user prompting, and multidirectional decision points.
Adversarial risk analysis with incomplete information: a level-k approach.
Rothschild, Casey; McLay, Laura; Guikema, Seth
2012-07-01
This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.
MUSiC—An Automated Scan for Deviations between Data and Monte Carlo Simulation
NASA Astrophysics Data System (ADS)
Meyer, Arnd
2010-02-01
A model independent analysis approach is presented, systematically scanning the data for deviations from the standard model Monte Carlo expectation. Such an analysis can contribute to the understanding of the CMS detector and the tuning of event generators. The approach is sensitive to a variety of models of new physics, including those not yet thought of.
MUSiC - An Automated Scan for Deviations between Data and Monte Carlo Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Arnd
2010-02-10
A model independent analysis approach is presented, systematically scanning the data for deviations from the standard model Monte Carlo expectation. Such an analysis can contribute to the understanding of the CMS detector and the tuning of event generators. The approach is sensitive to a variety of models of new physics, including those not yet thought of.
NASA Technical Reports Server (NTRS)
Cothran, E. K.
1982-01-01
The computer program written in support of one dimensional analytical approach to thermal modeling of Bridgman type crystal growth is presented. The program listing and flow charts are included, along with the complete thermal model. Sample problems include detailed comments on input and output to aid the first time user.
Delgado, Alejandra; Posada-Ureta, Oscar; Olivares, Maitane; Vallejo, Asier; Etxebarria, Nestor
2013-12-15
In this study a priority organic pollutants usually found in environmental water samples were considered to accomplish two extraction and analysis approaches. Among those compounds organochlorine compounds, pesticides, phthalates, phenols and residues of pharmaceutical and personal care products were included. The extraction and analysis steps were based on silicone rod extraction (SR) followed by liquid desorption in combination with large volume injection-programmable temperature vaporiser (LVI-PTV) and gas chromatography-mass spectrometry (GC-MS). Variables affecting the analytical response as a function of the programmable temperature vaporiser (PTV) parameters were firstly optimised following an experimental design approach. The SR extraction and desorption conditions were assessed afterwards, including matrix modification, time extraction, and stripping solvent composition. Subsequently, the possibility of performing membrane enclosed sorptive coating extraction (MESCO) as a modified extraction approach was also evaluated. The optimised method showed low method detection limits (3-35 ng L(-1)), acceptable accuracy (78-114%) and precision values (<13%) for most of the studied analytes regardless of the aqueous matrix. Finally, the developed approach was successfully applied to the determination of target analytes in aqueous environmental matrices including estuarine and wastewater samples. © 2013 Elsevier B.V. All rights reserved.
Krawitz, Peter M; Schiska, Daniela; Krüger, Ulrike; Appelt, Sandra; Heinrich, Verena; Parkhomchuk, Dmitri; Timmermann, Bernd; Millan, Jose M; Robinson, Peter N; Mundlos, Stefan; Hecht, Jochen; Gross, Manfred
2014-01-01
Usher syndrome is an autosomal recessive disorder characterized both by deafness and blindness. For the three clinical subtypes of Usher syndrome causal mutations in altogether 12 genes and a modifier gene have been identified. Due to the genetic heterogeneity of Usher syndrome, the molecular analysis is predestined for a comprehensive and parallelized analysis of all known genes by next-generation sequencing (NGS) approaches. We describe here the targeted enrichment and deep sequencing for exons of Usher genes and compare the costs and workload of this approach compared to Sanger sequencing. We also present a bioinformatics analysis pipeline that allows us to detect single-nucleotide variants, short insertions and deletions, as well as copy number variations of one or more exons on the same sequence data. Additionally, we present a flexible in silico gene panel for the analysis of sequence variants, in which newly identified genes can easily be included. We applied this approach to a cohort of 44 Usher patients and detected biallelic pathogenic mutations in 35 individuals and monoallelic mutations in eight individuals of our cohort. Thirty-nine of the sequence variants, including two heterozygous deletions comprising several exons of USH2A, have not been reported so far. Our NGS-based approach allowed us to assess single-nucleotide variants, small indels, and whole exon deletions in a single test. The described diagnostic approach is fast and cost-effective with a high molecular diagnostic yield. PMID:25333064
Krawitz, Peter M; Schiska, Daniela; Krüger, Ulrike; Appelt, Sandra; Heinrich, Verena; Parkhomchuk, Dmitri; Timmermann, Bernd; Millan, Jose M; Robinson, Peter N; Mundlos, Stefan; Hecht, Jochen; Gross, Manfred
2014-09-01
Usher syndrome is an autosomal recessive disorder characterized both by deafness and blindness. For the three clinical subtypes of Usher syndrome causal mutations in altogether 12 genes and a modifier gene have been identified. Due to the genetic heterogeneity of Usher syndrome, the molecular analysis is predestined for a comprehensive and parallelized analysis of all known genes by next-generation sequencing (NGS) approaches. We describe here the targeted enrichment and deep sequencing for exons of Usher genes and compare the costs and workload of this approach compared to Sanger sequencing. We also present a bioinformatics analysis pipeline that allows us to detect single-nucleotide variants, short insertions and deletions, as well as copy number variations of one or more exons on the same sequence data. Additionally, we present a flexible in silico gene panel for the analysis of sequence variants, in which newly identified genes can easily be included. We applied this approach to a cohort of 44 Usher patients and detected biallelic pathogenic mutations in 35 individuals and monoallelic mutations in eight individuals of our cohort. Thirty-nine of the sequence variants, including two heterozygous deletions comprising several exons of USH2A, have not been reported so far. Our NGS-based approach allowed us to assess single-nucleotide variants, small indels, and whole exon deletions in a single test. The described diagnostic approach is fast and cost-effective with a high molecular diagnostic yield.
Computer aided analysis and optimization of mechanical system dynamics
NASA Technical Reports Server (NTRS)
Haug, E. J.
1984-01-01
The purpose is to outline a computational approach to spatial dynamics of mechanical systems that substantially enlarges the scope of consideration to include flexible bodies, feedback control, hydraulics, and related interdisciplinary effects. Design sensitivity analysis and optimization is the ultimate goal. The approach to computer generation and solution of the system dynamic equations and graphical methods for creating animations as output is outlined.
ERIC Educational Resources Information Center
Bashaw, W. L., Ed.; Findley, Warren G., Ed.
This volume contains the five major addresses and subsequent discussion from the Symposium on the General Linear Models Approach to the Analysis of Experimental Data in Educational Research, which was held in 1967 in Athens, Georgia. The symposium was designed to produce systematic information, including new methodology, for dissemination to the…
Fluxomics - connecting 'omics analysis and phenotypes.
Winter, Gal; Krömer, Jens O
2013-07-01
In our modern 'omics era, metabolic flux analysis (fluxomics) represents the physiological counterpart of its siblings transcriptomics, proteomics and metabolomics. Fluxomics integrates in vivo measurements of metabolic fluxes with stoichiometric network models to allow the determination of absolute flux through large networks of the central carbon metabolism. There are many approaches to implement fluxomics including flux balance analysis (FBA), (13) C fluxomics and (13) C-constrained FBA as well as many experimental settings for flux measurement including dynamic, stationary and semi-stationary. Here we outline the principles of the different approaches and their relative advantages. We demonstrate the unique contribution of flux analysis for phenotype elucidation using a thoroughly studied metabolic reaction as a case study, the microbial aerobic/anaerobic shift, highlighting the importance of flux analysis as a single layer of data as well as interlaced in multi-omics studies. © 2012 John Wiley & Sons Ltd and Society for Applied Microbiology.
Clarifying Public Controversy: An Approach to Teaching Social Studies.
ERIC Educational Resources Information Center
Newmann, Fred M.; Oliver, Donald W.
This book presents a rational discussion process approach to the teaching of specific social controversies in the social studies in secondary schools. The authors provide an in-depth analysis of this approach which includes both theory and application. The introductory chapters place the discussion process framework within terms of American values…
Implementation of Potential of the Transdisciplinary Approaches in Economic Studies
ERIC Educational Resources Information Center
Stepanova, Tatiana E.; Manokhina, Nadeghda V.; Konovalova, Maria E.; Kuzmina, Olga Y.; Andryukhina, Lyudmila M.
2016-01-01
The relevance of the researched problem is caused by the increasing interest in using potential of transdisciplinary approaches, and mathematical methods, which include the game theory in analysis of public and economic processes. The aim of the article is studying a possibility of implementation of the transdisciplinary approaches in economic…
The Value of Information: Approaches in Economics, Accounting, and Management Science.
ERIC Educational Resources Information Center
Repo, Aatto J.
1989-01-01
This review and analysis of research on the economics of information performed by economists, accounting researchers, and management scientists focuses on their approaches to describing and measuring the value of information. The discussion includes comparisons of research approaches based on cost effectiveness and on the value of information. (77…
Incorporating organisational safety culture within ergonomics practice.
Bentley, Tim; Tappin, David
2010-10-01
This paper conceptualises organisational safety culture and considers its relevance to ergonomics practice. Issues discussed in the paper include the modest contribution that ergonomists and ergonomics as a discipline have made to this burgeoning field of study and the significance of safety culture to a systems approach. The relevance of safety culture to ergonomics work with regard to the analysis, design, implementation and evaluation process, and implications for participatory ergonomics approaches, are also discussed. A potential user-friendly, qualitative approach to assessing safety culture as part of ergonomics work is presented, based on a recently published conceptual framework that recognises the dynamic and multi-dimensional nature of safety culture. The paper concludes by considering the use of such an approach, where an understanding of different aspects of safety culture within an organisation is seen as important to the success of ergonomics projects. STATEMENT OF RELEVANCE: The relevance of safety culture to ergonomics practice is a key focus of this paper, including its relationship with the systems approach, participatory ergonomics and the ergonomics analysis, design, implementation and evaluation process. An approach to assessing safety culture as part of ergonomics work is presented.
Biardeau, X; Zanaty, M; Aoun, F; Benbouzid, S; Peyronnet, B
2016-03-01
We aim to assess the complications associated with different approaches used in female suburethral sling surgery. We performed a research on Medline using the following keywords: "suburethral slings", "complications", "safety" and "randomized". Only randomized clinical trials including women and reporting intra- and postoperative complications associated with the retropubic (RP) approach; TOT and/or TVT-O were included. The meta-analysis was conducted using the Review Manager (RevMan 5.3) software delivered by the "Cochrane Library". Out of 176 articles, 23 were included in synthesis. Risks of bladder perforation during surgery (60/1482 vs 5/1479; OR=6.44; 95% CI [3.32-12.50]) and postoperative urinary retention (48/1160 vs 24/1159; OR=1.93; 95% CI [1.26-3.12]) were significantly higher with the RP approach, when compared with the transobturator (TO) approach (TOT or TVT-O). Conversely, the risk of prolonged postoperative pain was significantly lower after RP approach, when compared with TO approach (24/1156 vs 69/1149; OR=0.36; 95% CI [0.23-0.56]). Risks of intraoperative urethral injury, postoperative erosion and de novo overactive bladder were comparable between the two approaches. Data regarding the comparison between TOT and TVT-O were scarce and did not allow us to conclude about complications associated with. The RP approach was associated with a significant risk of bladder perforation and postoperative urinary retention. The TO approach was associated with a higher risk of prolonged postoperative pain. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
Crash Certification by Analysis - Are We There Yet?
NASA Technical Reports Server (NTRS)
Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.
2006-01-01
This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."
Optofluidic Cell Selection from Complex Microbial Communities for Single-Genome Analysis
Landry, Zachary C.; Giovanonni, Stephen J.; Quake, Stephen R.; Blainey, Paul C.
2013-01-01
Genetic analysis of single cells is emerging as a powerful approach for studies of heterogeneous cell populations. Indeed, the notion of homogeneous cell populations is receding as approaches to resolve genetic and phenotypic variation between single cells are applied throughout the life sciences. A key step in single-cell genomic analysis today is the physical isolation of individual cells from heterogeneous populations, particularly microbial populations, which often exhibit high diversity. Here, we detail the construction and use of instrumentation for optical trapping inside microfluidic devices to select individual cells for analysis by methods including nucleic acid sequencing. This approach has unique advantages for analyses of rare community members, cells with irregular morphologies, small quantity samples, and studies that employ advanced optical microscopy. PMID:24060116
Fair market value: taking a proactive approach.
Romero, Richard A
2008-04-01
A valuation report assessing the fair market value of a contractual arrangement should include: A description of the company, entity, or circumstance being valued. Analysis of general economic conditions that are expected to affect the enterprise. Evaluation of economic conditions in the medical services industry. Explanation of the various valuation approaches that were considered. Documentation of key underlying assumptions, including revenue and expense projections, projected profit, and ROI.
Planning for Cost Effectiveness.
ERIC Educational Resources Information Center
Schlaebitz, William D.
1984-01-01
A heat pump life-cycle cost analysis is used to explain the technique. Items suggested for the life-cycle analysis approach include lighting, longer-life batteries, site maintenance, and retaining experts to inspect specific building components. (MLF)
On the Application of Syntactic Methodologies in Automatic Text Analysis.
ERIC Educational Resources Information Center
Salton, Gerard; And Others
1990-01-01
Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…
Large perturbation flow field analysis and simulation for supersonic inlets
NASA Technical Reports Server (NTRS)
Varner, M. O.; Martindale, W. R.; Phares, W. J.; Kneile, K. R.; Adams, J. C., Jr.
1984-01-01
An analysis technique for simulation of supersonic mixed compression inlets with large flow field perturbations is presented. The approach is based upon a quasi-one-dimensional inviscid unsteady formulation which includes engineering models of unstart/restart, bleed, bypass, and geometry effects. Numerical solution of the governing time dependent equations of motion is accomplished through a shock capturing finite difference algorithm, of which five separate approaches are evaluated. Comparison with experimental supersonic wind tunnel data is presented to verify the present approach for a wide range of transient inlet flow conditions.
Chigerwe, Munashe; Ilkiw, Jan E; Boudreaux, Karen A
2011-01-01
The objectives of the present study were to evaluate first-, second-, third-, and fourth-year veterinary medical students' approaches to studying and learning as well as the factors within the curriculum that may influence these approaches. A questionnaire consisting of the short version of the Approaches and Study Skills Inventory for Students (ASSIST) was completed by 405 students, and it included questions relating to conceptions about learning, approaches to studying, and preferences for different types of courses and teaching. Descriptive statistics, factor analysis, Cronbach's alpha analysis, and log-linear analysis were performed on the data. Deep, strategic, and surface learning approaches emerged. There were a few differences between our findings and those presented in previous studies in terms of the correlation of the subscale monitoring effectiveness, which showed loading with both the deep and strategic learning approaches. In addition, the subscale alertness to assessment demands showed correlation with the surface learning approach. The perception of high workloads, the use of previous test files as a method for studying, and examinations that are based only on material provided in lecture notes were positively associated with the surface learning approach. Focusing on improving specific teaching and assessment methods that enhance deep learning is anticipated to enhance students' positive learning experience. These teaching methods include instructors who encourage students to be critical thinkers, the integration of course material in other disciplines, courses that encourage thinking and reading about the learning material, and books and articles that challenge students while providing explanations beyond lecture material.
Performance Learning Roadmap A Network-Centric Approach for Engaged Learners
2005-01-01
Insurance Corporation Target Corporation Unilever Corporation United Nations Development Programme University of Wisconsin (UWSA)–Madison U.S. Coast Guard...performance support services, including consulting, coaching, mentoring, rapid 14 deployment training, targeted training, analysis , facilitation, and team...services include consulting, coaching, mentoring, rapid deployment training, targeted train- ing, analysis , facilitation, and team collaboration support
ERIC Educational Resources Information Center
Weinberg, Jessica P., Ed.; O'Bryan, Erin L., Ed.; Moll, Laura A., Ed.; Haugan, Jason D., Ed.
The five papers included in this volume approach the study of American Indian languages from a diverse array of methodological and theoretical approaches to linguistics. Two papers focus on approaches that come from the applied linguistics tradition, emphasizing ethnolinguistics and discourse analysis: Sonya Bird's paper "A Cross Cultural…
Development of an Aeroelastic Analysis Including a Viscous Flow Model
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Bakhle, Milind A.
2001-01-01
Under this grant, Version 4 of the three-dimensional Navier-Stokes aeroelastic code (TURBO-AE) has been developed and verified. The TURBO-AE Version 4 aeroelastic code allows flutter calculations for a fan, compressor, or turbine blade row. This code models a vibrating three-dimensional bladed disk configuration and the associated unsteady flow (including shocks, and viscous effects) to calculate the aeroelastic instability using a work-per-cycle approach. Phase-lagged (time-shift) periodic boundary conditions are used to model the phase lag between adjacent vibrating blades. The direct-store approach is used for this purpose to reduce the computational domain to a single interblade passage. A disk storage option, implemented using direct access files, is available to reduce the large memory requirements of the direct-store approach. Other researchers have implemented 3D inlet/exit boundary conditions based on eigen-analysis. Appendix A: Aeroelastic calculations based on three-dimensional euler analysis. Appendix B: Unsteady aerodynamic modeling of blade vibration using the turbo-V3.1 code.
Object-oriented approach to fast display of electrophysiological data under MS-windows.
Marion-Poll, F
1995-12-01
Microcomputers provide neuroscientists an alternative to a host of laboratory equipment to record and analyze electrophysiological data. Object-oriented programming tools bring an essential link between custom needs for data acquisition and analysis with general software packages. In this paper, we outline the layout of basic objects that display and manipulate electrophysiological data files. Visual inspection of the recordings is a basic requirement of any data analysis software. We present an approach that allows flexible and fast display of large data sets. This approach involves constructing an intermediate representation of the data in order to lower the number of actual points displayed while preserving the aspect of the data. The second group of objects is related to the management of lists of data files. Typical experiments designed to test the biological activity of pharmacological products include scores of files. Data manipulation and analysis are facilitated by creating multi-document objects that include the names of all experiment files. Implementation steps of both objects are described for an MS-Windows hosted application.
A note on the depreciation of the societal perspective in economic evaluation of health care.
Johannesson, M
1995-07-01
It is common in cost-effectiveness analyses of health care to only include health care costs, with the argument that some fictive 'health care budget' should be used to maximize the health effects. This paper provides a criticism of the 'health care budget' approach to cost-effectiveness analysis of health care. It is argued that the approach is ad hoc and lacks theoretical foundation. The approach is also inconsistent with using a fixed budget as the decision rule for cost-effectiveness analysis. That is the case unless only costs that fall into a single annual actual budget are included in the analysis, which would mean that any cost paid by the patients should be excluded as well as any future cost changes and all costs that fall on other budgets. Furthermore the prices facing the budget holder should be used, rather than opportunity costs. It is concluded that the 'health care budget' perspective should be abandoned and the societal perspective reinstated in economic evaluation of health care.
The CICT Earth Science Systems Analysis Model
NASA Technical Reports Server (NTRS)
Pell, Barney; Coughlan, Joe; Biegel, Bryan; Stevens, Ken; Hansson, Othar; Hayes, Jordan
2004-01-01
Contents include the following: Computing Information and Communications Technology (CICT) Systems Analysis. Our modeling approach: a 3-part schematic investment model of technology change, impact assessment and prioritization. A whirlwind tour of our model. Lessons learned.
Evolving Concepts and Teaching Approaches In Tectonics and Sedimentation.
ERIC Educational Resources Information Center
Graham, Stephan Alan
1983-01-01
Discusses five recent advances in sedimentary tectonics, noting how they are incorporated into college curricula. Advances discussed include basin type, tectonic setting, facies analysis (in conjunction with basin type/setting), stratigraphic analysis of reflection seismic data, and quantitative analysis of subsidence histories of sedimentary…
1992 NASA Life Support Systems Analysis workshop
NASA Technical Reports Server (NTRS)
Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.
1992-01-01
The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.
Kapiriri, Lydia; Razavi, Donya
2017-09-01
There is a growing body of literature on systematic approaches to healthcare priority setting from various countries and different levels of decision making. This paper synthesizes the current literature in order to assess the extent to which program budgeting and marginal analysis (PBMA), burden of disease & cost-effectiveness analysis (BOD/CEA), multi-criteria decision analysis (MCDA), and accountability for reasonableness (A4R), are reported to have been institutionalized and influenced policy making and practice. We searched for English language publications on health care priority setting approaches (2000-2017). Our sources of literature included PubMed and Ovid databases (including Embase, Global Health, Medline, PsycINFO, EconLit). Of the four approaches PBMA and A4R were commonly applied in high income countries while BOD/CEA was exclusively applied in low income countries. PBMA and BOD/CEA were most commonly reported to have influenced policy making. The explanations for limited adoption of an approach were related to its complexity, poor policy maker understanding and resource requirements. While systematic approaches have the potential to improve healthcare priority setting; most have not been adopted in routine policy making. The identified barriers call for sustained knowledge exchange between researchers and policy-makers and development of practical guidelines to ensure that these frameworks are more accessible, applicable and sustainable in informing policy making. Copyright © 2017 Elsevier B.V. All rights reserved.
Zhang, Hong; Abhyankar, Shrirang; Constantinescu, Emil; ...
2017-01-24
Sensitivity analysis is an important tool for describing power system dynamic behavior in response to parameter variations. It is a central component in preventive and corrective control applications. The existing approaches for sensitivity calculations, namely, finite-difference and forward sensitivity analysis, require a computational effort that increases linearly with the number of sensitivity parameters. In this paper, we investigate, implement, and test a discrete adjoint sensitivity approach whose computational effort is effectively independent of the number of sensitivity parameters. The proposed approach is highly efficient for calculating sensitivities of larger systems and is consistent, within machine precision, with the function whosemore » sensitivity we are seeking. This is an essential feature for use in optimization applications. Moreover, our approach includes a consistent treatment of systems with switching, such as dc exciters, by deriving and implementing the adjoint jump conditions that arise from state-dependent and time-dependent switchings. The accuracy and the computational efficiency of the proposed approach are demonstrated in comparison with the forward sensitivity analysis approach. In conclusion, this paper focuses primarily on the power system dynamics, but the approach is general and can be applied to hybrid dynamical systems in a broader range of fields.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hong; Abhyankar, Shrirang; Constantinescu, Emil
Sensitivity analysis is an important tool for describing power system dynamic behavior in response to parameter variations. It is a central component in preventive and corrective control applications. The existing approaches for sensitivity calculations, namely, finite-difference and forward sensitivity analysis, require a computational effort that increases linearly with the number of sensitivity parameters. In this paper, we investigate, implement, and test a discrete adjoint sensitivity approach whose computational effort is effectively independent of the number of sensitivity parameters. The proposed approach is highly efficient for calculating sensitivities of larger systems and is consistent, within machine precision, with the function whosemore » sensitivity we are seeking. This is an essential feature for use in optimization applications. Moreover, our approach includes a consistent treatment of systems with switching, such as dc exciters, by deriving and implementing the adjoint jump conditions that arise from state-dependent and time-dependent switchings. The accuracy and the computational efficiency of the proposed approach are demonstrated in comparison with the forward sensitivity analysis approach. In conclusion, this paper focuses primarily on the power system dynamics, but the approach is general and can be applied to hybrid dynamical systems in a broader range of fields.« less
Contemporary management of frontal sinus mucoceles: a meta-analysis.
Courson, Andy M; Stankiewicz, James A; Lal, Devyani
2014-02-01
To analyze trends in the surgical management of frontal and fronto-ethmoid mucoceles through meta-analysis. Meta-analysis and case series. A systematic literature review on surgical management of frontal and fronto-ethmoid mucoceles was conducted. Studies were divided into historical (1975-2001) and contemporary (2002-2012) groups. A meta-analysis of these studies was performed. The historical and contemporary cohorts were compared (surgical approach, recurrence, and complications). To study evolution in surgical management, a senior surgeon's experience over 28 years was analyzed separately. Thirty-one studies were included for meta-analysis. The historical cohort included 425 mucoceles from 11 studies. The contemporary cohort included 542 mucoceles from 20 studies. More endoscopic techniques were used in the contemporary versus historical cohort (53.9% vs. 24.7%; P = <0.001). In the authors' series, a higher percentage was treated endoscopically (82.8% of 122 mucoceles). Recurrence (P = 0.20) and major complication (P = 0.23) rates were similar between cohorts. Minor complication rates were superior for endoscopic techniques in both cohorts (P = 0.02 historical; P = <0.001 contemporary). In the historical cohort, higher recurrence was noted in the external group (P = 0.03). Results from endoscopic and open approaches are comparable. Although endoscopic techniques are being increasingly adopted, comparison with our series shows that more cases could potentially be treated endoscopically. Frequent use of open approaches may reflect efficacy, or perhaps lack of expertise and equipment required for endoscopic management. Most contemporary authors favor endoscopic management, limiting open approaches for specific indications (unfavorable anatomy, lateral disease, and scarring). N/A. Copyright © 2013 The American Laryngological, Rhinological and Otological Society, Inc.
Pantex Falling Man - Independent Review Panel Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolini, Louis; Brannon, Nathan; Olson, Jared
2014-11-01
Consolidated Nuclear Security (CNS) Pantex took the initiative to organize a Review Panel of subject matter experts to independently assess the adequacy of the Pantex Tripping Man Analysis methodology. The purpose of this report is to capture the details of the assessment including the scope, approach, results, and detailed Appendices. Along with the assessment of the analysis methodology, the panel evaluated the adequacy with which the methodology was applied as well as congruence with Department of Energy (DOE) standards 3009 and 3016. The approach included the review of relevant documentation, interactive discussion with Pantex staff, and the iterative process ofmore » evaluating critical lines of inquiry.« less
Du, Yiyang; He, Bosai; Li, Qing; He, Jiao; Wang, Di; Bi, Kaishun
2017-07-01
Suan-Zao-Ren granule is widely used to treat insomnia in China. However, because of the complexity and diversity of the chemical compositions in traditional Chinese medicine formula, the comprehensive analysis of constituents in vitro and in vivo is rather difficult. In our study, an ultra high performance liquid chromatography with quadrupole time-of-flight mass spectrometry and the PeakView® software, which uses multiple data processing approaches including product ion filter, neutral loss filter, and mass defect filter, method was developed to characterize the ingredients and rat serum metabolites in Suan-Zao-Ren granule. A total of 101 constituents were detected in vitro. Under the same analysis conditions, 68 constituents were characterized in rat serum, including 35 prototype components and 33 metabolites. The metabolic pathways of main components were also illustrated. Among them, the metabolic pathways of timosaponin AI were firstly revealed. The bioactive compounds mainly underwent the phase I metabolic pathways including hydroxylation, oxidation, hydrolysis, and phase II metabolic pathways including sulfate conjugation, glucuronide conjugation, cysteine conjugation, acetycysteine conjugation, and glutathione conjugation. In conclusion, our results showed that this analysis approach was extremely useful for the in-depth pharmacological research of Suan-Zao-Ren granule and provided a chemical basis for its rational. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Willke, Richard J; Zheng, Zhiyuan; Subedi, Prasun; Althin, Rikard; Mullins, C Daniel
2012-12-13
Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the "intermediate" outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading.By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research.
Rosenow, Felix; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Bauer, Sebastian
2017-11-01
Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. Part I includes the clinical phenotyping and diagnostic methods, EEG network-analysis, biomarkers, and personalized treatment approaches. In Part II, experimental and translational approaches will be discussed (Bauer et al., 2017) [1]. Copyright © 2017 Elsevier Inc. All rights reserved.
Gower, Amy L; Rider, G Nicole; Coleman, Eli; Brown, Camille; McMorris, Barbara J; Eisenberg, Marla E
2018-06-19
As measures of birth-assigned sex, gender identity, and perceived gender presentation are increasingly included in large-scale research studies, data analysis approaches incorporating such measures are needed. Large samples capable of demonstrating variation within the transgender and gender diverse (TGD) community can inform intervention efforts to improve health equity. A population-based sample of TGD youth was used to examine associations between perceived gender presentation, bullying victimization, and emotional distress using two data analysis approaches. Secondary data analysis of the Minnesota Student Survey included 2168 9th and 11th graders who identified as "transgender, genderqueer, genderfluid, or unsure about their gender identity." Youth reported their biological sex, how others perceived their gender presentation, experiences of four forms of bullying victimization, and four measures of emotional distress. Logistic regression and multifactor analysis of variance (ANOVA) were used to compare and contrast two analysis approaches. Logistic regressions indicated that TGD youth perceived as more gender incongruent had higher odds of bullying victimization and emotional distress relative to those perceived as very congruent with their biological sex. Multifactor ANOVAs demonstrated more variable patterns and allowed for comparisons of each perceived presentation group with all other groups, reflecting nuances that exist within TGD youth. Researchers should adopt data analysis strategies that allow for comparisons of all perceived gender presentation categories rather than assigning a reference group. Those working with TGD youth should be particularly attuned to youth perceived as gender incongruent as they may be more likely to experience bullying victimization and emotional distress.
Planning for population viability on Northern Great Plains national grasslands
Samson, F.B.; Knopf, F.L.; McCarthy, C.W.; Noon, B.R.; Ostlie, W.R.; Rinehart, S.M.; Larson, S.; Plumb, G.E.; Schenbeck, G.L.; Svingen, D.N.; Byer, T.W.
2003-01-01
Broad-scale information in concert with conservation of individual species must be used to develop conservation priorities and a more integrated ecosystem protection strategy. In 1999 the United States Forest Service initiated an approach for the 1.2× 106 ha of national grasslands in the Northern Great Plains to fulfill the requirement to maintain viable populations of all native and desirable introduced vertebrate and plant species. The challenge was threefold: 1) develop basic building blocks in the conservation planning approach, 2) apply the approach to national grasslands, and 3) overcome differences that may exist in agency-specific legal and policy requirements. Key assessment components in the approach included a bioregional assessment, coarse-filter analysis, and fine-filter analysis aimed at species considered at-risk. A science team of agency, conservation organization, and university personnel was established to develop the guidelines and standards and other formal procedures for implementation of conservation strategies. Conservation strategies included coarse-filter recommendations to restore the tallgrass, mixed, and shortgrass prairies to conditions that approximate historical ecological processes and landscape patterns, and fine-filter recommendations to address viability needs of individual and multiple species of native animals and plants. Results include a cost-effective approach to conservation planning and recommendations for addressing population viability and biodiversity concerns on national grasslands in the Northern Great Plains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC
Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A
2014-07-01
A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.
Comparison of Human Exploration Architecture and Campaign Approaches
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Mattfeld, Bryan; Stromgren, Chel; Shyface, Hilary
2015-01-01
As part of an overall focus on space exploration, National Aeronautics and Space Administration (NASA) continues to evaluate potential approaches for sending humans beyond low Earth orbit (LEO). In addition, various external organizations are studying options for beyond LEO exploration. Recent studies include NASA's Evolvable Mars Campaign and Design Reference Architecture (DRA) 5.0, JPL's Minimal Mars Architecture; the Inspiration Mars mission; the Mars One campaign; and the Global Exploration Roadmap (GER). Each of these potential exploration constructs applies unique methods, architectures, and philosophies for human exploration. It is beneficial to compare potential approaches in order to better understand the range of options available for exploration. Since most of these studies were conducted independently, the approaches, ground rules, and assumptions used to conduct the analysis differ. In addition, the outputs and metrics presented for each construct differ substantially. This paper will describe the results of an effort to compare and contrast the results of these different studies under a common set of metrics. The paper will first present a summary of each of the proposed constructs, including a description of the overall approach and philosophy for exploration. Utilizing a common set of metrics for comparison, the paper will present the results of an evaluation of the potential benefits, critical challenges, and uncertainties associated with each construct. The analysis framework will include a detailed evaluation of key characteristics of each construct. These will include but are not limited to: a description of the technology and capability developments required to enable the construct and the uncertainties associated with these developments; an analysis of significant operational and programmatic risks associated with that construct; and an evaluation of the extent to which exploration is enabled by the construct, including the destinations visited and the exploration capabilities provided at those destinations. Based upon the comparison of constructs, the paper will identify trends and lessons learned across all of the candidate studies.
A New Methodology for Systematic Exploitation of Technology Databases.
ERIC Educational Resources Information Center
Bedecarrax, Chantal; Huot, Charles
1994-01-01
Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)
Bai, L; Hou, Y-L; Lin, G-H; Zhang, X; Liu, G-Q; Yu, B
2018-04-01
Our aim was to compare the effect of sinus tarsi approach (STA) vs extensile lateral approach (ELA) for treatment of closed displaced intra-articular calcaneal fractures (DIACF) is still being debated. A thorough research was carried out in the MEDLINE, EMBASE and Cochrane library databases from inception to December 2016. Only prospective or retrospective comparative studies were selected in this meta-analysis. Two independent reviewers conducted literature search, data extraction and quality assessment. The primary outcomes were anatomical restoration and prevalence of complications. Secondary outcomes included operation time and functional recovery. Four randomized controlled trials involving 326 patients and three cohort studies involving 206 patients were included. STA technique for DIACFs led to a decline in both operation time and incidence of complications. There were no significant differences between the groups in American Orthopedic Foot and Ankle Society scores, nor changes in Böhler angle. This meta-analysis suggests that STA technique may reduce the operation time and incidence of complications. In conclusion, STA technique is reasonably an optimal choice for DIACF. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
2014-01-01
Background Recent innovations in sequencing technologies have provided researchers with the ability to rapidly characterize the microbial content of an environmental or clinical sample with unprecedented resolution. These approaches are producing a wealth of information that is providing novel insights into the microbial ecology of the environment and human health. However, these sequencing-based approaches produce large and complex datasets that require efficient and sensitive computational analysis workflows. Many recent tools for analyzing metagenomic-sequencing data have emerged, however, these approaches often suffer from issues of specificity, efficiency, and typically do not include a complete metagenomic analysis framework. Results We present PathoScope 2.0, a complete bioinformatics framework for rapidly and accurately quantifying the proportions of reads from individual microbial strains present in metagenomic sequencing data from environmental or clinical samples. The pipeline performs all necessary computational analysis steps; including reference genome library extraction and indexing, read quality control and alignment, strain identification, and summarization and annotation of results. We rigorously evaluated PathoScope 2.0 using simulated data and data from the 2011 outbreak of Shiga-toxigenic Escherichia coli O104:H4. Conclusions The results show that PathoScope 2.0 is a complete, highly sensitive, and efficient approach for metagenomic analysis that outperforms alternative approaches in scope, speed, and accuracy. The PathoScope 2.0 pipeline software is freely available for download at: http://sourceforge.net/projects/pathoscope/. PMID:25225611
Good Laboratory Practices of Materials Testing at NASA White Sands Test Facility
NASA Technical Reports Server (NTRS)
Hirsch, David; Williams, James H.
2005-01-01
An approach to good laboratory practices of materials testing at NASA White Sands Test Facility is presented. The contents include: 1) Current approach; 2) Data analysis; and 3) Improvements sought by WSTF to enhance the diagnostic capability of existing methods.
Local algebraic analysis of differential systems
NASA Astrophysics Data System (ADS)
Kaptsov, O. V.
2015-06-01
We propose a new approach for studying the compatibility of partial differential equations. This approach is a synthesis of the Riquier method, Gröbner basis theory, and elements of algebraic geometry. As applications, we consider systems including the wave equation and the sine-Gordon equation.
ERIC Educational Resources Information Center
Menashy, Francine
2013-01-01
This study provides a discursive analysis of World Bank policy documents in order to reveal the stark omission of a rights-based approach to education, while highlighting instead the support of an economic-instrumentalist approach. Plausible explanations are provided to shed light on this exclusion, including the feasibility critique of education…
Two-tiered design analysis of a radiator for a solar dynamic powered Stirling engine
NASA Technical Reports Server (NTRS)
Hainley, Donald C.
1989-01-01
Two separate design approaches for a pumped loop radiator used to transfer heat from the cold end of a solar dynamic powered Stirling engine are described. The first approach uses a standard method to determine radiator requirements to meet specified end of mission conditions. Trade-off studies conducted for the analysis are included. Justification of this concept within the specified parameters of the analysis is provided. The second design approach determines the life performance of the radiator/Stirling system. In this approach, the system performance was altered by reducing the radiator heat transfer area. Performance effects and equilibrium points were determined as radiator segments were removed. This simulates the effect of loss of radiator sections due to micro-meteoroid and space debris penetration. The two designs were compared on the basis of overall system requirements and goals.
Two-tiered design analysis of a radiator for a solar dynamic powered Stirling engine
NASA Technical Reports Server (NTRS)
Hainley, Donald C.
1989-01-01
Two separate design approaches for a pumped loop radiator used to transfer heat from the cold end of a solar dynamic powered Stirling engine are described. The first approach uses a standard method to determine radiator requirements to meet specified end of mission conditions. Trade-off studies conducted for the analysis are included. Justification of this concept within the specified parameters of the analysis is provided. The second design approach determines the life performance of the radiator/Stirling system. In this approach, the system performance was altered by reducing the radiator heat transfer area. Performance effects and equilibrium points were determined as radiator segments were removed. This simulates the effect of loss of radiator sections due to micro-meteoroid and space debris penetration. The two designs are compared on the basis of overall system requirements and goals.
Improving Public Perception of Behavior Analysis.
Freedman, David H
2016-05-01
The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis.
Methods for Mediation Analysis with Missing Data
ERIC Educational Resources Information Center
Zhang, Zhiyong; Wang, Lijuan
2013-01-01
Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…
NASA Astrophysics Data System (ADS)
Tweed, Fiona S.
2017-08-01
This special edition of Zeitschrift für Geomorphologie (ZfG) is based on presentations given at a conference entitled 'Hydrological Extreme Events in Historic and Prehistoric Times' which took place in Bonn in June 2014. The volume consists of an editorial introduction and nine research papers reflecting a range of approaches to understanding past events, including modelling, analysis of historical data and studies that focus on a consistent approach to collection and analysis of data from different areas. The HEX project, which generated the conference in Bonn, adopted a multidisciplinary approach and this is reflected in the collection of papers, which emphasise the importance of combining a range of approaches and analyses as tools for decoding both landscapes and processes.
A non-iterative extension of the multivariate random effects meta-analysis.
Makambi, Kepher H; Seung, Hyunuk
2015-01-01
Multivariate methods in meta-analysis are becoming popular and more accepted in biomedical research despite computational issues in some of the techniques. A number of approaches, both iterative and non-iterative, have been proposed including the multivariate DerSimonian and Laird method by Jackson et al. (2010), which is non-iterative. In this study, we propose an extension of the method by Hartung and Makambi (2002) and Makambi (2001) to multivariate situations. A comparison of the bias and mean square error from a simulation study indicates that, in some circumstances, the proposed approach perform better than the multivariate DerSimonian-Laird approach. An example is presented to demonstrate the application of the proposed approach.
Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana
2016-01-01
Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment effects are unclear. PMID:27531725
Building a Practical Natural Laminar Flow Design Capability
NASA Technical Reports Server (NTRS)
Campbell, Richard L.; Lynde, Michelle N.
2017-01-01
A preliminary natural laminar flow (NLF) design method that has been developed and applied to supersonic and transonic wings with moderate-to-high leading-edge sweeps at flight Reynolds numbers is further extended and evaluated in this paper. The modular design approach uses a knowledge-based design module linked with different flow solvers and boundary layer stability analysis methods to provide a multifidelity capability for NLF analysis and design. An assessment of the effects of different options for stability analysis is included using pressures and geometry from an NLF wing designed for the Common Research Model (CRM). Several extensions to the design module are described, including multiple new approaches to design for controlling attachment line contamination and transition. Finally, a modification to the NLF design algorithm that allows independent control of Tollmien-Schlichting (TS) and cross flow (CF) modes is proposed. A preliminary evaluation of the TS-only option applied to the design of an NLF nacelle for the CRM is performed that includes the use of a low-fidelity stability analysis directly in the design module.
Dziak, John J.; Bray, Bethany C.; Zhang, Jieting; Zhang, Minqiang; Lanza, Stephanie T.
2016-01-01
Several approaches are available for estimating the relationship of latent class membership to distal outcomes in latent profile analysis (LPA). A three-step approach is commonly used, but has problems with estimation bias and confidence interval coverage. Proposed improvements include the correction method of Bolck, Croon, and Hagenaars (BCH; 2004), Vermunt’s (2010) maximum likelihood (ML) approach, and the inclusive three-step approach of Bray, Lanza, & Tan (2015). These methods have been studied in the related case of latent class analysis (LCA) with categorical indicators, but not as well studied for LPA with continuous indicators. We investigated the performance of these approaches in LPA with normally distributed indicators, under different conditions of distal outcome distribution, class measurement quality, relative latent class size, and strength of association between latent class and the distal outcome. The modified BCH implemented in Latent GOLD had excellent performance. The maximum likelihood and inclusive approaches were not robust to violations of distributional assumptions. These findings broadly agree with and extend the results presented by Bakk and Vermunt (2016) in the context of LCA with categorical indicators. PMID:28630602
Criteria for Developing a Successful Privatization Project
1989-05-01
conceptualization and planning are required when pursuing privatization projects. In fact, privatization project proponents need to know how to...selection of projects for analysis, methods of acquiring information about these projects, and the analysis framwork . Chapter IV includes the analysis. A...performed an analysis to determine cormion conceptual and creative approaches and lessons learned. This analysis was then used to develop criteria for
Profiling a Periodicals Collection
ERIC Educational Resources Information Center
Bolgiano, Christina E.; King, Mary Kathryn
1978-01-01
Libraries need solid information upon which to base collection development decisions. Specific evaluative methods for determining scope, access, and usefullness are described. Approaches used for data collection include analysis of interlibrary loan requests, comparison with major bibliographies, and analysis of accessibility through available…
Hyperspectral data analysis procedures with reduced sensitivity to noise
NASA Technical Reports Server (NTRS)
Landgrebe, David A.
1993-01-01
Multispectral sensor systems have become steadily improved over the years in their ability to deliver increased spectral detail. With the advent of hyperspectral sensors, including imaging spectrometers, this technology is in the process of taking a large leap forward, thus providing the possibility of enabling delivery of much more detailed information. However, this direction of development has drawn even more attention to the matter of noise and other deleterious effects in the data, because reducing the fundamental limitations of spectral detail on information collection raises the limitations presented by noise to even greater importance. Much current effort in remote sensing research is thus being devoted to adjusting the data to mitigate the effects of noise and other deleterious effects. A parallel approach to the problem is to look for analysis approaches and procedures which have reduced sensitivity to such effects. We discuss some of the fundamental principles which define analysis algorithm characteristics providing such reduced sensitivity. One such analysis procedure including an example analysis of a data set is described, illustrating this effect.
Integrative prescreening in analysis of multiple cancer genomic studies
2012-01-01
Background In high throughput cancer genomic studies, results from the analysis of single datasets often suffer from a lack of reproducibility because of small sample sizes. Integrative analysis can effectively pool and analyze multiple datasets and provides a cost effective way to improve reproducibility. In integrative analysis, simultaneously analyzing all genes profiled may incur high computational cost. A computationally affordable remedy is prescreening, which fits marginal models, can be conducted in a parallel manner, and has low computational cost. Results An integrative prescreening approach is developed for the analysis of multiple cancer genomic datasets. Simulation shows that the proposed integrative prescreening has better performance than alternatives, particularly including prescreening with individual datasets, an intensity approach and meta-analysis. We also analyze multiple microarray gene profiling studies on liver and pancreatic cancers using the proposed approach. Conclusions The proposed integrative prescreening provides an effective way to reduce the dimensionality in cancer genomic studies. It can be coupled with existing analysis methods to identify cancer markers. PMID:22799431
Treatment of Selective Mutism: A Best-Evidence Synthesis.
ERIC Educational Resources Information Center
Stone, Beth Pionek; Kratochwill, Thomas R.; Sladezcek, Ingrid; Serlin, Ronald C.
2002-01-01
Presents systematic analysis of the major treatment approaches used for selective mutism. Based on nonparametric statistical tests of effect sizes, major findings include the following: treatment of selective mutism is more effective than no treatment; behaviorally oriented treatment approaches are more effective than no treatment; and no…
A Systems Analysis Role Play Case: We Sell Stuff, Inc.
ERIC Educational Resources Information Center
Mitri, Michel; Cole, Carey
2007-01-01
Most systems development projects incorporate some sort of life cycle approach in their development. Whether the development methodology involves a traditional life cycle, prototyping, rapid application development, or some other approach, the first step usually involves a system investigation, which includes problem identification, feasibility…
Lee, Jihoon; Fredriksson, David W.; DeCew, Judson; Drach, Andrew; Yim, Solomon C.
2018-01-01
This study provides an engineering approach for designing an aquaculture cage system for use in constructed channel flow environments. As sustainable aquaculture has grown globally, many novel techniques have been introduced such as those implemented in the global Atlantic salmon industry. The advent of several highly sophisticated analysis software systems enables the development of such novel engineering techniques. These software systems commonly include three-dimensional (3D) drafting, computational fluid dynamics, and finite element analysis. In this study, a combination of these analysis tools is applied to evaluate a conceptual aquaculture system for potential deployment in a power plant effluent channel. The channel is supposedly clean; however, it includes elevated water temperatures and strong currents. The first portion of the analysis includes the design of a fish cage system with specific net solidities using 3D drafting techniques. Computational fluid dynamics is then applied to evaluate the flow reduction through the system from the previously generated solid models. Implementing the same solid models, a finite element analysis is performed on the critical components to assess the material stresses produced by the drag force loads that are calculated from the fluid velocities. PMID:29897954
NASA Technical Reports Server (NTRS)
Leonard, J. I.; White, R. J.; Rummel, J. A.
1980-01-01
An approach was developed to aid in the integration of many of the biomedical findings of space flight, using systems analysis. The mathematical tools used in accomplishing this task include an automated data base, a biostatistical and data analysis system, and a wide variety of mathematical simulation models of physiological systems. A keystone of this effort was the evaluation of physiological hypotheses using the simulation models and the prediction of the consequences of these hypotheses on many physiological quantities, some of which were not amenable to direct measurement. This approach led to improvements in the model, refinements of the hypotheses, a tentative integrated hypothesis for adaptation to weightlessness, and specific recommendations for new flight experiments.
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.
2009-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D
2008-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.
A Passive System Reliability Analysis for a Station Blackout
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia; Bucknor, Matthew; Grabaskas, David
2015-05-03
The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less
Hardware Acceleration for Cyber Security
2010-11-01
perform different approaches. It includes behavioral analysis, by means of NetFlow monitoring, as well as packet content analysis, so called Deep...Interface (API). The example of such application is NetFlow exporter described in [5]. • We provide modified libpcap library using libsze2 API. This...cards. The software applications using NIFIC include FlowMon NetFlow /IPFIX generator, Wireshark packet analyzer, iptables - Linux kernel firewall, deep
Engineering design, stress and thermal analysis, and documentation for SATS program
NASA Technical Reports Server (NTRS)
1973-01-01
An in-depth analysis and mechanical design of the solar array stowage and deployment arrangements for use in Small Applications Technology Satellite spacecraft is presented. Alternate approaches for the major elements of work are developed and evaluated. Elements include array stowage and deployment arrangements, the spacecraft and array behavior in the spacecraft despin mode, and the design of the main hinge and segment hinge assemblies. Feasibility calculations are performed and the preferred approach is identified.
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann
1988-01-01
Several Laboratory software development projects that followed nonstandard development processes, which were hybrids of incremental development and prototyping, are being studied. Factors in the project environment leading to the decision to use a nonstandard development process and affecting its success are analyzed. A simple characterization of project environment based on this analysis is proposed, together with software development approaches which have been found effective for each category. These approaches include both documentation and review requirements.
Howes, Andrew; Lewis, Richard L; Vera, Alonso
2009-10-01
The authors assume that individuals adapt rationally to a utility function given constraints imposed by their cognitive architecture and the local task environment. This assumption underlies a new approach to modeling and understanding cognition-cognitively bounded rational analysis-that sharpens the predictive acuity of general, integrated theories of cognition and action. Such theories provide the necessary computational means to explain the flexible nature of human behavior but in doing so introduce extreme degrees of freedom in accounting for data. The new approach narrows the space of predicted behaviors through analysis of the payoff achieved by alternative strategies, rather than through fitting strategies and theoretical parameters to data. It extends and complements established approaches, including computational cognitive architectures, rational analysis, optimal motor control, bounded rationality, and signal detection theory. The authors illustrate the approach with a reanalysis of an existing account of psychological refractory period (PRP) dual-task performance and the development and analysis of a new theory of ordered dual-task responses. These analyses yield several novel results, including a new understanding of the role of strategic variation in existing accounts of PRP and the first predictive, quantitative account showing how the details of ordered dual-task phenomena emerge from the rational control of a cognitive system subject to the combined constraints of internal variance, motor interference, and a response selection bottleneck.
Utilization of the Building-Block Approach in Structural Mechanics Research
NASA Technical Reports Server (NTRS)
Rouse, Marshall; Jegley, Dawn C.; McGowan, David M.; Bush, Harold G.; Waters, W. Allen
2005-01-01
In the last 20 years NASA has worked in collaboration with industry to develop enabling technologies needed to make aircraft safer and more affordable, extend their lifetime, improve their reliability, better understand their behavior, and reduce their weight. To support these efforts, research programs starting with ideas and culminating in full-scale structural testing were conducted at the NASA Langley Research Center. Each program contained development efforts that (a) started with selecting the material system and manufacturing approach; (b) moved on to experimentation and analysis of small samples to characterize the system and quantify behavior in the presence of defects like damage and imperfections; (c) progressed on to examining larger structures to examine buckling behavior, combined loadings, and built-up structures; and (d) finally moved to complicated subcomponents and full-scale components. Each step along the way was supported by detailed analysis, including tool development, to prove that the behavior of these structures was well-understood and predictable. This approach for developing technology became known as the "building-block" approach. In the Advanced Composites Technology Program and the High Speed Research Program the building-block approach was used to develop a true understanding of the response of the structures involved through experimentation and analysis. The philosophy that if the structural response couldn't be accurately predicted, it wasn't really understood, was critical to the progression of these programs. To this end, analytical techniques including closed-form and finite elements were employed and experimentation used to verify assumptions at each step along the way. This paper presents a discussion of the utilization of the building-block approach described previously in structural mechanics research and development programs at NASA Langley Research Center. Specific examples that illustrate the use of this approach are included from recent research and development programs for both subsonic and supersonic transports.
Multivariate analysis: A statistical approach for computations
NASA Astrophysics Data System (ADS)
Michu, Sachin; Kaushik, Vandana
2014-10-01
Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.
Technology Assessment and Policy Analysis
ERIC Educational Resources Information Center
Majone, Giandomenico
1977-01-01
Argues that the application of policy analysis to technology assessment requires the abandonment of stereotyped approaches and a reformulation of analytical paradigms to include consideration of institutional constraints. Available from: Elsevier Scientific Publishing Company, Box 211, Amsterdam, the Netherlands, single copies available.…
Developments in Sampling and Analysis Instrumentation for Stationary Sources
ERIC Educational Resources Information Center
Nader, John S.
1973-01-01
Instrumentation for the measurement of pollutant emissions is considered including sample-site selection, sample transport, sample treatment, sample analysis, and data reduction, display, and interpretation. Measurement approaches discussed involve sample extraction from within the stack and electro-optical methods. (BL)
Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods
Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.
2017-01-01
The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537
2012-01-01
Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the “intermediate” outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading. By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research. PMID:23234603
Petrou, Stavros; Kwon, Joseph; Madan, Jason
2018-05-10
Economic analysts are increasingly likely to rely on systematic reviews and meta-analyses of health state utility values to inform the parameter inputs of decision-analytic modelling-based economic evaluations. Beyond the context of economic evaluation, evidence from systematic reviews and meta-analyses of health state utility values can be used to inform broader health policy decisions. This paper provides practical guidance on how to conduct a systematic review and meta-analysis of health state utility values. The paper outlines a number of stages in conducting a systematic review, including identifying the appropriate evidence, study selection, data extraction and presentation, and quality and relevance assessment. The paper outlines three broad approaches that can be used to synthesise multiple estimates of health utilities for a given health state or condition, namely fixed-effect meta-analysis, random-effects meta-analysis and mixed-effects meta-regression. Each approach is illustrated by a synthesis of utility values for a hypothetical decision problem, and software code is provided. The paper highlights a number of methodological issues pertinent to the conduct of meta-analysis or meta-regression. These include the importance of limiting synthesis to 'comparable' utility estimates, for example those derived using common utility measurement approaches and sources of valuation; the effects of reliance on limited or poorly reported published data from primary utility assessment studies; the use of aggregate outcomes within analyses; approaches to generating measures of uncertainty; handling of median utility values; challenges surrounding the disentanglement of utility estimates collected serially within the context of prospective observational studies or prospective randomised trials; challenges surrounding the disentanglement of intervention effects; and approaches to measuring model validity. Areas of methodological debate and avenues for future research are highlighted.
Andrews, Ross N; Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan
2018-02-01
X-ray photon correlation spectroscopy (XPCS), an extension of dynamic light scattering (DLS) in the X-ray regime, detects temporal intensity fluctuations of coherent speckles and provides scattering vector-dependent sample dynamics at length scales smaller than DLS. The penetrating power of X-rays enables probing dynamics in a broad array of materials with XPCS, including polymers, glasses and metal alloys, where attempts to describe the dynamics with a simple exponential fit usually fails. In these cases, the prevailing XPCS data analysis approach employs stretched or compressed exponential decay functions (Kohlrausch functions), which implicitly assume homogeneous dynamics. In this paper, we propose an alternative analysis scheme based upon inverse Laplace or Gaussian transformation for elucidating heterogeneous distributions of dynamic time scales in XPCS, an approach analogous to the CONTIN algorithm widely accepted in the analysis of DLS from polydisperse and multimodal systems. Using XPCS data measured from colloidal gels, we demonstrate the inverse transform approach reveals hidden multimodal dynamics in materials, unleashing the full potential of XPCS.
Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan
2018-01-01
X-ray photon correlation spectroscopy (XPCS), an extension of dynamic light scattering (DLS) in the X-ray regime, detects temporal intensity fluctuations of coherent speckles and provides scattering vector-dependent sample dynamics at length scales smaller than DLS. The penetrating power of X-rays enables probing dynamics in a broad array of materials with XPCS, including polymers, glasses and metal alloys, where attempts to describe the dynamics with a simple exponential fit usually fails. In these cases, the prevailing XPCS data analysis approach employs stretched or compressed exponential decay functions (Kohlrausch functions), which implicitly assume homogeneous dynamics. In this paper, we propose an alternative analysis scheme based upon inverse Laplace or Gaussian transformation for elucidating heterogeneous distributions of dynamic time scales in XPCS, an approach analogous to the CONTIN algorithm widely accepted in the analysis of DLS from polydisperse and multimodal systems. Using XPCS data measured from colloidal gels, we demonstrate the inverse transform approach reveals hidden multimodal dynamics in materials, unleashing the full potential of XPCS. PMID:29875506
A comparison of regional flood frequency analysis approaches in a simulation framework
NASA Astrophysics Data System (ADS)
Ganora, D.; Laio, F.
2016-07-01
Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve at ungauged (or scarcely gauged) sites. Different RFA approaches exist, depending on the way the information is transferred to the site of interest, but it is not clear in the literature if a specific method systematically outperforms the others. The aim of this study is to provide a framework wherein carrying out the intercomparison by building up a virtual environment based on synthetically generated data. The considered regional approaches include: (i) a unique regional curve for the whole region; (ii) a multiple-region model where homogeneous subregions are determined through cluster analysis; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially smooth estimation procedure where the parameters of the regional model vary continuously along the space. Virtual environments are generated considering different patterns of heterogeneity, including step change and smooth variations. If the region is heterogeneous, with the parent distribution changing continuously within the region, the spatially smooth regional approach outperforms the others, with overall errors 10-50% lower than the other methods. In the case of a step-change, the spatially smooth and clustering procedures perform similarly if the heterogeneity is moderate, while clustering procedures work better when the step-change is severe. To extend our findings, an extensive sensitivity analysis has been performed to investigate the effect of sample length, number of virtual stations, return period of the predicted quantile, variability of the scale parameter of the parent distribution, number of predictor variables and different parent distribution. Overall, the spatially smooth approach appears as the most robust approach as its performances are more stable across different patterns of heterogeneity, especially when short records are considered.
Defense Small Business Innovation Research Program (SBIR) Abstracts of Phase 1 Awards 1983.
1984-04-06
STRATEGY, THE INTERPLAY BETWEEN ELECTROMAGNETIC EMISSION CON- TROL AND FLEET OPERATION. THE TECHNICAL APPROACH IS BASED ON AN ANALYSIS OF EMCON...THEM AND A BOTTOM UP APPROACH . THE REQUIREMENTS AND ARCHITECTURAL ASPECTS WILL BE EXPLORED FROM THE MORE ENCOMPASSING PERSPECTIVE OF THE TOTAL...AN AI APPROACH TO INFORMATION FUSION INCLUDING KNOW- LEDGE ORGANIZATION, HYPOTHESIS REPRESENTATIVES, DOMAIN KNOWLEDGE RE- PRESENTATION, HYPOTHESIS
ERIC Educational Resources Information Center
Haber-Curran, Paige; Tillapaugh, Daniel
2013-01-01
This qualitative study examines student learning about leadership across three sections of a capstone course in an undergraduate leadership minor. Qualitative methods were informed by exploratory case study analysis and phenomenology. Student-centered and inquiry-focused pedagogical approaches, including case-in-point, action inquiry, and…
A Participatory Design Approach for a Mobile App-Based Personal Response System
ERIC Educational Resources Information Center
Song, Donggil; Oh, Eun Young
2016-01-01
This study reports on a participatory design approach including the design, development, implementation, and evaluation of a mobile app-based personal response system (PRS). The first cycle formulated initial design principles through context and needs analysis; the second utilized the collaboration with instructors and experts embodying specific…
An Instructional Approach to Modeling in Microevolution.
ERIC Educational Resources Information Center
Thompson, Steven R.
1988-01-01
Describes an approach to teaching population genetics and evolution and some of the ways models can be used to enhance understanding of the processes being studied. Discusses the instructional plan, and the use of models including utility programs and analysis with models. Provided are a basic program and sample program outputs. (CW)
Field Theory in Organizational Psychology: An Analysis of Theoretical Approaches in Leadership.
ERIC Educational Resources Information Center
Garcia, Joseph E.
This literature review examines Kurt Lewin's influence in leadership psychology. Characteristics of field theory are described in detail and utilized in analyzing leadership research, including the trait approach, leader behavior studies, contingency theory, path-goal theory, and leader decision theory. Important trends in leadership research are…
Barzyk, Timothy M.; Wilson, Sacoby; Wilson, Anthony
2015-01-01
Community, state, and federal approaches to conventional and cumulative risk assessment (CRA) were described and compared to assess similarities and differences, and develop recommendations for a consistent CRA approach, acceptable across each level as a rigorous scientific methodology, including partnership formation and solution development as necessary practices. Community, state, and federal examples were described and then summarized based on their adherence to CRA principles of: (1) planning, scoping, and problem formulation; (2) risk analysis and ranking, and (3) risk characterization, interpretation, and management. While each application shared the common goal of protecting human health and the environment, they adopted different approaches to achieve this. For a specific project-level analysis of a particular place or instance, this may be acceptable, but to ensure long-term applicability and transferability to other projects, recommendations for developing a consistent approach to CRA are provided. This approach would draw from best practices, risk assessment and decision analysis sciences, and historical lessons learned to provide results in an understandable and accepted manner by all entities. This approach is intended to provide a common ground around which to develop CRA methods and approaches that can be followed at all levels. PMID:25918910
Dehesh, Tania; Zare, Najaf; Ayatollahi, Seyyed Mohammad Taghi
2015-01-01
Univariate meta-analysis (UM) procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS) method as a multivariate meta-analysis approach. We evaluated the efficiency of four new approaches including zero correlation (ZC), common correlation (CC), estimated correlation (EC), and multivariate multilevel correlation (MMC) on the estimation bias, mean square error (MSE), and 95% probability coverage of the confidence interval (CI) in the synthesis of Cox proportional hazard models coefficients in a simulation study. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients.
Managed Development Environment Successes for MSFC's VIPA Team
NASA Technical Reports Server (NTRS)
Finckenor, Jeff; Corder, Gary; Owens, James; Meehan, Jim; Tidwell, Paul H.
2005-01-01
This paper outlines the best practices of the Vehicle Design Team for VIPA. The functions of the VIPA Vehicle Design (VVD) discipline team are to maintain the controlled reference geometry and provide linked, simplified geometry for each of the other discipline analyses. The core of the VVD work, and the approach for VVD s first task of controlling the reference geometry, involves systems engineering, top-down, layout-based CAD modeling within a Product Data Manager (PDM) development environment. The top- down approach allows for simple control of very large, integrated assemblies and greatly enhances the ability to generate trade configurations and reuse data. The second VVD task, model simplification for analysis, is handled within the managed environment through application of the master model concept. In this approach, there is a single controlling, or master, product definition dataset. Connected to this master model are reference datasets with live geometric and expression links. The referenced models can be for drawings, manufacturing, visualization, embedded analysis, or analysis simplification. A discussion of web based interaction, including visualization, between the design and other disciplines is included. Demonstrated examples are cited, including the Space Launch Initiative development cycle, the Saturn V systems integration and verification cycle, an Orbital Space Plane study, and NASA Exploration Office studies of Shuttle derived and clean sheet launch vehicles. The VIPA Team has brought an immense amount of detailed data to bear on program issues. A central piece of that success has been the Managed Development Environment and the VVD Team approach to modeling.
Schwingshackl, Lukas; Chaimani, Anna; Schwedhelm, Carolina; Toledo, Estefania; Pünsch, Marina; Hoffmann, Georg; Boeing, Heiner
2018-05-02
Pairwise meta-analyses have shown beneficial effects of individual dietary approaches on blood pressure but their comparative effects have not been established. Therefore we performed a systematic review of different dietary intervention trials and estimated the aggregate blood pressure effects through network meta-analysis including hypertensive and pre-hypertensive patients. PubMed, Cochrane CENTRAL, and Google Scholar were searched until June 2017. The inclusion criteria were defined as follows: i) Randomized trial with a dietary approach; ii) hypertensive and pre-hypertensive adult patients; and iii) minimum intervention period of 12 weeks. In order to determine the pooled effect of each intervention relative to each of the other intervention for both diastolic and systolic blood pressure (SBP and DBP), random effects network meta-analysis was performed. A total of 67 trials comparing 13 dietary approaches (DASH, low-fat, moderate-carbohydrate, high-protein, low-carbohydrate, Mediterranean, Palaeolithic, vegetarian, low-GI/GL, low-sodium, Nordic, Tibetan, and control) enrolling 17,230 participants were included. In the network meta-analysis, the DASH, Mediterranean, low-carbohydrate, Palaeolithic, high-protein, low-glycaemic index, low-sodium, and low-fat dietary approaches were significantly more effective in reducing SBP (-8.73 to -2.32 mmHg) and DBP (-4.85 to -1.27 mmHg) compared to a control diet. According to the SUCRAs, the DASH diet was ranked the most effective dietary approach in reducing SBP (90%) and DBP (91%), followed by the Palaeolithic, and the low-carbohydrate diet (ranked 3rd for SBP) or the Mediterranean diet (ranked 3rd for DBP). For most comparisons, the credibility of evidence was rated very low to moderate, with the exception for the DASH vs. the low-fat dietary approach for which the quality of evidence was rated high. The present network meta-analysis suggests that the DASH dietary approach might be the most effective dietary measure to reduce blood pressure among hypertensive and pre-hypertensive patients based on high quality evidence.
A chemical proteomics approach for global analysis of lysine monomethylome profiling.
Wu, Zhixiang; Cheng, Zhongyi; Sun, Mingwei; Wan, Xuelian; Liu, Ping; He, Tieming; Tan, Minjia; Zhao, Yingming
2015-02-01
Methylation of lysine residues on histone proteins is known to play an important role in chromatin structure and function. However, non-histone protein substrates of this modification remain largely unknown. An effective approach for system-wide analysis of protein lysine methylation, particularly lysine monomethylation, is lacking. Here we describe a chemical proteomics approach for global screening for monomethyllysine substrates, involving chemical propionylation of monomethylated lysine, affinity enrichment of the modified monomethylated peptides, and HPLC/MS/MS analysis. Using this approach, we identified with high confidence 446 lysine monomethylation sites in 398 proteins, including three previously unknown histone monomethylation marks, representing the largest data set of protein lysine monomethylation described to date. Our data not only confirms previously discovered lysine methylation substrates in the nucleus and spliceosome, but also reveals new substrates associated with diverse biological processes. This method hence offers a powerful approach for dynamic study of protein lysine monomethylation under diverse cellular conditions and in human diseases. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Incremental Upgrade of Legacy Systems (IULS)
2001-04-01
analysis task employed SEI’s Feature-Oriented Domain Analysis methodology (see FODA reference) and included several phases: • Context Analysis • Establish...Legacy, new Host and upgrade system and software. The Feature Oriented Domain Analysis approach ( FODA , see SUM References) was used for this step...Feature-Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR- 21, ESD-90-TR-222); Software Engineering Institute, Carnegie Mellon University
Green supplier selection: a new genetic/immune strategy with industrial application
NASA Astrophysics Data System (ADS)
Kumar, Amit; Jain, Vipul; Kumar, Sameer; Chandra, Charu
2016-10-01
With the onset of the 'climate change movement', organisations are striving to include environmental criteria into the supplier selection process. This article hybridises a Green Data Envelopment Analysis (GDEA)-based approach with a new Genetic/Immune Strategy for Data Envelopment Analysis (GIS-DEA). A GIS-DEA approach provides a different view to solving multi-criteria decision making problems using data envelopment analysis (DEA) by considering DEA as a multi-objective optimisation problem with efficiency as one objective and proximity of solution to decision makers' preferences as the other objective. The hybrid approach called GIS-GDEA is applied here to a well-known automobile spare parts manufacturer in India and the results presented. User validation developed based on specific set of criteria suggests that the supplier selection process with GIS-GDEA is more practical than other approaches in a current industrial scenario with multiple decision makers.
Faggion, Clovis M; Huda, Fahd; Wasiak, Jason
2014-06-01
To evaluate the methodological approaches used to assess the quality of studies included in systematic reviews (SRs) in periodontology and implant dentistry. Two electronic databases (PubMed and Cochrane Database of Systematic Reviews) were searched independently to identify SRs examining interventions published through 2 September 2013. The reference lists of included SRs and records of 10 specialty dental journals were searched manually. Methodological approaches were assessed using seven criteria based on the Cochrane Handbook for Systematic Reviews of Interventions. Temporal trends in methodological quality were also explored. Of the 159 SRs with meta-analyses included in the analysis, 44 (28%) reported the use of domain-based tools, 15 (9%) reported the use of checklists and 7 (4%) reported the use of scales. Forty-two (26%) SRs reported use of more than one tool. Criteria were met heterogeneously; authors of 15 (9%) publications incorporated the quality of evidence of primary studies into SRs, whereas 69% of SRs reported methodological approaches in the Materials/Methods section. Reporting of four criteria was significantly better in recent (2010-2013) than in previous publications. The analysis identified several methodological limitations of approaches used to assess evidence in studies included in SRs in periodontology and implant dentistry. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Econutrition and utilization of food-based approaches for nutritional health.
Blasbalg, Tanya L; Wispelwey, Bram; Deckelbaum, Richard J
2011-03-01
Macronutrient and micronutrient deficiencies continue to have a detrimental impact in lower-income countries, with significant costs in morbidity, mortality, and productivity. Food is the primary source of the nutrients needed to sustain life, and it is the essential component that links nutrition, agriculture, and ecology in the econutrition framework. To present evidence and analysis of food-based approaches for improving nutritional and health outcomes in lower-income countries. Review of existing literature. The benefits of food-based approaches may include nutritional improvement, food security, cost-effectiveness, sustainability, and human productivity. Food-based approaches require additional inputs, including nutrition education, gender considerations, and agricultural planning. Although some forms of malnutrition can be addressed via supplements, food-based approaches are optimal to achieve sustainable solutions to multiple nutrient deficiencies.
Recent advances in applying mass spectrometry and systems biology to determine brain dynamics.
Scifo, Enzo; Calza, Giulio; Fuhrmann, Martin; Soliymani, Rabah; Baumann, Marc; Lalowski, Maciej
2017-06-01
Neurological disorders encompass various pathologies which disrupt normal brain physiology and function. Poor understanding of their underlying molecular mechanisms and their societal burden argues for the necessity of novel prevention strategies, early diagnostic techniques and alternative treatment options to reduce the scale of their expected increase. Areas covered: This review scrutinizes mass spectrometry based approaches used to investigate brain dynamics in various conditions, including neurodegenerative and neuropsychiatric disorders. Different proteomics workflows for isolation/enrichment of specific cell populations or brain regions, sample processing; mass spectrometry technologies, for differential proteome quantitation, analysis of post-translational modifications and imaging approaches in the brain are critically deliberated. Future directions, including analysis of cellular sub-compartments, targeted MS platforms (selected/parallel reaction monitoring) and use of mass cytometry are also discussed. Expert commentary: Here, we summarize and evaluate current mass spectrometry based approaches for determining brain dynamics in health and diseases states, with a focus on neurological disorders. Furthermore, we provide insight on current trends and new MS technologies with potential to improve this analysis.
A program to form a multidisciplinary data base and analysis for dynamic systems
NASA Technical Reports Server (NTRS)
Taylor, L. W.; Suit, W. T.; Mayo, M. H.
1984-01-01
Diverse sets of experimental data and analysis programs have been assembled for the purpose of facilitating research in systems identification, parameter estimation and state estimation techniques. The data base analysis programs are organized to make it easy to compare alternative approaches. Additional data and alternative forms of analysis will be included as they become available.
Bao, Jie; Hou, Zhangshuan; Huang, Maoyi; ...
2015-12-04
Here, effective sensitivity analysis approaches are needed to identify important parameters or factors and their uncertainties in complex Earth system models composed of multi-phase multi-component phenomena and multiple biogeophysical-biogeochemical processes. In this study, the impacts of 10 hydrologic parameters in the Community Land Model on simulations of runoff and latent heat flux are evaluated using data from a watershed. Different metrics, including residual statistics, the Nash-Sutcliffe coefficient, and log mean square error, are used as alternative measures of the deviations between the simulated and field observed values. Four sensitivity analysis (SA) approaches, including analysis of variance based on the generalizedmore » linear model, generalized cross validation based on the multivariate adaptive regression splines model, standardized regression coefficients based on a linear regression model, and analysis of variance based on support vector machine, are investigated. Results suggest that these approaches show consistent measurement of the impacts of major hydrologic parameters on response variables, but with differences in the relative contributions, particularly for the secondary parameters. The convergence behaviors of the SA with respect to the number of sampling points are also examined with different combinations of input parameter sets and output response variables and their alternative metrics. This study helps identify the optimal SA approach, provides guidance for the calibration of the Community Land Model parameters to improve the model simulations of land surface fluxes, and approximates the magnitudes to be adjusted in the parameter values during parametric model optimization.« less
Increasing Effectiveness in Teaching Ethics to Undergraduate Business Students.
ERIC Educational Resources Information Center
Lampe, Marc
1997-01-01
Traditional approaches to teaching business ethics (philosophical analysis, moral quandaries, executive cases) may not be effective in persuading undergraduates of the importance of ethical behavior. Better techniques include values education, ethical decision-making models, analysis of ethical conflicts, and role modeling. (SK)
2014-01-01
Background To improve quality of care and patient outcomes, health system decision-makers need to identify and implement effective interventions. An increasing number of systematic reviews document the effects of quality improvement programs to assist decision-makers in developing new initiatives. However, limitations in the reporting of primary studies and current meta-analysis methods (including approaches for exploring heterogeneity) reduce the utility of existing syntheses for health system decision-makers. This study will explore the role of innovative meta-analysis approaches and the added value of enriched and updated data for increasing the utility of systematic reviews of complex interventions. Methods/Design We will use the dataset from our recent systematic review of 142 randomized trials of diabetes quality improvement programs to evaluate novel approaches for exploring heterogeneity. These will include exploratory methods, such as multivariate meta-regression analyses and all-subsets combinatorial meta-analysis. We will then update our systematic review to include new trials and enrich the dataset by surveying authors of all included trials. In doing so, we will explore the impact of variables not, reported in previous publications, such as details of study context, on the effectiveness of the intervention. We will use innovative analytical methods on the enriched and updated dataset to identify key success factors in the implementation of quality improvement interventions for diabetes. Decision-makers will be involved throughout to help identify and prioritize variables to be explored and to aid in the interpretation and dissemination of results. Discussion This study will inform future systematic reviews of complex interventions and describe the value of enriching and updating data for exploring heterogeneity in meta-analysis. It will also result in an updated comprehensive systematic review of diabetes quality improvement interventions that will be useful to health system decision-makers in developing interventions to improve outcomes for people with diabetes. Systematic review registration PROSPERO registration no. CRD42013005165 PMID:25115289
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
A projection-free method for representing plane-wave DFT results in an atom-centered basis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunnington, Benjamin D.; Schmidt, J. R., E-mail: schmidt@chem.wisc.edu
2015-09-14
Plane wave density functional theory (DFT) is a powerful tool for gaining accurate, atomic level insight into bulk and surface structures. Yet, the delocalized nature of the plane wave basis set hinders the application of many powerful post-computation analysis approaches, many of which rely on localized atom-centered basis sets. Traditionally, this gap has been bridged via projection-based techniques from a plane wave to atom-centered basis. We instead propose an alternative projection-free approach utilizing direct calculation of matrix elements of the converged plane wave DFT Hamiltonian in an atom-centered basis. This projection-free approach yields a number of compelling advantages, including strictmore » orthonormality of the resulting bands without artificial band mixing and access to the Hamiltonian matrix elements, while faithfully preserving the underlying DFT band structure. The resulting atomic orbital representation of the Kohn-Sham wavefunction and Hamiltonian provides a gateway to a wide variety of analysis approaches. We demonstrate the utility of the approach for a diverse set of chemical systems and example analysis approaches.« less
Landau, Sabine; Emsley, Richard; Dunn, Graham
2018-06-01
Random allocation avoids confounding bias when estimating the average treatment effect. For continuous outcomes measured at post-treatment as well as prior to randomisation (baseline), analyses based on (A) post-treatment outcome alone, (B) change scores over the treatment phase or (C) conditioning on baseline values (analysis of covariance) provide unbiased estimators of the average treatment effect. The decision to include baseline values of the clinical outcome in the analysis is based on precision arguments, with analysis of covariance known to be most precise. Investigators increasingly carry out explanatory analyses to decompose total treatment effects into components that are mediated by an intermediate continuous outcome and a non-mediated part. Traditional mediation analysis might be performed based on (A) post-treatment values of the intermediate and clinical outcomes alone, (B) respective change scores or (C) conditioning on baseline measures of both intermediate and clinical outcomes. Using causal diagrams and Monte Carlo simulation, we investigated the performance of the three competing mediation approaches. We considered a data generating model that included three possible confounding processes involving baseline variables: The first two processes modelled baseline measures of the clinical variable or the intermediate variable as common causes of post-treatment measures of these two variables. The third process allowed the two baseline variables themselves to be correlated due to past common causes. We compared the analysis models implied by the competing mediation approaches with this data generating model to hypothesise likely biases in estimators, and tested these in a simulation study. We applied the methods to a randomised trial of pragmatic rehabilitation in patients with chronic fatigue syndrome, which examined the role of limiting activities as a mediator. Estimates of causal mediation effects derived by approach (A) will be biased if one of the three processes involving baseline measures of intermediate or clinical outcomes is operating. Necessary assumptions for the change score approach (B) to provide unbiased estimates under either process include the independence of baseline measures and change scores of the intermediate variable. Finally, estimates provided by the analysis of covariance approach (C) were found to be unbiased under all the three processes considered here. When applied to the example, there was evidence of mediation under all methods but the estimate of the indirect effect depended on the approach used with the proportion mediated varying from 57% to 86%. Trialists planning mediation analyses should measure baseline values of putative mediators as well as of continuous clinical outcomes. An analysis of covariance approach is recommended to avoid potential biases due to confounding processes involving baseline measures of intermediate or clinical outcomes, and not simply for increased precision.
Zhang, T; Yang, M; Xiao, X; Feng, Z; Li, C; Zhou, Z; Ren, Q; Li, X
2014-03-01
Many infectious diseases exhibit repetitive or regular behaviour over time. Time-domain approaches, such as the seasonal autoregressive integrated moving average model, are often utilized to examine the cyclical behaviour of such diseases. The limitations for time-domain approaches include over-differencing and over-fitting; furthermore, the use of these approaches is inappropriate when the assumption of linearity may not hold. In this study, we implemented a simple and efficient procedure based on the fast Fourier transformation (FFT) approach to evaluate the epidemic dynamic of scarlet fever incidence (2004-2010) in China. This method demonstrated good internal and external validities and overcame some shortcomings of time-domain approaches. The procedure also elucidated the cycling behaviour in terms of environmental factors. We concluded that, under appropriate circumstances of data structure, spectral analysis based on the FFT approach may be applicable for the study of oscillating diseases.
Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Yannan; Hou, Zhangshuan; Meng, Da
2016-07-17
In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.
A CAD approach to magnetic bearing design
NASA Technical Reports Server (NTRS)
Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.
1988-01-01
A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.
Schaefbauer, Chris L; Campbell, Terrance R; Senteio, Charles; Siek, Katie A; Bakken, Suzanne; Veinot, Tiffany C
2016-01-01
Objective We compare 5 health informatics research projects that applied community-based participatory research (CBPR) approaches with the goal of extending existing CBPR principles to address issues specific to health informatics research. Materials and methods We conducted a cross-case analysis of 5 diverse case studies with 1 common element: integration of CBPR approaches into health informatics research. After reviewing publications and other case-related materials, all coauthors engaged in collaborative discussions focused on CBPR. Researchers mapped each case to an existing CBPR framework, examined each case individually for success factors and barriers, and identified common patterns across cases. Results Benefits of applying CBPR approaches to health informatics research across the cases included the following: developing more relevant research with wider impact, greater engagement with diverse populations, improved internal validity, more rapid translation of research into action, and the development of people. Challenges of applying CBPR to health informatics research included requirements to develop strong, sustainable academic-community partnerships and mismatches related to cultural and temporal factors. Several technology-related challenges, including needs to define ownership of technology outputs and to build technical capacity with community partners, also emerged from our analysis. Finally, we created several principles that extended an existing CBPR framework to specifically address health informatics research requirements. Conclusions Our cross-case analysis yielded valuable insights regarding CBPR implementation in health informatics research and identified valuable lessons useful for future CBPR-based research. The benefits of applying CBPR approaches can be significant, particularly in engaging populations that are typically underserved by health care and in designing patient-facing technology. PMID:26228766
Abraha, Iosief; Cherubini, Antonio; Cozzolino, Francesco; De Florio, Rita; Luchetta, Maria Laura; Rimland, Joseph M; Folletti, Ilenia; Marchesi, Mauro; Germani, Antonella; Orso, Massimiliano; Eusebi, Paolo; Montedori, Alessandro
2015-05-27
To examine whether deviation from the standard intention to treat analysis has an influence on treatment effect estimates of randomised trials. Meta-epidemiological study. Medline, via PubMed, searched between 2006 and 2010; 43 systematic reviews of interventions and 310 randomised trials were included. From each year searched, random selection of 5% of intervention reviews with a meta-analysis that included at least one trial that deviated from the standard intention to treat approach. Basic characteristics of the systematic reviews and randomised trials were extracted. Information on the reporting of intention to treat analysis, outcome data, risk of bias items, post-randomisation exclusions, and funding were extracted from each trial. Trials were classified as: ITT (reporting the standard intention to treat approach), mITT (reporting a deviation from the standard approach), and no ITT (reporting no approach). Within each meta-analysis, treatment effects were compared between mITT and ITT trials, and between mITT and no ITT trials. The ratio of odds ratios was calculated (value <1 indicated larger treatment effects in mITT trials than in other trial categories). 50 meta-analyses and 322 comparisons of randomised trials (from 84 ITT trials, 118 mITT trials, and 108 no ITT trials; 12 trials contributed twice to the analysis) were examined. Compared with ITT trials, mITT trials showed a larger intervention effect (pooled ratio of odds ratios 0.83 (95% confidence interval 0.71 to 0.96), P=0.01; between meta-analyses variance τ(2)=0.13). Adjustments for sample size, type of centre, funding, items of risk of bias, post-randomisation exclusions, and variance of log odds ratio yielded consistent results (0.80 (0.69 to 0.94), P=0.005; τ(2)=0.08). After exclusion of five influential studies, results remained consistent (0.85 (0.75 to 0.98); τ(2)=0.08). The comparison between mITT trials and no ITT trials showed no statistical difference between the two groups (adjusted ratio of odds ratios 0.92 (0.70 to 1.23); τ(2)=0.57). Trials that deviated from the intention to treat analysis showed larger intervention effects than trials that reported the standard approach. Where an intention to treat analysis is impossible to perform, authors should clearly report who is included in the analysis and attempt to perform multiple imputations. © Abraha et al 2015.
Tejera, Eduardo; Cruz-Monteagudo, Maykel; Burgos, Germán; Sánchez, María-Eugenia; Sánchez-Rodríguez, Aminael; Pérez-Castillo, Yunierkis; Borges, Fernanda; Cordeiro, Maria Natália Dias Soeiro; Paz-Y-Miño, César; Rebelo, Irene
2017-08-08
Preeclampsia is a multifactorial disease with unknown pathogenesis. Even when recent studies explored this disease using several bioinformatics tools, the main objective was not directed to pathogenesis. Additionally, consensus prioritization was proved to be highly efficient in the recognition of genes-disease association. However, not information is available about the consensus ability to early recognize genes directly involved in pathogenesis. Therefore our aim in this study is to apply several theoretical approaches to explore preeclampsia; specifically those genes directly involved in the pathogenesis. We firstly evaluated the consensus between 12 prioritization strategies to early recognize pathogenic genes related to preeclampsia. A communality analysis in the protein-protein interaction network of previously selected genes was done including further enrichment analysis. The enrichment analysis includes metabolic pathways as well as gene ontology. Microarray data was also collected and used in order to confirm our results or as a strategy to weight the previously enriched pathways. The consensus prioritized gene list was rationally filtered to 476 genes using several criteria. The communality analysis showed an enrichment of communities connected with VEGF-signaling pathway. This pathway is also enriched considering the microarray data. Our result point to VEGF, FLT1 and KDR as relevant pathogenic genes, as well as those connected with NO metabolism. Our results revealed that consensus strategy improve the detection and initial enrichment of pathogenic genes, at least in preeclampsia condition. Moreover the combination of the first percent of the prioritized genes with protein-protein interaction network followed by communality analysis reduces the gene space. This approach actually identifies well known genes related with pathogenesis. However, genes like HSP90, PAK2, CD247 and others included in the first 1% of the prioritized list need to be further explored in preeclampsia pathogenesis through experimental approaches.
A computer-aided approach to nonlinear control systhesis
NASA Technical Reports Server (NTRS)
Wie, Bong; Anthony, Tobin
1988-01-01
The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.
Scaling up to address data science challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Joanne R.
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Marketing approaches for OTC analgesics in Bulgaria
Petkova, Valentina; Valchanova, Velislava; Ibrahim, Adel; Nikolova, Irina; Benbasat, Niko; Dimitrov, Milen
2014-01-01
The marketing management includes analysis of market opportunities, selection of target markets, planning, developing and implementing of marketing strategies, monitoring and result control. The object of the present study was to analyse the marketing approaches applied for non-steroidal anti-inflammatory drugs (NSAIDs) in Bulgaria. The performed SWOT(planning method used to evaluate the strengths, weaknesses, opportunities, and threats) analysis for one of the leading Bulgarian manufacturers marked the complex corporative strategy for stimulating the sales of NSAIDs. The study results show that the legislation frame in the country gives an opportunity for regulation of the NSAID market in order that incorrect marketing approaches such as disloyal competition are avoided. PMID:26019521
Marketing approaches for OTC analgesics in Bulgaria.
Petkova, Valentina; Valchanova, Velislava; Ibrahim, Adel; Nikolova, Irina; Benbasat, Niko; Dimitrov, Milen
2014-03-04
The marketing management includes analysis of market opportunities, selection of target markets, planning, developing and implementing of marketing strategies, monitoring and result control. The object of the present study was to analyse the marketing approaches applied for non-steroidal anti-inflammatory drugs (NSAIDs) in Bulgaria. The performed SWOT(planning method used to evaluate the strengths, weaknesses, opportunities, and threats) analysis for one of the leading Bulgarian manufacturers marked the complex corporative strategy for stimulating the sales of NSAIDs. The study results show that the legislation frame in the country gives an opportunity for regulation of the NSAID market in order that incorrect marketing approaches such as disloyal competition are avoided.
Scaling up to address data science challenges
Wendelberger, Joanne R.
2017-04-27
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Chetty, Mersha; Kenworthy, James J; Langham, Sue; Walker, Andrew; Dunlop, William C N
2017-02-24
Opioid dependence is a chronic condition with substantial health, economic and social costs. The study objective was to conduct a systematic review of published health-economic models of opioid agonist therapy for non-prescription opioid dependence, to review the different modelling approaches identified, and to inform future modelling studies. Literature searches were conducted in March 2015 in eight electronic databases, supplemented by hand-searching reference lists and searches on six National Health Technology Assessment Agency websites. Studies were included if they: investigated populations that were dependent on non-prescription opioids and were receiving opioid agonist or maintenance therapy; compared any pharmacological maintenance intervention with any other maintenance regimen (including placebo or no treatment); and were health-economic models of any type. A total of 18 unique models were included. These used a range of modelling approaches, including Markov models (n = 4), decision tree with Monte Carlo simulations (n = 3), decision analysis (n = 3), dynamic transmission models (n = 3), decision tree (n = 1), cohort simulation (n = 1), Bayesian (n = 1), and Monte Carlo simulations (n = 2). Time horizons ranged from 6 months to lifetime. The most common evaluation was cost-utility analysis reporting cost per quality-adjusted life-year (n = 11), followed by cost-effectiveness analysis (n = 4), budget-impact analysis/cost comparison (n = 2) and cost-benefit analysis (n = 1). Most studies took the healthcare provider's perspective. Only a few models included some wider societal costs, such as productivity loss or costs of drug-related crime, disorder and antisocial behaviour. Costs to individuals and impacts on family and social networks were not included in any model. A relatively small number of studies of varying quality were found. Strengths and weaknesses relating to model structure, inputs and approach were identified across all the studies. There was no indication of a single standard emerging as a preferred approach. Most studies omitted societal costs, an important issue since the implications of drug abuse extend widely beyond healthcare services. Nevertheless, elements from previous models could together form a framework for future economic evaluations in opioid agonist therapy including all relevant costs and outcomes. This could more adequately support decision-making and policy development for treatment of non-prescription opioid dependence.
Schmidt; Marx; de Graaf AA; Wiechert; Sahm; Nielsen; Villadsen
1998-04-05
Conventional metabolic flux analysis uses the information gained from determination of measurable fluxes and a steady-state assumption for intracellular metabolites to calculate the metabolic fluxes in a given metabolic network. The determination of intracellular fluxes depends heavily on the correctness of the assumed stoichiometry including the presence of all reactions with a noticeable impact on the model metabolite balances. Determination of fluxes in complex metabolic networks often requires the inclusion of NADH and NADPH balances, which are subject to controversial debate. Transhydrogenation reactions that transfer reduction equivalents from NADH to NADPH or vice versa can usually not be included in the stoichiometric model, because they result in singularities in the stoichiometric matrix. However, it is the NADPH balance that, to a large extent, determines the calculated flux through the pentose phosphate pathway. Hence, wrong assumptions on the presence or activity of transhydrogenation reactions will result in wrong estimations of the intracellular flux distribution. Using 13C tracer experiments and NMR analysis, flux analysis can be performed on the basis of only well established stoichiometric equations and measurements of the labeling state of intracellular metabolites. Neither NADH/NADPH balancing nor assumptions on energy yields need to be included to determine the intracellular fluxes. Because metabolite balancing methods and the use of 13C labeling measurements are two different approaches to the determination of intracellular fluxes, both methods can be used to verify each other or to discuss the origin and significance of deviations in the results. Flux analysis based entirely on metabolite balancing and flux analysis, including labeling information, have been performed independently for a wild-type strain of Aspergillus oryzae producing alpha-amylase. Two different nitrogen sources, NH4+ and NO3-, have been used to investigate the influence of the NADPH requirements on the intracellular flux distribution. The two different approaches to the calculation of fluxes are compared and deviations in the results are discussed. Copyright 1998 John Wiley & Sons, Inc.
Sun, Xiang-Yao; Zhang, Xi-Nuo; Hai, Yong
2017-05-01
This study evaluated differences in outcome variables between percutaneous, traditional, and paraspinal posterior open approaches for traumatic thoracolumbar fractures without neurologic deficit. A systematic review of PubMed, Cochrane, and Embase was performed. In this meta-analysis, we conducted online searches of PubMed, Cochrane, Embase using the search terms "thoracolumbar fractures", "lumbar fractures", ''percutaneous'', "minimally invasive", ''open", "traditional", "posterior", "conventional", "pedicle screw", "sextant", and "clinical trial". The analysis was performed on individual patient data from all the studies that met the selection criteria. Clinical outcomes were expressed as risk difference for dichotomous outcomes and mean difference for continuous outcomes with 95 % confidence interval. Heterogeneity was assessed using the χ 2 test and I 2 statistics. There were 4 randomized controlled trials and 14 observational articles included in this analysis. Percutaneous approach was associated with better ODI score, less Cobb angle correction, less Cobb angle correction loss, less postoperative VBA correction, and lower infection rate compared with open approach. Percutaneous approach was also associated with shorter operative duration, longer intraoperative fluoroscopy, less postoperative VAS, and postoperative VBH% in comparison with traditional open approach. No significant difference was found in Cobb angle correction, postoperative VBA, VBA correction loss, Postoperative VBH%, VBH correction loss, and pedicle screw misplacement between percutaneous approach and open approach. There was no significant difference in operative duration, intraoperative fluoroscopy, postoperative VAS, and postoperative VBH% between percutaneous approach and paraspianl approach. The functional and the radiological outcome of percutaneous approach would be better than open approach in the long term. Although trans-muscular spatium approach belonged to open fixation methods, it was strictly defined as less invasive approach, which provided less injury to the paraspinal muscles and better reposition effect.
Multiscale modeling of brain dynamics: from single neurons and networks to mathematical tools.
Siettos, Constantinos; Starke, Jens
2016-09-01
The extreme complexity of the brain naturally requires mathematical modeling approaches on a large variety of scales; the spectrum ranges from single neuron dynamics over the behavior of groups of neurons to neuronal network activity. Thus, the connection between the microscopic scale (single neuron activity) to macroscopic behavior (emergent behavior of the collective dynamics) and vice versa is a key to understand the brain in its complexity. In this work, we attempt a review of a wide range of approaches, ranging from the modeling of single neuron dynamics to machine learning. The models include biophysical as well as data-driven phenomenological models. The discussed models include Hodgkin-Huxley, FitzHugh-Nagumo, coupled oscillators (Kuramoto oscillators, Rössler oscillators, and the Hindmarsh-Rose neuron), Integrate and Fire, networks of neurons, and neural field equations. In addition to the mathematical models, important mathematical methods in multiscale modeling and reconstruction of the causal connectivity are sketched. The methods include linear and nonlinear tools from statistics, data analysis, and time series analysis up to differential equations, dynamical systems, and bifurcation theory, including Granger causal connectivity analysis, phase synchronization connectivity analysis, principal component analysis (PCA), independent component analysis (ICA), and manifold learning algorithms such as ISOMAP, and diffusion maps and equation-free techniques. WIREs Syst Biol Med 2016, 8:438-458. doi: 10.1002/wsbm.1348 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
A concept analysis of abductive reasoning.
Mirza, Noeman A; Akhtar-Danesh, Noori; Noesgaard, Charlotte; Martin, Lynn; Staples, Eric
2014-09-01
To describe an analysis of the concept of abductive reasoning. In the discipline of nursing, abductive reasoning has received only philosophical attention and remains a vague concept. In addition to deductive and inductive reasoning, abductive reasoning is not recognized even in prominent nursing knowledge development literature. Therefore, what abductive reasoning is and how it can inform nursing practice and education was explored. Concept analysis. Combinations of specific keywords were searched in Web of Science, CINAHL, PsychINFO, PubMed, Medline and EMBASE. The analysis was conducted in June 2012 and only literature before this period was included. No time limits were set. Rodger's evolutionary method for conducting concept analysis was used. Twelve records were included in the analysis. The most common surrogate term was retroduction, whereas related terms included intuition and pattern and similarity recognition. Antecedents consisted of a complex, puzzling situation and a clinician with creativity, experience and knowledge. Consequences included the formation of broad hypotheses that enhance understanding of care situations. Overall, abductive reasoning was described as the process of hypothesis or theory generation and evaluation. It was also viewed as inference to the best explanation. As a new approach, abductive reasoning could enhance reasoning abilities of novice clinicians. It can not only incorporate various ways of knowing but also its holistic approach to learning appears to be promising in problem-based learning. As nursing literature on abductive reasoning is predominantly philosophical, practical consequences of abductive reasoning warrant further research. © 2014 John Wiley & Sons Ltd.
Kinsella, Elizabeth Anne; Bidinosti, Susan
2016-05-01
This paper reports on a study of an arts informed approach to ethics education in a health professions education context. The purpose of this study was to investigate students' reported learning experiences as a result of engagement with an arts-informed project in a health professions' ethics course. A hermeneutic phenomenological methodological approach was adopted for the study. The data were collected over 5 years, and involved analysis of 234 occupational therapy students' written reflections on learning. Phenomenological methods were used. Five key themes were identified with respect to students' reported learning including: becoming aware of values, (re) discovering creativity, coming to value reflection in professional life, deepening self-awareness, and developing capacities to imagine future practices. There appear to be a number of unique ways in which arts-informed approaches can contribute to health professions education including: activating imaginative engagement, fostering interpretive capacity, inspiring transformative understandings, offering new ways of knowing, deepening reflection, and heightening consciousness, while also enriching the inner life of practitioners. Innovative approaches are being used to introduce arts-informed practices in health professions curricula programs. The findings point to the promise of arts-informed approaches for advancing health sciences education.
NASA Technical Reports Server (NTRS)
Oden, J. Tinsley; Fly, Gerald W.; Mahadevan, L.
1987-01-01
A hybrid stress finite element method is developed for accurate stress and vibration analysis of problems in linear anisotropic elasticity. A modified form of the Hellinger-Reissner principle is formulated for dynamic analysis and an algorithm for the determination of the anisotropic elastic and compliance constants from experimental data is developed. These schemes were implemented in a finite element program for static and dynamic analysis of linear anisotropic two dimensional elasticity problems. Specific numerical examples are considered to verify the accuracy of the hybrid stress approach and compare it with that of the standard displacement method, especially for highly anisotropic materials. It is that the hybrid stress approach gives much better results than the displacement method. Preliminary work on extensions of this method to three dimensional elasticity is discussed, and the stress shape functions necessary for this extension are included.
Learning representative features for facial images based on a modified principal component analysis
NASA Astrophysics Data System (ADS)
Averkin, Anton; Potapov, Alexey
2013-05-01
The paper is devoted to facial image analysis and particularly deals with the problem of automatic evaluation of the attractiveness of human faces. We propose a new approach for automatic construction of feature space based on a modified principal component analysis. Input data sets for the algorithm are the learning data sets of facial images, which are rated by one person. The proposed approach allows one to extract features of the individual subjective face beauty perception and to predict attractiveness values for new facial images, which were not included into a learning data set. The Pearson correlation coefficient between values predicted by our method for new facial images and personal attractiveness estimation values equals to 0.89. This means that the new approach proposed is promising and can be used for predicting subjective face attractiveness values in real systems of the facial images analysis.
The balanced scorecard: an incremental approach model to health care management.
Pineno, Charles J
2002-01-01
The balanced scorecard represents a technique used in strategic management to translate an organization's mission and strategy into a comprehensive set of performance measures that provide the framework for implementation of strategic management. This article develops an incremental approach for decision making by formulating a specific balanced scorecard model with an index of nonfinancial as well as financial measures. The incremental approach to costs, including profit contribution analysis and probabilities, allows decisionmakers to assess, for example, how their desire to meet different health care needs will cause changes in service design. This incremental approach to the balanced scorecard may prove to be useful in evaluating the existence of causality relationships between different objective and subjective measures to be included within the balanced scorecard.
Coco, Laura; Colina, Sonia; Atcherson, Samuel R.
2017-01-01
Purpose The purpose of this study was to examine the readability level of the Spanish versions of several audiology- and otolaryngology-related patient-reported outcome measures (PROMs) and include a readability analysis of 2 translation approaches when available—the published version and a “functionalist” version—using a team-based collaborative approach including community members. Method Readability levels were calculated using the Fry Graph adapted for Spanish, as well as the Fernandez-Huerta and the Spaulding formulae for several commonly used audiology- and otolaryngology-related PROMs. Results Readability calculations agreed with previous studies analyzing audiology-related PROMs in English and demonstrated many Spanish-language PROMs were beyond the 5th grade reading level suggested for health-related materials written for the average population. In addition, the functionalist versions of the PROMs yielded lower grade-level (improved) readability levels than the published versions. Conclusion Our results suggest many of the Spanish-language PROMs evaluated here are beyond the recommended readability levels and may be influenced by the approach to translation. Moreover, improved readability may be possible using a functionalist approach to translation. Future analysis of the suitability of outcome measures and the quality of their translations should move beyond readability and include an evaluation of the individual's comprehension of the written text. PMID:28892821
Zoffmann, Vibeke; Hörnsten, Åsa; Storbækken, Solveig; Graue, Marit; Rasmussen, Bodil; Wahl, Astrid; Kirkevold, Marit
2016-03-01
Person-centred care [PCC] can engage people in living well with a chronic condition. However, translating PCC into practice is challenging. We aimed to compare the translational potentials of three approaches: motivational interviewing [MI], illness integration support [IIS] and guided self-determination [GSD]. Comparative analysis included eight components: (1) philosophical origin; (2) development in original clinical setting; (3) theoretical underpinnings; (4) overarching goal and supportive processes; (5) general principles, strategies or tools for engaging peoples; (6) health care professionals' background and training; (7) fidelity assessment; (8) reported effects. Although all approaches promoted autonomous motivation, they differed in other ways. Their original settings explain why IIS and GSD strive for life-illness integration, whereas MI focuses on managing ambivalence. IIS and GSD were based on grounded theories, and MI was intuitively developed. All apply processes and strategies to advance professionals' communication skills and engagement; GSD includes context-specific reflection sheets. All offer training programs; MI and GSD include fidelity tools. Each approach has a primary application: MI, when ambivalence threatens positive change; IIS, when integrating newly diagnosed chronic conditions; and GSD, when problem solving is difficult, or deadlocked. Professionals must critically consider the context in their choice of approach. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The Dispositions for Culturally Responsive Pedagogy Scale
ERIC Educational Resources Information Center
Whitaker, Manya C.; Valtierra, Kristina Marie
2018-01-01
Purpose: The purpose of this study is to develop and validate the dispositions for culturally responsive pedagogy scale (DCRPS). Design/methodology/approach: Scale development consisted of a six-step process including item development, expert review, exploratory factor analysis, factor interpretation, confirmatory factor analysis and convergent…
Structuring Effective Student Teams.
ERIC Educational Resources Information Center
Dickson, Ellen L.
1997-01-01
Experience with student teams working on policy analysis projects indicates the need for faculty supervision of teams in the process of addressing complex issues. The problem-solving approach adopted in one policy analysis course is described, including assignments and tasks, issues and sponsors, team dynamics, conflict management, and the…
Management of reliability and maintainability; a disciplined approach to fleet readiness
NASA Technical Reports Server (NTRS)
Willoughby, W. J., Jr.
1981-01-01
Material acquisition fundamentals were reviewed and include: mission profile definition, stress analysis, derating criteria, circuit reliability, failure modes, and worst case analysis. Military system reliability was examined with emphasis on the sparing of equipment. The Navy's organizational strategy for 1980 is presented.
Stress Analysis of Composite Cylindrical Shells with an Elliptical Cutout
NASA Technical Reports Server (NTRS)
Oterkus, E.; Madenci, E.; Nemeth, M. P.
2007-01-01
A special-purpose, semi-analytical solution method for determining the stress and deformation fields in a thin laminated-composite cylindrical shell with an elliptical cutout is presented. The analysis includes the effects of cutout size, shape, and orientation; non-uniform wall thickness; oval-cross-section eccentricity; and loading conditions. The loading conditions include uniform tension, uniform torsion, and pure bending. The analysis approach is based on the principle of stationary potential energy and uses Lagrange multipliers to relax the kinematic admissibility requirements on the displacement representations through the use of idealized elastic edge restraints. Specifying appropriate stiffness values for the elastic extensional and rotational edge restraints (springs) allows the imposition of the kinematic boundary conditions in an indirect manner, which enables the use of a broader set of functions for representing the displacement fields. Selected results of parametric studies are presented for several geometric parameters that demonstrate that analysis approach is a powerful means for developing design criteria for laminated-composite shells.
Stress Analysis of Composite Cylindrical Shells With an Elliptical Cutout
NASA Technical Reports Server (NTRS)
Nemeth, M. P.; Oterkus, E.; Madenci, E.
2005-01-01
A special-purpose, semi-analytical solution method for determining the stress and deformation fields in a thin laminated-composite cylindrical shell with an elliptical cutout is presented. The analysis includes the effects of cutout size, shape, and orientation; nonuniform wall thickness; oval-cross-section eccentricity; and loading conditions. The loading conditions include uniform tension, uniform torsion, and pure bending. The analysis approach is based on the principle of stationary potential energy and uses Lagrange multipliers to relax the kinematic admissibility requirements on the displacement representations through the use of idealized elastic edge restraints. Specifying appropriate stiffness values for the elastic extensional and rotational edge restraints (springs) allows the imposition of the kinematic boundary conditions in an indirect manner, which enables the use of a broader set of functions for representing the displacement fields. Selected results of parametric studies are presented for several geometric parameters that demonstrate that analysis approach is a powerful means for developing design criteria for laminated-composite shells.
Array magnetics modal analysis for the DIII-D tokamak based on localized time-series modelling
Olofsson, K. Erik J.; Hanson, Jeremy M.; Shiraki, Daisuke; ...
2014-07-14
Here, time-series analysis of magnetics data in tokamaks is typically done using block-based fast Fourier transform methods. This work presents the development and deployment of a new set of algorithms for magnetic probe array analysis. The method is based on an estimation technique known as stochastic subspace identification (SSI). Compared with the standard coherence approach or the direct singular value decomposition approach, the new technique exhibits several beneficial properties. For example, the SSI method does not require that frequencies are orthogonal with respect to the timeframe used in the analysis. Frequencies are obtained directly as parameters of localized time-series models.more » The parameters are extracted by solving small-scale eigenvalue problems. Applications include maximum-likelihood regularized eigenmode pattern estimation, detection of neoclassical tearing modes, including locked mode precursors, and automatic clustering of modes, and magnetics-pattern characterization of sawtooth pre- and postcursors, edge harmonic oscillations and fishbones.« less
Bioinformatics/biostatistics: microarray analysis.
Eichler, Gabriel S
2012-01-01
The quantity and complexity of the molecular-level data generated in both research and clinical settings require the use of sophisticated, powerful computational interpretation techniques. It is for this reason that bioinformatic analysis of complex molecular profiling data has become a fundamental technology in the development of personalized medicine. This chapter provides a high-level overview of the field of bioinformatics and outlines several, classic bioinformatic approaches. The highlighted approaches can be aptly applied to nearly any sort of high-dimensional genomic, proteomic, or metabolomic experiments. Reviewed technologies in this chapter include traditional clustering analysis, the Gene Expression Dynamics Inspector (GEDI), GoMiner (GoMiner), Gene Set Enrichment Analysis (GSEA), and the Learner of Functional Enrichment (LeFE).
Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates
NASA Technical Reports Server (NTRS)
Peffley, Al F.
1991-01-01
The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.
Different perspectives on economic base.
Lisa K. Crone; Richard W. Haynes; Nicholas E. Reyna
1999-01-01
Two general approaches for measuring the economic base are discussed. Each method is used to define the economic base for each of the counties included in the Interior Columbia Basin Ecosystem Management Project area. A more detailed look at four selected counties results in similar findings from different approaches. Limitations of economic base analysis also are...
An Empirical Comparison of Heterogeneity Variance Estimators in 12,894 Meta-Analyses
ERIC Educational Resources Information Center
Langan, Dean; Higgins, Julian P. T.; Simmonds, Mark
2015-01-01
Heterogeneity in meta-analysis is most commonly estimated using a moment-based approach described by DerSimonian and Laird. However, this method has been shown to produce biased estimates. Alternative methods to estimate heterogeneity include the restricted maximum likelihood approach and those proposed by Paule and Mandel, Sidik and Jonkman, and…
SILVAH: managers and scientists working together to improve research and management
Susan L Stout; Patrick H. Brose
2014-01-01
SILVAH is a systematic approach to silvicultural prescription development based on inventory and analysis of stand data for Allegheny hardwood, northern hardwood, and mixed oak forests. SILVAH includes annual training sessions and decision support software, and it ensures a consistent, complete, and objective approach to prescriptions. SILVAH has created a community of...
An integrated sampling and analysis approach for improved biodiversity monitoring
DeWan, Amielle A.; Zipkin, Elise
2010-01-01
Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.
Forghani, Masoomeh; Ghanbari Hashem Abadi, Bahram Ali
2016-06-01
The aim of the present study was to evaluate the effect of group psychotherapy with transactional analysis (TA) approach on emotional intelligence (EI), executive functions and substance dependency among drug-addicts at rehabilitation centers in Mashhad city, Iran, in 2013. In this quasi-experimental study with pretest, posttest, case- control stages, 30 patients were selected from a rehabilitation center and randomly divided into two groups. The case group received 12 sessions of group psychotherapy with transactional analysis approach. Then the effects of independent variable (group psychotherapy with TA approach) on EI, executive function and drug dependency were assessed. The Bar-on test was used for EI, Stroop test for measuring executive function and morphine test, meth-amphetamines and B2 test for evaluating drug dependency. Data were analyzed using multifactorial covariance analysis, Levenes' analysis, MANCOVA, t-student and Pearson correlation coefficient tests t with SPSS software. Our results showed that group psychotherapy with the TA approach was effective in improving EI, executive functions and decreasing drug dependency (P < 0.05). The result of this study showed that group psychotherapy with TA approach has significant effects on addicts and prevents addiction recurrence by improving the coping capabilities and some mental functions of the subjects. However, there are some limitations regarding this study including follow-up duration and sample size.
State space approach to mixed boundary value problems.
NASA Technical Reports Server (NTRS)
Chen, C. F.; Chen, M. M.
1973-01-01
A state-space procedure for the formulation and solution of mixed boundary value problems is established. This procedure is a natural extension of the method used in initial value problems; however, certain special theorems and rules must be developed. The scope of the applications of the approach includes beam, arch, and axisymmetric shell problems in structural analysis, boundary layer problems in fluid mechanics, and eigenvalue problems for deformable bodies. Many classical methods in these fields developed by Holzer, Prohl, Myklestad, Thomson, Love-Meissner, and others can be either simplified or unified under new light shed by the state-variable approach. A beam problem is included as an illustration.
Lorenzon, Laura; La Torre, Marco; Ziparo, Vincenzo; Montebelli, Francesco; Mercantini, Paolo; Balducci, Genoveffa; Ferri, Mario
2014-04-07
To report a meta-analysis of the studies that compared the laparoscopic with the open approach for colon cancer resection. Forty-seven manuscripts were reviewed, 33 of which employed for meta-analysis according to the PRISMA guidelines. The results were differentiated according to the study design (prospective randomized trials vs case-control series) and according to the tumor's location. Outcome measures included: (1) short-term results (operating times, blood losses, bowel function recovery, post-operative pain, return to the oral intake, complications and hospital stay); (2) oncological adequateness (number of nodes harvested in the surgical specimens); and (3) long-term results (including the survivals' rates and incidence of incisional hernias) and (4) costs. Meta-analysis of trials provided evidences in support of the laparoscopic procedures for a several short-term outcomes including: a lower blood loss, an earlier recovery of the bowel function, an earlier return to the oral intake, a shorter hospital stay and a lower morbidity rate. Opposite the operating time has been confirmed shorter in open surgery. The same trend has been reported investigating case-control series and cancer by sites, even though there are some concerns regarding the power of the studies in this latter field due to the small number of trials and the small sample of patients enrolled. The two approaches were comparable regarding the mean number of nodes harvested and long-term results, even though these variables were documented reviewing the literature but were not computable for meta-analysis. The analysis of the costs documented lower costs for the open surgery, however just few studies investigated the incidence of post-operative hernias. Laparoscopy is superior for the majority of short-term results. Future studies should better differentiate these approaches on the basis of tumors' location and the post-operative hernias.
Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions
2018-03-20
USAARL Report No. 2018-08 Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions By Kathryn A...3 Statistical Analysis Approach ..............................................................................................3 Results...1 Introduction The success of unmanned aerial systems (UAS) operations relies upon a variety of factors, including, but not limited to
Space shuttle navigation analysis. Volume 2: Baseline system navigation
NASA Technical Reports Server (NTRS)
Jones, H. L.; Luders, G.; Matchett, G. A.; Rains, R. G.
1980-01-01
Studies related to the baseline navigation system for the orbiter are presented. The baseline navigation system studies include a covariance analysis of the Inertial Measurement Unit calibration and alignment procedures, postflight IMU error recovery for the approach and landing phases, on-orbit calibration of IMU instrument biases, and a covariance analysis of entry and prelaunch navigation system performance.
Marital dissolution: an economic analysis.
Hunter, K A
1984-01-01
A longitudinal analysis of factors affecting marital dissolution in the United States is presented using data from the Coleman-Rossi Retrospective Life History. Factors considered include labor force participation of both spouses, wage growth, size of family unit, age at marriage, and educational status. The study is based on the economic analysis approach developed by Gary S. Becker and others.
A specialized rehabilitation approach improves mobility in children with osteogenesis imperfecta.
Hoyer-Kuhn, H; Semler, O; Stark, C; Struebing, N; Goebel, O; Schoenau, E
2014-12-01
Osteogenesis imperfecta (OI) is a rare disease leading to recurrent fractures, hyperlaxicity of ligaments, short stature and muscular weakness. Physiotherapy is one important treatment approach. The objective of our analysis was to evaluate the effect of a new physiotherapy approach including side alternating whole body vibration on motor function in children with OI. In a retrospective analysis data of 53 children were analyzed. The 12 months approach included 6 months of side alternating whole body vibration training, concomitant physiotherapy, resistance training, treadmill training and 6 months follow up. Primary outcome parameter was the Gross Motor Function Measure after 12 months (M12). 53 children (male: 32; age (mean±SEM): 9.1±0.61, range 2.54-24.81 years) participated in the treatment approach. A significant increase of motor function (GMFM-66 score 55.47±2.45 to 58.67±2.83; p=0.001) and walking distance (47.04 m±6.52 to 63.36±8.25 m (p<0.01) between M0 and M12 was seen. Total body without head bone mineral density increased significantly at M12 (p=0.0189). In the cohort of OI children which participated in the specialized treatment approach improvements of motor function were observed. Therefore this program should be considered as additional therapeutic approach for children with severe OI.
Bauer, Sebastian; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Rosenow, Felix
2017-11-01
Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics, and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. This Part II includes the experimental and translational approaches and a discussion of the future perspectives, while the diagnostic methods, EEG network analysis, biomarkers, and personalized treatment approaches were addressed in Part I [1]. Copyright © 2017. Published by Elsevier Inc.
A study of structural concepts for ultralightweight spacecraft
NASA Technical Reports Server (NTRS)
Miller, R. K.; Knapp, K.; Hedgepeth, J. M.
1984-01-01
Structural concepts for ultralightweight spacecraft were studied. Concepts for ultralightweight space structures were identified and the validity of heir potential application in advanced spacecraft was assessed. The following topics were investigated: (1) membrane wrinkling under pretensioning; (2) load-carrying capability of pressurized tubes; (3) equilibrium of a precompressed rim; (4) design of an inflated reflector spacecraft; (5) general instability of a rim; and (6) structural analysis of a pressurized isotensoid column. The design approaches for a paraboloidal reflector spacecraft included a spin-stiffened design, both inflated and truss central columns, and to include both deep truss and rim-stiffened geodesic designs. The spinning spacecraft analysis is included, and the two truss designs are covered. The performances of four different approaches to the structural design of a paraboloidal reflector spacecraft are compared. The spinning and inflated configurations result in very low total masses and some concerns about their performance due to unresolved questions about dynamic stability and lifetimes, respectively.
Van Dessel, E; Fierens, K; Pattyn, P; Van Nieuwenhove, Y; Berrevoet, F; Troisi, R; Ceelen, W
2009-01-01
Approximately 5%-20% of colorectal cancer (CRC) patients present with synchronous potentially resectable liver metastatic disease. Preclinical and clinical studies suggest a benefit of the 'liver first' approach, i.e. resection of the liver metastasis followed by resection of the primary tumour. A formal decision analysis may support a rational choice between several therapy options. Survival and morbidity data were retrieved from relevant clinical studies identified by a Web of Science search. Data were entered into decision analysis software (TreeAge Pro 2009, Williamstown, MA, USA). Transition probabilities including the risk of death from complications or disease progression associated with individual therapy options were entered into the model. Sensitivity analysis was performed to evaluate the model's validity under a variety of assumptions. The result of the decision analysis confirms the superiority of the 'liver first' approach. Sensitivity analysis demonstrated that this assumption is valid on condition that the mortality associated with the hepatectomy first is < 4.5%, and that the mortality of colectomy performed after hepatectomy is < 3.2%. The results of this decision analysis suggest that, in patients with synchronous resectable colorectal liver metastases, the 'liver first' approach is to be preferred. Randomized trials will be needed to confirm the results of this simulation based outcome.
Ocean wavenumber estimation from wave-resolving time series imagery
Plant, N.G.; Holland, K.T.; Haller, M.C.
2008-01-01
We review several approaches that have been used to estimate ocean surface gravity wavenumbers from wave-resolving remotely sensed image sequences. Two fundamentally different approaches that utilize these data exist. A power spectral density approach identifies wavenumbers where image intensity variance is maximized. Alternatively, a cross-spectral correlation approach identifies wavenumbers where intensity coherence is maximized. We develop a solution to the latter approach based on a tomographic analysis that utilizes a nonlinear inverse method. The solution is tolerant to noise and other forms of sampling deficiency and can be applied to arbitrary sampling patterns, as well as to full-frame imagery. The solution includes error predictions that can be used for data retrieval quality control and for evaluating sample designs. A quantitative analysis of the intrinsic resolution of the method indicates that the cross-spectral correlation fitting improves resolution by a factor of about ten times as compared to the power spectral density fitting approach. The resolution analysis also provides a rule of thumb for nearshore bathymetry retrievals-short-scale cross-shore patterns may be resolved if they are about ten times longer than the average water depth over the pattern. This guidance can be applied to sample design to constrain both the sensor array (image resolution) and the analysis array (tomographic resolution). ?? 2008 IEEE.
Winsor, Geoffrey L; Griffiths, Emma J; Lo, Raymond; Dhillon, Bhavjinder K; Shay, Julie A; Brinkman, Fiona S L
2016-01-04
The Pseudomonas Genome Database (http://www.pseudomonas.com) is well known for the application of community-based annotation approaches for producing a high-quality Pseudomonas aeruginosa PAO1 genome annotation, and facilitating whole-genome comparative analyses with other Pseudomonas strains. To aid analysis of potentially thousands of complete and draft genome assemblies, this database and analysis platform was upgraded to integrate curated genome annotations and isolate metadata with enhanced tools for larger scale comparative analysis and visualization. Manually curated gene annotations are supplemented with improved computational analyses that help identify putative drug targets and vaccine candidates or assist with evolutionary studies by identifying orthologs, pathogen-associated genes and genomic islands. The database schema has been updated to integrate isolate metadata that will facilitate more powerful analysis of genomes across datasets in the future. We continue to place an emphasis on providing high-quality updates to gene annotations through regular review of the scientific literature and using community-based approaches including a major new Pseudomonas community initiative for the assignment of high-quality gene ontology terms to genes. As we further expand from thousands of genomes, we plan to provide enhancements that will aid data visualization and analysis arising from whole-genome comparative studies including more pan-genome and population-based approaches. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Compulsory Birth Control and Fertility Measures in India.
ERIC Educational Resources Information Center
Halli, S. S.
1983-01-01
Discussion of possible applications of the microsimulation approach to analysis of population policy proposes compulsory sterilization policy for all of India. Topics covered include India's population problem, methods for generating a distribution of couples to be sterilized, model validation, data utilized, data analysis, program limitations,…
Streamline Your Project: A Lifecycle Model.
ERIC Educational Resources Information Center
Viren, John
2000-01-01
Discusses one approach to project organization providing a baseline lifecycle model for multimedia/CBT development. This variation of the standard four-phase model of Analysis, Design, Development, and Implementation includes a Pre-Analysis phase, called Definition, and a Post-Implementation phase, known as Maintenance. Each phase is described.…
ERIC Educational Resources Information Center
Mysliwiec, Tami H.
2003-01-01
Incorporates history and genetics to explain how genetic traits are passed on to the next generation by focusing on methemoglobinemia, a rare genetic disease, and discusses how oxygen is carried by hemoglobin. Includes individual pedigree analysis and class pedigree analysis. (YDS)
NASA Technical Reports Server (NTRS)
Heffley, R. K.; Stapleford, R. L.; Rumold, R. C.; Lehman, J. M.; Scott, B. C.; Hynes, C. S.
1974-01-01
A simulator study of STOL airworthiness was conducted using a model of an augmentor wing transport. The approach, flare and landing, go-around, and takeoff phases of flight were investigated. The simulation and the data obtained are described. These data include performance measures, pilot commentary, and pilot ratings. A pilot/vehicle analysis of glide slope tracking and of the flare maneuver is included.
A 3S Risk ?3SR? Assessment Approach for Nuclear Power: Safety Security and Safeguards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forrest, Robert; Reinhardt, Jason Christian; Wheeler, Timothy A.
Safety-focused risk analysis and assessment approaches struggle to adequately include malicious, deliberate acts against the nuclear power industry's fissile and waste material, infrastructure, and facilities. Further, existing methods do not adequately address non- proliferation issues. Treating safety, security, and safeguards concerns independently is inefficient because, at best, it may not take explicit advantage of measures that provide benefits against multiple risk domains, and, at worst, it may lead to implementations that increase overall risk due to incompatibilities. What is needed is an integrated safety, security and safeguards risk (or "3SR") framework for describing and assessing nuclear power risks that canmore » enable direct trade-offs and interactions in order to inform risk management processes -- a potential paradigm shift in risk analysis and management. These proceedings of the Sandia ePRA Workshop (held August 22-23, 2017) are an attempt to begin the discussions and deliberations to extend and augment safety focused risk assessment approaches to include security concerns and begin moving towards a 3S Risk approach. Safeguards concerns were not included in this initial workshop and are left to future efforts. This workshop focused on four themes in order to begin building out a the safety and security portions of the 3S Risk toolkit: 1. Historical Approaches and Tools 2. Current Challenges 3. Modern Approaches 4. Paths Forward and Next Steps This report is organized along the four areas described above, and concludes with a summary of key points. 2 Contact: rforres@sandia.gov; +1 (925) 294-2728« less
Kostick, Kristin M; Schensul, Stephen L; Singh, Rajendra; Pelto, Pertti; Saggurti, Niranjan
2011-05-01
This paper responds to the call for culturally-relevant intervention research by introducing a methodology for identifying community norms and resources in order to more effectively implement sustainable interventions strategies. Results of an analysis of community norms, specifically attitudes toward gender equity, are presented from an HIV/STI research and intervention project in a low-income community in Mumbai, India (2008-2012). Community gender norms were explored because of their relevance to sexual risk in settings characterized by high levels of gender inequity. This paper recommends approaches that interventionists and social scientists can take to incorporate cultural insights into formative assessments and project implementation These approaches include how to (1) examine modal beliefs and norms and any patterned variation within the community; (2) identify and assess variation in cultural beliefs and norms among community members (including leaders, social workers, members of civil society and the religious sector); and (3) identify differential needs among sectors of the community and key types of individuals best suited to help formulate and disseminate culturally-relevant intervention messages. Using a multi-method approach that includes the progressive translation of qualitative interviews into a quantitative survey of cultural norms, along with an analysis of community consensus, we outline a means for measuring variation in cultural expectations and beliefs about gender relations in an urban community in Mumbai. Results illustrate how intervention strategies and implementation can benefit from an organic (versus a priori and/or stereotypical) approach to cultural characteristics and analysis of community resources and vulnerabilities. Copyright © 2011 Elsevier Ltd. All rights reserved.
A methodology for building culture and gender norms into intervention: An example from Mumbai, India
Schensul, Stephen L.; Singh, Rajendra; Pelto, Pertti; Saggurti, Niranjan
2011-01-01
This paper responds to the call for culturally relevant intervention research by introducing a methodology for identifying community norms and resources in order to more effectively implement sustainable interventions strategies. Results of an analysis of community norms, specifically attitudes toward gender equity, are presented from an HIV/STI research and intervention project in a low income community in Mumbai, India (2008–2012). Community gender norms were explored because of their relevance to sexual risk in settings characterized by high levels of gender inequity. This paper recommends approaches that interventionists and social scientists can take to incorporate cultural insights into formative assessments and project implementation These approaches include how to (1) examine modal beliefs and norms and any patterned variation within the community; (2) identify and assess variation in cultural beliefs and norms among community members (including leaders, social workers, members of civil society and the religious sector); and (3) identify differential needs among sectors of the community and key types of individuals best suited to help formulate and disseminate culturally relevant intervention messages. Using a multi-method approach that includes the progressive translation of qualitative interviews into a quantitative survey of cultural norms, along with an analysis of community consensus, we outline a means for measuring variation in cultural expectations and beliefs about gender relations in an urban community in Mumbai. Results illustrate how intervention strategies and implementation can benefit from an organic (versus a priori and/or stereotypical) approach to cultural characteristics and analysis of community resources and vulnerabilities. PMID:21524835
Pariser, Joseph J; Pearce, Shane M; Patel, Sanjay G; Bales, Gregory T
2015-10-01
To examine the national trends of simple prostatectomy (SP) for benign prostatic hyperplasia (BPH) focusing on perioperative outcomes and risk factors for complications. The National Inpatient Sample (2002-2012) was utilized to identify patients with BPH undergoing SP. Analysis included demographics, hospital details, associated procedures, and operative approach (open, robotic, or laparoscopic). Outcomes included complications, length of stay, charges, and mortality. Multivariate logistic regression was used to determine the risk factors for perioperative complications. Linear regression was used to assess the trends in the national annual utilization of SP. The study population included 35,171 patients. Median length of stay was 4 days (interquartile range 3-6). Cystolithotomy was performed concurrently in 6041 patients (17%). The overall complication rate was 28%, with bleeding occurring most commonly. In total, 148 (0.4%) patients experienced in-hospital mortality. On multivariate analysis, older age, black race, and overall comorbidity were associated with greater risk of complications while the use of a minimally invasive approach and concurrent cystolithotomy had a decreased risk. Over the study period, the national use of simple prostatectomy decreased, on average, by 145 cases per year (P = .002). By 2012, 135/2580 procedures (5%) were performed using a minimally invasive approach. The nationwide utilization of SP for BPH has decreased. Bleeding complications are common, but perioperative mortality is low. Patients who are older, black race, or have multiple comorbidities are at higher risk of complications. Minimally invasive approaches, which are becoming increasingly utilized, may reduce perioperative morbidity. Copyright © 2015 Elsevier Inc. All rights reserved.
Learning in First-Year Biology: Approaches of Distance and On-Campus Students
NASA Astrophysics Data System (ADS)
Quinn, Frances Catherine
2011-01-01
This paper aims to extend previous research into learning of tertiary biology, by exploring the learning approaches adopted by two groups of students studying the same first-year biology topic in either on-campus or off-campus "distance" modes. The research involved 302 participants, who responded to a topic-specific version of the Study Process Questionnaire, and in-depth interviews with 16 of these students. Several quantitative analytic techniques, including cluster analysis and Rasch differential item functioning analysis, showed that the younger, on-campus cohort made less use of deep approaches, and more use of surface approaches than the older, off-campus group. At a finer scale, clusters of students within these categories demonstrated different patterns of learning approach. Students' descriptions of their learning approaches at interview provided richer complementary descriptions of the approach they took to their study in the topic, showing how deep and surface approaches were manifested in the study context. These findings are critically analysed in terms of recent literature questioning the applicability of learning approaches theory in mass education, and their implications for teaching and research in undergraduate biology.
NASA Technical Reports Server (NTRS)
Pliutau, Denis; Prasad, Narasimha S.
2012-01-01
In this paper a modeling method based on data reductions is investigated which includes pre analyzed MERRA atmospheric fields for quantitative estimates of uncertainties introduced in the integrated path differential absorption methods for the sensing of various molecules including CO2. This approach represents the extension of our existing lidar modeling framework previously developed and allows effective on- and offline wavelength optimizations and weighting function analysis to minimize the interference effects such as those due to temperature sensitivity and water vapor absorption. The new simulation methodology is different from the previous implementation in that it allows analysis of atmospheric effects over annual spans and the entire Earth coverage which was achieved due to the data reduction methods employed. The effectiveness of the proposed simulation approach is demonstrated with application to the mixing ratio retrievals for the future ASCENDS mission. Independent analysis of multiple accuracy limiting factors including the temperature, water vapor interferences, and selected system parameters is further used to identify favorable spectral regions as well as wavelength combinations facilitating the reduction in total errors in the retrieved XCO2 values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan
X-ray photon correlation spectroscopy (XPCS), an extension of dynamic light scattering (DLS) in the X-ray regime, detects temporal intensity fluctuations of coherent speckles and provides scattering-vector-dependent sample dynamics at length scales smaller than DLS. The penetrating power of X-rays enables XPCS to probe the dynamics in a broad array of materials, including polymers, glasses and metal alloys, where attempts to describe the dynamics with a simple exponential fit usually fail. In these cases, the prevailing XPCS data analysis approach employs stretched or compressed exponential decay functions (Kohlrausch functions), which implicitly assume homogeneous dynamics. This paper proposes an alternative analysis schememore » based upon inverse Laplace or Gaussian transformation for elucidating heterogeneous distributions of dynamic time scales in XPCS, an approach analogous to the CONTIN algorithm widely accepted in the analysis of DLS from polydisperse and multimodal systems. In conclusion, using XPCS data measured from colloidal gels, it is demonstrated that the inverse transform approach reveals hidden multimodal dynamics in materials, unleashing the full potential of XPCS.« less
Nolin, Frédérique; Ploton, Dominique; Wortham, Laurence; Tchelidze, Pavel; Balossier, Gérard; Banchet, Vincent; Bobichon, Hélène; Lalun, Nathalie; Terryn, Christine; Michel, Jean
2012-11-01
Cryo fluorescence imaging coupled with the cryo-EM technique (cryo-CLEM) avoids chemical fixation and embedding in plastic, and is the gold standard for correlated imaging in a close to native state. This multi-modal approach has not previously included elementary nano analysis or evaluation of water content. We developed a new approach allowing analysis of targeted in situ intracellular ions and water measurements at the nanoscale (EDXS and STEM dark field imaging) within domains identified by examination of specific GFP-tagged proteins. This method allows both water and ions- fundamental to cell biology- to be located and quantified at the subcellular level. We illustrate the potential of this approach by investigating changes in water and ion content in nuclear domains identified by GFP-tagged proteins in cells stressed by Actinomycin D treatment and controls. The resolution of our approach was sufficient to distinguish clumps of condensed chromatin from surrounding nucleoplasm by fluorescence imaging and to perform nano analysis in this targeted compartment. Copyright © 2012 Elsevier Inc. All rights reserved.
Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan; ...
2018-02-01
X-ray photon correlation spectroscopy (XPCS), an extension of dynamic light scattering (DLS) in the X-ray regime, detects temporal intensity fluctuations of coherent speckles and provides scattering-vector-dependent sample dynamics at length scales smaller than DLS. The penetrating power of X-rays enables XPCS to probe the dynamics in a broad array of materials, including polymers, glasses and metal alloys, where attempts to describe the dynamics with a simple exponential fit usually fail. In these cases, the prevailing XPCS data analysis approach employs stretched or compressed exponential decay functions (Kohlrausch functions), which implicitly assume homogeneous dynamics. This paper proposes an alternative analysis schememore » based upon inverse Laplace or Gaussian transformation for elucidating heterogeneous distributions of dynamic time scales in XPCS, an approach analogous to the CONTIN algorithm widely accepted in the analysis of DLS from polydisperse and multimodal systems. In conclusion, using XPCS data measured from colloidal gels, it is demonstrated that the inverse transform approach reveals hidden multimodal dynamics in materials, unleashing the full potential of XPCS.« less
NASA Astrophysics Data System (ADS)
Cretcher, C. K.
1980-11-01
The various types of solar domestic hot water systems are discussed including their advantages and disadvantages. The problems that occur in hydronic solar heating systems are reviewed with emphasis on domestic hot water applicatons. System problems in retrofitting of residential buildings are also discussed including structural and space constraints for various components and subsystems. System design parameters include various collector sizing methods, collector orientation, storage capacity and heat loss from pipes and tanks. The installation costs are broken down by components and subsystems. The approach used for utility economic impact analysis is reviewed. The simulation is described, and the results of the economic impact analysis are given. A summary assessment is included.
Evaluating and Reporting Statistical Power in Counseling Research
ERIC Educational Resources Information Center
Balkin, Richard S.; Sheperis, Carl J.
2011-01-01
Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…
Managing the "Performance" in Performance Management.
ERIC Educational Resources Information Center
Repinski, Marilyn; Bartsch, Maryjo
1996-01-01
Describes a five-step approach to performance management which includes (1) redefining tasks; (2) identifying skills; (3) determining what development tools are necessary; (4) prioritizing skills development; and (5) developing an action plan. Presents a hiring model that includes job analysis, job description, selection, goal setting, evaluation,…
Stoney, David A; Stoney, Paul L
2015-08-01
An effective trace evidence capability is defined as one that exploits all useful particle types, chooses appropriate technologies to do so, and directly integrates the findings with case-specific problems. Limitations of current approaches inhibit the attainment of an effective capability and it has been strongly argued that a new approach to trace evidence analysis is essential. A hypothetical case example is presented to illustrate and analyze how forensic particle analysis can be used as a powerful practical tool in forensic investigations. The specifics in this example, including the casework investigation, laboratory analyses, and close professional interactions, provide focal points for subsequent analysis of how this outcome can be achieved. This leads to the specification of five key elements that are deemed necessary and sufficient for effective forensic particle analysis: (1) a dynamic forensic analytical approach, (2) concise and efficient protocols addressing particle combinations, (3) multidisciplinary capabilities of analysis and interpretation, (4) readily accessible external specialist resources, and (5) information integration and communication. A coordinating role, absent in current approaches to trace evidence analysis, is essential to achieving these elements. However, the level of expertise required for the coordinating role is readily attainable. Some additional laboratory protocols are also essential. However, none of these has greater staffing requirements than those routinely met by existing forensic trace evidence practitioners. The major challenges that remain are organizational acceptance, planning and implementation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2002-01-01
Two new equivalent linearization implementations for geometrically nonlinear random vibrations are presented. Both implementations are based upon a novel approach for evaluating the nonlinear stiffness within commercial finite element codes and are suitable for use with any finite element code having geometrically nonlinear static analysis capabilities. The formulation includes a traditional force-error minimization approach and a relatively new version of a potential energy-error minimization approach, which has been generalized for multiple degree-of-freedom systems. Results for a simply supported plate under random acoustic excitation are presented and comparisons of the displacement root-mean-square values and power spectral densities are made with results from a nonlinear time domain numerical simulation.
A Multifaceted Approach to Investigating Pre-Task Planning Effects on Paired Oral Test Performance
ERIC Educational Resources Information Center
Nitta, Ryo; Nakatsuhara, Fumiyo
2014-01-01
Despite the growing popularity of paired format speaking assessments, the effects of pre-task planning time on performance in these formats are not yet well understood. For example, some studies have revealed the benefits of planning but others have not. Using a multifaceted approach including analysis of the process of speaking performance, the…
Studying Distance Students: Methods, Findings, Actions
ERIC Educational Resources Information Center
Wahl, Diane; Avery, Beth; Henry, Lisa
2013-01-01
University of North Texas (UNT) Libraries began studying the library needs of distance learners in 2009 using a variety of approaches to explore and confirm these needs as well as obtain input into how to meet them. Approaches used to date include analysis of both quantitative and qualitative responses by online students to the LibQUAL+[R] surveys…
ERIC Educational Resources Information Center
López-García, Jeanett; Jiménez Zamudio, Jorge Javier
2017-01-01
It is very common to find in contemporary literature of Differential Equations, the need to incorporate holistically in teaching and learning the three different approaches: analytical, qualitative, and numerical, for continuous dynamical systems. However, nowadays, in some Bachelor of Science that includes only one course in differential…
Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM
ERIC Educational Resources Information Center
Warner, Rebecca M.
2007-01-01
This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…
ERIC Educational Resources Information Center
Prevost, A. Toby; Mason, Dan; Griffin, Simon; Kinmonth, Ann-Louise; Sutton, Stephen; Spiegelhalter, David
2007-01-01
Practical meta-analysis of correlation matrices generally ignores covariances (and hence correlations) between correlation estimates. The authors consider various methods for allowing for covariances, including generalized least squares, maximum marginal likelihood, and Bayesian approaches, illustrated using a 6-dimensional response in a series of…
A Guide to Job Analysis for the Preparation of Job Training Programmes.
ERIC Educational Resources Information Center
Ceramics, Glass, and Mineral Products Industry Training Board, Harrow (England).
The paper deals with job analysis for the preparation of job training programs. The analytical approach involves five steps: enlisting support, examining the job, describing the job, analyzing training requirements, and planning the programs. Appendixes include methods of producing training schemes--the simple job breakdown, straightforward…
Teaching Case: A Systems Analysis Role-Play Exercise and Assignment
ERIC Educational Resources Information Center
Mitri, Michel; Cole, Carey; Atkins, Laura
2017-01-01
This paper presents a role-play exercise and assignment that provides an active learning experience related to the system investigation phase of an SDLC. Whether using waterfall or agile approaches, the first SDLC step usually involves system investigation activities, including problem identification, feasibility study, cost-benefit analysis, and…
A Framework for the Selection of Electronic Marketplaces: A Content Analysis Approach.
ERIC Educational Resources Information Center
Stockdale, Rosemary; Standing, Craig
2002-01-01
Discussion of electronic marketplaces focuses on a content analysis of research and practitioner articles that evaluated issues that prospective participants, seeking to purchase goods and services online, need to address in their selection process. Proposes a framework to support electronic marketplace decision making that includes internal…
Petticrew, Mark; Rehfuess, Eva; Noyes, Jane; Higgins, Julian P T; Mayhew, Alain; Pantoja, Tomas; Shemilt, Ian; Sowden, Amanda
2013-11-01
Although there is increasing interest in the evaluation of complex interventions, there is little guidance on how evidence from complex interventions may be reviewed and synthesized, and the relevance of the plethora of evidence synthesis methods to complexity is unclear. This article aims to explore how different meta-analytical approaches can be used to examine aspects of complexity; describe the contribution of various narrative, tabular, and graphical approaches to synthesis; and give an overview of the potential choice of selected qualitative and mixed-method evidence synthesis approaches. The methodological discussions presented here build on a 2-day workshop held in Montebello, Canada, in January 2012, involving methodological experts from the Campbell and Cochrane Collaborations and from other international review centers (Anderson L, Petticrew M, Chandler J, et al. systematic reviews of complex interventions. In press). These systematic review methodologists discussed the broad range of existing methods and considered the relevance of these methods to reviews of complex interventions. The evidence from primary studies of complex interventions may be qualitative or quantitative. There is a wide range of methodological options for reviewing and presenting this evidence. Specific contributions of statistical approaches include the use of meta-analysis, meta-regression, and Bayesian methods, whereas narrative summary approaches provide valuable precursors or alternatives to these. Qualitative and mixed-method approaches include thematic synthesis, framework synthesis, and realist synthesis. A suitable combination of these approaches allows synthesis of evidence for understanding complex interventions. Reviewers need to consider which aspects of complex interventions should be a focus of their review and what types of quantitative and/or qualitative studies they will be including, and this will inform their choice of review methods. These may range from standard meta-analysis through to more complex mixed-method synthesis and synthesis approaches that incorporate theory and/or user's perspectives. Copyright © 2013 Elsevier Inc. All rights reserved.
Integrated Safety Risk Reduction Approach to Enhancing Human-Rated Spaceflight Safety
NASA Astrophysics Data System (ADS)
Mikula, J. F. Kip
2005-12-01
This paper explores and defines the current accepted concept and philosophy of safety improvement based on a Reliability enhancement (called here Reliability Enhancement Based Safety Theory [REBST]). In this theory a Reliability calculation is used as a measure of the safety achieved on the program. This calculation may be based on a math model or a Fault Tree Analysis (FTA) of the system, or on an Event Tree Analysis (ETA) of the system's operational mission sequence. In each case, the numbers used in this calculation are hardware failure rates gleaned from past similar programs. As part of this paper, a fictional but representative case study is provided that helps to illustrate the problems and inaccuracies of this approach to safety determination. Then a safety determination and enhancement approach based on hazard, worst case analysis, and safety risk determination (called here Worst Case Based Safety Theory [WCBST]) is included. This approach is defined and detailed using the same example case study as shown in the REBST case study. In the end it is concluded that an approach combining the two theories works best to reduce Safety Risk.
Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.
1997-01-01
This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620
Proposed reliability cost model
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1973-01-01
The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.
Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support
NASA Astrophysics Data System (ADS)
Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.
2017-12-01
The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.
Thermal Design, Analysis, and Testing of the Quench Module Insert Bread Board
NASA Technical Reports Server (NTRS)
Breeding, Shawn; Khodabandeh, Julia
2002-01-01
Contents include the following: Quench Module Insert (QMI) science requirements. QMI interfaces. QMI design layout. QMI thermal analysis and design methodology. QMI bread board testing and instrumentation approach. QMI thermal probe design parameters. Design features for gradient measurement. Design features for heated zone measurements. Thermal gradient analysis results. Heated zone analysis results. Bread board thermal probe layout. QMI bread board correlation and performance. Summary and conclusions.
Laboratory Spectrometer for Wear Metal Analysis of Engine Lubricants.
1986-04-01
analysis, the acid digestion technique for sample pretreatment is the best approach available to date because of its relatively large sample size (1000...microliters or more). However, this technique has two major shortcomings limiting its application: (1) it requires the use of hydrofluoric acid (a...accuracy. Sample preparation including filtration or acid digestion may increase analysis times by 20 minutes or more. b. Repeatability In the analysis
Advancing Alternative Analysis: Integration of Decision Science.
Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A
2017-06-13
Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.
Alternate Methods in Refining the SLS Nozzle Plug Loads
NASA Technical Reports Server (NTRS)
Burbank, Scott; Allen, Andrew
2013-01-01
Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.
Simulation Assisted Risk Assessment: Blast Overpressure Modeling
NASA Technical Reports Server (NTRS)
Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael
2006-01-01
A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.
Interdisciplinary research on patient-provider communication: a cross-method comparison.
Chou, Wen-Ying Sylvia; Han, Paul; Pilsner, Alison; Coa, Kisha; Greenberg, Larrie; Blatt, Benjamin
2011-01-01
Patient-provider communication, a key aspect of healthcare delivery, has been assessed through multiple methods for purposes of research, education, and quality control. Common techniques include satisfaction ratings and quantitatively- and qualitatively-oriented direct observations. Identifying the strengths and weaknesses of different approaches is critically important in determining the appropriate assessment method for a specific research or practical goal. Analyzing ten videotaped simulated encounters between medical students and Standardized Patients (SPs), this study compared three existing assessment methods through the same data set. Methods included: (1) dichotomized SP ratings on students' communication skills; (2) Roter Interaction Analysis System (RIAS) analysis; and (3) inductive discourse analysis informed by sociolinguistic theories. The large dichotomous contrast between good and poor ratings in (1) was not evidenced in any of the other methods. Following a discussion of strengths and weaknesses of each approach, we pilot-tested a combined assessment done by coders blinded to results of (1)-(3). This type of integrative approach has the potential of adding a quantifiable dimension to qualitative, discourse-based observations. Subjecting the same data set to separate analytic methods provides an excellent opportunity for methodological comparisons with the goal of informing future assessment of clinical encounters.
Error analysis and correction of discrete solutions from finite element codes
NASA Technical Reports Server (NTRS)
Thurston, G. A.; Stein, P. A.; Knight, N. F., Jr.; Reissner, J. E.
1984-01-01
Many structures are an assembly of individual shell components. Therefore, results for stresses and deflections from finite element solutions for each shell component should agree with the equations of shell theory. This paper examines the problem of applying shell theory to the error analysis and the correction of finite element results. The general approach to error analysis and correction is discussed first. Relaxation methods are suggested as one approach to correcting finite element results for all or parts of shell structures. Next, the problem of error analysis of plate structures is examined in more detail. The method of successive approximations is adapted to take discrete finite element solutions and to generate continuous approximate solutions for postbuckled plates. Preliminary numerical results are included.
Shifting from Stewardship to Analytics of Massive Science Data
NASA Astrophysics Data System (ADS)
Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.
2015-12-01
Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.
Carvalho, Carolina Abreu de; Fonsêca, Poliana Cristina de Almeida; Nobre, Luciana Neri; Priore, Silvia Eloiza; Franceschini, Sylvia do Carmo Castro
2016-01-01
The objective of this study is to provide guidance for identifying dietary patterns using the a posteriori approach, and analyze the methodological aspects of the studies conducted in Brazil that identified the dietary patterns of children. Articles were selected from the Latin American and Caribbean Literature on Health Sciences, Scientific Electronic Library Online and Pubmed databases. The key words were: Dietary pattern; Food pattern; Principal Components Analysis; Factor analysis; Cluster analysis; Reduced rank regression. We included studies that identified dietary patterns of children using the a posteriori approach. Seven studies published between 2007 and 2014 were selected, six of which were cross-sectional and one cohort, Five studies used the food frequency questionnaire for dietary assessment; one used a 24-hour dietary recall and the other a food list. The method of exploratory approach used in most publications was principal components factor analysis, followed by cluster analysis. The sample size of the studies ranged from 232 to 4231, the values of the Kaiser-Meyer-Olkin test from 0.524 to 0.873, and Cronbach's alpha from 0.51 to 0.69. Few Brazilian studies identified dietary patterns of children using the a posteriori approach and principal components factor analysis was the technique most used.
Griffing, Lawrence R
2018-01-01
In this chapter, approaches to the image analysis of the choreography of the plant endoplasmic reticulum (ER) labeled with fluorescent fusion proteins ("stars," if you wish) are presented. The approaches include the analyses of those parts of the ER that are attached through membrane contact sites to moving or nonmoving partners (other "stars"). Image analysis is also used to understand the nature of the tubular polygonal network, the hallmark of this organelle, and how the polygons change over time due to tubule sliding or motion. Furthermore, the remodeling polygons of the ER interact with regions of fundamentally different topology, the ER cisternae, and image analysis can be used to separate the tubules from the cisternae. ER cisternae, like polygons and tubules, can be motile or stationary. To study which parts are attached to nonmoving partners, such as domains of the ER that form membrane contact sites with the plasma membrane/cell wall, an image analysis approach called persistency mapping has been used. To study the domains of the ER that are moving rapidly and streaming through the cell, the image analysis of optic flow has been used. However, optic flow approaches confuse the movement of the ER itself with the movement of proteins within the ER. As an overall measure of ER dynamics, optic flow approaches are of value, but their limitation as to what exactly is "flowing" needs to be specified. Finally, there are important imaging approaches that directly address the movement of fluorescent proteins within the ER lumen or in the membrane of the ER. Of these, fluorescence recovery after photobleaching (FRAP), inverse FRAP (iFRAP), and single particle tracking approaches are described.
NASA Astrophysics Data System (ADS)
Qi, Wei
2017-11-01
Cost-benefit analysis is commonly used for engineering planning and design problems in practice. However, previous cost-benefit based design flood estimation is based on stationary assumption. This study develops a non-stationary cost-benefit based design flood estimation approach. This approach integrates a non-stationary probability distribution function into cost-benefit analysis, and influence of non-stationarity on expected total cost (including flood damage and construction costs) and design flood estimation can be quantified. To facilitate design flood selections, a 'Risk-Cost' analysis approach is developed, which reveals the nexus of extreme flood risk, expected total cost and design life periods. Two basins, with 54-year and 104-year flood data respectively, are utilized to illustrate the application. It is found that the developed approach can effectively reveal changes of expected total cost and extreme floods in different design life periods. In addition, trade-offs are found between extreme flood risk and expected total cost, which reflect increases in cost to mitigate risk. Comparing with stationary approaches which generate only one expected total cost curve and therefore only one design flood estimation, the proposed new approach generate design flood estimation intervals and the 'Risk-Cost' approach selects a design flood value from the intervals based on the trade-offs between extreme flood risk and expected total cost. This study provides a new approach towards a better understanding of the influence of non-stationarity on expected total cost and design floods, and could be beneficial to cost-benefit based non-stationary design flood estimation across the world.
ERIC Educational Resources Information Center
Danielsen, Dina; Bruselius-Jensen, Maria; Laitsch, Daniel
2017-01-01
Health promotion and education researchers and practitioners advocate for more democratic approaches to school-based health education, including participatory teaching methods and the promotion of a broad and positive concept of health and health knowledge, including aspects of the German educational concept of "bildung." Although…
NASA Technical Reports Server (NTRS)
Perangelo, H. J.; Milordi, F. W.
1976-01-01
Analysis techniques used in the automated telemetry station (ATS) for on line data reduction are encompassed in a broad range of software programs. Concepts that form the basis for the algorithms used are mathematically described. The control the user has in interfacing with various on line programs is discussed. The various programs are applied to an analysis of flight data which includes unimodal and bimodal response signals excited via a swept frequency shaker and/or random aerodynamic forces. A nonlinear response error modeling analysis approach is described. Preliminary results in the analysis of a hard spring nonlinear resonant system are also included.
Robust Mediation Analysis Based on Median Regression
Yuan, Ying; MacKinnon, David P.
2014-01-01
Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925
The Use of Mobile Devices in Aiding Dietary Assessment and Evaluation
Zhu, Fengqing; Bosch, Marc; Woo, Insoo; Kim, SungYe; Boushey, Carol J.; Ebert, David S.; Delp, Edward J.
2010-01-01
There is a growing concern about chronic diseases and other health problems related to diet including obesity and cancer. The need to accurately measure diet (what foods a person consumes) becomes imperative. Dietary intake provides valuable insights for mounting intervention programs for prevention of chronic diseases. Measuring accurate dietary intake is considered to be an open research problem in the nutrition and health fields. In this paper, we describe a novel mobile telephone food record that will provide an accurate account of daily food and nutrient intake. Our approach includes the use of image analysis tools for identification and quantification of food that is consumed at a meal. Images obtained before and after foods are eaten are used to estimate the amount and type of food consumed. The mobile device provides a unique vehicle for collecting dietary information that reduces the burden on respondents that are obtained using more classical approaches for dietary assessment. We describe our approach to image analysis that includes the segmentation of food items, features used to identify foods, a method for automatic portion estimation, and our overall system architecture for collecting the food intake information. PMID:20862266
2011-12-01
therefore a more general approach uses the pseudo-inverse shown in Equation (12) to obtain the commanded gimbal rate. 1 /T T b N CMG...gimbal motor. Approaching the problem from this perspective increases the complexity significantly and the relationship between motor current and...included in this document confirms the equations that Schaub and Junkins developed. The approaches used in the two derivations are sufficiently
Uncertainty modelling of real-time observation of a moving object: photogrammetric measurements
NASA Astrophysics Data System (ADS)
Ulrich, Thomas
2015-04-01
Photogrametric systems are widely used in the field of industrial metrology to measure kinematic tasks such as tracking robot movements. In order to assess spatiotemporal deviations of a kinematic movement, it is crucial to have a reliable uncertainty of the kinematic measurements. Common methods to evaluate the uncertainty in kinematic measurements include approximations specified by the manufactures, various analytical adjustment methods and Kalman filters. Here a hybrid system estimator in conjunction with a kinematic measurement model is applied. This method can be applied to processes which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. Additionally, it has been shown that the approach is in accordance with GUM (Guide to the Expression of Uncertainty in Measurement). The approach is compared to the Kalman filter using simulated data to achieve an overall error calculation. Furthermore, the new approach is used for the analysis of a rotating system as this system has both a constant and a variable turn rate. As the new approach reduces overshoots it is more appropriate for analysing kinematic processes than the Kalman filter. In comparison with the manufacturer’s approximations, the new approach takes account of kinematic behaviour, with an improved description of the real measurement process. Therefore, this approach is well-suited to the analysis of kinematic processes with unknown changes in kinematic behaviour.
Culture-Independent Analysis of Probiotic Products by Denaturing Gradient Gel Electrophoresis
Temmerman, R.; Scheirlinck, I.; Huys, G.; Swings, J.
2003-01-01
In order to obtain functional and safe probiotic products for human consumption, fast and reliable quality control of these products is crucial. Currently, analysis of most probiotics is still based on culture-dependent methods involving the use of specific isolation media and identification of a limited number of isolates, which makes this approach relatively insensitive, laborious, and time-consuming. In this study, a collection of 10 probiotic products, including four dairy products, one fruit drink, and five freeze-dried products, were subjected to microbial analysis by using a culture-independent approach, and the results were compared with the results of a conventional culture-dependent analysis. The culture-independent approach involved extraction of total bacterial DNA directly from the product, PCR amplification of the V3 region of the 16S ribosomal DNA, and separation of the amplicons on a denaturing gradient gel. Digital capturing and processing of denaturing gradient gel electrophoresis (DGGE) band patterns allowed direct identification of the amplicons at the species level. This whole culture-independent approach can be performed in less than 30 h. Compared with culture-dependent analysis, the DGGE approach was found to have a much higher sensitivity for detection of microbial strains in probiotic products in a fast, reliable, and reproducible manner. Unfortunately, as reported in previous studies in which the culture-dependent approach was used, a rather high percentage of probiotic products suffered from incorrect labeling and yielded low bacterial counts, which may decrease their probiotic potential. PMID:12513998
Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review
Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie
2015-01-01
Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115
NASA Astrophysics Data System (ADS)
Prakoso, W. G.; Murtilaksono, K.; Tarigan, S. D.; Purwanto, Y. J.
2018-05-01
An approach on flow duration and flood design estimation on the ungauged catchment with no rainfall and discharge data availability was been being develop with hydrological modelling including rainfall run off model implemented with watershed characteristic dataset. Near real time Rainfall data from multi satellite platform e.g. TRMM can be utilized for regionalization approach on the ungauged catchment. Watershed hydrologically similarity analysis were conducted including all of the major watershed in Borneo which was predicted to be similar with the Nanga Raun Watershed. It was found that a satisfactory hydrological model calibration could be achieved using catchment weighted time series of TRMM daily rainfall data, performed on nearby catchment deemed to be sufficiently similar to Nanga Raun catchment in hydrological terms. Based on this calibration, rainfall runoff parameters were then transferred to a model. Relatively reliable flow duration curve and extreme discharge value estimation were produced with reasonable several limitation. Further approach may be performed in order to deal with the primary limitations inherent in the hydrological and statistical analysis, especially to give prolongation to the availability of the rainfall and climate data with some novel approach like downscaling of global climate model.
Probabilistic models for reactive behaviour in heterogeneous condensed phase media
NASA Astrophysics Data System (ADS)
Baer, M. R.; Gartling, D. K.; DesJardin, P. E.
2012-02-01
This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.
Evaluation of UK Integrated Care Pilots: research protocol
Ling, Tom; Bardsley, Martin; Adams, John; Lewis, Richard; Roland, Martin
2010-01-01
Background In response to concerns that the needs of the aging population for well-integrated care were increasing, the English National Health Service (NHS) appointed 16 Integrated Care Pilots following a national competition. The pilots have a range of aims including development of new organisational structures to support integration, changes in staff roles, reducing unscheduled emergency hospital admissions, reduced length of hospital stay, increasing patient satisfaction, and reducing cost. This paper describes the evaluation of the initiative which has been commissioned. Study design and data collection methods A mixed methods approach has been adopted including interviews with staff and patients, non-participant observation of meetings, structured written feedback from sites, questionnaires to patients and staff, and analysis of routinely collected hospital utilisation data for patients/service users. The qualitative analysis aims to identify the approaches taken to integration by the sites, the benefits which result, the context in which benefits have resulted, and the mechanisms by which they occur. Methods of analysis The quantitative analysis adopts a ‘difference in differences’ approach comparing health care utilisation before and after the intervention with risk-matched controls. The qualitative data analysis adopts a ‘theory of change’ approach in which we triangulate data from the quantitative analysis with qualitative data in order to describe causal effects (what happens when an independent variable changes) and causal mechanisms (what connects causes to their effects). An economic analysis will identify what incremental resources are required to make integration succeed and how they can be combined efficiently to produce better outcomes for patients. Conclusion This evaluation will produce a portfolio of evidence aimed at strengthening the evidence base for integrated care, and in particular identifying the context in which interventions are likely to be effective. These data will support a series of evaluation judgements aimed at reducing uncertainties about the role of integrated care in improving the efficient and effective delivery of healthcare. PMID:20922068
A study for hypergolic vapor sensor development
NASA Technical Reports Server (NTRS)
Stetter, J. R.
1977-01-01
The use of an electrochemical technique for MMH and N02 measurement was investigated. Specific MMH and N02 electrochemical sensors were developed. Experimental techniques for preparation, handling, and analysis of hydrazine's vapor mixtures at ppb and ppm levels were developed. Two approaches to N02 instrument design were evaluated including specific adsorption and specific electrochemical reduction. Two approaches to hydrazines monitoring were evaluated including catalytic conversion to N0 with subsequent N0 detection and direct specific electrochemical oxidation. Two engineering prototype MMH/N02 monitors were designed and constructed.
Module Degradation Mechanisms Studied by a Multi-Scale Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, Steve; Al-Jassim, Mowafak; Hacke, Peter
2016-11-21
A key pathway to meeting the Department of Energy SunShot 2020 goals is to reduce financing costs by improving investor confidence through improved photovoltaic (PV) module reliability. A comprehensive approach to further understand and improve PV reliability includes characterization techniques and modeling from module to atomic scale. Imaging techniques, which include photoluminescence, electroluminescence, and lock-in thermography, are used to locate localized defects responsible for module degradation. Small area samples containing such defects are prepared using coring techniques and are then suitable and available for microscopic study and specific defect modeling and analysis.
ERIC Educational Resources Information Center
Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.
2000-01-01
These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)
2000-04-10
interest. These include Statistical Energy Analysis (SEA), fuzzy structure theory, and approaches combining modal analysis and SEA. Non-determinism...34 arising with increasing frequency. This has led to Statistical Energy Analysis , in which a system is modelled as a collection of coupled subsystems...22. IUTAM Symposium on Statistical Energy Analysis . 1999 Ed. F.J. Fahy and W.G. Price. Kluwer Academic Publishing. • 23. R.S. Langley and P
Robitschek, Jon; Dresner, Harley; Hilger, Peter
2017-12-01
Photographic nasal analysis constitutes a critical step along the path toward accurate diagnosis and precise surgical planning in rhinoplasty. The learned process by which one assesses photographs, analyzes relevant anatomical landmarks, and generates a global view of the nasal aesthetic is less widely described. To discern the common pitfalls in performing photographic nasal analysis and to quantify the utility of a systematic approach model in teaching photographic nasal analysis to otolaryngology residents. This prospective observational study included 20 participants from a university-based otolaryngology residency program. The control and intervention groups underwent baseline graded assessment of 3 patients. The intervention group received instruction on a systematic approach model for nasal analysis, and both groups underwent postintervention testing at 10 weeks. Data were collected from October 1, 2015, through June 1, 2016. A 10-minute, 11-slide presentation provided instruction on a systematic approach to nasal analysis to the intervention group. Graded photographic nasal analysis using a binary 18-point system. The 20 otolaryngology residents (15 men and 5 women; age range, 24-34 years) were adept at mentioning dorsal deviation and dorsal profile with focused descriptions of tip angle and contour. Areas commonly omitted by residents included verification of the Frankfort plane, position of the lower lateral crura, radix position, and ratio of the ala to tip lobule. The intervention group demonstrated immediate improvement after instruction on the teaching model, with the mean (SD) postintervention test score doubling compared with their baseline performance (7.5 [2.7] vs 10.3 [2.5]; P < .001). At 10 weeks after the intervention, the mean comparative improvement in overall graded nasal analysis was 17% (95% CI, 10%-23%; P < .001). Otolaryngology residents demonstrated proficiency at incorporating nasal deviation, tip angle, and dorsal profile contour into their nasal analysis. They often omitted verification of the Frankfort plane, position of lower lateral crura, radix depth, and ala-to-tip lobule ratio. Findings with this novel 10-minute teaching model should be validated at other teaching institutions, and the instruction model should be further enhanced to teach more sophisticated analysis to residents as they proceed through training. NA.
Passive Fully Polarimetric W-Band Millimeter-Wave Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernacki, Bruce E.; Kelly, James F.; Sheen, David M.
2012-04-01
We present the theory, design, and experimental results obtained from a scanning passive W-band fully polarimetric imager. Passive millimeter-wave imaging offers persistent day/nighttime imaging and the ability to penetrate dust, clouds and other obscurants, including clothing and dry soil. The single-pixel scanning imager includes both far-field and near-field fore-optics for investigation of polarization phenomena. Using both fore-optics, a variety of scenes including natural and man-made objects was imaged and these results are presented showing the utility of polarimetric imaging for anomaly detection. Analysis includes conventional Stokes-parameter based approaches as well as multivariate image analysis methods.
NASA Astrophysics Data System (ADS)
Gardner, Robin P.; Xu, Libai
2009-10-01
The Center for Engineering Applications of Radioisotopes (CEAR) has been working for over a decade on the Monte Carlo library least-squares (MCLLS) approach for treating non-linear radiation analyzer problems including: (1) prompt gamma-ray neutron activation analysis (PGNAA) for bulk analysis, (2) energy-dispersive X-ray fluorescence (EDXRF) analyzers, and (3) carbon/oxygen tool analysis in oil well logging. This approach essentially consists of using Monte Carlo simulation to generate the libraries of all the elements to be analyzed plus any other required background libraries. These libraries are then used in the linear library least-squares (LLS) approach with unknown sample spectra to analyze for all elements in the sample. Iterations of this are used until the LLS values agree with the composition used to generate the libraries. The current status of the methods (and topics) necessary to implement the MCLLS approach is reported. This includes: (1) the Monte Carlo codes such as CEARXRF, CEARCPG, and CEARCO for forward generation of the necessary elemental library spectra for the LLS calculation for X-ray fluorescence, neutron capture prompt gamma-ray analyzers, and carbon/oxygen tools; (2) the correction of spectral pulse pile-up (PPU) distortion by Monte Carlo simulation with the code CEARIPPU; (3) generation of detector response functions (DRF) for detectors with linear and non-linear responses for Monte Carlo simulation of pulse-height spectra; and (4) the use of the differential operator (DO) technique to make the necessary iterations for non-linear responses practical. In addition to commonly analyzed single spectra, coincidence spectra or even two-dimensional (2-D) coincidence spectra can also be used in the MCLLS approach and may provide more accurate results.
How effects on health equity are assessed in systematic reviews of interventions.
Welch, Vivian; Tugwell, Peter; Petticrew, Mark; de Montigny, Joanne; Ueffing, Erin; Kristjansson, Betsy; McGowan, Jessie; Benkhalti Jandu, Maria; Wells, George A; Brand, Kevin; Smylie, Janet
2010-12-08
Enhancing health equity has now achieved international political importance with endorsement from the World Health Assembly in 2009. The failure of systematic reviews to consider effects on health equity is cited by decision-makers as a limitation to their ability to inform policy and program decisions. To systematically review methods to assess effects on health equity in systematic reviews of effectiveness. We searched the following databases up to July 2 2010: MEDLINE, PsychINFO, the Cochrane Methodology Register, CINAHL, Education Resources Information Center, Education Abstracts, Criminal Justice Abstracts, Index to Legal Periodicals, PAIS International, Social Services Abstracts, Sociological Abstracts, Digital Dissertations and the Health Technology Assessment Database. We searched SCOPUS to identify articles that cited any of the included studies on October 7 2010. We included empirical studies of cohorts of systematic reviews that assessed methods for measuring effects on health inequalities. Data were extracted using a pre-tested form by two independent reviewers. Risk of bias was appraised for included studies according to the potential for bias in selection and detection of systematic reviews. Thirty-four methodological studies were included. The methods used by these included studies were: 1) Targeted approaches (n=22); 2) gap approaches (n=12) and gradient approach (n=1). Gender or sex was assessed in eight out of 34 studies, socioeconomic status in ten studies, race/ethnicity in seven studies, age in seven studies, low and middle income countries in 14 studies, and two studies assessed multiple factors across health inequity may exist.Only three studies provided a definition of health equity. Four methodological approaches to assessing effects on health equity were identified: 1) descriptive assessment of reporting and analysis in systematic reviews (all 34 studies used a type of descriptive method); 2) descriptive assessment of reporting and analysis in original trials (12/34 studies); 3) analytic approaches (10/34 studies); and 4) applicability assessment (11/34 studies). Both analytic and applicability approaches were not reported transparently nor in sufficient detail to judge their credibility. There is a need for improvement in conceptual clarity about the definition of health equity, describing sufficient detail about analytic approaches (including subgroup analyses) and transparent reporting of judgments required for applicability assessments in order to assess and report effects on health equity in systematic reviews.
Saccone, Gabriele; Caissutti, Claudia; Khalifeh, Adeeb; Meltzer, Sara; Scifres, Christina; Simhan, Hyagriv N; Kelekci, Sefa; Sevket, Osman; Berghella, Vincenzo
2017-12-03
To compare both the prevalence of gestational diabetes mellitus (GDM) as well as maternal and neonatal outcomes by either the one-step or the two-step approaches. Electronic databases were searched from their inception until June 2017. We included all randomized controlled trials (RCTs) comparing the one-step with the two-step approaches for the screening and diagnosis of GDM. The primary outcome was the incidence of GDM. Three RCTs (n = 2333 participants) were included in the meta-analysis. 910 were randomized to the one step approach (75 g, 2 hrs), and 1423 to the two step approach. No significant difference in the incidence of GDM was found comparing the one step versus the two step approaches (8.4 versus 4.3%; relative risk (RR) 1.64, 95%CI 0.77-3.48). Women screened with the one step approach had a significantly lower risk of preterm birth (PTB) (3.7 versus 7.6%; RR 0.49, 95%CI 0.27-0.88), cesarean delivery (16.3 versus 22.0%; RR 0.74, 95%CI 0.56-0.99), macrosomia (2.9 versus 6.9%; RR 0.43, 95%CI 0.22-0.82), neonatal hypoglycemia (1.7 versus 4.5%; RR 0.38, 95%CI 0.16-0.90), and admission to neonatal intensive care unit (NICU) (4.4 versus 9.0%; RR 0.49, 95%CI 0.29-0.84), compared to those randomized to screening with the two step approach. The one and the two step approaches were not associated with a significant difference in the incidence of GDM. However, the one step approach was associated with better maternal and perinatal outcomes.
Wichlas, Florian; Tsitsilonis, Serafim; Kopf, Sebastian; Krapohl, Björn Dirk; Manegold, Sebastian
2017-01-01
Introduction: The aim of the present study is to develop a heuristic that could replace the surgeon's analysis for the decision on the operative approach of distal radius fractures based on simple fracture characteristics. Patients and methods: Five hundred distal radius fractures operated between 2011 and 2014 were analyzed for the surgeon's decision on the approach used. The 500 distal radius fractures were treated with open reduction and internal fixation through palmar, dorsal, and dorsopalmar approaches with 2.4 mm locking plates or underwent percutaneous fixation. The parameters that should replace the surgeon's analysis were the fractured palmar cortex, and the frontal and the sagittal split of the articular surface of the distal radius. Results: The palmar approach was used for 422 (84.4%) fractures, the dorsal approach for 39 (7.8%), and the combined dorsopalmar approach for 30 (6.0%). Nine (1.8%) fractures were treated percutaneously. The correlation between the fractured palmar cortex and the used palmar approach was moderate (r=0.464; p<0.0001). The correlation between the frontal split and the dorsal approach, including the dorsopalmar approach, was strong (r=0.715; p<0.0001). The sagittal split had only a weak correlation for the dorsal and dorsopalmar approach (r=0.300; p<0.0001). Discussion: The study shows that the surgical decision on the preferred approach is dictated through two simple factors, even in the case of complex fractures. Conclusion: When the palmar cortex is displaced in distal radius fractures, a palmar approach should be used. When there is a displaced frontal split of the articular surface, a dorsal approach should be used. When both are present, a dorsopalmar approach should be used. These two simple parameters could replace the surgeon's analysis for the surgical approach.
Advancing Alternative Analysis: Integration of Decision Science
Zaunbrecher, Virginia M.; Batteate, Christina M.; Blake, Ann; Carroll, William F.; Corbett, Charles J.; Hansen, Steffen Foss; Lempert, Robert J.; Linkov, Igor; McFadden, Roger; Moran, Kelly D.; Olivetti, Elsa; Ostrom, Nancy K.; Romero, Michelle; Schoenung, Julie M.; Seager, Thomas P.; Sinsheimer, Peter; Thayer, Kristina A.
2017-01-01
Background: Decision analysis—a systematic approach to solving complex problems—offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. Objectives: We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. Methods: A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups’ findings. Results: We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. Conclusions: We advance four recommendations: a) engaging the systematic development and evaluation of decision approaches and tools; b) using case studies to advance the integration of decision analysis into alternatives analysis; c) supporting transdisciplinary research; and d) supporting education and outreach efforts. https://doi.org/10.1289/EHP483 PMID:28669940
NASA Technical Reports Server (NTRS)
Towner, Robert L.; Band, Jonathan L.
2012-01-01
An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.
NASA Astrophysics Data System (ADS)
Lenticchia, E.; Coïsson, E.
2017-05-01
The present paper proposes the use of GIS for the application of the so-called phenomenological approach to the analysis of the seismic behaviour of historical buildings. This approach is based on the awareness that the different masonry building typologies are characterized by different, recurring vulnerabilities. Thus, the observation and classification of the real damage is seen as the first step for recognizing and classifying these vulnerabilities, in order to plan focused preventive interventions. For these purposes, the GIS has proven to be a powerful instrument to collect and manage this type of information on a large number of cases. This paper specifically focuses on the application of the phenomenological approach to the analysis of the seismic behaviour of fortified buildings, including castles, fortresses, citadels, and all the typical historical constructions characterized by the presence of massive towers and defensive walls. The main earthquakes which struck Italy in the last 40 years (up to the recent Central Italy seismic swarm) were taken into consideration and described by means of shake maps. A previously published work has been continued with the addition of new data and some improvements, including a specific symbology for the description of building typologies and conservation status on the maps, the indications of damage levels and the comparison between shake maps in terms of pga and in terms of pseudo-acceleration. The increase in knowledge obtained and the broader frame given by the analysis of the data are here directed to the primary aim of cultural heritage preservation.
ERIC Educational Resources Information Center
Emerson, Eric
2006-01-01
Sturmey (2005) argues that the evidence base underlying approaches to intervention based on applied behavioural analysis (ABA) are significantly stronger than that underlying approaches to intervention based on cognitive therapy. He concludes that "the ethical imperative of beneficence requires that people, including people with ID, receive known…
The analysis of the pilot's cognitive and decision processes
NASA Technical Reports Server (NTRS)
Curry, R. E.
1975-01-01
Articles are presented on pilot performance in zero-visibility precision approach, failure detection by pilots during automatic landing, experiments in pilot decision-making during simulated low visibility approaches, a multinomial maximum likelihood program, and a random search algorithm for laboratory computers. Other topics discussed include detection of system failures in multi-axis tasks and changes in pilot workload during an instrument landing.
ERIC Educational Resources Information Center
Mundia, Lawrence
2012-01-01
This mixed-methods study incorporated elements of survey, case study and action research approaches in investigating an at-risk child. Using an in-take interview, a diagnostic test, an error analysis, and a think-aloud clinical interview, the study identified the child's major presenting difficulties. These included: inability to use the four…
ERIC Educational Resources Information Center
Derfoufi, Sanae; Benmoussa, Adnane; El Harti, Jaouad; Ramli, Youssef; Taoufik, Jamal; Chaouir, Souad
2015-01-01
This study investigates the positive impact of the Case Method implemented during a 4- hours tutorial in "therapeutic chemistry module." We view the Case Method as one particular approach within the broader spectrum of problem based or inquiry based learning approaches. Sixty students were included in data analysis. A pre-test and…
Estimating individual benefits of medical or behavioral treatments in severely ill patients.
Diaz, Francisco J
2017-01-01
There is a need for statistical methods appropriate for the analysis of clinical trials from a personalized-medicine viewpoint as opposed to the common statistical practice that simply examines average treatment effects. This article proposes an approach to quantifying, reporting and analyzing individual benefits of medical or behavioral treatments to severely ill patients with chronic conditions, using data from clinical trials. The approach is a new development of a published framework for measuring the severity of a chronic disease and the benefits treatments provide to individuals, which utilizes regression models with random coefficients. Here, a patient is considered to be severely ill if the patient's basal severity is close to one. This allows the derivation of a very flexible family of probability distributions of individual benefits that depend on treatment duration and the covariates included in the regression model. Our approach may enrich the statistical analysis of clinical trials of severely ill patients because it allows investigating the probability distribution of individual benefits in the patient population and the variables that influence it, and we can also measure the benefits achieved in specific patients including new patients. We illustrate our approach using data from a clinical trial of the anti-depressant imipramine.
Thermal Design Overview of the Mars Exploration Rover Project
NASA Technical Reports Server (NTRS)
Tsuyuki, Glenn
2002-01-01
Contents include the following: Mission Overview. Thermal Environments. Driving Thermal Requirements. Thermal Design Approach. Thermal Control Block Diagram. Thermal Design Description. Thermal Analysis Results Summary. Testing Plans. Issues & Concerns.
Inelastic and Dynamic Fracture and Stress Analyses
NASA Technical Reports Server (NTRS)
Atluri, S. N.
1984-01-01
Large deformation inelastic stress analysis and inelastic and dynamic crack propagation research work is summarized. The salient topics of interest in engine structure analysis that are discussed herein include: (1) a path-independent integral (T) in inelastic fracture mechanics, (2) analysis of dynamic crack propagation, (3) generalization of constitutive relations of inelasticity for finite deformations , (4) complementary energy approaches in inelastic analyses, and (5) objectivity of time integration schemes in inelastic stress analysis.
The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements
NASA Technical Reports Server (NTRS)
Nemeth, Michael P.; Starnes, James H., Jr.
1998-01-01
A summary of existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.
The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements
NASA Technical Reports Server (NTRS)
Nemeth, Michael P.; Starnes, James H., Jr.
1998-01-01
A summary of the existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability-based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.
Optimization Techniques for Clustering,Connectivity, and Flow Problems in Complex Networks
2012-10-01
discrete optimization and for analysis of performance of algorithm portfolios; introducing a metaheuristic framework of variable objective search that...The results of empirical evaluation of the proposed algorithm are also included. 1.3 Theoretical analysis of heuristics and designing new metaheuristic ...analysis of heuristics for inapproximable problems and designing new metaheuristic approaches for the problems of interest; (IV) Developing new models
Three-dimensional Stress Analysis Using the Boundary Element Method
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Banerjee, P. K.
1984-01-01
The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.
A Comparative Analysis of Method Books for Class Jazz Instruction
ERIC Educational Resources Information Center
Watson, Kevin E.
2017-01-01
The purpose of this study was to analyze and compare instructional topics and teaching approaches included in selected class method books for jazz pedagogy through content analysis methodology. Frequency counts for the number of pages devoted to each defined instructional content category were compiled and percentages of pages allotted to each…
USDA-ARS?s Scientific Manuscript database
Using five centimeter resolution images acquired with an unmanned aircraft system (UAS), we developed and evaluated an image processing workflow that included the integration of resolution-appropriate field sampling, feature selection, object-based image analysis, and processing approaches for UAS i...
2017-05-25
37 Research Design ... research employed a mixed research methodology – quantitative with descriptive statistical analysis and qualitative with a thematic analysis approach...mixed research methodology – quantitative and qualitative, using interviews to collect the data. The interviews included demographic and open-ended
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-13
... emissions impact of the clean fuels exclusion, MDNR relied on a technical analysis of emissions from units..., correspondence, referenced above, also included a technical analysis demonstrating that the averaging approach... EPA cannot read your comment due to technical difficulties and cannot contact you for clarification...
NASA Technical Reports Server (NTRS)
1976-01-01
The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.
Energy Analysis of Offshore Systems | Wind | NREL
successful research to understand and improve the cost of wind generation technology. As a research approaches used to estimate direct and indirect economic impacts of offshore wind. Chart of cost data for report on cost trends. Recent studies include: Analysis of capital cost trends for planned and installed
Organizational Economics: Notes on the Use of Transaction-Cost Theory in the Study of Organizations.
ERIC Educational Resources Information Center
Robins, James A.
1987-01-01
Reviews transaction-cost approaches to organizational analysis, examines their use in microeconomic theory, and identifies some important flaws in the study. Advocates transaction-cost theory as a powerful tool for organizational and strategic analysis when set within the famework of more general organizational theory. Includes 61 references. (MLH)
Oligonucleotide microarrays and other ‘omics’ approaches are powerful tools for unsupervised analysis of chemical impacts on biological systems. However, the lack of well annotated biological pathways for many aquatic organisms, including fish, and the poor power of microarray-b...
Effectiveness of Occupational Health and Safety Training: A Systematic Review with Meta-Analysis
ERIC Educational Resources Information Center
Ricci, Federico; Chiesi, Andrea; Bisio, Carlo; Panari, Chiara; Pelosi, Annalisa
2016-01-01
Purpose: This meta-analysis aims to verify the efficacy of occupational health and safety (OHS) training in terms of knowledge, attitude and beliefs, behavior and health. Design/methodology/approach: The authors included studies published in English (2007-2014) selected from ten databases. Eligibility criteria were studies concerned with the…
A Market-oriented Approach To Maximizing Product Benefits: Cases in U.S. Forest Products Industries
Vijay S. Reddy; Robert J. Bush; Ronen Roudik
1996-01-01
Conjoint analysis, a decompositional customer preference modelling technique, has seen little application to forest products. However, the technique provides useful information for marketing decisions by quantifying consumer preference functions for multiattribute product alternatives. The results of a conjoint analysis include the contribution of each attribute and...
Tuan Pham; Julia Jones; Ronald Metoyer; Frederick Colwell
2014-01-01
The study of the diversity of multivariate objects shares common characteristics and goals across disciplines, including ecology and organizational management. Nevertheless, subject-matter experts have adopted somewhat separate diversity concepts and analysis techniques, limiting the potential for sharing and comparing across disciplines. Moreover, while large and...
Educational Leadership Effectiveness: A Rasch Analysis
ERIC Educational Resources Information Center
Sinnema, Claire; Ludlow, Larry; Robinson, Viviane
2016-01-01
Purpose: The purposes of this paper are, first, to establish the psychometric properties of the ELP tool, and, second, to test, using a Rasch item response theory analysis, the hypothesized progression of challenge presented by the items included in the tool. Design/ Methodology/ Approach: Data were collected at two time points through a survey of…
This analysis updates EPA's standard VSL estimate by using a more comprehensive collection of VSL studies that include studies published between 1992 and 2000, as well as applying a more appropriate statistical method. We provide a pooled effect VSL estimate by applying the empi...
Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr
2010-03-24
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less
CONFIG: Integrated engineering of systems and their operation
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Ryan, Dan; Fleming, Land
1994-01-01
This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.
The Faster, Better, Cheaper Approach to Space Missions: An Engineering Management Assessment
NASA Technical Reports Server (NTRS)
Hamaker, Joe
2000-01-01
This paper describes, in viewgraph form, the faster, better, cheaper approach to space missions. The topics include: 1) What drives "Faster, Better, Cheaper"? 2) Why Space Programs are Costly; 3) Background; 4) Aerospace Project Management (Old Culture); 5) Aerospace Project Management (New Culture); 6) Scope of Analysis Limited to Engineering Management Culture; 7) Qualitative Analysis; 8) Some Basic Principles of the New Culture; 9) Cause and Effect; 10) "New Ways of Doing Business" Survey Results; 11) Quantitative Analysis; 12) Recent Space System Cost Trends; 13) Spacecraft Dry Weight Trend; 14) Complexity Factor Trends; 15) Cost Normalization; 16) Cost Normalization Algorithm; 17) Unnormalized Cost vs. Normalized Cost; and 18) Concluding Observations.
Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals
Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.
2016-01-01
Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116
Wang, Chen-guang; Li, Yao-min; Zhang, Hua-feng; Li, Hui; Li, Zhi-jun
2016-03-01
We performed a meta-analysis, pooling the results from controlled clinical trials to compare the efficiency of anterior and posterior surgical approaches to Pipkin I and II fractures of the femoral head. Potential academic articles were identified from the Cochrane Library, Medline (1966-2015.5), PubMed (1966-2015.5), Embase (1980-2015.5) and ScienceDirect (1966-2015.5) databases. Gray studies were identified from the references of the included literature. Pooling of the data was performed and analyzed by RevMan software, version 5.1. Five case-control trials (CCTs) met the inclusion criteria. There were significant differences in the incidence of heterotopic ossification (HO) between the approaches, but no significant differences were found between the two groups regarding functional outcomes of the hip, general postoperative complications, osteonecrosis of the femoral head or post-traumatic arthritis. The present meta-analysis indicated that the posterior approach decreased the risk of heterotopic ossification compared with the anterior approach for the treatment of Pipkin I and II femoral head fractures. No other complications were related to anterior and posterior approaches. Future high-quality randomized, controlled trials (RCTs) are needed to determine the optimal surgical approach and to predict other postoperative complications. III. Copyright © 2016 IJS Publishing Group Limited. Published by Elsevier Ltd. All rights reserved.
2008-12-01
Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1 . AGENCY USE ONLY...on Investment (ROI) of the Zephyr system. This is achieved by ( 1 ) Developing a model to carry out Business Case Analysis (BCA) of JCTDs, including
Analysis of Crystallization Kinetics
NASA Technical Reports Server (NTRS)
Kelton, Kenneth F.
1997-01-01
A realistic computer model for polymorphic crystallization (i.e., initial and final phases with identical compositions), which includes time-dependent nucleation and cluster-size-dependent growth rates, is developed and tested by fits to experimental data. Model calculations are used to assess the validity of two of the more common approaches for the analysis of crystallization data. The effects of particle size on transformation kinetics, important for the crystallization of many systems of limited dimension including thin films, fine powders, and nanoparticles, are examined.
NASA Technical Reports Server (NTRS)
Hopkins, D. A.
1984-01-01
A unique upward-integrated top-down-structured approach is presented for nonlinear analysis of high-temperature multilayered fiber composite structures. Based on this approach, a special purpose computer code was developed (nonlinear COBSTRAN) which is specifically tailored for the nonlinear analysis of tungsten-fiber-reinforced superalloy (TFRS) composite turbine blade/vane components of gas turbine engines. Special features of this computational capability include accounting of; micro- and macro-heterogeneity, nonlinear (stess-temperature-time dependent) and anisotropic material behavior, and fiber degradation. A demonstration problem is presented to mainfest the utility of the upward-integrated top-down-structured approach, in general, and to illustrate the present capability represented by the nonlinear COBSTRAN code. Preliminary results indicate that nonlinear COBSTRAN provides the means for relating the local nonlinear and anisotropic material behavior of the composite constituents to the global response of the turbine blade/vane structure.
DIELECTROPHORESIS-BASED MICROFLUIDIC SEPARATION AND DETECTION SYSTEMS
Yang, Jun; Vykoukal, Jody; Noshari, Jamileh; Becker, Frederick; Gascoyne, Peter; Krulevitch, Peter; Fuller, Chris; Ackler, Harold; Hamilton, Julie; Boser, Bernhard; Eldredge, Adam; Hitchens, Duncan; Andrews, Craig
2009-01-01
Diagnosis and treatment of human diseases frequently requires isolation and detection of certain cell types from a complex mixture. Compared with traditional separation and detection techniques, microfluidic approaches promise to yield easy-to-use diagnostic instruments tolerant of a wide range of operating environments and capable of accomplishing automated analyses. These approaches will enable diagnostic advances to be disseminated from sophisticated clinical laboratories to the point-of-care. Applications will include the separation and differential analysis of blood cell subpopulations for host-based detection of blood cell changes caused by disease, infection, or exposure to toxins, and the separation and analysis of surface-sensitized, custom dielectric beads for chemical, biological, and biomolecular targets. Here we report a new particle separation and analysis microsystem that uses dielectrophoretic field-flow fractionation (DEP-FFF). The system consists of a microfluidic chip with integrated sample injector, a DEP-FFF separator, and an AC impedance sensor. We show the design of a miniaturized impedance sensor integrated circuit (IC) with improved sensitivity, a new packaging approach for micro-flumes that features a slide-together compression package and novel microfluidic interconnects, and the design, control, integration and packaging of a fieldable prototype. Illustrative applications will be shown, including the separation of different sized beads and different cell types, blood cell differential analysis, and impedance sensing results for beads, spores and cells. PMID:22025905
Estimating hazard ratios in cohort data with missing disease information due to death.
Binder, Nadine; Herrnböck, Anne-Sophie; Schumacher, Martin
2017-03-01
In clinical and epidemiological studies information on the primary outcome of interest, that is, the disease status, is usually collected at a limited number of follow-up visits. The disease status can often only be retrieved retrospectively in individuals who are alive at follow-up, but will be missing for those who died before. Right-censoring the death cases at the last visit (ad-hoc analysis) yields biased hazard ratio estimates of a potential risk factor, and the bias can be substantial and occur in either direction. In this work, we investigate three different approaches that use the same likelihood contributions derived from an illness-death multistate model in order to more adequately estimate the hazard ratio by including the death cases into the analysis: a parametric approach, a penalized likelihood approach, and an imputation-based approach. We investigate to which extent these approaches allow for an unbiased regression analysis by evaluating their performance in simulation studies and on a real data example. In doing so, we use the full cohort with complete illness-death data as reference and artificially induce missing information due to death by setting discrete follow-up visits. Compared to an ad-hoc analysis, all considered approaches provide less biased or even unbiased results, depending on the situation studied. In the real data example, the parametric approach is seen to be too restrictive, whereas the imputation-based approach could almost reconstruct the original event history information. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Whole-genome CNV analysis: advances in computational approaches.
Pirooznia, Mehdi; Goes, Fernando S; Zandi, Peter P
2015-01-01
Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development.
Shpynov, S; Pozdnichenko, N; Gumenuk, A
2015-01-01
Genome sequences of 36 Rickettsia and Orientia were analyzed using Formal Order Analysis (FOA). This approach takes into account arrangement of nucleotides in each sequence. A numerical characteristic, the average distance (remoteness) - "g" was used to compare of genomes. Our results corroborated previous separation of three groups within the genus Rickettsia, including typhus group, classic spotted fever group, and the ancestral group and Orientia as a separate genus. Rickettsia felis URRWXCal2 and R. akari Hartford were not in the same group based on FOA, therefore designation of a so-called transitional Rickettsia group could not be confirmed with this approach. Copyright © 2015 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.
2014-09-01
simulation time frame from 30 days to one year. This was enabled by porting the simulation to the Pleiades supercomputer at NASA Ames Research Center, a...including the motivation for changes to our past approach. We then present the software implementation (3) on the NASA Ames Pleiades supercomputer...significantly updated since last year’s paper [25]. The main incentive for that was the shift to a highly parallel approach in order to utilize the Pleiades
Integrated orbital servicing study follow-on. Volume 2: Technical analysis and system design
NASA Technical Reports Server (NTRS)
1978-01-01
In-orbit service functional and physical requirements to support both low and high Earth orbit servicing/maintenance operations were defined, an optimum servicing system configuration was developed and mockups and early prototype hardware were fabricated to demonstrate and validate the concepts selected. Significant issues addressed include criteria for concept selection; representative mission equipment and approaches to their design for serviceability; significant serviceable spacecraft design aspects; servicer mechanism operation in one-g; approaches for the demonstration/simulation; and service mechanism structure design approach.
Understanding Chemical Changes through Sugar Heating.
ERIC Educational Resources Information Center
Papageorgiou, George
1998-01-01
Presents an approach that can help students to understand what happens in an experiment. Uses overlapping transparencies of both the experiment and the analysis. Includes details of the experiment and transparency preparation. (DDR)
Next generation system modeling of NTR systems
NASA Technical Reports Server (NTRS)
Buksa, John J.; Rider, William J.
1993-01-01
The topics are presented in viewgraph form and include the following: nuclear thermal rocket (NTR) modeling challenges; current approaches; shortcomings of current analysis method; future needs; and present steps to these goals.
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
Rotor design optimization using a free wake analysis
NASA Technical Reports Server (NTRS)
Quackenbush, Todd R.; Boschitsch, Alexander H.; Wachspress, Daniel A.; Chua, Kiat
1993-01-01
The aim of this effort was to develop a comprehensive performance optimization capability for tiltrotor and helicopter blades. The analysis incorporates the validated EHPIC (Evaluation of Hover Performance using Influence Coefficients) model of helicopter rotor aerodynamics within a general linear/quadratic programming algorithm that allows optimization using a variety of objective functions involving the performance. The resulting computer code, EHPIC/HERO (HElicopter Rotor Optimization), improves upon several features of the previous EHPIC performance model and allows optimization utilizing a wide spectrum of design variables, including twist, chord, anhedral, and sweep. The new analysis supports optimization of a variety of objective functions, including weighted measures of rotor thrust, power, and propulsive efficiency. The fundamental strength of the approach is that an efficient search for improved versions of the baseline design can be carried out while retaining the demonstrated accuracy inherent in the EHPIC free wake/vortex lattice performance analysis. Sample problems are described that demonstrate the success of this approach for several representative rotor configurations in hover and axial flight. Features that were introduced to convert earlier demonstration versions of this analysis into a generally applicable tool for researchers and designers is also discussed.
NASA Technical Reports Server (NTRS)
Stoll, Frederick
1993-01-01
The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
Hooks, J; Wolfberg, A J; Wang, E T; Struble, C A; Zahn, J; Juneau, K; Mohseni, M; Huang, S; Bogard, P; Song, K; Oliphant, A; Musci, T J
2014-05-01
To assess the performance of a directed chromosomal analysis approach in the prenatal evaluation of fetal sex chromosome aneuploidy. We analyzed 432 frozen maternal plasma samples obtained from patients prior to undergoing fetal diagnostic testing. The cohort included women greater than 18 years of age with a singleton pregnancy of greater than 10 weeks gestation. Samples were analyzed using a chromosome-selective approach (DANSR(TM) ) and a risk algorithm that incorporates fetal fraction (FORTE(TM) ). The cohort included 34 cases of sex chromosome aneuploidy. The assay correctly identified 26 of 27 (92.6%) cases of Monosomy X, one case of XXX, and all six cases of XXY. There were four false positive cases of sex chromosome aneuploidy among 380 euploid cases for an overall false positive rate of less than 1%. Analysis of the risk for sex chromosome aneuploidies can be accomplished with a targeted assay with high sensitivity. © 2014 John Wiley & Sons, Ltd.
Inference and Prediction of Metabolic Network Fluxes
Nikoloski, Zoran; Perez-Storey, Richard; Sweetlove, Lee J.
2015-01-01
In this Update, we cover the basic principles of the estimation and prediction of the rates of the many interconnected biochemical reactions that constitute plant metabolic networks. This includes metabolic flux analysis approaches that utilize the rates or patterns of redistribution of stable isotopes of carbon and other atoms to estimate fluxes, as well as constraints-based optimization approaches such as flux balance analysis. Some of the major insights that have been gained from analysis of fluxes in plants are discussed, including the functioning of metabolic pathways in a network context, the robustness of the metabolic phenotype, the importance of cell maintenance costs, and the mechanisms that enable energy and redox balancing at steady state. We also discuss methodologies to exploit 'omic data sets for the construction of tissue-specific metabolic network models and to constrain the range of permissible fluxes in such models. Finally, we consider the future directions and challenges faced by the field of metabolic network flux phenotyping. PMID:26392262
An introduction to autonomous control systems
NASA Technical Reports Server (NTRS)
Antsaklis, Panos J.; Passino, Kevin M.; Wang, S. J.
1991-01-01
The functions, characteristics, and benefits of autonomous control are outlined. An autonomous control functional architecture for future space vehicles that incorporates the concepts and characteristics described is presented. The controller is hierarchical, with an execution level (the lowest level), coordination level (middle level), and management and organization level (highest level). The general characteristics of the overall architecture, including those of the three levels, are explained, and an example to illustrate their functions is given. Mathematical models for autonomous systems, including 'logical' discrete event system models, are discussed. An approach to the quantitative, systematic modeling, analysis, and design of autonomous controllers is also discussed. It is a hybrid approach since it uses conventional analysis techniques based on difference and differential equations and new techniques for the analysis of the systems described with a symbolic formalism such as finite automata. Some recent results from the areas of planning and expert systems, machine learning, artificial neural networks, and the area restructurable controls are briefly outlined.
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
Cuevas, Soledad
Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.
Wiggins, Paul A
2015-07-21
This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Quantification of regional fat volume in rat MRI
NASA Astrophysics Data System (ADS)
Sacha, Jaroslaw P.; Cockman, Michael D.; Dufresne, Thomas E.; Trokhan, Darren
2003-05-01
Multiple initiatives in the pharmaceutical and beauty care industries are directed at identifying therapies for weight management. Body composition measurements are critical for such initiatives. Imaging technologies that can be used to measure body composition noninvasively include DXA (dual energy x-ray absorptiometry) and MRI (magnetic resonance imaging). Unlike other approaches, MRI provides the ability to perform localized measurements of fat distribution. Several factors complicate the automatic delineation of fat regions and quantification of fat volumes. These include motion artifacts, field non-uniformity, brightness and contrast variations, chemical shift misregistration, and ambiguity in delineating anatomical structures. We have developed an approach to deal practically with those challenges. The approach is implemented in a package, the Fat Volume Tool, for automatic detection of fat tissue in MR images of the rat abdomen, including automatic discrimination between abdominal and subcutaneous regions. We suppress motion artifacts using masking based on detection of implicit landmarks in the images. Adaptive object extraction is used to compensate for intensity variations. This approach enables us to perform fat tissue detection and quantification in a fully automated manner. The package can also operate in manual mode, which can be used for verification of the automatic analysis or for performing supervised segmentation. In supervised segmentation, the operator has the ability to interact with the automatic segmentation procedures to touch-up or completely overwrite intermediate segmentation steps. The operator's interventions steer the automatic segmentation steps that follow. This improves the efficiency and quality of the final segmentation. Semi-automatic segmentation tools (interactive region growing, live-wire, etc.) improve both the accuracy and throughput of the operator when working in manual mode. The quality of automatic segmentation has been evaluated by comparing the results of fully automated analysis to manual analysis of the same images. The comparison shows a high degree of correlation that validates the quality of the automatic segmentation approach.
USDA-ARS?s Scientific Manuscript database
The objective of this analysis is to estimate and compare the cost-effectiveness of on- and off-field approaches to reducing nitrogen loadings. On-field practices include improving the timing, rate, and method of nitrogen application. Off-field practices include restoring wetlands and establishing v...
Training Teachers to Conduct Trial-Based Functional Analyses
ERIC Educational Resources Information Center
Kunnavatana, S. Shanun; Bloom, Sarah E.; Samaha, Andrew L.; Dayton, Elizabeth
2013-01-01
The trial-based functional analysis (FA) is a promising approach to identification of behavioral function and is especially suited for use in educational settings. Not all studies on trial-based FA have included teachers as therapists, and those studies that have, included minimal information on teacher training. The purpose of this study was to…
A realistic evaluation: the case of protocol-based care
2010-01-01
Background 'Protocol based care' was envisioned by policy makers as a mechanism for delivering on the service improvement agenda in England. Realistic evaluation is an increasingly popular approach, but few published examples exist, particularly in implementation research. To fill this gap, within this paper we describe the application of a realistic evaluation approach to the study of protocol-based care, whilst sharing findings of relevance about standardising care through the use of protocols, guidelines, and pathways. Methods Situated between positivism and relativism, realistic evaluation is concerned with the identification of underlying causal mechanisms, how they work, and under what conditions. Fundamentally it focuses attention on finding out what works, for whom, how, and in what circumstances. Results In this research, we were interested in understanding the relationships between the type and nature of particular approaches to protocol-based care (mechanisms), within different clinical settings (context), and what impacts this resulted in (outcomes). An evidence review using the principles of realist synthesis resulted in a number of propositions, i.e., context, mechanism, and outcome threads (CMOs). These propositions were then 'tested' through multiple case studies, using multiple methods including non-participant observation, interviews, and document analysis through an iterative analysis process. The initial propositions (conjectured CMOs) only partially corresponded to the findings that emerged during analysis. From the iterative analysis process of scrutinising mechanisms, context, and outcomes we were able to draw out some theoretically generalisable features about what works, for whom, how, and what circumstances in relation to the use of standardised care approaches (refined CMOs). Conclusions As one of the first studies to apply realistic evaluation in implementation research, it was a good fit, particularly given the growing emphasis on understanding how context influences evidence-based practice. The strengths and limitations of the approach are considered, including how to operationalise it and some of the challenges. This approach provided a useful interpretive framework with which to make sense of the multiple factors that were simultaneously at play and being observed through various data sources, and for developing explanatory theory about using standardised care approaches in practice. PMID:20504293
Improving Site-Specific Radiological Performance Assessments - 13431
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tauxe, John; Black, Paul; Catlett, Kate
2013-07-01
An improved approach is presented for conducting complete and defensible radiological site-specific performance assessments (PAs) to support radioactive waste disposal decisions. The basic tenets of PA were initiated some thirty years ago, focusing on geologic disposals and evaluating compliance with regulations. Some of these regulations were inherently probabilistic (i.e., addressing uncertainty in a quantitative fashion), such as the containment requirements of the U.S. Environmental Protection Agency's (EPA's) 40 CFR 191, Environmental Radiation Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes, Chap. 191.13 [1]. Methods of analysis were developed to meet those requirements, butmore » at their core early PAs used 'conservative' parameter values and modeling approaches. This limited the utility of such PAs to compliance evaluation, and did little to inform decisions about optimizing disposal, closure and long-term monitoring and maintenance, or, in general, maintaining doses 'as low as reasonably achievable' (ALARA). This basic approach to PA development in the United States was employed essentially unchanged through the end of the 20. century, principally by the U.S. Department of Energy (DOE). Performance assessments developed in support of private radioactive waste disposal operations, regulated by the U.S. Nuclear Regulatory Commission (NRC) and its agreement states, were typically not as sophisticated. Discussion of new approaches to PA is timely, since at the time of this writing, the DOE is in the midst of revising its Order 435.1, Radioactive Waste Management [2], and the NRC is revising 10 CFR 61, Licensing Requirements for Land Disposal of Radioactive Waste [3]. Over the previous decade, theoretical developments and improved computational technology have provided the foundation for integrating decision analysis (DA) concepts and objective-focused thinking, plus a Bayesian approach to probabilistic modeling and risk analysis, to guide improvements in PA. This decision-making approach, [4, 5, 6] provides a transparent formal framework for using a value- or objective-focused approach to decision-making. DA, as an analytical means to implement structured decision making, provides a context for both understanding how uncertainty affects decisions and for targeting uncertainty reduction. The proposed DA approach improves defensibility and transparency of decision-making. The DA approach is fully consistent with the need to perform realistic modeling (rather than conservative modeling), including evaluation of site-specific factors. Instead of using generic stylized scenarios for radionuclide fate and transport and for human exposures to radionuclides, site-specific scenarios better represent the advantages and disadvantages of alternative disposal sites or engineered designs, thus clarifying their differences as well as providing a sound basis for evaluation of site performance. The full DA approach to PA is described, from explicitly incorporating societal values through stakeholder involvement to model building. Model building involves scoping by considering features, events, processes, and exposure scenarios (FEPSs), development of a conceptual site model (CSM), translation into numerical models and subsequent computation, and model evaluation. These are implemented in a cycle of uncertainty analysis, sensitivity analysis and value of information analysis so that uncertainty can be reduced until sufficient confidence is gained in the decisions to be made. This includes the traditional focus on hydrogeological processes, but also places emphasis on other FEPSs such as biotically-induced transport and human exposure phenomena. The significance of human exposure scenarios is emphasized by modifying the traditional acronym 'FEPs' to include them, hence 'FEPSs'. The radioactive waste community is also recognizing that disposal sites are to be considered a national (or even global) resource. As such, there is a pressing need to optimize their utility within the constraints of protecting human health and the environment. Failing to do so will result in the need for additional sites or options for storing radioactive waste temporarily, assuming a continued need for radioactive waste disposal. Optimization should be performed using DA, including economic analysis, invoked if necessary through the ALARA process. The economic analysis must recognize the cost of implementation (disposal design, closure, maintenance, etc.), and intra- and inter-generational equity in order to ensure that the best possible radioactive waste management decisions are made for the protection of both current and future generations. In most cases this requires consideration of population or collective risk. (authors)« less
Fukuda, Haruhisa; Shimizu, Sayuri; Ishizaki, Tatsuro
2015-01-01
To assess the value of organized care by comparing the clinical outcomes and healthcare expenditure between the conventional Japanese "integrated care across specialties within one hospital" mode of providing healthcare and the prospective approach of "organized care across separate facilities within a community". Retrospective cohort study. Two groups of hospitals were categorized according to healthcare delivery approach: the first group included 3 hospitals autonomously providing integrated care across specialties, and the second group included 4 acute care hospitals and 7 rehabilitative care hospitals providing organized care across separate facilities. Patients aged 65 years and above who had undergone hip fracture surgery. Regression models adjusting for patient characteristics and clinical variables were used to investigate the impact of organized care on the improvements to the mobility capability of patients before and after hospitalization and the differences in healthcare resource utilization. The sample for analysis included 837 hip fracture surgery cases. The proportion of patients with either unchanged or improved mobility capability was not statistically associated with the healthcare delivery approaches. Total adjusted mean healthcare expenditure for integrated care and organized care were US$28,360 (95% confidence interval: 27,787-28,972) and US$21,951 (21,511-22,420), respectively, indicating an average increase of US$6,409 in organized care. Our cost-consequence analysis underscores the need to further investigate the actual contribution of organized care to the provision of efficient and high-quality healthcare.
Experiences of Cigarette Smoking among Iranian Educated Women: A Qualitative Study.
Baheiraei, Azam; Mirghafourvand, Mojgan; Mohammadi, Eesa; Majdzadeh, Reza
2016-01-01
Smoking is a well-known public health problem in women as well as men. In many countries including Iran, there is an increase in tobacco use among women. Exploring the experience of smoking by educated women in order to develop effective tobacco prevention programs in these women is necessary. This study aimed to explore the experiences of smoking among Iranian educated women. This study used a method of qualitative content analysis with the deep individual, semi-structured interviews on a sample of 14 educated female smokers, selected purposefully. Data were analyzed using qualitative content analysis with conventional approach while being collected. The data analysis led to 16 subcategories which were divided into four main categories: (1) Personal factors including subcategories of imitation, show-off and independence, inexperience and curiosity, personal interest and desire, improved mood, and social defiance; (2) family factors including smokers in the family, intrafamily conflicts, and family strictures and limitations; (3) social factors including subcategories of effects of work and school environment, gender equality symbols, peer pressure, and acceptance among friends; and (4) negative consequences of smoking including subcategories of a sense of being physically hurt, psychological and emotional stress, and being looked upon in a negative and judgmental manner. The findings of this study showed that smoking among Iranian educated women is a multifactorial problem. Thus, it is necessary to address smoking among educated women in a holistic approach that focuses on different determinants including personal, family, and social factors particularly the gender roles and stereotypes.
CFD Methods and Tools for Multi-Element Airfoil Analysis
NASA Technical Reports Server (NTRS)
Rogers, Stuart E.; George, Michael W. (Technical Monitor)
1995-01-01
This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.
Effects of huangqi and bear bile on recurrent parotitis in children: a new clinical approach.
Ruan, Wen-hua; Huang, Mei-li; He, Xiao-lei; Zhang, Feng; Tao, Hai-biao
2013-03-01
To evaluate the pharmacological effects of traditional Chinese medicine, bear bile capsule and Huangqi granule, on recurrent parotitis in children. In this prospective, controlled, and randomized study, a total of 151 young children were divided into three groups: Group A included massaging the children's parotid region and melting vitamin C in their mouth daily; Group B included swallowing bear bile capsule and Huangqi granule daily; and Group C included massages and vitamin C as prescribed in Group A, and traditional Chinese medicine as prescribed in Group B. Children were treated individually for one month and then a follow-up study was conducted for 1 to 3.5 years. Analysis of variance (ANOVA) and Ridit analysis were employed for statistical analysis. The recurrence rate decreased in every group, but was significantly more in Groups B and C when compared to Group A. The recurrences significantly decreased (P<0.01) in Group B and their recovery rate was as high as 63%, significantly better than those of the other groups (P<0.01). Huangqi and bear bile could be a novel clinical approach for treating recurrent parotitis in children.
Effects of Huangqi and bear bile on recurrent parotitis in children: a new clinical approach*
Ruan, Wen-hua; Huang, Mei-li; He, Xiao-lei; Zhang, Feng; Tao, Hai-biao
2013-01-01
Objective: To evaluate the pharmacological effects of traditional Chinese medicine, bear bile capsule and Huangqi granule, on recurrent parotitis in children. Methods: In this prospective, controlled, and randomized study, a total of 151 young children were divided into three groups: Group A included massaging the children’s parotid region and melting vitamin C in their mouth daily; Group B included swallowing bear bile capsule and Huangqi granule daily; and Group C included massages and vitamin C as prescribed in Group A, and traditional Chinese medicine as prescribed in Group B. Children were treated individually for one month and then a follow-up study was conducted for 1 to 3.5 years. Analysis of variance (ANOVA) and Ridit analysis were employed for statistical analysis. Results: The recurrence rate decreased in every group, but was significantly more in Groups B and C when compared to Group A. The recurrences significantly decreased (P<0.01) in Group B and their recovery rate was as high as 63%, significantly better than those of the other groups (P<0.01). Conclusions: Huangqi and bear bile could be a novel clinical approach for treating recurrent parotitis in children. PMID:23463769
Evaluation of UK Integrated Care Pilots: research protocol.
Ling, Tom; Bardsley, Martin; Adams, John; Lewis, Richard; Roland, Martin
2010-09-27
In response to concerns that the needs of the aging population for well-integrated care were increasing, the English National Health Service (NHS) appointed 16 Integrated Care Pilots following a national competition. The pilots have a range of aims including development of new organisational structures to support integration, changes in staff roles, reducing unscheduled emergency hospital admissions, reduced length of hospital stay, increasing patient satisfaction, and reducing cost. This paper describes the evaluation of the initiative which has been commissioned. A mixed methods approach has been adopted including interviews with staff and patients, non-participant observation of meetings, structured written feedback from sites, questionnaires to patients and staff, and analysis of routinely collected hospital utilisation data for patients/service users. The qualitative analysis aims to identify the approaches taken to integration by the sites, the benefits which result, the context in which benefits have resulted, and the mechanisms by which they occur. The quantitative analysis adopts a 'difference in differences' approach comparing health care utilisation before and after the intervention with risk-matched controls. The qualitative data analysis adopts a 'theory of change' approach in which we triangulate data from the quantitative analysis with qualitative data in order to describe causal effects (what happens when an independent variable changes) and causal mechanisms (what connects causes to their effects). An economic analysis will identify what incremental resources are required to make integration succeed and how they can be combined efficiently to produce better outcomes for patients. This evaluation will produce a portfolio of evidence aimed at strengthening the evidence base for integrated care, and in particular identifying the context in which interventions are likely to be effective. These data will support a series of evaluation judgements aimed at reducing uncertainties about the role of integrated care in improving the efficient and effective delivery of healthcare.
The MOD-OA 200 kilowatt wind turbine generator design and analysis report
NASA Astrophysics Data System (ADS)
Andersen, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-08-01
The project requirements, approach, system description, design requirements, design, analysis, system tests, installation safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the MOD-OA 200 kw wind turbine generator are discussed. The components, the rotor, driven train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electrical system, and control systems are presented. The rotor includes the blades, hub and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control and Dynamic loads and fatigue are analyzed.
The MOD-OA 200 kilowatt wind turbine generator design and analysis report
NASA Technical Reports Server (NTRS)
Andersen, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-01-01
The project requirements, approach, system description, design requirements, design, analysis, system tests, installation safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the MOD-OA 200 kw wind turbine generator are discussed. The components, the rotor, driven train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electrical system, and control systems are presented. The rotor includes the blades, hub and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control and Dynamic loads and fatigue are analyzed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zubko, I. Yu., E-mail: zoubko@list.ru; Kochurov, V. I.
2015-10-27
For the aim of the crystal temperature control the computational-statistical approach to studying thermo-mechanical properties for finite sized crystals is presented. The approach is based on the combination of the high-performance computational techniques and statistical analysis of the crystal response on external thermo-mechanical actions for specimens with the statistically small amount of atoms (for instance, nanoparticles). The heat motion of atoms is imitated in the statics approach by including the independent degrees of freedom for atoms connected with their oscillations. We obtained that under heating, graphene material response is nonsymmetric.
Ocké, Marga C
2013-05-01
This paper aims to describe different approaches for studying the overall diet with advantages and limitations. Studies of the overall diet have emerged because the relationship between dietary intake and health is very complex with all kinds of interactions. These cannot be captured well by studying single dietary components. Three main approaches to study the overall diet can be distinguished. The first method is researcher-defined scores or indices of diet quality. These are usually based on guidelines for a healthy diet or on diets known to be healthy. The second approach, using principal component or cluster analysis, is driven by the underlying dietary data. In principal component analysis, scales are derived based on the underlying relationships between food groups, whereas in cluster analysis, subgroups of the population are created with people that cluster together based on their dietary intake. A third approach includes methods that are driven by a combination of biological pathways and the underlying dietary data. Reduced rank regression defines linear combinations of food intakes that maximally explain nutrient intakes or intermediate markers of disease. Decision tree analysis identifies subgroups of a population whose members share dietary characteristics that influence (intermediate markers of) disease. It is concluded that all approaches have advantages and limitations and essentially answer different questions. The third approach is still more in an exploration phase, but seems to have great potential with complementary value. More insight into the utility of conducting studies on the overall diet can be gained if more attention is given to methodological issues.
1986-06-01
la Armada (EMGAR)-- Staff of the Navy ------------------------- 18 b. Direction de Presupuesto Programac ion Ecomica (DIPPE)-Direction of Budget and...Economic Programming -------------------- 18 c. Cornite De Programacion y Presupuesto (CPP)-- Programming and Budget Committee-----------18 3. Major...development. This analysis is included in the annual budget. b. Direction de Presupuesto Programaclon Ecomica (DIPPE)- Direction of Budget and Economic
Taylor, William J
2016-03-01
Conjoint analysis of choice or preference data has been used in marketing for over 40 years but has appeared in healthcare settings much more recently. It may be a useful technique for applications within the rheumatology field. Conjoint analysis in rheumatology contexts has mainly used the approaches implemented in 1000Minds Ltd, Dunedin, New Zealand, Sawtooth Software, Orem UT, USA. Examples include classification criteria, composite response criteria, service prioritization tools and utilities assessment. Limitations imposed by very many attributes can be managed using new techniques. Conjoint analysis studies of classification and response criteria suggest that the assumption of equal weighting of attributes cannot be met, which challenges traditional approaches to composite criteria construction. Weights elicited through choice experiments with experts can derive more accurate classification criteria, than unweighted criteria. Studies that find significant variation in attribute weights for composite response criteria for gout make construction of such criteria problematic. Better understanding of various multiattribute phenomena is likely to increase with increased use of conjoint analysis, especially when the attributes concern individual perceptions or opinions. In addition to classification criteria, some applications for conjoint analysis that are emerging in rheumatology include prioritization tools, remission criteria, and utilities for life areas.
Christopher B. Dow; Brandon M. Collins; Scott L. Stephens
2016-01-01
Finding novel ways to plan and implement landscape-level forest treatments that protect sensitive wildlife and other key ecosystem components, while also reducing the risk of large-scale, high-severity fires, can prove to be difficult. We examined alternative approaches to landscape-scale fuel-treatment design for the same landscape. These approaches included two...
ERIC Educational Resources Information Center
Castro-Schilo, Laura; Ferrer, Emilio
2013-01-01
We illustrate the idiographic/nomothetic debate by comparing 3 approaches to using daily self-report data on affect for predicting relationship quality and breakup. The 3 approaches included (a) the first day in the series of daily data; (b) the mean and variability of the daily series; and (c) parameters from dynamic factor analysis, a…
NASA Astrophysics Data System (ADS)
Liu, Zhangjun; Liu, Zenghui
2018-06-01
This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.
Ebrahim, Shanil; Johnston, Bradley C; Akl, Elie A; Mustafa, Reem A; Sun, Xin; Walter, Stephen D; Heels-Ansdell, Diane; Alonso-Coello, Pablo; Guyatt, Gordon H
2014-05-01
We previously developed an approach to address the impact of missing participant data in meta-analyses of continuous variables in trials that used the same measurement instrument. We extend this approach to meta-analyses including trials that use different instruments to measure the same construct. We reviewed the available literature, conducted an iterative consultative process, and developed an approach involving a complete-case analysis complemented by sensitivity analyses that apply a series of increasingly stringent assumptions about results in patients with missing continuous outcome data. Our approach involves choosing the reference measurement instrument; converting scores from different instruments to the units of the reference instrument; developing four successively more stringent imputation strategies for addressing missing participant data; calculating a pooled mean difference for the complete-case analysis and imputation strategies; calculating the proportion of patients who experienced an important treatment effect; and judging the impact of the imputation strategies on the confidence in the estimate of effect. We applied our approach to an example systematic review of respiratory rehabilitation for chronic obstructive pulmonary disease. Our extended approach provides quantitative guidance for addressing missing participant data in systematic reviews of trials using different instruments to measure the same construct. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
WANG, D.; Wang, Y.; Zeng, X.
2017-12-01
Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.
McSherry, Robert; Timmins, Fiona; de Vries, Jan M A; McSherry, Wilfred
2018-06-22
Following declining health care practices at one UK health care site the subsequent and much publicized Francis Report made several far-reaching recommendations aimed at recovering optimal levels of care including stringent monitoring of practice. The aftermath of these deliberations have had resounding consequences for quality care both nationally and internationally. A reflective qualitative appreciative qualitative inquiry using a hybrid approach combining case study and thematic analysis outlines the development and analysis of a solution-focused intervention aimed at restoring staff confidence and optimal care levels at one key UK hospital site. Personal diaries were used to collect data. Data were analysed using descriptive thematic analysis. The implications of the five emerging themes and the 10-step approach used are discussed in the context of understanding care erosion and ways to effect organisational change. A novel approach to addressing care deficits, which provides a promising bottom-up approach, initiated by health care policy makers is suggested for use in other health care settings when concerns about care arise. It is anticipated this approach will prove useful for nurse managers, particularly in relation to finding positive solutions to addressing problems that surround potential failing standards of care in hospitals. © 2018 John Wiley & Sons Ltd.
A quantile-based scenario analysis approach to biomass supply chain optimization under uncertainty
Zamar, David S.; Gopaluni, Bhushan; Sokhansanj, Shahab; ...
2016-11-21
Supply chain optimization for biomass-based power plants is an important research area due to greater emphasis on renewable power energy sources. Biomass supply chain design and operational planning models are often formulated and studied using deterministic mathematical models. While these models are beneficial for making decisions, their applicability to real world problems may be limited because they do not capture all the complexities in the supply chain, including uncertainties in the parameters. This study develops a statistically robust quantile-based approach for stochastic optimization under uncertainty, which builds upon scenario analysis. We apply and evaluate the performance of our approach tomore » address the problem of analyzing competing biomass supply chains subject to stochastic demand and supply. Finally, the proposed approach was found to outperform alternative methods in terms of computational efficiency and ability to meet the stochastic problem requirements.« less
A quantile-based scenario analysis approach to biomass supply chain optimization under uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zamar, David S.; Gopaluni, Bhushan; Sokhansanj, Shahab
Supply chain optimization for biomass-based power plants is an important research area due to greater emphasis on renewable power energy sources. Biomass supply chain design and operational planning models are often formulated and studied using deterministic mathematical models. While these models are beneficial for making decisions, their applicability to real world problems may be limited because they do not capture all the complexities in the supply chain, including uncertainties in the parameters. This study develops a statistically robust quantile-based approach for stochastic optimization under uncertainty, which builds upon scenario analysis. We apply and evaluate the performance of our approach tomore » address the problem of analyzing competing biomass supply chains subject to stochastic demand and supply. Finally, the proposed approach was found to outperform alternative methods in terms of computational efficiency and ability to meet the stochastic problem requirements.« less
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Schwingshackl, Lukas; Chaimani, Anna; Hoffmann, Georg; Schwedhelm, Carolina; Boeing, Heiner
2017-03-20
Dietary advice is one of the cornerstones in the management of type 2 diabetes mellitus. The American Diabetes Association recommended a hypocaloric diet for overweight or obese adults with type 2 diabetes in order to induce weight loss. However, there is limited evidence on the optimal approaches to control hyperglycemia in type 2 diabetes patients. The aim of the present study is to assess the comparative efficacy of different dietary approaches on glycemic control and blood lipids in patients with type 2 diabetes mellitus in a systematic review including a standard pairwise and network meta-analysis of randomized trials. We will conduct searches in Cochrane Central Register of Controlled Trials (CENTRAL) on the Cochrane Library, PubMed (from 1966), and Google Scholar. Citations, abstracts, and relevant papers will be screened for eligibility by two reviewers independently. Randomized controlled trials (with a control group or randomized trials with at least two intervention groups) will be included if they meet the following criteria: (1) include type 2 diabetes mellitus, (2) include patients aged ≥18 years, (3) include dietary intervention (different type of diets: e.g., Mediterranean dietary pattern, low-carbohydrate diet, low-fat diet, vegetarian diet, high protein diet); either hypo, iso-caloric, or ad libitum diets, (4) minimum intervention period of 12 weeks. For each outcome measure of interest, random effects pairwise and network meta-analyses will be performed in order to determine the pooled relative effect of each intervention relative to every other intervention in terms of the post-intervention values (or mean differences between the changes from baseline value scores). Subgroup analyses are planned for study length, sample size, age, and sex. This systematic review will synthesize the available evidence on the comparative efficacy of different dietary approaches in the management of glycosylated hemoglobin (primary outcome), fasting glucose, and cardiovascular risk factors in type 2 diabetes mellitus patients. The results of the present network meta-analysis will influence evidence-based treatment decisions since it will be fundamental for based recommendations in the management of type 2 diabetes. PROSPERO 42016047464.
Charting the Learning Journey of a Group of Adults Returning to Education
ERIC Educational Resources Information Center
Mooney, Des
2011-01-01
Using a qualitative case study method the researcher studied a group of adult returning students completing a childcare course. Methods used included focus groups, a questionnaire and observations. Using a holistic analysis approach (Yin 2003) of the case the researcher then focused on a number of key issues. From this analysis the themes of…
Analysis of High School English Curriculum Materials through Rasch Measurement Model and Maxqda
ERIC Educational Resources Information Center
Batdi, Veli; Elaldi, Senel
2016-01-01
The purpose of the study is to analyze high school English curriculum materials (ECM) through FACETS analysis and MAXQDA-11 programs. The mixed methods approach, both quantitative and qualitative methods, were used in three samples including English teachers in Elazig during the 2014-2015 academic year. While the quantitative phase of the study…
ERIC Educational Resources Information Center
Kadiam, Subhash Chandra Bose S. V.; Mohammed, Ahmed Ali; Nguyen, Duc T.
2010-01-01
In this paper, we describe an approach to analyze 2D truss/Frame/Beam structures under Flash-based environment. Stiffness Matrix Method (SMM) module was developed as part of ongoing projects on a broad topic "Students' Learning Improvements in Science, Technology, Engineering and Mathematics (STEM) Related Areas" at Old Dominion…
ERIC Educational Resources Information Center
Golden, Thomas P.; Karpur, Arun
2012-01-01
This study is a comparative analysis of the impact of traditional face-to-face training contrasted with a blended learning approach, as it relates to improving skills, knowledge and attitudes for enhancing practices for achieving improved employment outcomes for individuals with disabilities. The study included two intervention groups: one…
ERIC Educational Resources Information Center
von Eye, Alexander; Mun, Eun Young; Bogat, G. Anne
2008-01-01
This article reviews the premises of configural frequency analysis (CFA), including methods of choosing significance tests and base models, as well as protecting [alpha], and discusses why CFA is a useful approach when conducting longitudinal person-oriented research. CFA operates at the manifest variable level. Longitudinal CFA seeks to identify…
Development and Application of an Integrated Approach toward NASA Airspace Systems Research
NASA Technical Reports Server (NTRS)
Barhydt, Richard; Fong, Robert K.; Abramson, Paul D.; Koenke, Ed
2008-01-01
The National Aeronautics and Space Administration's (NASA) Airspace Systems Program is contributing air traffic management research in support of the 2025 Next Generation Air Transportation System (NextGen). Contributions support research and development needs provided by the interagency Joint Planning and Development Office (JPDO). These needs generally call for integrated technical solutions that improve system-level performance and work effectively across multiple domains and planning time horizons. In response, the Airspace Systems Program is pursuing an integrated research approach and has adapted systems engineering best practices for application in a research environment. Systems engineering methods aim to enable researchers to methodically compare different technical approaches, consider system-level performance, and develop compatible solutions. Systems engineering activities are performed iteratively as the research matures. Products of this approach include a demand and needs analysis, system-level descriptions focusing on NASA research contributions, system assessment and design studies, and common systemlevel metrics, scenarios, and assumptions. Results from the first systems engineering iteration include a preliminary demand and needs analysis; a functional modeling tool; and initial system-level metrics, scenario characteristics, and assumptions. Demand and needs analysis results suggest that several advanced concepts can mitigate demand/capacity imbalances for NextGen, but fall short of enabling three-times current-day capacity at the nation s busiest airports and airspace. Current activities are focusing on standardizing metrics, scenarios, and assumptions, conducting system-level performance assessments of integrated research solutions, and exploring key system design interfaces.
Moment-based metrics for global sensitivity analysis of hydrological systems
NASA Astrophysics Data System (ADS)
Dell'Oca, Aronne; Riva, Monica; Guadagnini, Alberto
2017-12-01
We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE), other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of) analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.
Systems and context modeling approach to requirements analysis
NASA Astrophysics Data System (ADS)
Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick
2014-08-01
Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.
An approach to studying scale for students in higher education: a Rasch measurement model analysis.
Waugh, R F; Hii, T K; Islam, A
2000-01-01
A questionnaire comprising 80 self-report items was designed to measure student Approaches to Studying in a higher education context. The items were conceptualized and designed from five learning orientations: a Deep Approach, a Surface Approach, a Strategic Approach, Clarity of Direction and Academic Self-Confidence, to include 40 attitude items and 40 corresponding behavior items. The study aimed to create a scale and investigate its psychometric properties using a Rasch measurement model. The convenience sample consisted of 350 students at an Australian university in 1998. The analysis supported the conceptual structure of the Scale as involving studying attitudes and behaviors towards five orientations to learning. Attitudes are mostly easier than behaviors, in line with the theory. Sixty-eight items fit the model and have good psychometric properties. The proportion of observed variance considered true is 92% and the Scale is well-targeted against the students. Some harder items are needed to improve the targeting and some further testing work needs to be done on the Surface Approach. In the Surface Approach and Clarity of Direction in Studying, attitudes make a lesser contribution than behaviors to the variable, Approaches to Studying.
A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data
NASA Astrophysics Data System (ADS)
Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.
2014-12-01
Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while managing the uncertainties of scientific conclusions derived from such capabilities. This talk will provide an overview of JPL's efforts in developing a comprehensive architectural approach to data science.
Microwave Landing System signal requirements for conventional aircraft
DOT National Transportation Integrated Search
1972-07-01
The results of analysis directed towards determining Microwave Landing System (MLS) signal requirements for conventional aircraft are discussed. The phases of flight considered include straight-in final approach, flareout, and rollout. A limited numb...
A two-stage approach for estimating a statewide truck trip table.
DOT National Transportation Integrated Search
2014-05-01
Statewide models, including passenger and freight movements, are frequently used for : supporting numerous statewide planning activities. Many states use them for traffic impact : studies, air quality conformity analysis, freight planning, economic d...
Ninth NASTRAN (R) Users' Colloquium
NASA Technical Reports Server (NTRS)
1980-01-01
The general application of finite element methodology and the specific application of NASTRAN to a wide variety of static and dynamic structural problems is addressed. Comparison with other approaches and new methods of analysis with nastran are included.
Séblain, D; Bourlet, J; Sigaux, N; Khonsari, R H; Chauvel Picard, J; Gleizal, A
2018-06-01
Compare literature-reported efficiency and complications of the standard maxillary advancement surgery with those of a minimally invasive mucosal approach in patients with CL/P requiring Le Fort 1 osteotomy. Meta-analysis vs. retrospective analysis of 18 consecutive cases. Department of maxillofacial surgery at a tertiary-level public general hospital. The meta-analysis encompassed Medline, Embase and Cochrane, years 1990 to 2014, inclusive. The local series concerned all squeletally mature adolescents with non-syndromic CL/P who underwent orthognathic surgery between 30 April 2004 and 27 January 2012. Minimally invasive approach and perioperative orthodontics including intermaxillary fixation for 3 months after surgery. Assessment of complications. Standard lateral cephalograms were taken before surgery, then <1 week and 12 months after surgery. Delaire's cephalometric analysis was performed and the position of the maxilla was recorded. There were no significant differences between the literature and our series regarding sex and type of deformity (P=0.634 and 0.779, respectively). The mean horizontal and vertical relapse rates were 0.61 and 1.17mm (vs. 1.29 and 1.48mm in the meta-analysis) and the overall complication rate was 22.2% (vs. 12.76% but P=0.271). There was a significant difference regarding the palatal fistula rate (0 here vs. 21.43% in meta-analysis, P=0.028). The minimally invasive approach showed trends toward less relapse and less complications than conventional approaches. This technique seems adapted to the management of patients with CL/P sequelae. Other benefiting groups are underway. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study.
Vaismoradi, Mojtaba; Turunen, Hannele; Bondas, Terese
2013-09-01
Qualitative content analysis and thematic analysis are two commonly used approaches in data analysis of nursing research, but boundaries between the two have not been clearly specified. In other words, they are being used interchangeably and it seems difficult for the researcher to choose between them. In this respect, this paper describes and discusses the boundaries between qualitative content analysis and thematic analysis and presents implications to improve the consistency between the purpose of related studies and the method of data analyses. This is a discussion paper, comprising an analytical overview and discussion of the definitions, aims, philosophical background, data gathering, and analysis of content analysis and thematic analysis, and addressing their methodological subtleties. It is concluded that in spite of many similarities between the approaches, including cutting across data and searching for patterns and themes, their main difference lies in the opportunity for quantification of data. It means that measuring the frequency of different categories and themes is possible in content analysis with caution as a proxy for significance. © 2013 Wiley Publishing Asia Pty Ltd.
Robust Flutter Margin Analysis that Incorporates Flight Data
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Martin J.
1998-01-01
An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.
Characteristics of caring self-efficacy in pediatric nurses: a qualitative study.
Alavi, Azam; Bahrami, Masoud; Zargham-Boroujeni, Ali; Yousefy, Alireza
2015-07-01
The present study was conducted to clarify pediatric nurses' characteristics of caring self-efficacy. This study was conducted using a qualitative content analysis approach. The participants included 27 pediatric nurses and clinical instructors, selected purposively. Data were collected using semi-structured interviews and were analyzed using the content analysis method. Data analysis generated four main themes as attributes of a self-efficient pediatric nurse including: (a) professional communications; (b) management of care; (c) altruism; and (d) proficiency. Nursing managers and instructors can use these results to help develop nurses' empowerment and self-efficacy, especially in pediatric care. © 2015, Wiley Periodicals, Inc.
Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.
Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin
2016-02-15
Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.
Ferranti, Pasquale; Nasi, Antonella; Bruno, Milena; Basile, Adriana; Serpe, Luigi; Gallo, Pasquale
2011-05-15
In recent years, the occurrence of cyanobacterial blooms in eutrophic freshwaters has been described all over the world, including most European countries. Blooms of cyanobacteria may produce mixtures of toxic secondary metabolites, called cyanotoxins. Among these, the most studied are microcystins, a group of cyclic heptapeptides, because of their potent hepatotoxicity and activity as tumour promoters. Other peptide cyanotoxins have been described whose structure and toxicity have not been thoroughly studied. Herein we present a peptidomic approach aimed to characterise and quantify the peptide cyanotoxins produced in two Italian lakes, Averno and Albano. The procedure was based on matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry mass spectrometry (MALDI-TOF-MS) analysis for rapid detection and profiling of the peptide mixture complexity, combined with liquid chromatography/electrospray ionisation quadrupole time-of- flight tandem mass spectrometry (LC/ESI-Q-TOF-MS/MS) which provided unambiguous structural identification of the main compounds, as well as accurate quantitative analysis of microcystins. In the case of Lake Averno, a novel variant of microcystin-RR and two novel anabaenopeptin variants (Anabaenopeptins B(1) and Anabaenopeptin F(1)), presenting homoarginine in place of the commonly found arginine, were detected and characterised. In Lake Albano, the peculiar peptide patterns in different years were compared, as an example of the potentiality of the peptidomic approach for fast screening analysis, prior to fine structural analysis and determination of cyanotoxins, which included six novel aeruginosin variants. This approach allows for wide range monitoring of cyanobacteria blooms, and to collect data for evaluating possible health risks to consumers, through the panel of the compounds produced along different years. Copyright © 2011 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Bailey, Christine I.
2014-01-01
Drawing upon a postmodern ethnographic approach, the modes of inquiry into this qualitative study included observation and data analysis in order to represent a particular community of students: first year college freshmen from a mid-size, religiously-affiliated university in the southern United States. The methods included artifact…
ERIC Educational Resources Information Center
Yasar, Okan; Seremet, Mehmet
2007-01-01
This study brings in a comparative approach regarding pictures involved in secondary school (14-17 ages) textbooks taught in Turkey. In this respect, following the classification of pictures (line drawings and photographs) included in secondary school education geography textbooks, evaluation of the photographs in books in question in terms of…
A Curriculum Development Route Map for a Technology Enhanced Learning Era
ERIC Educational Resources Information Center
Castañeda, Linda; Prendes, Paz
2013-01-01
In this paper we are trying to present a model of analysis that includes a comprehensive perspective of the state of the art in the specialized literature about curriculum development. From this theoretical approach, we get a complete curriculum overview. Including insights into: what are the curriculum principal elements, what we already know…
Meta-analysis of individual registry results enhances international registry collaboration.
Paxton, Elizabeth W; Mohaddes, Maziar; Laaksonen, Inari; Lorimer, Michelle; Graves, Stephen E; Malchau, Henrik; Namba, Robert S; Kärrholm, John; Rolfson, Ola; Cafri, Guy
2018-03-28
Background and purpose - Although common in medical research, meta-analysis has not been widely adopted in registry collaborations. A meta-analytic approach in which each registry conducts a standardized analysis on its own data followed by a meta-analysis to calculate a weighted average of the estimates allows collaboration without sharing patient-level data. The value of meta-analysis as an alternative to individual patient data analysis is illustrated in this study by comparing the risk of revision of porous tantalum cups versus other uncemented cups in primary total hip arthroplasties from Sweden, Australia, and a US registry (2003-2015). Patients and methods - For both individual patient data analysis and meta-analysis approaches a Cox proportional hazard model was fit for time to revision, comparing porous tantalum (n = 23,201) with other uncemented cups (n = 128,321). Covariates included age, sex, diagnosis, head size, and stem fixation. In the meta-analysis approach, treatment effect size (i.e., Cox model hazard ratio) was calculated within each registry and a weighted average for the individual registries' estimates was calculated. Results - Patient-level data analysis and meta-analytic approaches yielded the same results with the porous tantalum cups having a higher risk of revision than other uncemented cups (HR (95% CI) 1.6 (1.4-1.7) and HR (95% CI) 1.5 (1.4-1.7), respectively). Adding the US cohort to the meta-analysis led to greater generalizability, increased precision of the treatment effect, and similar findings (HR (95% CI) 1.6 (1.4-1.7)) with increased risk of porous tantalum cups. Interpretation - The meta-analytic technique is a viable option to address privacy, security, and data ownership concerns allowing more expansive registry collaboration, greater generalizability, and increased precision of treatment effects.
Bayesian evidence computation for model selection in non-linear geoacoustic inference problems.
Dettmer, Jan; Dosso, Stan E; Osler, John C
2010-12-01
This paper applies a general Bayesian inference approach, based on Bayesian evidence computation, to geoacoustic inversion of interface-wave dispersion data. Quantitative model selection is carried out by computing the evidence (normalizing constants) for several model parameterizations using annealed importance sampling. The resulting posterior probability density estimate is compared to estimates obtained from Metropolis-Hastings sampling to ensure consistent results. The approach is applied to invert interface-wave dispersion data collected on the Scotian Shelf, off the east coast of Canada for the sediment shear-wave velocity profile. Results are consistent with previous work on these data but extend the analysis to a rigorous approach including model selection and uncertainty analysis. The results are also consistent with core samples and seismic reflection measurements carried out in the area.
Use of video-feedback, reflection, and interactive analysis to improve nurse leadership practices.
Crenshaw, Jeannette T
2012-01-01
The chronic shortage of registered nurses (RNs) affects patient safety and health care quality. Many factors affect the RN shortage in the workforce, including negative work environments, exacerbated by ineffective leadership approaches. Improvements in the use of relationship-based leadership approaches lead to healthier work environments that foster RN satisfaction and reduce RN turnover and vacancy rates in acute care settings. In this article, an innovative approach to reduce nurse turnover and decrease vacancy rates in acute care settings is described. Video feedback with reflection and interactive analysis is an untapped resource for nurse leaders and aspiring nurse leaders in their development of effective leadership skills. This unique method may be an effective leadership strategy for addressing recruitment and retention issues in a diverse workforce.
Kawarazuka, Nozomi; Locke, Catherine; McDougall, Cynthia; Kantor, Paula; Morgan, Miranda
2017-03-01
The demand for gender analysis is now increasingly orthodox in natural resource programming, including that for small-scale fisheries. Whilst the analysis of social-ecological resilience has made valuable contributions to integrating social dimensions into research and policy-making on natural resource management, it has so far demonstrated limited success in effectively integrating considerations of gender equity. This paper reviews the challenges in, and opportunities for, bringing a gender analysis together with social-ecological resilience analysis in the context of small-scale fisheries research in developing countries. We conclude that rather than searching for a single unifying framework for gender and resilience analysis, it will be more effective to pursue a plural solution in which closer engagement is fostered between analysis of gender and social-ecological resilience whilst preserving the strengths of each approach. This approach can make an important contribution to developing a better evidence base for small-scale fisheries management and policy.
Predicting the Consequences of Workload Management Strategies with Human Performance Modeling
NASA Technical Reports Server (NTRS)
Mitchell, Diane Kuhl; Samma, Charneta
2011-01-01
Human performance modelers at the US Army Research Laboratory have developed an approach for establishing Soldier high workload that can be used for analyses of proposed system designs. Their technique includes three key components. To implement the approach in an experiment, the researcher would create two experimental conditions: a baseline and a design alternative. Next they would identify a scenario in which the test participants perform all their representative concurrent interactions with the system. This scenario should include any events that would trigger a different set of goals for the human operators. They would collect workload values during both the control and alternative design condition to see if the alternative increased workload and decreased performance. They have successfully implemented this approach for military vehicle. designs using the human performance modeling tool, IMPRINT. Although ARL researches use IMPRINT to implement their approach, it can be applied to any workload analysis. Researchers using other modeling and simulations tools or conducting experiments or field tests can use the same approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whipple, C
Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less
Using "big data" to optimally model hydrology and water quality across expansive regions
Roehl, E.A.; Cook, J.B.; Conrads, P.A.
2009-01-01
This paper describes a new divide and conquer approach that leverages big environmental data, utilizing all available categorical and time-series data without subjectivity, to empirically model hydrologic and water-quality behaviors across expansive regions. The approach decomposes large, intractable problems into smaller ones that are optimally solved; decomposes complex signals into behavioral components that are easier to model with "sub- models"; and employs a sequence of numerically optimizing algorithms that include time-series clustering, nonlinear, multivariate sensitivity analysis and predictive modeling using multi-layer perceptron artificial neural networks, and classification for selecting the best sub-models to make predictions at new sites. This approach has many advantages over traditional modeling approaches, including being faster and less expensive, more comprehensive in its use of available data, and more accurate in representing a system's physical processes. This paper describes the application of the approach to model groundwater levels in Florida, stream temperatures across Western Oregon and Wisconsin, and water depths in the Florida Everglades. ?? 2009 ASCE.
Vibro-Acoustic Analysis of NASA's Space Shuttle Launch Pad 39A Flame Trench Wall
NASA Technical Reports Server (NTRS)
Margasahayam, Ravi N.
2009-01-01
A vital element to NASA's manned space flight launch operations is the Kennedy Space Center Launch Complex 39's launch pads A and B. Originally designed and constructed In the 1960s for the Saturn V rockets used for the Apollo missions, these pads were modified above grade to support Space Shuttle missions. But below grade, each of the pad's original walls (including a 42 feet deep, 58 feet wide, and 450 feet long tunnel designed to deflect flames and exhaust gases, the flame trench) remained unchanged. On May 31, 2008 during the launch of STS-124, over 3500 of the. 22000 interlocking refractory bricks that lined east wall of the flame trench, protecting the pad structure were liberated from pad 39A. The STS-124 launch anomaly spawned an agency-wide initiative to determine the failure root cause, to assess the impact of debris on vehicle and ground support equipment safety, and to prescribe corrective action. The investigation encompassed radar imaging, infrared video review, debris transport mechanism analysis using computational fluid dynamics, destructive testing, and non-destructive evaluation, including vibroacoustic analysis, in order to validate the corrective action. The primary focus of this paper is on the analytic approach, including static, modal, and vibro-acoustic analysis, required to certify the corrective action, and ensure Integrity and operational reliability for future launches. Due to the absence of instrumentation (including pressure transducers, acoustic pressure sensors, and accelerometers) in the flame trench, defining an accurate acoustic signature of the launch environment during shuttle main engine/solid rocket booster Ignition and vehicle ascent posed a significant challenge. Details of the analysis, including the derivation of launch environments, the finite element approach taken, and analysistest/ launch data correlation are discussed. Data obtained from the recent launch of STS-126 from Pad 39A was instrumental in validating the design analysis philosophies outlined in this paper.
Norström, Madelaine; Kristoffersen, Anja Bråthen; Görlach, Franziska Sophie; Nygård, Karin; Hopp, Petter
2015-01-01
In order to facilitate foodborne outbreak investigations there is a need to improve the methods for identifying the food products that should be sampled for laboratory analysis. The aim of this study was to examine the applicability of a likelihood ratio approach previously developed on simulated data, to real outbreak data. We used human case and food product distribution data from the Norwegian enterohaemorrhagic Escherichia coli outbreak in 2006. The approach was adjusted to include time, space smoothing and to handle missing or misclassified information. The performance of the adjusted likelihood ratio approach on the data originating from the HUS outbreak and control data indicates that the adjusted approach is promising and indicates that the adjusted approach could be a useful tool to assist and facilitate the investigation of food borne outbreaks in the future if good traceability are available and implemented in the distribution chain. However, the approach needs to be further validated on other outbreak data and also including other food products than meat products in order to make a more general conclusion of the applicability of the developed approach. PMID:26237468
In silico approaches to study mass and energy flows in microbial consortia: a syntrophic case study
2009-01-01
Background Three methods were developed for the application of stoichiometry-based network analysis approaches including elementary mode analysis to the study of mass and energy flows in microbial communities. Each has distinct advantages and disadvantages suitable for analyzing systems with different degrees of complexity and a priori knowledge. These approaches were tested and compared using data from the thermophilic, phototrophic mat communities from Octopus and Mushroom Springs in Yellowstone National Park (USA). The models were based on three distinct microbial guilds: oxygenic phototrophs, filamentous anoxygenic phototrophs, and sulfate-reducing bacteria. Two phases, day and night, were modeled to account for differences in the sources of mass and energy and the routes available for their exchange. Results The in silico models were used to explore fundamental questions in ecology including the prediction of and explanation for measured relative abundances of primary producers in the mat, theoretical tradeoffs between overall productivity and the generation of toxic by-products, and the relative robustness of various guild interactions. Conclusion The three modeling approaches represent a flexible toolbox for creating cellular metabolic networks to study microbial communities on scales ranging from cells to ecosystems. A comparison of the three methods highlights considerations for selecting the one most appropriate for a given microbial system. For instance, communities represented only by metagenomic data can be modeled using the pooled method which analyzes a community's total metabolic potential without attempting to partition enzymes to different organisms. Systems with extensive a priori information on microbial guilds can be represented using the compartmentalized technique, employing distinct control volumes to separate guild-appropriate enzymes and metabolites. If the complexity of a compartmentalized network creates an unacceptable computational burden, the nested analysis approach permits greater scalability at the cost of more user intervention through multiple rounds of pathway analysis. PMID:20003240
Meta-shell Approach for Constructing Lightweight and High Resolution X-Ray Optics
NASA Technical Reports Server (NTRS)
McClelland, Ryan S.
2016-01-01
Lightweight and high resolution optics are needed for future space-based x-ray telescopes to achieve advances in high-energy astrophysics. Past missions such as Chandra and XMM-Newton have achieved excellent angular resolution using a full shell mirror approach. Other missions such as Suzaku and NuSTAR have achieved lightweight mirrors using a segmented approach. This paper describes a new approach, called meta-shells, which combines the fabrication advantages of segmented optics with the alignment advantages of full shell optics. Meta-shells are built by layering overlapping mirror segments onto a central structural shell. The resulting optic has the stiffness and rotational symmetry of a full shell, but with an order of magnitude greater collecting area. Several meta-shells so constructed can be integrated into a large x-ray mirror assembly by proven methods used for Chandra and XMM-Newton. The mirror segments are mounted to the meta-shell using a novel four point semi-kinematic mount. The four point mount deterministically locates the segment in its most performance sensitive degrees of freedom. Extensive analysis has been performed to demonstrate the feasibility of the four point mount and meta-shell approach. A mathematical model of a meta-shell constructed with mirror segments bonded at four points and subject to launch loads has been developed to determine the optimal design parameters, namely bond size, mirror segment span, and number of layers per meta-shell. The parameters of an example 1.3 m diameter mirror assembly are given including the predicted effective area. To verify the mathematical model and support opto-mechanical analysis, a detailed finite element model of a meta-shell was created. Finite element analysis predicts low gravity distortion and low thermal distortion. Recent results are discussed including Structural Thermal Optical Performance (STOP) analysis as well as vibration and shock testing of prototype meta-shells.
Features of Cross-Correlation Analysis in a Data-Driven Approach for Structural Damage Assessment
Camacho Navarro, Jhonatan; Ruiz, Magda; Villamizar, Rodolfo; Mujica, Luis
2018-01-01
This work discusses the advantage of using cross-correlation analysis in a data-driven approach based on principal component analysis (PCA) and piezodiagnostics to obtain successful diagnosis of events in structural health monitoring (SHM). In this sense, the identification of noisy data and outliers, as well as the management of data cleansing stages can be facilitated through the implementation of a preprocessing stage based on cross-correlation functions. Additionally, this work evidences an improvement in damage detection when the cross-correlation is included as part of the whole damage assessment approach. The proposed methodology is validated by processing data measurements from piezoelectric devices (PZT), which are used in a piezodiagnostics approach based on PCA and baseline modeling. Thus, the influence of cross-correlation analysis used in the preprocessing stage is evaluated for damage detection by means of statistical plots and self-organizing maps. Three laboratory specimens were used as test structures in order to demonstrate the validity of the methodology: (i) a carbon steel pipe section with leak and mass damage types, (ii) an aircraft wing specimen, and (iii) a blade of a commercial aircraft turbine, where damages are specified as mass-added. As the main concluding remark, the suitability of cross-correlation features combined with a PCA-based piezodiagnostic approach in order to achieve a more robust damage assessment algorithm is verified for SHM tasks. PMID:29762505
Features of Cross-Correlation Analysis in a Data-Driven Approach for Structural Damage Assessment.
Camacho Navarro, Jhonatan; Ruiz, Magda; Villamizar, Rodolfo; Mujica, Luis; Quiroga, Jabid
2018-05-15
This work discusses the advantage of using cross-correlation analysis in a data-driven approach based on principal component analysis (PCA) and piezodiagnostics to obtain successful diagnosis of events in structural health monitoring (SHM). In this sense, the identification of noisy data and outliers, as well as the management of data cleansing stages can be facilitated through the implementation of a preprocessing stage based on cross-correlation functions. Additionally, this work evidences an improvement in damage detection when the cross-correlation is included as part of the whole damage assessment approach. The proposed methodology is validated by processing data measurements from piezoelectric devices (PZT), which are used in a piezodiagnostics approach based on PCA and baseline modeling. Thus, the influence of cross-correlation analysis used in the preprocessing stage is evaluated for damage detection by means of statistical plots and self-organizing maps. Three laboratory specimens were used as test structures in order to demonstrate the validity of the methodology: (i) a carbon steel pipe section with leak and mass damage types, (ii) an aircraft wing specimen, and (iii) a blade of a commercial aircraft turbine, where damages are specified as mass-added. As the main concluding remark, the suitability of cross-correlation features combined with a PCA-based piezodiagnostic approach in order to achieve a more robust damage assessment algorithm is verified for SHM tasks.
Close Approach Prediction Analysis of the Earth Science Constellation with the Fengyun-1C Debris
NASA Technical Reports Server (NTRS)
Duncan, Matthew; Rand, David K.
2008-01-01
Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. Each day, close approach predictions are generated by a U.S. Department of Defense Joint Space Operations Center Orbital Safety Analyst using the high accuracy Space Object Catalog maintained by the Air Force's 1" Space Control Squadron. Prediction results and other ancillary data such as state vector information are sent to NASAJGoddard Space Flight Center's (GSFC's) Collision Risk Assessment analysis team for review. Collision analysis is performed and the GSFC team works with the ESC member missions to develop risk reduction strategies as necessary. This paper presents various close approach statistics for the ESC. The ESC missions have been affected by debris from the recent anti-satellite test which destroyed the Chinese Fengyun- 1 C satellite. The paper also presents the percentage of close approach events induced by the Fengyun-1C debris, and presents analysis results which predict the future effects on the ESC caused by this event. Specifically, the Fengyun-1C debris is propagated for twenty years using high-performance computing technology and close approach predictions are generated for the ESC. The percent increase in the total number of conjunction events is considered to be an estimate of the collision risk due to the Fengyun-1C break- UP.
A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.
Tipton, Elizabeth; Shuster, Jonathan
2017-10-15
Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Phelps, Charles E; Lakdawalla, Darius N; Basu, Anirban; Drummond, Michael F; Towse, Adrian; Danzon, Patricia M
2018-02-01
The fifth section of our Special Task Force report identifies and discusses two aggregation issues: 1) aggregation of cost and benefit information across individuals to a population level for benefit plan decision making and 2) combining multiple elements of value into a single value metric for individuals. First, we argue that additional elements could be included in measures of value, but such elements have not generally been included in measures of quality-adjusted life-years. For example, we describe a recently developed extended cost-effectiveness analysis (ECEA) that provides a good example of how to use a broader concept of utility. ECEA adds two features-measures of financial risk protection and income distributional consequences. We then discuss a further option for expanding this approach-augmented CEA, which can introduce many value measures. Neither of these approaches, however, provide a comprehensive measure of value. To resolve this issue, we review a technique called multicriteria decision analysis that can provide a comprehensive measure of value. We then discuss budget-setting and prioritization using multicriteria decision analysis, issues not yet fully resolved. Next, we discuss deliberative processes, which represent another important approach for population- or plan-level decisions used by many health technology assessment bodies. These use quantitative information on CEA and other elements, but the group decisions are reached by a deliberative voting process. Finally, we briefly discuss the use of stated preference methods for developing "hedonic" value frameworks, and conclude with some recommendations in this area. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Cole, Tyler; Veeravagu, Anand; Zhang, Michael; Azad, Tej D; Desai, Atman; Ratliff, John K
2015-07-01
Retrospective 2:1 propensity score-matched analysis on a national longitudinal database between 2006 and 2010. To compare rates of adverse events, revisions procedure rates, and payment differences in anterior cervical fusion procedures compared with posterior laminectomy and fusion procedures with at least 3 levels of instrumentation. The comparative benefits of anterior versus posterior approach to multilevel degenerative cervical disease remain controversial. Recent systematic reviews have reached conflicting conclusions. We demonstrate the comparative economic and clinical outcomes of anterior and posterior approaches for multilevel cervical degenerative disk disease. We identified 13,662 patients in a national billing claims database who underwent anterior or posterior cervical fusion procedures with 3 or more levels of instrumentation. Cohorts were balanced using 2:1 propensity score matching and outcomes were compared using bivariate analysis. With the exception of dysphagia (6.4% in anterior and 1.4% in posterior), overall 30-day complication rates were lower in the anterior approach group. The rate of any complication excluding dysphagia with anterior approaches was 12.3%, significantly lower (P < 0.0001) than that of posterior approaches, 17.8%. Anterior approaches resulted in lower hospital ($18,346 vs. $23,638) and total payments ($28,963 vs. $33,526). Patients receiving an anterior surgical approach demonstrated significantly lower rate of 30-day readmission (5.1% vs. 9.9%, P < 0.0001), were less likely to require revision surgery (12.8% vs. 18.1%, P < 0.0001), and had a shorter length of stay by 1.5 nights (P < 0.0001). Anterior approaches in the surgical management of multilevel degenerative cervical disease provide clinical advantages over posterior approaches, including lower overall complication rates, revision procedure rates, and decreased length of stay. Anterior approach procedures are also associated with decreased overall payments. These findings must be interpreted in light of limitations inherent to retrospective longitudinal studies including absence of subjective and radiographical outcomes. 3.
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
Application of tire dynamics to aircraft landing gear design analysis
NASA Technical Reports Server (NTRS)
Black, R. J.
1983-01-01
The tire plays a key part in many analyses used for design of aircraft landing gear. Examples include structural design of wheels, landing gear shimmy, brake whirl, chatter and squeal, complex combination of chatter and shimmy on main landing gear (MLG) systems, anti-skid performance, gear walk, and rough terrain loads and performance. Tire parameters needed in the various analyses are discussed. Two tire models are discussed for shimmy analysis, the modified Moreland approach and the von Schlippe-Dietrich approach. It is shown that the Moreland model can be derived from the Von Schlippe-Dietrich model by certain approximations. The remaining analysis areas are discussed in general terms and the tire parameters needed for each are identified. Accurate tire data allows more accurate design analysis and the correct prediction of dynamic performance of aircraft landing gear.
Random-effects meta-analysis: the number of studies matters.
Guolo, Annamaria; Varin, Cristiano
2017-06-01
This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.
Adaptive windowing and windowless approaches to estimate dynamic functional brain connectivity
NASA Astrophysics Data System (ADS)
Yaesoubi, Maziar; Calhoun, Vince D.
2017-08-01
In this work, we discuss estimation of dynamic dependence of a multi-variate signal. Commonly used approaches are often based on a locality assumption (e.g. sliding-window) which can miss spontaneous changes due to blurring with local but unrelated changes. We discuss recent approaches to overcome this limitation including 1) a wavelet-space approach, essentially adapting the window to the underlying frequency content and 2) a sparse signal-representation which removes any locality assumption. The latter is especially useful when there is no prior knowledge of the validity of such assumption as in brain-analysis. Results on several large resting-fMRI data sets highlight the potential of these approaches.
Wave-Sediment Interaction in Muddy Environments: A Field Experiment
2009-01-01
Geosciences project includes a field experiment on the Atchafalaya shelf, Louisiana, in Years 1 and 2 (2007-2008) and a data analysis and modeling effort in... analysis procedures. During the major field experiment effort in 2008 (Year 2), a total of 5 tripods were deployed at locations fronting the Atchafalaya...experiment effort. This final year of the project (2009, Year 3) has been focused upon data analysis and preparation of publications. APPROACH
A Bayesian approach to meta-analysis of plant pathology studies.
Mila, A L; Ngugi, H K
2011-01-01
Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.
Ocean tides and quasi-stationary departures from the marine geoid investigation
NASA Technical Reports Server (NTRS)
Siry, J. W.; Kahn, W. D.; Bryan, J. W.; Vonbun, F. O.
1973-01-01
The detection of tides and/or currents through the analysis of data generated in connection with the Ocean Geoid Determination Investigation is presented. A discussion of the detailed objectives and approach are included.
Improved Regression Analysis of Temperature-Dependent Strain-Gage Balance Calibration Data
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2015-01-01
An improved approach is discussed that may be used to directly include first and second order temperature effects in the load prediction algorithm of a wind tunnel strain-gage balance. The improved approach was designed for the Iterative Method that fits strain-gage outputs as a function of calibration loads and uses a load iteration scheme during the wind tunnel test to predict loads from measured gage outputs. The improved approach assumes that the strain-gage balance is at a constant uniform temperature when it is calibrated and used. First, the method introduces a new independent variable for the regression analysis of the balance calibration data. The new variable is designed as the difference between the uniform temperature of the balance and a global reference temperature. This reference temperature should be the primary calibration temperature of the balance so that, if needed, a tare load iteration can be performed. Then, two temperature{dependent terms are included in the regression models of the gage outputs. They are the temperature difference itself and the square of the temperature difference. Simulated temperature{dependent data obtained from Triumph Aerospace's 2013 calibration of NASA's ARC-30K five component semi{span balance is used to illustrate the application of the improved approach.
Metabolomics, Standards, and Metabolic Modeling for Synthetic Biology in Plants
Hill, Camilla Beate; Czauderna, Tobias; Klapperstück, Matthias; Roessner, Ute; Schreiber, Falk
2015-01-01
Life on earth depends on dynamic chemical transformations that enable cellular functions, including electron transfer reactions, as well as synthesis and degradation of biomolecules. Biochemical reactions are coordinated in metabolic pathways that interact in a complex way to allow adequate regulation. Biotechnology, food, biofuel, agricultural, and pharmaceutical industries are highly interested in metabolic engineering as an enabling technology of synthetic biology to exploit cells for the controlled production of metabolites of interest. These approaches have only recently been extended to plants due to their greater metabolic complexity (such as primary and secondary metabolism) and highly compartmentalized cellular structures and functions (including plant-specific organelles) compared with bacteria and other microorganisms. Technological advances in analytical instrumentation in combination with advances in data analysis and modeling have opened up new approaches to engineer plant metabolic pathways and allow the impact of modifications to be predicted more accurately. In this article, we review challenges in the integration and analysis of large-scale metabolic data, present an overview of current bioinformatics methods for the modeling and visualization of metabolic networks, and discuss approaches for interfacing bioinformatics approaches with metabolic models of cellular processes and flux distributions in order to predict phenotypes derived from specific genetic modifications or subjected to different environmental conditions. PMID:26557642
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Yuanshun; Baek, Seung H.; Garcia-Diza, Alberto
2012-01-01
This paper designs a comprehensive approach based on the engineering machine/system concept, to model, analyze, and assess the level of CO2 exchange between the atmosphere and terrestrial ecosystems, which is an important factor in understanding changes in global climate. The focus of this article is on spatial patterns and on the correlation between levels of CO2 fluxes and a variety of influencing factors in eco-environments. The engineering/machine concept used is a system protocol that includes the sequential activities of design, test, observe, and model. This concept is applied to explicitly include various influencing factors and interactions associated with CO2 fluxes.more » To formulate effective models of a large and complex climate system, this article introduces a modeling technique that will be referred to as Stochastic Filtering Analysis of Variance (SFANOVA). The CO2 flux data observed from some sites of AmeriFlux are used to illustrate and validate the analysis, prediction and globalization capabilities of the proposed engineering approach and the SF-ANOVA technology. The SF-ANOVA modeling approach was compared to stepwise regression, ridge regression, and neural networks. The comparison indicated that the proposed approach is a valid and effective tool with similar accuracy and less complexity than the other procedures.« less
How Older Persons Perceive the Loss of Independence: The Need of a Holistic Approach to Frailty.
Escourrou, E; Cesari, M; Chicoulaa, B; Fougère, B; Vellas, B; Andrieu, S; Oustric, S
2017-01-01
Since 2004, the definition of the frailty syndrome has shifted from purely physical criteria to a more comprehensive consideration of the individual, including their psychosocial criteria. In this study, qualitative research methods were used as a complementary approach in order to enrich the existing quantitative results in this area. To understand the views of older persons on the risk of loss of independence. The study population comprised people over 75 years of age who were living at home in the south-west of France and were considered to be at risk of losing their independence. Data were collected using individual semi-structured in-depth interviews, accompanied by observations. Inductive analysis was carried out according to grounded theory methods. Fifteen individual interviews were conducted to achieve theoretical data saturation. Analysis of the content of the interviews revealed seven risk factors for the loss of independence: poor mental health, poor physical health, social isolation, no longer leaving the home, an unsuitable environment, unsuitable living conditions, and few resources. These results complement the purely physical approach to screening for the frailty syndrome and lead us to reconsider our screening approach to include a more holistic view of the older person and their circumstances.
Api, A M; Belsito, D; Bruze, M; Cadby, P; Calow, P; Dagli, M L; Dekant, W; Ellis, G; Fryer, A D; Fukayama, M; Griem, P; Hickey, C; Kromidas, L; Lalko, J F; Liebler, D C; Miyachi, Y; Politano, V T; Renskers, K; Ritacco, G; Salvito, D; Schultz, T W; Sipes, I G; Smith, B; Vitale, D; Wilcox, D K
2015-08-01
The Research Institute for Fragrance Materials, Inc. (RIFM) has been engaged in the generation and evaluation of safety data for fragrance materials since its inception over 45 years ago. Over time, RIFM's approach to gathering data, estimating exposure and assessing safety has evolved as the tools for risk assessment evolved. This publication is designed to update the RIFM safety assessment process, which follows a series of decision trees, reflecting advances in approaches in risk assessment and new and classical toxicological methodologies employed by RIFM over the past ten years. These changes include incorporating 1) new scientific information including a framework for choosing structural analogs, 2) consideration of the Threshold of Toxicological Concern (TTC), 3) the Quantitative Risk Assessment (QRA) for dermal sensitization, 4) the respiratory route of exposure, 5) aggregate exposure assessment methodology, 6) the latest methodology and approaches to risk assessments, 7) the latest alternatives to animal testing methodology and 8) environmental risk assessment. The assessment begins with a thorough analysis of existing data followed by in silico analysis, identification of 'read across' analogs, generation of additional data through in vitro testing as well as consideration of the TTC approach. If necessary, risk management may be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
A polyphasic taxonomic approach in isolated strains of Cyanobacteria from thermal springs of Greece.
Bravakos, Panos; Kotoulas, Georgios; Skaraki, Katerina; Pantazidou, Adriani; Economou-Amilli, Athena
2016-05-01
Strains of Cyanobacteria isolated from mats of 9 thermal springs of Greece have been studied for their taxonomic evaluation. A polyphasic taxonomic approach was employed which included: morphological observations by light microscopy and scanning electron microscopy, maximum parsimony, maximum likelihood and Bayesian analysis of 16S rDNA sequences, secondary structural comparisons of 16S-23S rRNA Internal Transcribed Spacer sequences, and finally environmental data. The 17 cyanobacterial isolates formed a diverse group that contained filamentous, coccoid and heterocytous strains. These included representatives of the polyphyletic genera of Synechococcus and Phormidium, and the orders Oscillatoriales, Spirulinales, Chroococcales and Nostocales. After analysis, at least 6 new taxa at the genus level provide new evidence in the taxonomy of Cyanobacteria and highlight the abundant diversity of thermal spring environments with many potential endemic species or ecotypes. Copyright © 2016 Elsevier Inc. All rights reserved.
Tuan, Nguyen Ngoc; Chang, Yi-Chia; Yu, Chang-Ping; Huang, Shir-Ly
2014-01-01
In this study, the first survey of microbial community in thermophilic anaerobic digester using swine manure as sole feedstock was performed by multiple approaches including denaturing gradient gel electrophoresis (DGGE), clone library and pyrosequencing techniques. The integrated analysis of 21 DGGE bands, 126 clones and 8506 pyrosequencing read sequences revealed that Clostridia from the phylum Firmicutes account for the most dominant Bacteria. In addition, our analysis also identified additional taxa that were missed by the previous researches, including members of the bacterial phyla Synergistetes, Planctomycetes, Armatimonadetes, Chloroflexi and Nitrospira which might also play a role in thermophilic anaerobic digester. Most archaeal 16S rRNA sequences could be assigned to the order Methanobacteriales instead of Methanomicrobiales comparing to previous studies. In addition, this study reported that the member of Methanothermobacter genus was firstly found in thermophilic anaerobic digester. Copyright © 2014 Elsevier GmbH. All rights reserved.
NASA Astrophysics Data System (ADS)
Szafranko, Elżbieta
2017-10-01
Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.
A Sociocultural Approach to Children's Perceptions of Death and Loss.
Yang, Sungeun; Park, Soyeon
2017-11-01
By employing the phenomenographic approach, the present study explored children's cognitive understanding of and emotional responses to death and bereavement. Participants included 52 Korean, 16 Chinese, and 16 Chinese American children ages 5-6. Thematic analysis of children's drawings and open-ended interviews revealed that most children associated death with negative emotions such as fear, anxiety, and sadness. The majority of children used realistic expressions to narrate death. The core themes from their drawings included causes for death, attempts to stop the dying, and situations after death. This study contributes to the literature by targeting young children who have been relatively excluded in death studies and provides evidence in the usefulness of drawings as a developmentally appropriate data collection tool. The findings also enrich our knowledge about children's understanding of death and bereavement, rooted in the inductive analysis of empirical data with children from culturally diverse backgrounds.
A queueing theory based model for business continuity in hospitals.
Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R
2013-01-01
Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.
Understanding poisson regression.
Hayat, Matthew J; Higgins, Melinda
2014-04-01
Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.
An approach to evaluating reactive airborne wind shear systems
NASA Technical Reports Server (NTRS)
Gibson, Joseph P., Jr.
1992-01-01
An approach to evaluating reactive airborne windshear detection systems was developed to support a deployment study for future FAA ground-based windshear detection systems. The deployment study methodology assesses potential future safety enhancements beyond planned capabilities. The reactive airborne systems will be an integral part of planned windshear safety enhancements. The approach to evaluating reactive airborne systems involves separate analyses for both landing and take-off scenario. The analysis estimates the probability of effective warning considering several factors including NASA energy height loss characteristics, reactive alert timing, and a probability distribution for microburst strength.
Detection and categorization of bacteria habitats using shallow linguistic analysis
2015-01-01
Background Information regarding bacteria biotopes is important for several research areas including health sciences, microbiology, and food processing and preservation. One of the challenges for scientists in these domains is the huge amount of information buried in the text of electronic resources. Developing methods to automatically extract bacteria habitat relations from the text of these electronic resources is crucial for facilitating research in these areas. Methods We introduce a linguistically motivated rule-based approach for recognizing and normalizing names of bacteria habitats in biomedical text by using an ontology. Our approach is based on the shallow syntactic analysis of the text that include sentence segmentation, part-of-speech (POS) tagging, partial parsing, and lemmatization. In addition, we propose two methods for identifying bacteria habitat localization relations. The underlying assumption for the first method is that discourse changes with a new paragraph. Therefore, it operates on a paragraph-basis. The second method performs a more fine-grained analysis of the text and operates on a sentence-basis. We also develop a novel anaphora resolution method for bacteria coreferences and incorporate it with the sentence-based relation extraction approach. Results We participated in the Bacteria Biotope (BB) Task of the BioNLP Shared Task 2013. Our system (Boun) achieved the second best performance with 68% Slot Error Rate (SER) in Sub-task 1 (Entity Detection and Categorization), and ranked third with an F-score of 27% in Sub-task 2 (Localization Event Extraction). This paper reports the system that is implemented for the shared task, including the novel methods developed and the improvements obtained after the official evaluation. The extensions include the expansion of the OntoBiotope ontology using the training set for Sub-task 1, and the novel sentence-based relation extraction method incorporated with anaphora resolution for Sub-task 2. These extensions resulted in promising results for Sub-task 1 with a SER of 68%, and state-of-the-art performance for Sub-task 2 with an F-score of 53%. Conclusions Our results show that a linguistically-oriented approach based on the shallow syntactic analysis of the text is as effective as machine learning approaches for the detection and ontology-based normalization of habitat entities. Furthermore, the newly developed sentence-based relation extraction system with the anaphora resolution module significantly outperforms the paragraph-based one, as well as the other systems that participated in the BB Shared Task 2013. PMID:26201262
NASA Astrophysics Data System (ADS)
Zhao, Zhanfeng; Illman, Walter A.
2018-04-01
Previous studies have shown that geostatistics-based transient hydraulic tomography (THT) is robust for subsurface heterogeneity characterization through the joint inverse modeling of multiple pumping tests. However, the hydraulic conductivity (K) and specific storage (Ss) estimates can be smooth or even erroneous for areas where pumping/observation densities are low. This renders the imaging of interlayer and intralayer heterogeneity of highly contrasting materials including their unit boundaries difficult. In this study, we further test the performance of THT by utilizing existing and newly collected pumping test data of longer durations that showed drawdown responses in both aquifer and aquitard units at a field site underlain by a highly heterogeneous glaciofluvial deposit. The robust performance of the THT is highlighted through the comparison of different degrees of model parameterization including: (1) the effective parameter approach; (2) the geological zonation approach relying on borehole logs; and (3) the geostatistical inversion approach considering different prior information (with/without geological data). Results reveal that the simultaneous analysis of eight pumping tests with the geostatistical inverse model yields the best results in terms of model calibration and validation. We also find that the joint interpretation of long-term drawdown data from aquifer and aquitard units is necessary in mapping their full heterogeneous patterns including intralayer variabilities. Moreover, as geological data are included as prior information in the geostatistics-based THT analysis, the estimated K values increasingly reflect the vertical distribution patterns of permeameter-estimated K in both aquifer and aquitard units. Finally, the comparison of various THT approaches reveals that differences in the estimated K and Ss tomograms result in significantly different transient drawdown predictions at observation ports.
Cooper, Chris; Lovell, Rebecca; Husk, Kerryn; Booth, Andrew; Garside, Ruth
2018-06-01
We undertook a systematic review to evaluate the health benefits of environmental enhancement and conservation activities. We were concerned that a conventional process of study identification, focusing on exhaustive searches of bibliographic databases as the primary search method, would be ineffective, offering limited value. The focus of this study is comparing study identification methods. We compare (1) an approach led by searches of bibliographic databases with (2) an approach led by supplementary search methods. We retrospectively assessed the effectiveness and value of both approaches. Effectiveness was determined by comparing (1) the total number of studies identified and screened and (2) the number of includable studies uniquely identified by each approach. Value was determined by comparing included study quality and by using qualitative sensitivity analysis to explore the contribution of studies to the synthesis. The bibliographic databases approach identified 21 409 studies to screen and 2 included qualitative studies were uniquely identified. Study quality was moderate, and contribution to the synthesis was minimal. The supplementary search approach identified 453 studies to screen and 9 included studies were uniquely identified. Four quantitative studies were poor quality but made a substantive contribution to the synthesis; 5 studies were qualitative: 3 studies were good quality, one was moderate quality, and 1 study was excluded from the synthesis due to poor quality. All 4 included qualitative studies made significant contributions to the synthesis. This case study found value in aligning primary methods of study identification to maximise location of relevant evidence. Copyright © 2017 John Wiley & Sons, Ltd.
Lanza, Stephanie T.; Coffman, Donna L.
2013-01-01
Prevention scientists use latent class analysis (LCA) with increasing frequency to characterize complex behavior patterns and profiles of risk. Often, the most important research questions in these studies involve establishing characteristics that predict membership in the latent classes, thus describing the composition of the subgroups and suggesting possible points of intervention. More recently, prevention scientists have begun to adopt modern methods for drawing causal inference from observational data because of the bias that can be introduced by confounders. This same issue of confounding exists in any analysis of observational data, including prediction of latent class membership. This study demonstrates a straightforward approach to causal inference in LCA that builds on propensity score methods. We demonstrate this approach by examining the causal effect of early sex on subsequent delinquency latent classes using data from 1,890 adolescents in 11th and 12th grade from wave I of the National Longitudinal Study of Adolescent Health. Prior to the statistical adjustment for potential confounders, early sex was significantly associated with delinquency latent class membership for both genders (p=0.02). However, the propensity score adjusted analysis indicated no evidence for a causal effect of early sex on delinquency class membership (p=0.76) for either gender. Sample R and SAS code is included in an Appendix in the ESM so that prevention scientists may adopt this approach to causal inference in LCA in their own work. PMID:23839479
Butera, Nicole M; Lanza, Stephanie T; Coffman, Donna L
2014-06-01
Prevention scientists use latent class analysis (LCA) with increasing frequency to characterize complex behavior patterns and profiles of risk. Often, the most important research questions in these studies involve establishing characteristics that predict membership in the latent classes, thus describing the composition of the subgroups and suggesting possible points of intervention. More recently, prevention scientists have begun to adopt modern methods for drawing causal inference from observational data because of the bias that can be introduced by confounders. This same issue of confounding exists in any analysis of observational data, including prediction of latent class membership. This study demonstrates a straightforward approach to causal inference in LCA that builds on propensity score methods. We demonstrate this approach by examining the causal effect of early sex on subsequent delinquency latent classes using data from 1,890 adolescents in 11th and 12th grade from wave I of the National Longitudinal Study of Adolescent Health. Prior to the statistical adjustment for potential confounders, early sex was significantly associated with delinquency latent class membership for both genders (p = 0.02). However, the propensity score adjusted analysis indicated no evidence for a causal effect of early sex on delinquency class membership (p = 0.76) for either gender. Sample R and SAS code is included in an Appendix in the ESM so that prevention scientists may adopt this approach to causal inference in LCA in their own work.
Austin, Christine; Gennings, Chris; Tammimies, Kristiina; Bölte, Sven; Arora, Manish
2017-01-01
Environmental exposures to essential and toxic elements may alter health trajectories, depending on the timing, intensity, and mixture of exposures. In epidemiologic studies, these factors are typically analyzed as a function of elemental concentrations in biological matrices measured at one or more points in time. Such an approach, however, fails to account for the temporal cyclicity in the metabolism of environmental chemicals, which if perturbed may lead to adverse health outcomes. Here, we conceptualize and apply a non-linear method–recurrence quantification analysis (RQA)–to quantify cyclical components of prenatal and early postnatal exposure profiles for elements essential to normal development, including Zn, Mn, Mg, and Ca, and elements associated with deleterious health effects or narrow tolerance ranges, including Pb, As, and Cr. We found robust evidence of cyclical patterns in the metabolic profiles of nutrient elements, which we validated against randomized twin-surrogate time-series, and further found that nutrient dynamical properties differ from those of Cr, As, and Pb. Furthermore, we extended this approach to provide a novel method of quantifying dynamic interactions between two environmental exposures. To achieve this, we used cross-recurrence quantification analysis (CRQA), and found that elemental nutrient-nutrient interactions differed from those involving toxicants. These rhythmic regulatory interactions, which we characterize in two geographically distinct cohorts, have not previously been uncovered using traditional regression-based approaches, and may provide a critical unit of analysis for environmental and dietary exposures in epidemiological studies. PMID:29112980
The application of cluster analysis in the intercomparison of loop structures in RNA.
Huang, Hung-Chung; Nagaswamy, Uma; Fox, George E
2005-04-01
We have developed a computational approach for the comparison and classification of RNA loop structures. Hairpin or interior loops identified in atomic resolution RNA structures were intercompared by conformational matching. The root-mean-square deviation (RMSD) values between all pairs of RNA fragments of interest, even if from different molecules, are calculated. Subsequently, cluster analysis is performed on the resulting matrix of RMSD distances using the unweighted pair group method with arithmetic mean (UPGMA). The cluster analysis objectively reveals groups of folds that resemble one another. To demonstrate the utility of the approach, a comprehensive analysis of all the terminal hairpin tetraloops that have been observed in 15 RNA structures that have been determined by X-ray crystallography was undertaken. The method found major clusters corresponding to the well-known GNRA and UNCG types. In addition, two tetraloops with the unusual primary sequence UMAC (M is A or C) were successfully assigned to the GNRA cluster. Larger loop structures were also examined and the clustering results confirmed the occurrence of variations of the GNRA and UNCG tetraloops in these loops and provided a systematic means for locating them. Nineteen examples of larger loops that closely resemble either the GNRA or UNCG tetraloop were found in the large ribosomal RNAs. When the clustering approach was extended to include all structures in the SCOR database, novel relationships were detected including one between the ANYA motif and a less common folding of the GAAA tetraloop sequence.
The application of cluster analysis in the intercomparison of loop structures in RNA
HUANG, HUNG-CHUNG; NAGASWAMY, UMA; FOX, GEORGE E.
2005-01-01
We have developed a computational approach for the comparison and classification of RNA loop structures. Hairpin or interior loops identified in atomic resolution RNA structures were intercompared by conformational matching. The root-mean-square deviation (RMSD) values between all pairs of RNA fragments of interest, even if from different molecules, are calculated. Subsequently, cluster analysis is performed on the resulting matrix of RMSD distances using the unweighted pair group method with arithmetic mean (UPGMA). The cluster analysis objectively reveals groups of folds that resemble one another. To demonstrate the utility of the approach, a comprehensive analysis of all the terminal hairpin tetraloops that have been observed in 15 RNA structures that have been determined by X-ray crystallography was undertaken. The method found major clusters corresponding to the well-known GNRA and UNCG types. In addition, two tetraloops with the unusual primary sequence UMAC (M is A or C) were successfully assigned to the GNRA cluster. Larger loop structures were also examined and the clustering results confirmed the occurrence of variations of the GNRA and UNCG tetraloops in these loops and provided a systematic means for locating them. Nineteen examples of larger loops that closely resemble either the GNRA or UNCG tetraloop were found in the large ribosomal RNAs. When the clustering approach was extended to include all structures in the SCOR database, novel relationships were detected including one between the ANYA motif and a less common folding of the GAAA tetraloop sequence. PMID:15769871
Conducting high-value secondary dataset analysis: an introductory guide and resources.
Smith, Alexander K; Ayanian, John Z; Covinsky, Kenneth E; Landon, Bruce E; McCarthy, Ellen P; Wee, Christina C; Steinman, Michael A
2011-08-01
Secondary analyses of large datasets provide a mechanism for researchers to address high impact questions that would otherwise be prohibitively expensive and time-consuming to study. This paper presents a guide to assist investigators interested in conducting secondary data analysis, including advice on the process of successful secondary data analysis as well as a brief summary of high-value datasets and online resources for researchers, including the SGIM dataset compendium ( www.sgim.org/go/datasets ). The same basic research principles that apply to primary data analysis apply to secondary data analysis, including the development of a clear and clinically relevant research question, study sample, appropriate measures, and a thoughtful analytic approach. A real-world case description illustrates key steps: (1) define your research topic and question; (2) select a dataset; (3) get to know your dataset; and (4) structure your analysis and presentation of findings in a way that is clinically meaningful. Secondary dataset analysis is a well-established methodology. Secondary analysis is particularly valuable for junior investigators, who have limited time and resources to demonstrate expertise and productivity.
Decerns: A framework for multi-criteria decision analysis
Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...
2015-02-27
A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.
Computation of ancestry scores with mixed families and unrelated individuals.
Zhou, Yi-Hui; Marron, James S; Wright, Fred A
2018-03-01
The issue of robustness to family relationships in computing genotype ancestry scores such as eigenvector projections has received increased attention in genetic association, and is particularly challenging when sets of both unrelated individuals and closely related family members are included. The current standard is to compute loadings (left singular vectors) using unrelated individuals and to compute projected scores for remaining family members. However, projected ancestry scores from this approach suffer from shrinkage toward zero. We consider two main novel strategies: (i) matrix substitution based on decomposition of a target family-orthogonalized covariance matrix, and (ii) using family-averaged data to obtain loadings. We illustrate the performance via simulations, including resampling from 1000 Genomes Project data, and analysis of a cystic fibrosis dataset. The matrix substitution approach has similar performance to the current standard, but is simple and uses only a genotype covariance matrix, while the family-average method shows superior performance. Our approaches are accompanied by novel ancillary approaches that provide considerable insight, including individual-specific eigenvalue scree plots. © 2017 The Authors. Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.
Review of Instructional Approaches in Ethics Education.
Mulhearn, Tyler J; Steele, Logan M; Watts, Logan L; Medeiros, Kelsey E; Mumford, Michael D; Connelly, Shane
2017-06-01
Increased investment in ethics education has prompted a variety of instructional objectives and frameworks. Yet, no systematic procedure to classify these varying instructional approaches has been attempted. In the present study, a quantitative clustering procedure was conducted to derive a typology of instruction in ethics education. In total, 330 ethics training programs were included in the cluster analysis. The training programs were appraised with respect to four instructional categories including instructional content, processes, delivery methods, and activities. Eight instructional approaches were identified through this clustering procedure, and these instructional approaches showed different levels of effectiveness. Instructional effectiveness was assessed based on one of nine commonly used ethics criteria. With respect to specific training types, Professional Decision Processes Training (d = 0.50) and Field-Specific Compliance Training (d = 0.46) appear to be viable approaches to ethics training based on Cohen's d effect size estimates. By contrast, two commonly used approaches, General Discussion Training (d = 0.31) and Norm Adherence Training (d = 0.37), were found to be considerably less effective. The implications for instruction in ethics training are discussed.
Theoretical orientations in environmental planning: An inquiry into alternative approaches
NASA Astrophysics Data System (ADS)
Briassoulis, Helen
1989-07-01
In the process of devising courses of action to resolve problems arising at the society-environment interface, a variety of planning approaches are followed, whose adoption is influenced by—among other things—the characteristics of environmental problems, the nature of the decision-making context, and the intellectual traditions of the disciplines contributing to the study of these problems. This article provides a systematic analysis of six alternative environmental planning approaches—comprehensive/rational, incremental, adaptive, contingency, advocacy, and participatory/consensual. The relative influence of the abovementioned factors is examined, the occurrence of these approaches in real-world situations is noted, and their environmental soundness and political realism is evaluated. Because of the disparity between plan formulation and implementation and between theoretical form and empirical reality, a synthetic view of environmental planning approaches is taken and approaches in action are identified, which characterize the totality of the planning process from problem definition to plan implementation, as well as approaches in the becoming, which may be on the horizon of environmental planning of tomorrow. The suggested future research directions include case studies to verify and detail the presence of the approaches discussed, developing measures of success of a given approach in a given decision setting, and an intertemporal analysis of environmental planning approaches.
A Study on Technology Architecture and Serving Approaches of Electronic Government System
NASA Astrophysics Data System (ADS)
Liu, Chunnian; Huang, Yiyun; Pan, Qin
As E-government becomes a very active research area, a lot of solutions to solve citizens' needs are being deployed. This paper provides technology architecture of E-government system and approaches of service in Public Administrations. The proposed electronic system addresses the basic E-government requirements of user friendliness, security, interoperability, transparency and effectiveness in the communication between small and medium sized public organizations and their citizens, businesses and other public organizations. The paper has provided several serving approaches of E-government, which includes SOA, web service, mobile E-government, public library and every has its own characteristics and application scenes. Still, there are a number of E-government issues for further research on organization structure change, including research methodology, data collection analysis, etc.
On the equivalence of the RTI and SVM approaches to time correlated analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croft, S.; Favalli, A.; Henzlova, D.
2014-11-21
Recently two papers on how to perform passive neutron auto-correlation analysis on time gated histograms formed from pulse train data, generically called time correlation analysis (TCA), have appeared in this journal [1,2]. For those of us working in international nuclear safeguards these treatments are of particular interest because passive neutron multiplicity counting is a widely deployed technique for the quantification of plutonium. The purpose of this letter is to show that the skewness-variance-mean (SVM) approach developed in [1] is equivalent in terms of assay capability to the random trigger interval (RTI) analysis laid out in [2]. Mathematically we could alsomore » use other numerical ways to extract the time correlated information from the histogram data including for example what we might call the mean, mean square, and mean cube approach. The important feature however, from the perspective of real world applications, is that the correlated information extracted is the same, and subsequently gets interpreted in the same way based on the same underlying physics model.« less
Ethical analysis in HTA of complex health interventions.
Lysdahl, Kristin Bakke; Oortwijn, Wija; van der Wilt, Gert Jan; Refolo, Pietro; Sacchini, Dario; Mozygemba, Kati; Gerhardus, Ansgar; Brereton, Louise; Hofmann, Bjørn
2016-03-22
In the field of health technology assessment (HTA), there are several approaches that can be used for ethical analysis. However, there is a scarcity of literature that critically evaluates and compares the strength and weaknesses of these approaches when they are applied in practice. In this paper, we analyse the applicability of some selected approaches for addressing ethical issues in HTA in the field of complex health interventions. Complex health interventions have been the focus of methodological attention in HTA. However, the potential methodological challenges for ethical analysis are as yet unknown. Six of the most frequently described and applied ethical approaches in HTA were critically assessed against a set of five characteristics of complex health interventions: multiple and changing perspectives, indeterminate phenomena, uncertain causality, unpredictable outcomes, and ethical complexity. The assessments are based on literature and the authors' experiences of developing, applying and assessing the approaches. The Interactive, participatory HTA approach is by its nature and flexibility, applicable across most complexity characteristics. Wide Reflective Equilibrium is also flexible and its openness to different perspectives makes it better suited for complex health interventions than more rigid conventional approaches, such as Principlism and Casuistry. Approaches developed for HTA purposes are fairly applicable for complex health interventions, which one could expect because they include various ethical perspectives, such as the HTA Core Model® and the Socratic approach. This study shows how the applicability for addressing ethical issues in HTA of complex health interventions differs between the selected ethical approaches. Knowledge about these differences may be helpful when choosing and applying an approach for ethical analyses in HTA. We believe that the study contributes to increasing awareness and interest of the ethical aspects of complex health interventions in general.
Historic range of variability for upland vegetation in the Medicine Bow National Forest, Wyoming
Gregory K. Dillon; Dennis H. Knight; Carolyn B. Meyer
2005-01-01
An approach for synthesizing the results of ecological research pertinent to land management is the analysis of the historic range of variability (HRV) for key ecosystem variables that are affected by management activities. This report provides an HRV analysis for the upland vegetation of the Medicine Bow National Forest in southeastern Wyoming. The variables include...
Historic range of variability for upland vegetation in the Bighorn National Forest, Wyoming
Carolyn B. Meyer; Dennis H. Knight; Gregory K. Dillon
2005-01-01
An approach for synthesizing the results of ecological research pertinent to land management is the analysis of the historic range of variability (HRV) for key ecosystem variables that are affected by management activities. This report provides an HRV analysis for the upland vegetation of the Bighorn National Forest in northcentral Wyoming. The variables include live...
ERIC Educational Resources Information Center
Whisman, Andy; Chapman, Don
2013-01-01
A statewide analysis was conducted on school disciplinary incidents reported during the 2012-2013 school year--the first full year under the revised Policy 4373. Findings from the analysis are provided to help inform districts and schools about what supports they may need to improve school climate, including more positive approaches to student…
A human factors methodology for real-time support applications
NASA Technical Reports Server (NTRS)
Murphy, E. D.; Vanbalen, P. M.; Mitchell, C. M.
1983-01-01
A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information.
Computer-aided operations engineering with integrated models of systems and operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Ryan, Dan; Fleming, Land
1994-01-01
CONFIG 3 is a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operation of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. Integration is supported among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. Support is provided for integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems. CONFIG supports abstracted qualitative and symbolic modeling, for early conceptual design. System models are component structure models with operating modes, with embedded time-related behavior models. CONFIG supports failure modeling and modeling of state or configuration changes that result in dynamic changes in dependencies among components. Operations and procedure models are activity structure models that interact with system models. CONFIG is designed to support evaluation of system operability, diagnosability and fault tolerance, and analysis of the development of system effects of problems over time, including faults, failures, and procedural or environmental difficulties.
Neutron spectrometry for UF 6 enrichment verification in storage cylinders
Mengesha, Wondwosen; Kiff, Scott D.
2015-01-29
Verification of declared UF 6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF 6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF 6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra weremore » analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF 6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF 6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less
Multifidelity Analysis and Optimization for Supersonic Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory
2010-01-01
Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.
2009-01-01
The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing. The paper also includes the verification of a medium-fidelity aerodynamic tool used for the aerodynamic database generation with a steady and unsteady high-fidelity CFD analysis tool for a folding wing example.
Architectural Analysis of Dynamically Reconfigurable Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly
2010-01-01
oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.
Numerical Analysis of Coolant Flow and Heat Transfer in ITER Diagnostic First Wall
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khodak, A.; Loesser, G.; Zhai, Y.
2015-07-24
We performed numerical simulations of the ITER Diagnostic First Wall (DFW) using ANSYS workbench. During operation DFW will include solid main body as well as liquid coolant. Thus thermal and hydraulic analysis of the DFW was performed using conjugated heat transfer approach, in which heat transfer was resolved in both solid and liquid parts, and simultaneously fluid dynamics analysis was performed only in the liquid part. This approach includes interface between solid and liquid part of the systemAnalysis was performed using ANSYS CFX software. CFX software allows solution of heat transfer equations in solid and liquid part, and solution ofmore » the flow equations in the liquid part. Coolant flow in the DFW was assumed turbulent and was resolved using Reynolds averaged Navier-Stokes equations with Shear Stress Transport turbulence model. Meshing was performed using CFX method available within ANSYS. The data cloud for thermal loading consisting of volumetric heating and surface heating was imported into CFX Volumetric heating source was generated using Attila software. Surface heating was obtained using radiation heat transfer analysis. Our results allowed us to identify areas of excessive heating. Proposals for cooling channel relocation were made. Additional suggestions were made to improve hydraulic performance of the cooling system.« less
Wake Encounter Analysis for a Closely Spaced Parallel Runway Paired Approach Simulation
NASA Technical Reports Server (NTRS)
Mckissick,Burnell T.; Rico-Cusi, Fernando J.; Murdoch, Jennifer; Oseguera-Lohr, Rosa M.; Stough, Harry P, III; O'Connor, Cornelius J.; Syed, Hazari I.
2009-01-01
A Monte Carlo simulation of simultaneous approaches performed by two transport category aircraft from the final approach fix to a pair of closely spaced parallel runways was conducted to explore the aft boundary of the safe zone in which separation assurance and wake avoidance are provided. The simulation included variations in runway centerline separation, initial longitudinal spacing of the aircraft, crosswind speed, and aircraft speed during the approach. The data from the simulation showed that the majority of the wake encounters occurred near or over the runway and the aft boundaries of the safe zones were identified for all simulation conditions.
A Mixed-Fidelity Approach for Design of Low-Boom Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Li, Wu; Shields, Elwood; Geiselhart, Karl A.
2010-01-01
This paper documents a mixed-fidelity approach for the design of low-boom supersonic aircraft as a viable approach for designing a practical low-boom supersonic configuration. A low-boom configuration that is based on low-fidelity analysis is used as the baseline. Tail lift is included to help tailor the aft portion of the ground signature. A comparison of low- and high-fidelity analysis results demonstrates the necessity of using computational fluid dynamics (CFD) analysis in a low-boom supersonic configuration design process. The fuselage shape is modified iteratively to obtain a configuration with a CFD equivalent-area distribution that matches a predetermined low-boom target distribution. The mixed-fidelity approach can easily refine the low-fidelity low-boom baseline into a low-boom configuration with the use of CFD equivalent-area analysis. The ground signature of the final configuration is calculated by using a state-of-the-art CFD-based boom analysis method that generates accurate midfield pressure distributions for propagation to the ground with ray tracing. The ground signature that is propagated from a midfield pressure distribution has a shaped ramp front, which is similar to the ground signature that is propagated from the CFD equivalent-area distribution. This result confirms the validity of the low-boom supersonic configuration design by matching a low-boom equivalent-area target, which is easier to accomplish than matching a low-boom midfield pressure target.
Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population.
Raghavan, Chitra; Mauleon, Ramil; Lacorte, Vanica; Jubay, Monalisa; Zaw, Hein; Bonifacio, Justine; Singh, Rakesh Kumar; Huang, B Emma; Leung, Hei
2017-06-07
Multi-parent Advanced Generation Intercross (MAGIC) populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL) mapping. In this study, 1316 S6:8 indica MAGIC (MI) lines and the eight founders were sequenced using Genotyping by Sequencing (GBS). As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height), physical (grain length and grain width) and cooking properties (amylose content) of the rice grain, abiotic stress (submergence tolerance), and biotic stress (brown spot disease) were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations. Copyright © 2017 Raghavan et al.
NASA Astrophysics Data System (ADS)
Bleier, T.; Heraud, J. A.; Dunson, J. C.
2015-12-01
QuakeFinder (QF) and its international collaborators have installed and currently maintain 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. The data from these instruments are being analyzed for pre-quake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, PUCC in Chile, NOA in Greece, Syiah Kuala University in Indonesia, LASP at U of Colo., Stanford, and USGS). Recently, NASA Hq and QuakeFinder tried a new approach to help with the analysis of this huge (50+TB) data archive. A collaboration with Apirio/TopCoder, Harvard University, Amazon, QuakeFinder, and NASA Hq. resulted in an open algorithm development contest called "Quest for Quakes" in which contestants (freelance algorithm developers) attempted to identify quakes from a subset of the QuakeFinder data (3TB). The contest included a $25K prize pool, and contained 100 cases where earthquakes (and null sets) included data from up to 5 remote sites, near and far from quakes greater than M4. These data sets were made available through Amazon.com to hundreds of contestants over a two week contest period. In a more traditional approach, several new algorithms were tried by actively sharing the QF data with universities over a longer period. These algorithms included Principal Component Analysis-PCA and deep neural networks in an effort to automatically identify earthquake signals within typical, noise-filled environments. This presentation examines the pros and cons of employing these two approaches, from both logistical and scientific perspectives.
Fukuda, Haruhisa; Shimizu, Sayuri; Ishizaki, Tatsuro
2015-01-01
Objectives To assess the value of organized care by comparing the clinical outcomes and healthcare expenditure between the conventional Japanese “integrated care across specialties within one hospital” mode of providing healthcare and the prospective approach of “organized care across separate facilities within a community”. Design Retrospective cohort study. Setting Two groups of hospitals were categorized according to healthcare delivery approach: the first group included 3 hospitals autonomously providing integrated care across specialties, and the second group included 4 acute care hospitals and 7 rehabilitative care hospitals providing organized care across separate facilities. Participants Patients aged 65 years and above who had undergone hip fracture surgery. Measurements Regression models adjusting for patient characteristics and clinical variables were used to investigate the impact of organized care on the improvements to the mobility capability of patients before and after hospitalization and the differences in healthcare resource utilization. Results The sample for analysis included 837 hip fracture surgery cases. The proportion of patients with either unchanged or improved mobility capability was not statistically associated with the healthcare delivery approaches. Total adjusted mean healthcare expenditure for integrated care and organized care were US$28,360 (95% confidence interval: 27,787-28,972) and US$21,951 (21,511-22,420), respectively, indicating an average increase of US$6,409 in organized care. Conclusion Our cost-consequence analysis underscores the need to further investigate the actual contribution of organized care to the provision of efficient and high-quality healthcare. PMID:26208322
HELIOSEISMOLOGY OF PRE-EMERGING ACTIVE REGIONS. I. OVERVIEW, DATA, AND TARGET SELECTION CRITERIA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leka, K. D.; Barnes, G.; Birch, A. C.
2013-01-10
This first paper in a series describes the design of a study testing whether pre-appearance signatures of solar magnetic active regions were detectable using various tools of local helioseismology. The ultimate goal is to understand flux-emergence mechanisms by setting observational constraints on pre-appearance subsurface changes, for comparison with results from simulation efforts. This first paper provides details of the data selection and preparation of the samples, each containing over 100 members, of two populations: regions on the Sun that produced a numbered NOAA active region, and a 'control' sample of areas that did not. The seismology is performed on datamore » from the GONG network; accompanying magnetic data from SOHO/MDI are used for co-temporal analysis of the surface magnetic field. Samples are drawn from 2001-2007, and each target is analyzed for 27.7 hr prior to an objectively determined time of emergence. The results of two analysis approaches are published separately: one based on averages of the seismology- and magnetic-derived signals over the samples, another based on Discriminant Analysis of these signals, for a statistical test of detectable differences between the two populations. We include here descriptions of a new potential-field calculation approach and the algorithm for matching sample distributions over multiple variables. We describe known sources of bias and the approaches used to mitigate them. We also describe unexpected bias sources uncovered during the course of the study and include a discussion of refinements that should be included in future work on this topic.« less
NASA Technical Reports Server (NTRS)
Thompson, E.
1979-01-01
A finite element computer code for the analysis of mantle convection is described. The coupled equations for creeping viscous flow and heat transfer can be solved for either a transient analysis or steady-state analysis. For transient analyses, either a control volume or a control mass approach can be used. Non-Newtonian fluids with viscosities which have thermal and spacial dependencies can be easily incorporated. All material parameters may be written as function statements by the user or simply specified as constants. A wide range of boundary conditions, both for the thermal analysis and the viscous flow analysis can be specified. For steady-state analyses, elastic strain rates can be included. Although this manual was specifically written for users interested in mantle convection, the code is equally well suited for analysis in a number of other areas including metal forming, glacial flows, and creep of rock and soil.
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Promoting Metacognition in Introductory Calculus-based Physics Labs
NASA Astrophysics Data System (ADS)
Grennell, Drew; Boudreaux, Andrew
2010-10-01
In the Western Washington University physics department, a project is underway to develop research-based laboratory curriculum for the introductory calculus-based course. Instructional goals not only include supporting students' conceptual understanding and reasoning ability, but also providing students with opportunities to engage in metacognition. For the latter, our approach has been to scaffold reflective thinking with guided questions. Specific instructional strategies include analysis of alternate reasoning presented in fictitious dialogues and comparison of students' initial ideas with their lab group's final, consensus understanding. Assessment of student metacognition includes pre- and post- course data from selected questions on the CLASS survey, analysis of written lab worksheets, and student opinion surveys. CLASS results are similar to a traditional physics course and analysis of lab sheets show that students struggle to engage in a metacognitive process. Future directions include video studies, as well as use of additional written assessments adapted from educational psychology.
Sebastiani, Paola; Zhao, Zhenming; Abad-Grau, Maria M; Riva, Alberto; Hartley, Stephen W; Sedgewick, Amanda E; Doria, Alessandro; Montano, Monty; Melista, Efthymia; Terry, Dellara; Perls, Thomas T; Steinberg, Martin H; Baldwin, Clinton T
2008-01-01
Background One of the challenges of the analysis of pooling-based genome wide association studies is to identify authentic associations among potentially thousands of false positive associations. Results We present a hierarchical and modular approach to the analysis of genome wide genotype data that incorporates quality control, linkage disequilibrium, physical distance and gene ontology to identify authentic associations among those found by statistical association tests. The method is developed for the allelic association analysis of pooled DNA samples, but it can be easily generalized to the analysis of individually genotyped samples. We evaluate the approach using data sets from diverse genome wide association studies including fetal hemoglobin levels in sickle cell anemia and a sample of centenarians and show that the approach is highly reproducible and allows for discovery at different levels of synthesis. Conclusion Results from the integration of Bayesian tests and other machine learning techniques with linkage disequilibrium data suggest that we do not need to use too stringent thresholds to reduce the number of false positive associations. This method yields increased power even with relatively small samples. In fact, our evaluation shows that the method can reach almost 70% sensitivity with samples of only 100 subjects. PMID:18194558
Overcoming the matched-sample bottleneck: an orthogonal approach to integrate omic data.
Nguyen, Tin; Diaz, Diana; Tagett, Rebecca; Draghici, Sorin
2016-07-12
MicroRNAs (miRNAs) are small non-coding RNA molecules whose primary function is to regulate the expression of gene products via hybridization to mRNA transcripts, resulting in suppression of translation or mRNA degradation. Although miRNAs have been implicated in complex diseases, including cancer, their impact on distinct biological pathways and phenotypes is largely unknown. Current integration approaches require sample-matched miRNA/mRNA datasets, resulting in limited applicability in practice. Since these approaches cannot integrate heterogeneous information available across independent experiments, they neither account for bias inherent in individual studies, nor do they benefit from increased sample size. Here we present a novel framework able to integrate miRNA and mRNA data (vertical data integration) available in independent studies (horizontal meta-analysis) allowing for a comprehensive analysis of the given phenotypes. To demonstrate the utility of our method, we conducted a meta-analysis of pancreatic and colorectal cancer, using 1,471 samples from 15 mRNA and 14 miRNA expression datasets. Our two-dimensional data integration approach greatly increases the power of statistical analysis and correctly identifies pathways known to be implicated in the phenotypes. The proposed framework is sufficiently general to integrate other types of data obtained from high-throughput assays.
Wang, Fang-Xu; Yuan, Jian-Chao; Kang, Li-Ping; Pang, Xu; Yan, Ren-Yi; Zhao, Yang; Zhang, Jie; Sun, Xin-Guang; Ma, Bai-Ping
2016-09-10
An ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry approach coupled with multivariate statistical analysis was established and applied to rapidly distinguish the chemical differences between fibrous root and rhizome of Anemarrhena asphodeloides. The datasets of tR-m/z pairs, ion intensity and sample code were processed by principal component analysis and orthogonal partial least squares discriminant analysis. Chemical markers could be identified based on their exact mass data, fragmentation characteristics, and retention times. And the new compounds among chemical markers could be isolated rapidly guided by the ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry and their definitive structures would be further elucidated by NMR spectra. Using this approach, twenty-four markers were identified on line including nine new saponins and five new steroidal saponins of them were obtained in pure form. The study validated this proposed approach as a suitable method for identification of the chemical differences between various medicinal parts in order to expand medicinal parts and increase the utilization rate of resources. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Zalameda, Joseph N.; Bolduc, Sean; Harman, Rebecca
2017-01-01
A composite fuselage aircraft forward section was inspected with flash thermography. The fuselage section is 24 feet long and approximately 8 feet in diameter. The structure is primarily configured with a composite sandwich structure of carbon fiber face sheets with a Nomex(Trademark) honeycomb core. The outer surface area was inspected. The thermal data consisted of 477 data sets totaling in size of over 227 Gigabytes. Principal component analysis (PCA) was used to process the data sets for substructure and defect detection. A fixed eigenvector approach using a global covariance matrix was used and compared to a varying eigenvector approach. The fixed eigenvector approach was demonstrated to be a practical analysis method for the detection and interpretation of various defects such as paint thickness variation, possible water intrusion damage, and delamination damage. In addition, inspection considerations are discussed including coordinate system layout, manipulation of the fuselage section, and the manual scanning technique used for full coverage.
Evaluating disease management program effectiveness: an introduction to survival analysis.
Linden, Ariel; Adams, John L; Roberts, Nancy
2004-01-01
Currently, the most widely used method in the disease management industry for evaluating program effectiveness is the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer plausible rationale explaining the change from baseline. Survival analysis allows for the inclusion of data from censored cases, those subjects who either "survived" the program without experiencing the event (e.g., achievement of target clinical levels, hospitalization) or left the program prematurely, due to disenrollement from the health plan or program, or were lost to follow-up. Additionally, independent variables may be included in the model to help explain the variability in the outcome measure. In order to maximize the potential of this statistical method, validity of the model and research design must be assured. This paper reviews survival analysis as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.
Li, Yingxue; Hu, Yiying; Yang, Jingang; Li, Xiang; Liu, Haifeng; Xie, Guotong; Xu, Meilin; Hu, Jingyi; Yang, Yuejin
2017-01-01
Treatment effectiveness plays a fundamental role in patient therapies. In most observational studies, researchers often design an analysis pipeline for a specific treatment based on the study cohort. To evaluate other treatments in the data set, much repeated and multifarious work including cohort construction, statistical analysis need to be done. In addition, as treatments are often with an intrinsic hierarchical relationship, many rational comparable treatment pairs can be derived as new treatment variables besides the original single treatment one from the original cohort data set. In this paper, we propose an automatic treatment effectiveness analysis approach to solve this problem. With our approach, clinicians can assess the effect of treatments not only more conveniently but also more thoroughly and comprehensively. We applied this method to a real world case of estimating the drug effectiveness on Chinese Acute Myocardial Infarction (CAMI) data set and some meaningful results are obtained for potential improvement of patient treatments.
Thermal Damage Analysis in Biological Tissues Under Optical Irradiation: Application to the Skin
NASA Astrophysics Data System (ADS)
Fanjul-Vélez, Félix; Ortega-Quijano, Noé; Solana-Quirós, José Ramón; Arce-Diego, José Luis
2009-07-01
The use of optical sources in medical praxis is increasing nowadays. In this study, different approaches using thermo-optical principles that allow us to predict thermal damage in irradiated tissues are analyzed. Optical propagation is studied by means of the radiation transport theory (RTT) equation, solved via a Monte Carlo analysis. Data obtained are included in a bio-heat equation, solved via a numerical finite difference approach. Optothermal properties are considered for the model to be accurate and reliable. Thermal distribution is calculated as a function of optical source parameters, mainly optical irradiance, wavelength and exposition time. Two thermal damage models, the cumulative equivalent minutes (CEM) 43 °C approach and the Arrhenius analysis, are used. The former is appropriate when dealing with dosimetry considerations at constant temperature. The latter is adequate to predict thermal damage with arbitrary temperature time dependence. Both models are applied and compared for the particular application of skin thermotherapy irradiation.
A Variational Approach to the Analysis of Dissipative Electromechanical Systems
Allison, Andrew; Pearce, Charles E. M.; Abbott, Derek
2014-01-01
We develop a method for systematically constructing Lagrangian functions for dissipative mechanical, electrical, and electromechanical systems. We derive the equations of motion for some typical electromechanical systems using deterministic principles that are strictly variational. We do not use any ad hoc features that are added on after the analysis has been completed, such as the Rayleigh dissipation function. We generalise the concept of potential, and define generalised potentials for dissipative lumped system elements. Our innovation offers a unified approach to the analysis of electromechanical systems where there are energy and power terms in both the mechanical and electrical parts of the system. Using our novel technique, we can take advantage of the analytic approach from mechanics, and we can apply these powerful analytical methods to electrical and to electromechanical systems. We can analyse systems that include non-conservative forces. Our methodology is deterministic, and does does require any special intuition, and is thus suitable for automation via a computer-based algebra package. PMID:24586221
Analysis Commons, A Team Approach to Discovery in a Big-Data Environment for Genetic Epidemiology
Brody, Jennifer A.; Morrison, Alanna C.; Bis, Joshua C.; O'Connell, Jeffrey R.; Brown, Michael R.; Huffman, Jennifer E.; Ames, Darren C.; Carroll, Andrew; Conomos, Matthew P.; Gabriel, Stacey; Gibbs, Richard A.; Gogarten, Stephanie M.; Gupta, Namrata; Jaquish, Cashell E.; Johnson, Andrew D.; Lewis, Joshua P.; Liu, Xiaoming; Manning, Alisa K.; Papanicolaou, George J.; Pitsillides, Achilleas N.; Rice, Kenneth M.; Salerno, William; Sitlani, Colleen M.; Smith, Nicholas L.; Heckbert, Susan R.; Laurie, Cathy C.; Mitchell, Braxton D.; Vasan, Ramachandran S.; Rich, Stephen S.; Rotter, Jerome I.; Wilson, James G.; Boerwinkle, Eric; Psaty, Bruce M.; Cupples, L. Adrienne
2017-01-01
Summary paragraph The exploding volume of whole-genome sequence (WGS) and multi-omics data requires new approaches for analysis. As one solution, we have created a cloud-based Analysis Commons, which brings together genotype and phenotype data from multiple studies in a setting that is accessible by multiple investigators. This framework addresses many of the challenges of multi-center WGS analyses, including data sharing mechanisms, phenotype harmonization, integrated multi-omics analyses, annotation, and computational flexibility. In this setting, the computational pipeline facilitates a sequence-to-discovery analysis workflow illustrated here by an analysis of plasma fibrinogen levels in 3996 individuals from the National Heart, Lung, and Blood Institute (NHLBI) Trans-Omics for Precision Medicine (TOPMed) WGS program. The Analysis Commons represents a novel model for transforming WGS resources from a massive quantity of phenotypic and genomic data into knowledge of the determinants of health and disease risk in diverse human populations. PMID:29074945
ERIC Educational Resources Information Center
Almond, Russell; Deane, Paul; Quinlan, Thomas; Wagner, Michael; Sydorenko, Tetyana
2012-01-01
The Fall 2007 and Spring 2008 pilot tests for the "CBAL"™ Writing assessment included experimental keystroke logging capabilities. This report documents the approaches used to capture the keystroke logs and the algorithms used to process the outputs. It also includes some preliminary findings based on the pilot data. In particular, it…
ERIC Educational Resources Information Center
Blake, Anthony; Francis, David
1973-01-01
Approaches to developing management ability include systematic techniques, mental enlargement, self-analysis, and job-related counseling. A method is proposed to integrate them into a responsive program involving depth understanding, vision of the future, specialization commitment to change, and self-monitoring control. (MS)
The Evolution of Web Searching.
ERIC Educational Resources Information Center
Green, David
2000-01-01
Explores the interrelation between Web publishing and information retrieval technologies and lists new approaches to Web indexing and searching. Highlights include Web directories; search engines; portalisation; Internet service providers; browser providers; meta search engines; popularity based analysis; natural language searching; links-based…
Architectural Strategies for Enabling Data-Driven Science at Scale
NASA Astrophysics Data System (ADS)
Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.
2017-12-01
The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.
Towards a systems approach for chronic diseases, based on health state modeling
Rebhan, Michael
2017-01-01
Rising pressure from chronic diseases means that we need to learn how to deal with challenges at a different level, including the use of systems approaches that better connect across fragments, such as disciplines, stakeholders, institutions, and technologies. By learning from progress in leading areas of health innovation (including oncology and AIDS), as well as complementary indications (Alzheimer’s disease), I try to extract the most enabling innovation paradigms, and discuss their extension to additional areas of application within a systems approach. To facilitate such work, a Precision, P4 or Systems Medicine platform is proposed, which is centered on the representation of health states that enable the definition of time in the vision to provide the right intervention for the right patient at the right time and dose. Modeling of such health states should allow iterative optimization, as longitudinal human data accumulate. This platform is designed to facilitate the discovery of links between opportunities related to a) the modernization of diagnosis, including the increased use of omics profiling, b) patient-centric approaches enabled by technology convergence, including digital health and connected devices, c) increasing understanding of the pathobiological, clinical and health economic aspects of disease progression stages, d) design of new interventions, including therapies as well as preventive measures, including sequential intervention approaches. Probabilistic Markov models of health states, e.g. those used for health economic analysis, are discussed as a simple starting point for the platform. A path towards extension into other indications, data types and uses is discussed, with a focus on regenerative medicine and relevant pathobiology. PMID:28529704
Mateus, Octávio; Benson, Roger B.J.
2015-01-01
Diplodocidae are among the best known sauropod dinosaurs. Several species were described in the late 1800s or early 1900s from the Morrison Formation of North America. Since then, numerous additional specimens were recovered in the USA, Tanzania, Portugal, and Argentina, as well as possibly Spain, England, Georgia, Zimbabwe, and Asia. To date, the clade includes about 12 to 15 nominal species, some of them with questionable taxonomic status (e.g., ‘Diplodocus’ hayi or Dyslocosaurus polyonychius), and ranging in age from Late Jurassic to Early Cretaceous. However, intrageneric relationships of the iconic, multi-species genera Apatosaurus and Diplodocus are still poorly known. The way to resolve this issue is a specimen-based phylogenetic analysis, which has been previously implemented for Apatosaurus, but is here performed for the first time for the entire clade of Diplodocidae. The analysis includes 81 operational taxonomic units, 49 of which belong to Diplodocidae. The set of OTUs includes all name-bearing type specimens previously proposed to belong to Diplodocidae, alongside a set of relatively complete referred specimens, which increase the amount of anatomically overlapping material. Non-diplodocid outgroups were selected to test the affinities of potential diplodocid specimens that have subsequently been suggested to belong outside the clade. The specimens were scored for 477 morphological characters, representing one of the most extensive phylogenetic analyses of sauropod dinosaurs. Character states were figured and tables given in the case of numerical characters. The resulting cladogram recovers the classical arrangement of diplodocid relationships. Two numerical approaches were used to increase reproducibility in our taxonomic delimitation of species and genera. This resulted in the proposal that some species previously included in well-known genera like Apatosaurus and Diplodocus are generically distinct. Of particular note is that the famous genus Brontosaurus is considered valid by our quantitative approach. Furthermore, “Diplodocus” hayi represents a unique genus, which will herein be called Galeamopus gen. nov. On the other hand, these numerical approaches imply synonymization of “Dinheirosaurus” from the Late Jurassic of Portugal with the Morrison Formation genus Supersaurus. Our use of a specimen-, rather than species-based approach increases knowledge of intraspecific and intrageneric variation in diplodocids, and the study demonstrates how specimen-based phylogenetic analysis is a valuable tool in sauropod taxonomy, and potentially in paleontology and taxonomy as a whole. PMID:25870766
NASA Astrophysics Data System (ADS)
Carpenter, Matthew H.; Jernigan, J. G.
2007-05-01
We present examples of an analysis progression consisting of a synthesis of the Photon Clean Method (Carpenter, Jernigan, Brown, Beiersdorfer 2007) and bootstrap methods to quantify errors and variations in many-parameter models. The Photon Clean Method (PCM) works well for model spaces with large numbers of parameters proportional to the number of photons, therefore a Monte Carlo paradigm is a natural numerical approach. Consequently, PCM, an "inverse Monte-Carlo" method, requires a new approach for quantifying errors as compared to common analysis methods for fitting models of low dimensionality. This presentation will explore the methodology and presentation of analysis results derived from a variety of public data sets, including observations with XMM-Newton, Chandra, and other NASA missions. Special attention is given to the visualization of both data and models including dynamic interactive presentations. This work was performed under the auspices of the Department of Energy under contract No. W-7405-Eng-48. We thank Peter Beiersdorfer and Greg Brown for their support of this technical portion of a larger program related to science with the LLNL EBIT program.
Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing
Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak
2012-01-01
This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700
Ellrott, Kyle; Bailey, Matthew H; Saksena, Gordon; Covington, Kyle R; Kandoth, Cyriac; Stewart, Chip; Hess, Julian; Ma, Singer; Chiotti, Kami E; McLellan, Michael; Sofia, Heidi J; Hutter, Carolyn; Getz, Gad; Wheeler, David; Ding, Li
2018-03-28
The Cancer Genome Atlas (TCGA) cancer genomics dataset includes over 10,000 tumor-normal exome pairs across 33 different cancer types, in total >400 TB of raw data files requiring analysis. Here we describe the Multi-Center Mutation Calling in Multiple Cancers project, our effort to generate a comprehensive encyclopedia of somatic mutation calls for the TCGA data to enable robust cross-tumor-type analyses. Our approach accounts for variance and batch effects introduced by the rapid advancement of DNA extraction, hybridization-capture, sequencing, and analysis methods over time. We present best practices for applying an ensemble of seven mutation-calling algorithms with scoring and artifact filtering. The dataset created by this analysis includes 3.5 million somatic variants and forms the basis for PanCan Atlas papers. The results have been made available to the research community along with the methods used to generate them. This project is the result of collaboration from a number of institutes and demonstrates how team science drives extremely large genomics projects. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K
2011-10-01
To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods including only regression or both regression and ranking constraints on clinical data. On high dimensional data, the former model performs better. However, this approach does not have a theoretical link with standard statistical models for survival data. This link can be made by means of transformation models when ranking constraints are included. Copyright © 2011 Elsevier B.V. All rights reserved.
Cost-effectiveness analysis in minimally invasive spine surgery.
Al-Khouja, Lutfi T; Baron, Eli M; Johnson, J Patrick; Kim, Terrence T; Drazin, Doniel
2014-06-01
Medical care has been evolving with the increased influence of a value-based health care system. As a result, more emphasis is being placed on ensuring cost-effectiveness and utility in the services provided to patients. This study looks at this development in respect to minimally invasive spine surgery (MISS) costs. A literature review using PubMed, the Cost-Effectiveness Analysis (CEA) Registry, and the National Health Service Economic Evaluation Database (NHS EED) was performed. Papers were included in the study if they reported costs associated with minimally invasive spine surgery (MISS). If there was no mention of cost, CEA, cost-utility analysis (CUA), quality-adjusted life year (QALY), quality, or outcomes mentioned, then the article was excluded. Fourteen studies reporting costs associated with MISS in 12,425 patients (3675 undergoing minimally invasive procedures and 8750 undergoing open procedures) were identified through PubMed, the CEA Registry, and NHS EED. The percent cost difference between minimally invasive and open approaches ranged from 2.54% to 33.68%-all indicating cost saving with a minimally invasive surgical approach. Average length of stay (LOS) for minimally invasive surgery ranged from 0.93 days to 5.1 days compared with 1.53 days to 12 days for an open approach. All studies reporting EBL reported lower volume loss in an MISS approach (range 10-392.5 ml) than in an open approach (range 55-535.5 ml). There are currently an insufficient number of studies published reporting the costs of MISS. Of the studies published, none have followed a standardized method of reporting and analyzing cost data. Preliminary findings analyzing the 14 studies showed both cost saving and better outcomes in MISS compared with an open approach. However, more Level I CEA/CUA studies including cost/QALY evaluations with specifics of the techniques utilized need to be reported in a standardized manner to make more accurate conclusions on the cost effectiveness of minimally invasive spine surgery.
Error analysis of satellite attitude determination using a vision-based approach
NASA Astrophysics Data System (ADS)
Carozza, Ludovico; Bevilacqua, Alessandro
2013-09-01
Improvements in communication and processing technologies have opened the doors to exploit on-board cameras to compute objects' spatial attitude using only the visual information from sequences of remote sensed images. The strategies and the algorithmic approach used to extract such information affect the estimation accuracy of the three-axis orientation of the object. This work presents a method for analyzing the most relevant error sources, including numerical ones, possible drift effects and their influence on the overall accuracy, referring to vision-based approaches. The method in particular focuses on the analysis of the image registration algorithm, carried out through on-purpose simulations. The overall accuracy has been assessed on a challenging case study, for which accuracy represents the fundamental requirement. In particular, attitude determination has been analyzed for small satellites, by comparing theoretical findings to metric results from simulations on realistic ground-truth data. Significant laboratory experiments, using a numerical control unit, have further confirmed the outcome. We believe that our analysis approach, as well as our findings in terms of error characterization, can be useful at proof-of-concept design and planning levels, since they emphasize the main sources of error for visual based approaches employed for satellite attitude estimation. Nevertheless, the approach we present is also of general interest for all the affine applicative domains which require an accurate estimation of three-dimensional orientation parameters (i.e., robotics, airborne stabilization).
Conducting systematic reviews of association (etiology): The Joanna Briggs Institute's approach.
Moola, Sandeep; Munn, Zachary; Sears, Kim; Sfetcu, Raluca; Currie, Marian; Lisy, Karolina; Tufanaru, Catalin; Qureshi, Rubab; Mattis, Patrick; Mu, Peifan
2015-09-01
The systematic review of evidence is the research method which underpins the traditional approach to evidence-based healthcare. There is currently no uniform methodology for conducting a systematic review of association (etiology). This study outlines and describes the Joanna Briggs Institute's approach and guidance for synthesizing evidence related to association with a predominant focus on etiology and contributes to the emerging field of systematic review methodologies. It should be noted that questions of association typically address etiological or prognostic issues.The systematic review of studies to answer questions of etiology follows the same basic principles of systematic review of other types of data. An a priori protocol must inform the conduct of the systematic review, comprehensive searching must be performed and critical appraisal of retrieved studies must be carried out.The overarching objective of systematic reviews of etiology is to identify and synthesize the best available evidence on the factors of interest that are associated with a particular disease or outcome. The traditional PICO (population, interventions, comparators and outcomes) format for systematic reviews of effects does not align with questions relating to etiology. A systematic review of etiology should include the following aspects: population, exposure of interest (independent variable) and outcome (dependent variable).Studies of etiology are predominantly explanatory or predictive. The objective of reviews of explanatory or predictive studies is to contribute to, and improve our understanding of, the relationship of health-related events or outcomes by examining the association between variables. When interpreting possible associations between variables based on observational study data, caution must be exercised due to the likely presence of confounding variables or moderators that may impact on the results.As with all systematic reviews, there are various approaches to present the results, including a narrative, graphical or tabular summary, or meta-analysis. When meta-analysis is not possible, a set of alternative methods for synthesizing research is available. On the basis of the research question and objectives, narrative, tabular and/or visual approaches can be used for data synthesis. There are some special considerations when conducting meta-analysis for questions related to risk and correlation. These include, but are not limited to, causal inference.Systematic review and meta-analysis of studies related to etiology is an emerging methodology in the field of evidence synthesis. These reviews can provide useful information for healthcare professionals and policymakers on the burden of disease. The standardized Joanna Briggs Institute approach offers a rigorous and transparent method to conduct reviews of etiology.
Application of Chimera Grid Scheme to Combustor Flowfields at all Speeds
NASA Technical Reports Server (NTRS)
Yungster, Shaye; Chen, Kuo-Huey
1997-01-01
A CFD method for solving combustor flowfields at all speeds on complex configurations is presented. The approach is based on the ALLSPD-3D code which uses the compressible formulation of the flow equations including real gas effects, nonequilibrium chemistry and spray combustion. To facilitate the analysis of complex geometries, the chimera grid method is utilized. To the best of our knowledge, this is the first application of the chimera scheme to reacting flows. In order to evaluate the effectiveness of this numerical approach, several benchmark calculations of subsonic flows are presented. These include steady and unsteady flows, and bluff-body stabilized spray and premixed combustion flames.
Correlational approach to study interactions between dust Brownian particles in a plasma
NASA Astrophysics Data System (ADS)
Lisin, E. A.; Vaulina, O. S.; Petrov, O. F.
2018-01-01
A general approach to the correlational analysis of Brownian motion of strongly coupled particles in open dissipative systems is described. This approach can be applied to the theoretical description of various non-ideal statistically equilibrium systems (including non-Hamiltonian systems), as well as for the analysis of experimental data. In this paper, we consider an application of the correlational approach to the problem of experimental exploring the wake-mediated nonreciprocal interactions in complex plasmas. We derive simple analytic equations, which allows one to calculate the gradients of forces acting on a microparticle due to each of other particles as well as the gradients of external field, knowing only the information on time-averaged correlations of particles displacements and velocities. We show the importance of taking dissipative and random processes into account, without which consideration of a system with a nonreciprocal interparticle interaction as linearly coupled oscillators leads to significant errors in determining the characteristic frequencies in a system. In the examples of numerical simulations, we demonstrate that the proposed original approach could be an effective instrument in exploring the longitudinal wake structure of a microparticle in a plasma. Unlike the previous attempts to study the wake-mediated interactions in complex plasmas, our method does not require any external perturbations and is based on Brownian motion analysis only.
Catallo, Cristina; Jack, Susan M.; Ciliska, Donna; MacMillan, Harriet L.
2013-01-01
Little is known about how to systematically integrate complex qualitative studies within the context of randomized controlled trials. A two-phase sequential explanatory mixed methods study was conducted in Canada to understand how women decide to disclose intimate partner violence in emergency department settings. Mixing a RCT (with a subanalysis of data) with a grounded theory approach required methodological modifications to maintain the overall rigour of this mixed methods study. Modifications were made to the following areas of the grounded theory approach to support the overall integrity of the mixed methods study design: recruitment of participants, maximum variation and negative case sampling, data collection, and analysis methods. Recommendations for future studies include: (1) planning at the outset to incorporate a qualitative approach with a RCT and to determine logical points during the RCT to integrate the qualitative component and (2) consideration for the time needed to carry out a RCT and a grounded theory approach, especially to support recruitment, data collection, and analysis. Data mixing strategies should be considered during early stages of the study, so that appropriate measures can be developed and used in the RCT to support initial coding structures and data analysis needs of the grounded theory phase. PMID:23577245
ERIC Educational Resources Information Center
Tuma, Nancy Brandon; Hannan, Michael T.
The document, part of a series of chapters described in SO 011 759, considers the problem of censoring in the analysis of event-histories (data on dated events, including dates of change from one qualitative state to another). Censoring refers to the lack of information on events that occur before or after the period for which data are available.…
Preliminary design package for prototype solar heating system
NASA Technical Reports Server (NTRS)
1978-01-01
A summary is given of the preliminary analysis and design activity on solar heating systems. The analysis was made without site specific data other than weather; therefore, the results indicate performance expected under these special conditions. Major items include system candidates, design approaches, trade studies and other special data required to evaluate the preliminary analysis and design. The program calls for the development and delivery of eight prototype solar heating and cooling systems for installation and operational test.
Fault Tree Analysis: A Bibliography
NASA Technical Reports Server (NTRS)
2000-01-01
Fault tree analysis is a top-down approach to the identification of process hazards. It is as one of the best methods for systematically identifying an graphically displaying the many ways some things can go wrong. This bibliography references 266 documents in the NASA STI Database that contain the major concepts. fault tree analysis, risk an probability theory, in the basic index or major subject terms. An abstract is included with most citations, followed by the applicable subject terms.
Paleomagnetism.org: An online multi-platform open source environment for paleomagnetic data analysis
NASA Astrophysics Data System (ADS)
Koymans, Mathijs R.; Langereis, Cor G.; Pastor-Galán, Daniel; van Hinsbergen, Douwe J. J.
2016-08-01
This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The Paleomagnetism.org application is split in to an interpretation portal, a statistics portal, and a portal for miscellaneous paleomagnetic tools. In the interpretation portal, principle component analysis can be performed on visualized demagnetization diagrams. Interpreted directions and great circles can be combined to find great circle solutions. These directions can be used in the statistics portal, or exported as data and figures. The tools in the statistics portal cover standard Fisher statistics for directions and VGPs, including other statistical parameters used as reliability criteria. Other available tools include an eigenvector approach foldtest, two reversal test including a Monte Carlo simulation on mean directions, and a coordinate bootstrap on the original data. An implementation is included for the detection and correction of inclination shallowing in sediments following TK03.GAD. Finally we provide a module to visualize VGPs and expected paleolatitudes, declinations, and inclinations relative to widely used global apparent polar wander path models in coordinates of major continent-bearing plates. The tools in the miscellaneous portal include a net tectonic rotation (NTR) analysis to restore a body to its paleo-vertical and a bootstrapped oroclinal test using linear regressive techniques, including a modified foldtest around a vertical axis. Paleomagnetism.org provides an integrated approach for researchers to work with visualized (e.g. hemisphere projections, Zijderveld diagrams) paleomagnetic data. The application constructs a custom exportable file that can be shared freely and included in public databases. This exported file contains all data and can later be imported to the application by other researchers. The accessibility and simplicity through which paleomagnetic data can be interpreted, analyzed, visualized, and shared makes Paleomagnetism.org of interest to the community.
Schoer, Karl; Wood, Richard; Arto, Iñaki; Weinzettel, Jan
2013-12-17
The mass of material consumed by a population has become a useful proxy for measuring environmental pressure. The "raw material equivalents" (RME) metric of material consumption addresses the issue of including the full supply chain (including imports) when calculating national or product level material impacts. The RME calculation suffers from data availability, however, as quantitative data on production practices along the full supply chain (in different regions) is required. Hence, the RME is currently being estimated by three main approaches: (1) assuming domestic technology in foreign economies, (2) utilizing region-specific life-cycle inventories (in a hybrid framework), and (3) utilizing multi-regional input-output (MRIO) analysis to explicitly cover all regions of the supply chain. While the first approach has been shown to give inaccurate results, this paper focuses on the benefits and costs of the latter two approaches. We analyze results from two key (MRIO and hybrid) projects modeling raw material equivalents, adjusting the models in a stepwise manner in order to quantify the effects of individual conceptual elements. We attempt to isolate the MRIO gap, which denotes the quantitative impact of calculating the RME of imports by an MRIO approach instead of the hybrid model, focusing on the RME of EU external trade imports. While, the models give quantitatively similar results, differences become more pronounced when tracking more detailed material flows. We assess the advantages and disadvantages of the two approaches and look forward to ways to further harmonize data and approaches.
Asquith, William H.; Slade, R.M.
1999-01-01
The U.S. Geological Survey, in cooperation with the Texas Department of Transportation, has developed a computer program to estimate peak-streamflow frequency for ungaged sites in natural basins in Texas. Peak-streamflow frequency refers to the peak streamflows for recurrence intervals of 2, 5, 10, 25, 50, and 100 years. Peak-streamflow frequency estimates are needed by planners, managers, and design engineers for flood-plain management; for objective assessment of flood risk; for cost-effective design of roads and bridges; and also for the desin of culverts, dams, levees, and other flood-control structures. The program estimates peak-streamflow frequency using a site-specific approach and a multivariate generalized least-squares linear regression. A site-specific approach differs from a traditional regional regression approach by developing unique equations to estimate peak-streamflow frequency specifically for the ungaged site. The stations included in the regression are selected using an informal cluster analysis that compares the basin characteristics of the ungaged site to the basin characteristics of all the stations in the data base. The program provides several choices for selecting the stations. Selecting the stations using cluster analysis ensures that the stations included in the regression will have the most pertinent information about flooding characteristics of the ungaged site and therefore provide the basis for potentially improved peak-streamflow frequency estimation. An evaluation of the site-specific approach in estimating peak-streamflow frequency for gaged sites indicates that the site-specific approach is at least as accurate as a traditional regional regression approach.
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
NASA Astrophysics Data System (ADS)
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
Black, Nicola; Mullan, Barbara; Sharpe, Louise
2016-09-01
The current aim was to examine the effectiveness of behaviour change techniques (BCTs), theory and other characteristics in increasing the effectiveness of computer-delivered interventions (CDIs) to reduce alcohol consumption. Included were randomised studies with a primary aim of reducing alcohol consumption, which compared self-directed CDIs to assessment-only control groups. CDIs were coded for the use of 42 BCTs from an alcohol-specific taxonomy, the use of theory according to a theory coding scheme and general characteristics such as length of the CDI. Effectiveness of CDIs was assessed using random-effects meta-analysis and the association between the moderators and effect size was assessed using univariate and multivariate meta-regression. Ninety-three CDIs were included in at least one analysis and produced small, significant effects on five outcomes (d+ = 0.07-0.15). Larger effects occurred with some personal contact, provision of normative information or feedback on performance, prompting commitment or goal review, the social norms approach and in samples with more women. Smaller effects occurred when information on the consequences of alcohol consumption was provided. These findings can be used to inform both intervention- and theory-development. Intervention developers should focus on, including specific, effective techniques, rather than many techniques or more-elaborate approaches.
3D Simulation of External Flooding Events for the RISMC Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less
NASA Technical Reports Server (NTRS)
Glenn, G. M.
1977-01-01
A preflight analysis of the ALT separation reference trajectories for the tailcone on, forward, and aft cg orbiter configurations is documented. The ALT separation reference trajectories encompass the time from physical separation of the orbiter from the carrier to orbiter attainment of the maximum ALT interface airspeed. The trajectories include post separation roll maneuvers by both vehicles and are generated using the final preflight data base. The trajectories so generated satisfy all known separation design criteria and violate no known constraints. The requirement for this analysis is given along with the specifications, assumptions, and analytical approach used to generate the separation trajectories. The results of the analytical approach are evaluated, and conclusions and recommendations are summarized.
Near-infrared photon time-of-flight spectroscopy of turbid materials up to 1400 nm
NASA Astrophysics Data System (ADS)
Svensson, Tomas; Alerstam, Erik; Khoptyar, Dmitry; Johansson, Jonas; Folestad, Staffan; Andersson-Engels, Stefan
2009-06-01
Photon time-of-flight spectroscopy (PTOFS) is a powerful tool for analysis of turbid materials. We have constructed a time-of-flight spectrometer based on a supercontinuum fiber laser, acousto-optical tunable filtering, and an InP/InGaAsP microchannel plate photomultiplier tube. The system is capable of performing PTOFS up to 1400 nm, and thus covers an important region for vibrational spectroscopy of solid samples. The development significantly increases the applicability of PTOFS for analysis of chemical content and physical properties of turbid media. The great value of the proposed approach is illustrated by revealing the distinct absorption features of turbid epoxy resin. Promising future applications of the approach are discussed, including quantitative assessment of pharmaceuticals, powder analysis, and calibration-free near-infrared spectroscopy.
Evaluation of various modelling approaches in flood routing simulation and flood area mapping
NASA Astrophysics Data System (ADS)
Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe
2016-04-01
An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.
ERIC Educational Resources Information Center
Neman, Robert Lynn
This study was designed to assess the effects of the problem-oriented method compared to those of the traditional approach in general chemistry at the college level. The problem-oriented course included topics such as air and water pollution, drug addiction and analysis, tetraethyl-lead additives, insecticides in the environment, and recycling of…
Remote sensing based approach for monitoring urban growth in Mexico city, Mexico: A case study
NASA Astrophysics Data System (ADS)
Obade, Vincent
The world is experiencing a rapid rate of urban expansion, largely contributed by the population growth. Other factors supporting urban growth include the improved efficiency in the transportation sector and increasing dependence on cars as a means of transport. The problems attributed to the urban growth include: depletion of energy resources, water and air pollution; loss of landscapes and wildlife, loss of agricultural land, inadequate social security and lack of employment or underemployment. Aerial photography is one of the popular techniques for analyzing, planning and minimizing urbanization related problems. However, with the advances in space technology, satellite remote sensing is increasingly being utilized in the analysis and planning of the urban environment. This article outlines the strengths and limitations of potential remote sensing techniques for monitoring urban growth. The selected methods include: Principal component analysis, Maximum likelihood classification and "decision tree". The results indicate that the "classification tree" approach is the most promising for monitoring urban change, given the improved accuracy and smooth transition between the various land cover classes
Spectrum Modal Analysis for the Detection of Low-Altitude Windshear with Airborne Doppler Radar
NASA Technical Reports Server (NTRS)
Kunkel, Matthew W.
1992-01-01
A major obstacle in the estimation of windspeed patterns associated with low-altitude windshear with an airborne pulsed Doppler radar system is the presence of strong levels of ground clutter which can strongly bias a windspeed estimate. Typical solutions attempt to remove the clutter energy from the return through clutter rejection filtering. Proposed is a method whereby both the weather and clutter modes present in a return spectrum can be identified to yield an unbiased estimate of the weather mode without the need for clutter rejection filtering. An attempt will be made to show that modeling through a second order extended Prony approach is sufficient for the identification of the weather mode. A pattern recognition approach to windspeed estimation from the identified modes is derived and applied to both simulated and actual flight data. Comparisons between windspeed estimates derived from modal analysis and the pulse-pair estimator are included as well as associated hazard factors. Also included is a computationally attractive method for estimating windspeeds directly from the coefficients of a second-order autoregressive model. Extensions and recommendations for further study are included.
MOD-0A 200 kW wind turbine generator design and analysis report
NASA Astrophysics Data System (ADS)
Anderson, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-08-01
The design, analysis, and initial performance of the MOD-OA 200 kW wind turbine generator at Clayton, NM is documented. The MOD-OA was designed and built to obtain operation and performance data and experience in utility environments. The project requirements, approach, system description, design requirements, design, analysis, system tests, installation, safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the wind turbine are discussed. The design and analysis of the rotor, drive train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electricl system, and control systems are presented. The rotor includes the blades, hub, and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control are discussed. Systems analyses on dynamic loads and fatigue are presented.
MOD-0A 200 kW wind turbine generator design and analysis report
NASA Technical Reports Server (NTRS)
Anderson, T. S.; Bodenschatz, C. A.; Eggers, A. G.; Hughes, P. S.; Lampe, R. F.; Lipner, M. H.; Schornhorst, J. R.
1980-01-01
The design, analysis, and initial performance of the MOD-OA 200 kW wind turbine generator at Clayton, NM is documented. The MOD-OA was designed and built to obtain operation and performance data and experience in utility environments. The project requirements, approach, system description, design requirements, design, analysis, system tests, installation, safety considerations, failure modes and effects analysis, data acquisition, and initial performance for the wind turbine are discussed. The design and analysis of the rotor, drive train, nacelle equipment, yaw drive mechanism and brake, tower, foundation, electricl system, and control systems are presented. The rotor includes the blades, hub, and pitch change mechanism. The drive train includes the low speed shaft, speed increaser, high speed shaft, and rotor brake. The electrical system includes the generator, switchgear, transformer, and utility connection. The control systems are the blade pitch, yaw, and generator control, and the safety system. Manual, automatic, and remote control are discussed. Systems analyses on dynamic loads and fatigue are presented.
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Storm, Lance; Tressoldi, Patrizio E; Utts, Jessica
2013-01-01
Rouder, Morey, and Province (2013) stated that (a) the evidence-based case for psi in Storm, Tressoldi, and Di Risio's (2010) meta-analysis is supported only by a number of studies that used manual randomization, and (b) when these studies are excluded so that only investigations using automatic randomization are evaluated (and some additional studies previously omitted by Storm et al., 2010, are included), the evidence for psi is "unpersuasive." Rouder et al. used a Bayesian approach, and we adopted the same methodology, finding that our case is upheld. Because of recent updates and corrections, we reassessed the free-response databases of Storm et al. using a frequentist approach. We discuss and critique the assumptions and findings of Rouder et al. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Formulation for Simultaneous Aerodynamic Analysis and Design Optimization
NASA Technical Reports Server (NTRS)
Hou, G. W.; Taylor, A. C., III; Mani, S. V.; Newman, P. A.
1993-01-01
An efficient approach for simultaneous aerodynamic analysis and design optimization is presented. This approach does not require the performance of many flow analyses at each design optimization step, which can be an expensive procedure. Thus, this approach brings us one step closer to meeting the challenge of incorporating computational fluid dynamic codes into gradient-based optimization techniques for aerodynamic design. An adjoint-variable method is introduced to nullify the effect of the increased number of design variables in the problem formulation. The method has been successfully tested on one-dimensional nozzle flow problems, including a sample problem with a normal shock. Implementations of the above algorithm are also presented that incorporate Newton iterations to secure a high-quality flow solution at the end of the design process. Implementations with iterative flow solvers are possible and will be required for large, multidimensional flow problems.
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
NASA Technical Reports Server (NTRS)
Almroth, B. O.; Brogan, F. A.
1978-01-01
Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.
Statistical Approaches to Adjusting Weights for Dependent Arms in Network Meta-analysis.
Su, Yu-Xuan; Tu, Yu-Kang
2018-05-22
Network meta-analysis compares multiple treatments in terms of their efficacy and harm by including evidence from randomized controlled trials. Most clinical trials use parallel design, where patients are randomly allocated to different treatments and receive only one treatment. However, some trials use within person designs such as split-body, split-mouth and cross-over designs, where each patient may receive more than one treatment. Data from treatment arms within these trials are no longer independent, so the correlations between dependent arms need to be accounted for within the statistical analyses. Ignoring these correlations may result in incorrect conclusions. The main objective of this study is to develop statistical approaches to adjusting weights for dependent arms within special design trials. In this study, we demonstrate the following three approaches: the data augmentation approach, the adjusting variance approach, and the reducing weight approach. These three methods could be perfectly applied in current statistic tools such as R and STATA. An example of periodontal regeneration was used to demonstrate how these approaches could be undertaken and implemented within statistical software packages, and to compare results from different approaches. The adjusting variance approach can be implemented within the network package in STATA, while reducing weight approach requires computer software programming to set up the within-study variance-covariance matrix. This article is protected by copyright. All rights reserved.
Comparative Analysis of Sustainable Approaches and Systems for Scientific Data Stewardship
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2012-12-01
Sustainable data systems are critical components of the cyberinfrastructure needed to provide long-term stewardship of scientific data, including Earth science data, throughout their entire life cycle. A variety of approaches may help ensure the sustainability of such systems, but these approaches must be able to survive the demands of competing priorities and decreasing budgets. Analyzing and comparing alternative approaches can identify viable aspects of each approach and inform decisions for developing, managing, and supporting the cyberinfrastructure needed to facilitate discovery, access, and analysis of data by future communities of users. A typology of sustainability approaches is proposed, and example use cases are offered for comparing the approaches over time. These examples demonstrate the potential strengths and weaknesses of each approach under various conditions and with regard to different objectives, e.g., open vs. limited access. By applying the results of these analyses to their particular circumstances, systems stakeholders can assess their options for a sustainable systems approach along with other metrics and identify alternative strategies to ensure the sustainability of the scientific data and information for which they are responsible. In addition, comparing sustainability approaches should inform the design of new systems and the improvement of existing systems to meet the needs for long-term stewardship of scientific data, and support education and workforce development efforts needed to ensure that the appropriate scientific and technical skills are available to operate and further develop sustainable cyberinfrastructure.
On selecting a prior for the precision parameter of Dirichlet process mixture models
Dorazio, R.M.
2009-01-01
In hierarchical mixture models the Dirichlet process is used to specify latent patterns of heterogeneity, particularly when the distribution of latent parameters is thought to be clustered (multimodal). The parameters of a Dirichlet process include a precision parameter ?? and a base probability measure G0. In problems where ?? is unknown and must be estimated, inferences about the level of clustering can be sensitive to the choice of prior assumed for ??. In this paper an approach is developed for computing a prior for the precision parameter ?? that can be used in the presence or absence of prior information about the level of clustering. This approach is illustrated in an analysis of counts of stream fishes. The results of this fully Bayesian analysis are compared with an empirical Bayes analysis of the same data and with a Bayesian analysis based on an alternative commonly used prior.
Parallel computing for probabilistic fatigue analysis
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.
1993-01-01
This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.
Lennox, L; Maher, L; Reed, J
2018-02-09
Improvement initiatives offer a valuable mechanism for delivering and testing innovations in healthcare settings. Many of these initiatives deliver meaningful and necessary changes to patient care and outcomes. However, many improvement initiatives fail to sustain to a point where their full benefits can be realised. This has led many researchers and healthcare practitioners to develop frameworks, models and tools to support and monitor sustainability. This work aimed to identify what approaches are available to assess and influence sustainability in healthcare and to describe the different perspectives, applications and constructs within these approaches to guide their future use. A systematic review was carried out following PRISMA guidelines to identify publications that reported approaches to support or influence sustainability in healthcare. Eligibility criteria were defined through an iterative process in which two reviewers independently assessed 20% of articles to test the objectivity of the selection criteria. Data were extracted from the identified articles, and a template analysis was undertaken to identify and assess the sustainability constructs within each reported approach. The search strategy identified 1748 publications with 227 articles retrieved in full text for full documentary analysis. In total, 62 publications identifying a sustainability approach were included in this review (32 frameworks, 16 models, 8 tools, 4 strategies, 1 checklist and 1 process). Constructs across approaches were compared and 40 individual constructs for sustainability were found. Comparison across approaches demonstrated consistent constructs were seen regardless of proposed interventions, setting or level of application with 6 constructs included in 75% of the approaches. Although similarities were found, no approaches contained the same combination of the constructs nor did any single approach capture all identified constructs. From these results, a consolidated framework for sustainability constructs in healthcare was developed. Choosing a sustainability method can pose a challenge because of the diverse approaches reported in the literature. This review provides a valuable resource to researchers, healthcare professionals and improvement practitioners by providing a summary of available sustainability approaches and their characteristics. This review was registered on the PROSPERO database: CRD42016040081 in June 2016.
Logsdon, M Cynthia; Mittelberg, Meghan; Morrison, David; Robertson, Ashley; Luther, James F; Wisniewski, Stephen R; Confer, Andrea; Eng, Heather; Sit, Dorothy K Y; Wisner, Katherine L
2014-12-01
The purpose of this study was to determine which of the four common approaches to coding maternal-infant interaction best discriminates between mothers with and without postpartum depression. After extensive training, four research assistants coded 83 three minute videotapes of maternal infant interaction at 12month postpartum visits. Four theoretical approaches to coding (Maternal Behavior Q-Sort, the Dyadic Mini Code, Ainsworth Maternal Sensitivity Scale, and the Child-Caregiver Mutual Regulation Scale) were used. Twelve month data were chosen to allow the maximum possible exposure of the infant to maternal depression during the first postpartum year. The videotapes were created in a laboratory with standard procedures. Inter-rater reliabilities for each coding method ranged from .7 to .9. The coders were blind to depression status of the mother. Twenty-seven of the women had major depressive disorder during the 12month postpartum period. Receiver operating characteristics analysis indicated that none of the four methods of analyzing maternal infant interaction discriminated between mothers with and without major depressive disorder. Limitations of the study include the cross-sectional design and the low number of women with major depressive disorder. Further analysis should include data from videotapes at earlier postpartum time periods, and alternative coding approaches should be considered. Nurses should continue to examine culturally appropriate ways in which new mothers can be supported in how to best nurture their babies. Copyright © 2014 Elsevier Inc. All rights reserved.
A Research Roadmap for Computation-Based Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Mandelli, Diego; Joe, Jeffrey
2015-08-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less
Estimation of indirect effect when the mediator is a censored variable.
Wang, Jian; Shete, Sanjay
2017-01-01
A mediation model explores the direct and indirect effects of an initial variable ( X) on an outcome variable ( Y) by including a mediator ( M). In many realistic scenarios, investigators observe censored data instead of the complete data. Current research in mediation analysis for censored data focuses mainly on censored outcomes, but not censored mediators. In this study, we proposed a strategy based on the accelerated failure time model and a multiple imputation approach. We adapted a measure of the indirect effect for the mediation model with a censored mediator, which can assess the indirect effect at both the group and individual levels. Based on simulation, we established the bias in the estimations of different paths (i.e. the effects of X on M [ a], of M on Y [ b] and of X on Y given mediator M [ c']) and indirect effects when analyzing the data using the existing approaches, including a naïve approach implemented in software such as Mplus, complete-case analysis, and the Tobit mediation model. We conducted simulation studies to investigate the performance of the proposed strategy compared to that of the existing approaches. The proposed strategy accurately estimates the coefficients of different paths, indirect effects and percentages of the total effects mediated. We applied these mediation approaches to the study of SNPs, age at menopause and fasting glucose levels. Our results indicate that there is no indirect effect of association between SNPs and fasting glucose level that is mediated through the age at menopause.
Current situation of oil refinery in Bulgaria
NASA Astrophysics Data System (ADS)
Vershkova, Elena; Petkova, Petinka; Grinkevich, Anastasia
2016-09-01
This article deals with the classification approach for oil refineries in international practices. Criteria of refinery estimation group, including its financial status estimation, have been investigated. The analysis object is “Lukoil Neftochim Bourgas” AD (LNCHB) activity. This company is a leading enterprise in Bulgaria. The analysis of LNCHB operating: energy intensity index; index of operating costs and return on investment index have been performed.
Historical Analysis of Population Reactions to Stimuli - a Case Study of Fiji
2007-03-01
multidisciplinary approach taken from such disciplines as operations research, political science, anthropology and qualitative historical analysis. These... anthropology , historical linguistics and ethnography have all provided evidence for the formidable warrior culture that pervaded Fiji4 [7, 75]. The first...diagram’. The diagrams have also been colour -coded, with legends included. The matrix in Appendix C.3 also demonstrates how highly dependent the
SELECTION AND CALIBRATION OF SUBSURFACE REACTIVE TRANSPORT MODELS USING A SURROGATE-MODEL APPROACH
While standard techniques for uncertainty analysis have been successfully applied to groundwater flow models, extension to reactive transport is frustrated by numerous difficulties, including excessive computational burden and parameter non-uniqueness. This research introduces a...
The use of multivariate statistics in studies of wildlife habitat
David E. Capen
1981-01-01
This report contains edited and reviewed versions of papers presented at a workshop held at the University of Vermont in April 1980. Topics include sampling avian habitats, multivariate methods, applications, examples, and new approaches to analysis and interpretation.
Preliminary design package for Sunair SEC-601 solar collector
NASA Technical Reports Server (NTRS)
1978-01-01
The preliminary design of the Owens-Illinois model Sunair SEC-601 tubular air solar collector is presented. Information in this package includes the subsystem design and development approaches, hazard analysis, and detailed drawings available as the preliminary design review.
An integrated hybrid spatial-compartmental modeling approach is presented for analyzing the dynamic distribution of chemicals in the multimedia environment. Information obtained from such analysis, which includes temporal chemical concentration profiles in various media, mass ...