A Survey of Model Evaluation Approaches with a Tutorial on Hierarchical Bayesian Methods
ERIC Educational Resources Information Center
Shiffrin, Richard M.; Lee, Michael D.; Kim, Woojae; Wagenmakers, Eric-Jan
2008-01-01
This article reviews current methods for evaluating models in the cognitive sciences, including theoretically based approaches, such as Bayes factors and minimum description length measures; simulation approaches, including model mimicry evaluations; and practical approaches, such as validation and generalization measures. This article argues…
A framework for the social valuation of ecosystem services.
Felipe-Lucia, María R; Comín, Francisco A; Escalera-Reyes, Javier
2015-05-01
Methods to assess ecosystem services using ecological or economic approaches are considerably better defined than methods for the social approach. To identify why the social approach remains unclear, we reviewed current trends in the literature. We found two main reasons: (i) the cultural ecosystem services are usually used to represent the whole social approach, and (ii) the economic valuation based on social preferences is typically included in the social approach. Next, we proposed a framework for the social valuation of ecosystem services that provides alternatives to economics methods, enables comparison across studies, and supports decision-making in land planning and management. The framework includes the agreements emerged from the review, such as considering spatial-temporal flows, including stakeholders from all social ranges, and using two complementary methods to value ecosystem services. Finally, we provided practical recommendations learned from the application of the proposed framework in a case study.
Mistry, Rashmita S; White, Elizabeth S; Chow, Kirby A; Griffin, Katherine M; Nenadal, Lindsey
2016-01-01
Mixed methods research approaches are gaining traction across various social science disciplines, including among developmental scientists. In this chapter, we discuss the utility of a mixed methods research approach in examining issues related to equity and justice. We incorporate a brief overview of quantitative and qualitative monomethod research approaches in our larger discussion of the advantages, procedures, and considerations of employing a mixed methods design to advance developmental science from an equity and justice perspective. To better illustrate the theoretical and practical significance of a mixed methods research approach, we include examples of research conducted on children and adolescents' conceptions of economic inequality as one example of developmental science research with an equity and justice frame. © 2016 Elsevier Inc. All rights reserved.
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
The Impact of a Multifaceted Approach to Teaching Research Methods on Students' Attitudes
ERIC Educational Resources Information Center
Ciarocco, Natalie J.; Lewandowski, Gary W., Jr.; Van Volkom, Michele
2013-01-01
A multifaceted approach to teaching five experimental designs in a research methodology course was tested. Participants included 70 students enrolled in an experimental research methods course in the semester both before and after the implementation of instructional change. When using a multifaceted approach to teaching research methods that…
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Lindgren, Eric A.
2018-04-01
This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.
Cooper, Chris; Lovell, Rebecca; Husk, Kerryn; Booth, Andrew; Garside, Ruth
2018-06-01
We undertook a systematic review to evaluate the health benefits of environmental enhancement and conservation activities. We were concerned that a conventional process of study identification, focusing on exhaustive searches of bibliographic databases as the primary search method, would be ineffective, offering limited value. The focus of this study is comparing study identification methods. We compare (1) an approach led by searches of bibliographic databases with (2) an approach led by supplementary search methods. We retrospectively assessed the effectiveness and value of both approaches. Effectiveness was determined by comparing (1) the total number of studies identified and screened and (2) the number of includable studies uniquely identified by each approach. Value was determined by comparing included study quality and by using qualitative sensitivity analysis to explore the contribution of studies to the synthesis. The bibliographic databases approach identified 21 409 studies to screen and 2 included qualitative studies were uniquely identified. Study quality was moderate, and contribution to the synthesis was minimal. The supplementary search approach identified 453 studies to screen and 9 included studies were uniquely identified. Four quantitative studies were poor quality but made a substantive contribution to the synthesis; 5 studies were qualitative: 3 studies were good quality, one was moderate quality, and 1 study was excluded from the synthesis due to poor quality. All 4 included qualitative studies made significant contributions to the synthesis. This case study found value in aligning primary methods of study identification to maximise location of relevant evidence. Copyright © 2017 John Wiley & Sons, Ltd.
Petticrew, Mark; Rehfuess, Eva; Noyes, Jane; Higgins, Julian P T; Mayhew, Alain; Pantoja, Tomas; Shemilt, Ian; Sowden, Amanda
2013-11-01
Although there is increasing interest in the evaluation of complex interventions, there is little guidance on how evidence from complex interventions may be reviewed and synthesized, and the relevance of the plethora of evidence synthesis methods to complexity is unclear. This article aims to explore how different meta-analytical approaches can be used to examine aspects of complexity; describe the contribution of various narrative, tabular, and graphical approaches to synthesis; and give an overview of the potential choice of selected qualitative and mixed-method evidence synthesis approaches. The methodological discussions presented here build on a 2-day workshop held in Montebello, Canada, in January 2012, involving methodological experts from the Campbell and Cochrane Collaborations and from other international review centers (Anderson L, Petticrew M, Chandler J, et al. systematic reviews of complex interventions. In press). These systematic review methodologists discussed the broad range of existing methods and considered the relevance of these methods to reviews of complex interventions. The evidence from primary studies of complex interventions may be qualitative or quantitative. There is a wide range of methodological options for reviewing and presenting this evidence. Specific contributions of statistical approaches include the use of meta-analysis, meta-regression, and Bayesian methods, whereas narrative summary approaches provide valuable precursors or alternatives to these. Qualitative and mixed-method approaches include thematic synthesis, framework synthesis, and realist synthesis. A suitable combination of these approaches allows synthesis of evidence for understanding complex interventions. Reviewers need to consider which aspects of complex interventions should be a focus of their review and what types of quantitative and/or qualitative studies they will be including, and this will inform their choice of review methods. These may range from standard meta-analysis through to more complex mixed-method synthesis and synthesis approaches that incorporate theory and/or user's perspectives. Copyright © 2013 Elsevier Inc. All rights reserved.
Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M
2006-04-21
Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association studies using the case-control design, the application of a combination of several methods, including the set association approach, MDR and the random forests approach, will likely be a useful strategy to find the important genes and interaction patterns involved in complex diseases.
ERIC Educational Resources Information Center
Kim, Kyung Hi
2014-01-01
This research, based on a case study of vulnerable children in Korea, used a mixed methods transformative approach to explore strategies to support and help disadvantaged children. The methodological approach includes three phases: a mixed methods contextual analysis, a qualitative dominant analysis based on Sen's capability approach and critical…
Symbolic algebra approach to the calculation of intraocular lens power following cataract surgery
NASA Astrophysics Data System (ADS)
Hjelmstad, David P.; Sayegh, Samir I.
2013-03-01
We present a symbolic approach based on matrix methods that allows for the analysis and computation of intraocular lens power following cataract surgery. We extend the basic matrix approach corresponding to paraxial optics to include astigmatism and other aberrations. The symbolic approach allows for a refined analysis of the potential sources of errors ("refractive surprises"). We demonstrate the computation of lens powers including toric lenses that correct for both defocus (myopia, hyperopia) and astigmatism. A specific implementation in Mathematica allows an elegant and powerful method for the design and analysis of these intraocular lenses.
Some Methods for Evaluating Program Implementation.
ERIC Educational Resources Information Center
Hardy, Roy A.
An approach to evaluating program implementation is described. This approach includes the development of a project description which includes a structure matrix, sampling from the structure matrix, and preparing an implementation evaluation plan. The implementation evaluation plan should include: (1) verification of implementation of planned…
Accessible methods for the dynamic time-scale decomposition of biochemical systems.
Surovtsova, Irina; Simus, Natalia; Lorenz, Thomas; König, Artjom; Sahle, Sven; Kummer, Ursula
2009-11-01
The growing complexity of biochemical models asks for means to rationally dissect the networks into meaningful and rather independent subnetworks. Such foregoing should ensure an understanding of the system without any heuristics employed. Important for the success of such an approach is its accessibility and the clarity of the presentation of the results. In order to achieve this goal, we developed a method which is a modification of the classical approach of time-scale separation. This modified method as well as the more classical approach have been implemented for time-dependent application within the widely used software COPASI. The implementation includes different possibilities for the representation of the results including 3D-visualization. The methods are included in COPASI which is free for academic use and available at www.copasi.org. irina.surovtsova@bioquant.uni-heidelberg.de Supplementary data are available at Bioinformatics online.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.
2002-01-01
In this paper we present a comparison of optimization approaches to the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP), Quasi-Newton, Simplex, Genetic Algorithms, and Simulated Annealing. Each method is applied to a variety of test cases including, circular to circular coplanar orbits, LEO to GEO, and orbit phasing in highly elliptic orbits. We also compare different constrained optimization routines on complex orbit rendezvous problems with complicated, highly nonlinear constraints.
Training: An Opportunity for People with Disabilities in School Foodservice Operations
ERIC Educational Resources Information Center
Paez, Paola; Arendt, Susan; Strohbehn, Catherine
2011-01-01
Purpose/Objectives: This study assessed current training methods and topics used at public school foodservice operations as well as school foodservice representatives' attitudes toward training employees with disabilities. Methods: A mixed method approach of data collection included two phases. Phase I used a more qualitative approach; interviews…
Qualitative research methods in renal medicine: an introduction.
Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M
2015-09-01
Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driver, C.J.
1994-05-01
Criteria for determining the quality of liver sediment are necessary to ensure that concentrations of contaminants in aquatic systems are within acceptable limits for the protection of aquatic and human life. Such criteria should facilitate decision-making about remediation, handling, and disposal of contaminants. Several approaches to the development of sediment quality criteria (SQC) have been described and include both descriptive and numerical methods. However, no single method measures all impacts at all times to all organisms (U.S. EPA 1992b). The U.S. EPA`s interest is primarily in establishing chemically based, numerical SQC that are applicable nation-wide (Shea 1988). Of the approachesmore » proposed for SQC development, only three are being considered for numerical SQC on a national level. These approaches include an Equilibrium Partitioning Approach, a site-specific method using bioassays (the Apparent Effects Threshold Approach), and an approach similar to EPA`s water quality criteria (Pavlou and Weston 1984). Although national (or even regional) criteria address a number of political, litigative, and engineering needs, some researchers feel that protection of benthic communities require site-specific, biologically based criteria (Baudo et al. 1990). This is particularly true for areas where complex mixtures of contaminants are present in sediments. Other scientifically valid and accepted procedures for freshwater SQC include a background concentration approach, methods using field or spiked bioassays, a screening level concentration approach, the Apparent Effects Threshold Approach, the Sediment Quality Triad, the International Joint Commission Sediment Assessment Strategy, and the National Status and Trends Program Approach. The various sediment assessment approaches are evaluated for application to the Hanford Reach and recommendations for Hanford Site sediment quality criteria are discussed.« less
ERIC Educational Resources Information Center
Pekbay, Canay; Yilmaz, Serkan
2015-01-01
This study aims to explore the influence of nature of science (NOS) activities based on explicit-reflective and historical approach on preservice elementary teachers' views of NOS aspects. Mixed-method approach including both qualitative and quantitative methods was used. The sample consisted of 83 preservice elementary teachers of a public…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millis, Andrew
Understanding the behavior of interacting electrons in molecules and solids so that one can predict new superconductors, catalysts, light harvesters, energy and battery materials and optimize existing ones is the ``quantum many-body problem’’. This is one of the scientific grand challenges of the 21 st century. A complete solution to the problem has been proven to be exponentially hard, meaning that straightforward numerical approaches fail. New insights and new methods are needed to provide accurate yet feasible approximate solutions. This CMSCN project brought together chemists and physicists to combine insights from the two disciplines to develop innovative new approaches. Outcomesmore » included the Density Matrix Embedding method, a new, computationally inexpensive and extremely accurate approach that may enable first principles treatment of superconducting and magnetic properties of strongly correlated materials, new techniques for existing methods including an Adaptively Truncated Hilbert Space approach that will vastly expand the capabilities of the dynamical mean field method, a self-energy embedding theory and a new memory-function based approach to the calculations of the behavior of driven systems. The methods developed under this project are now being applied to improve our understanding of superconductivity, to calculate novel topological properties of materials and to characterize and improve the properties of nanoscale devices.« less
PBL and beyond: trends in collaborative learning.
Pluta, William J; Richards, Boyd F; Mutnick, Andrew
2013-01-01
Building upon the disruption to lecture-based methods triggered by the introduction of problem-based learning, approaches to promote collaborative learning are becoming increasingly diverse, widespread and generally well accepted within medical education. Examples of relatively new, structured collaborative learning methods include team-based learning and just-in-time teaching. Examples of less structured approaches include think-pair share, case discussions, and the flipped classroom. It is now common practice in medical education to employ a range of instructional approaches to support collaborative learning. We believe that the adoption of such approaches is entering a new and challenging era. We define collaborate learning by drawing on the broader literature, including Chi's ICAP framework that emphasizes the importance of sustained, interactive explanation and elaboration by learners. We distinguish collaborate learning from constructive, active, and passive learning and provide preliminary evidence documenting the growth of methods that support collaborative learning. We argue that the rate of adoption of collaborative learning methods will accelerate due to a growing emphasis on the development of team competencies and the increasing availability of digital media. At the same time, the adoption collaborative learning strategies face persistent challenges, stemming from an overdependence on comparative-effectiveness research and a lack of useful guidelines about how best to adapt collaborative learning methods to given learning contexts. The medical education community has struggled to consistently demonstrate superior outcomes when using collaborative learning methods and strategies. Despite this, support for their use will continue to expand. To select approaches with the greatest utility, instructors must carefully align conditions of the learning context with the learning approaches under consideration. Further, it is critical that modifications are made with caution and that instructors verify that modifications do not impede the desired cognitive activities needed to support meaningful collaborative learning.
Integrating structure-based and ligand-based approaches for computational drug design.
Wilson, Gregory L; Lill, Markus A
2011-04-01
Methods utilized in computer-aided drug design can be classified into two major categories: structure based and ligand based, using information on the structure of the protein or on the biological and physicochemical properties of bound ligands, respectively. In recent years there has been a trend towards integrating these two methods in order to enhance the reliability and efficiency of computer-aided drug-design approaches by combining information from both the ligand and the protein. This trend resulted in a variety of methods that include: pseudoreceptor methods, pharmacophore methods, fingerprint methods and approaches integrating docking with similarity-based methods. In this article, we will describe the concepts behind each method and selected applications.
Kisely, Stephen; Kendall, Elizabeth
2011-08-01
Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.
Baxter, Ruth; Taylor, Natalie; Kellar, Ian; Lawton, Rebecca
2016-01-01
Background The positive deviance approach focuses on those who demonstrate exceptional performance, despite facing the same constraints as others. ‘Positive deviants’ are identified and hypotheses about how they succeed are generated. These hypotheses are tested and then disseminated within the wider community. The positive deviance approach is being increasingly applied within healthcare organisations, although limited guidance exists and different methods, of varying quality, are used. This paper systematically reviews healthcare applications of the positive deviance approach to explore how positive deviance is defined, the quality of existing applications and the methods used within them, including the extent to which staff and patients are involved. Methods Peer-reviewed articles, published prior to September 2014, reporting empirical research on the use of the positive deviance approach within healthcare, were identified from seven electronic databases. A previously defined four-stage process for positive deviance in healthcare was used as the basis for data extraction. Quality assessments were conducted using a validated tool, and a narrative synthesis approach was followed. Results 37 of 818 articles met the inclusion criteria. The positive deviance approach was most frequently applied within North America, in secondary care, and to address healthcare-associated infections. Research predominantly identified positive deviants and generated hypotheses about how they succeeded. The approach and processes followed were poorly defined. Research quality was low, articles lacked detail and comparison groups were rarely included. Applications of positive deviance typically lacked staff and/or patient involvement, and the methods used often required extensive resources. Conclusion Further research is required to develop high quality yet practical methods which involve staff and patients in all stages of the positive deviance approach. The efficacy and efficiency of positive deviance must be assessed and compared with other quality improvement approaches. PROSPERO registration number CRD42014009365. PMID:26590198
ERIC Educational Resources Information Center
Derfoufi, Sanae; Benmoussa, Adnane; El Harti, Jaouad; Ramli, Youssef; Taoufik, Jamal; Chaouir, Souad
2015-01-01
This study investigates the positive impact of the Case Method implemented during a 4- hours tutorial in "therapeutic chemistry module." We view the Case Method as one particular approach within the broader spectrum of problem based or inquiry based learning approaches. Sixty students were included in data analysis. A pre-test and…
A Reconstruction Approach to High-Order Schemes Including Discontinuous Galerkin for Diffusion
NASA Technical Reports Server (NTRS)
Huynh, H. T.
2009-01-01
We introduce a new approach to high-order accuracy for the numerical solution of diffusion problems by solving the equations in differential form using a reconstruction technique. The approach has the advantages of simplicity and economy. It results in several new high-order methods including a simplified version of discontinuous Galerkin (DG). It also leads to new definitions of common value and common gradient quantities at each interface shared by the two adjacent cells. In addition, the new approach clarifies the relations among the various choices of new and existing common quantities. Fourier stability and accuracy analyses are carried out for the resulting schemes. Extensions to the case of quadrilateral meshes are obtained via tensor products. For the two-point boundary value problem (steady state), it is shown that these schemes, which include most popular DG methods, yield exact common interface quantities as well as exact cell average solutions for nearly all cases.
What approach to brain partial volume correction is best for PET/MRI?
NASA Astrophysics Data System (ADS)
Hutton, B. F.; Thomas, B. A.; Erlandsson, K.; Bousse, A.; Reilhac-Laborde, A.; Kazantsev, D.; Pedemonte, S.; Vunckx, K.; Arridge, S. R.; Ourselin, S.
2013-02-01
Many partial volume correction approaches make use of anatomical information, readily available in PET/MRI systems but it is not clear what approach is best. Seven novel approaches to partial volume correction were evaluated, including several post-reconstruction methods and several reconstruction methods that incorporate anatomical information. These were compared with an MRI-independent approach (reblurred van Cittert ) and uncorrected data. Monte Carlo PET data were generated for activity distributions representing both 18F FDG and amyloid tracer uptake. Post-reconstruction methods provided the best recovery with ideal segmentation but were particularly sensitive to mis-registration. Alternative approaches performed better in maintaining lesion contrast (unseen in MRI) with good noise control. These were also relatively insensitive to mis-registration errors. The choice of method will depend on the specific application and reliability of segmentation and registration algorithms.
Students' Attitudes toward Statistics across the Disciplines: A Mixed-Methods Approach
ERIC Educational Resources Information Center
Griffith, James D.; Adams, Lea T.; Gu, Lucy L.; Hart, Christian L.; Nichols-Whitehead, Penney
2012-01-01
Students' attitudes toward statistics were investigated using a mixed-methods approach including a discovery-oriented qualitative methodology among 684 undergraduate students across business, criminal justice, and psychology majors where at least one course in statistics was required. Students were asked about their attitudes toward statistics and…
Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods
Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.
2017-01-01
The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537
State space approach to mixed boundary value problems.
NASA Technical Reports Server (NTRS)
Chen, C. F.; Chen, M. M.
1973-01-01
A state-space procedure for the formulation and solution of mixed boundary value problems is established. This procedure is a natural extension of the method used in initial value problems; however, certain special theorems and rules must be developed. The scope of the applications of the approach includes beam, arch, and axisymmetric shell problems in structural analysis, boundary layer problems in fluid mechanics, and eigenvalue problems for deformable bodies. Many classical methods in these fields developed by Holzer, Prohl, Myklestad, Thomson, Love-Meissner, and others can be either simplified or unified under new light shed by the state-variable approach. A beam problem is included as an illustration.
Survey of methods for soil moisture determination
NASA Technical Reports Server (NTRS)
Schmugge, T. J.; Jackson, T. J.; Mckim, H. L.
1979-01-01
Existing and proposed methods for soil moisture determination are discussed. These include: (1) in situ investigations including gravimetric, nuclear, and electromagnetic techniques; (2) remote sensing approaches that use the reflected solar, thermal infrared, and microwave portions of the electromagnetic spectrum; and (3) soil physics models that track the behavior of water in the soil in response to meteorological inputs (precipitation) and demands (evapotranspiration). The capacities of these approaches to satisfy various user needs for soil moisture information vary from application to application, but a conceptual scheme for merging these approaches into integrated systems to provide soil moisture information is proposed that has the potential for meeting various application requirements.
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
Agile methods in biomedical software development: a multi-site experience report.
Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A
2006-05-30
Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods.
Agile methods in biomedical software development: a multi-site experience report
Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A
2006-01-01
Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914
NASA Technical Reports Server (NTRS)
Maskew, B.
1976-01-01
A discrete singularity method has been developed for calculating the potential flow around two-dimensional airfoils. The objective was to calculate velocities at any arbitrary point in the flow field, including points that approach the airfoil surface. That objective was achieved and is demonstrated here on a Joukowski airfoil. The method used combined vortices and sources ''submerged'' a small distance below the airfoil surface and incorporated a near-field subvortex technique developed earlier. When a velocity calculation point approached the airfoil surface, the number of discrete singularities effectively increased (but only locally) to keep the point just outside the error region of the submerged singularity discretization. The method could be extended to three dimensions, and should improve nonlinear methods, which calculate interference effects between multiple wings, and which include the effects of force-free trailing vortex sheets. The capability demonstrated here would extend the scope of such calculations to allow the close approach of wings and vortex sheets (or vortices).
Approaches toward a blue semiconductor laser
NASA Technical Reports Server (NTRS)
Ladany, I.
1989-01-01
Possible approaches for obtaining semiconductor diode laser action in the blue region of the spectrum are surveyed. A discussion of diode lasers is included along with a review of the current status of visible emitters, presently limited to 670 nm. Methods are discussed for shifting laser emission toward shorter wavelengths, including the use of II-IV materials, the increase in the bandgap of III-V materials by addition of nitrogen, and changing the bandstructure from indirect to direct by incorporating interstitial atoms or by constructing superlattices. Non-pn-junction injection methods are surveyed, including avalanche breakdown, Langmuir-Blodgett diodes, heterostructures, carrier accumulation, and Berglund diodes. Prospects of inventing new multinary semiconducting materials are discussed, and a number of novel materials described in the literature are tabulated. New approaches available through the development of quantum wells and superlattices are described, including resonant tunneling and the synthesis of arbitrary bandgap materials through multiple quantum wells.
Forestry sector analysis for developing countries: issues and methods.
R.W. Haynes
1993-01-01
A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...
An Empirical Comparison of Heterogeneity Variance Estimators in 12,894 Meta-Analyses
ERIC Educational Resources Information Center
Langan, Dean; Higgins, Julian P. T.; Simmonds, Mark
2015-01-01
Heterogeneity in meta-analysis is most commonly estimated using a moment-based approach described by DerSimonian and Laird. However, this method has been shown to produce biased estimates. Alternative methods to estimate heterogeneity include the restricted maximum likelihood approach and those proposed by Paule and Mandel, Sidik and Jonkman, and…
The Vibrational Frequencies of CaO2, ScO2, and TiO2: A Comparison of Theoretical Methods
NASA Technical Reports Server (NTRS)
Rosi, Marzio; Bauschlicher, Charles W., Jr.; Chertihin, George V.; Andrews, Lester; Arnold, James O. (Technical Monitor)
1997-01-01
The vibrational frequencies of several states of CaO2, ScO2, and TiO2 are computed at using density functional theory (DFT), the Hatree-Fock approach, second order Moller-Plesset perturbation theory (MP2), and the complete-active-space self-consistent-field theory. Three different functionals are used in the DFT calculations, including two hybrid functionals. The coupled cluster singles and doubles approach including the effect of unlinked triples, determined using perturbation theory, is applied to selected states. The Becke-Perdew 86 functional appears to be the cost effective method of choice, although even this functional does not perform well for one state of CaO2. The MP2 approach is significantly inferior to the DFT approaches.
Electronic measurement of variable torques in precision work technology
NASA Technical Reports Server (NTRS)
Maehr, M.
1978-01-01
Approaches for the determination of torques on the basis of length measurements are discussed. Attention is given to torque determinations in which the deformation of a shaft is measured, an electric measurement of the torsion angle, and an approach proposed by Buschmann (1970). Methods for a torque determination conducted with the aid of force measurements make use of piezoelectric approaches. The components used by these methods include a quartz crystal and a charge amplifier.
Sterilization: A Review and Update.
Moss, Chailee; Isley, Michelle M
2015-12-01
Sterilization is a frequently used method of contraception. Female sterilization is performed 3 times more frequently than male sterilization, and it can be performed immediately postpartum or as an interval procedure. Methods include mechanical occlusion, coagulation, or tubal excision. Female sterilization can be performed using an abdominal approach, or via laparoscopy or hysteroscopy. When an abdominal approach or laparoscopy is used, sterilization occurs immediately. When hysteroscopy is used, tubal occlusion occurs over time, and additional testing is needed to confirm tubal occlusion. Comprehensive counseling about sterilization should include discussion about male sterilization (vasectomy) and long-acting reversible contraceptive methods. Copyright © 2015 Elsevier Inc. All rights reserved.
Advanced Numerical Methods and Software Approaches for Semiconductor Device Simulation
Carey, Graham F.; Pardhanani, A. L.; Bova, S. W.
2000-01-01
In this article we concisely present several modern strategies that are applicable to driftdominated carrier transport in higher-order deterministic models such as the driftdiffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of “upwind” and artificial dissipation schemes, generalization of the traditional Scharfetter – Gummel approach, Petrov – Galerkin and streamline-upwind Petrov Galerkin (SUPG), “entropy” variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of themore » methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. We have included numerical examples from our recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and we emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, we briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.« less
NASA Astrophysics Data System (ADS)
Liu, Shun; Xu, Jinglei; Yu, Kaikai
2017-06-01
This paper proposes an improved approach for extraction of pressure fields from velocity data, such as obtained by particle image velocimetry (PIV), especially for steady compressible flows with strong shocks. The principle of this approach is derived from Navier-Stokes equations, assuming adiabatic condition and neglecting viscosity of flow field boundaries measured by PIV. The computing method is based on MacCormack's technique in computational fluid dynamics. Thus, this approach is called the MacCormack method. Moreover, the MacCormack method is compared with several approaches proposed in previous literature, including the isentropic method, the spatial integration and the Poisson method. The effects of velocity error level and PIV spatial resolution on these approaches are also quantified by using artificial velocity data containing shock waves. The results demonstrate that the MacCormack method has higher reconstruction accuracy than other approaches, and its advantages become more remarkable with shock strengthening. Furthermore, the performance of the MacCormack method is also validated by using synthetic PIV images with an oblique shock wave, confirming the feasibility and advantage of this approach in real PIV experiments. This work is highly significant for the studies on aerospace engineering, especially the outer flow fields of supersonic aircraft and the internal flow fields of ramjets.
Monte Carlo approaches to sampling forested tracts with lines or points
Harry T. Valentine; Jeffrey H. Gove; Timothy G. Gregoire
2001-01-01
Several line- and point-based sampling methods can be employed to estimate the aggregate dimensions of trees standing on a forested tract or pieces of coarse woody debris lying on the forest floor. Line methods include line intersect sampling, horizontal line sampling, and transect relascope sampling; point methods include variable- and fixed-radius plot sampling, and...
System and method for determining stability of a neural system
NASA Technical Reports Server (NTRS)
Curtis, Steven A. (Inventor)
2011-01-01
Disclosed are methods, systems, and computer-readable media for determining stability of a neural system. The method includes tracking a function world line of an N element neural system within at least one behavioral space, determining whether the tracking function world line is approaching a psychological stability surface, and implementing a quantitative solution that corrects instability if the tracked function world line is approaching the psychological stability surface.
Case Study Research Methodology in Nursing Research.
Cope, Diane G
2015-11-01
Through data collection methods using a holistic approach that focuses on variables in a natural setting, qualitative research methods seek to understand participants' perceptions and interpretations. Common qualitative research methods include ethnography, phenomenology, grounded theory, and historic research. Another type of methodology that has a similar qualitative approach is case study research, which seeks to understand a phenomenon or case from multiple perspectives within a given real-world context.
[Alternatives to animal testing].
Fabre, Isabelle
2009-11-01
The use of alternative methods to animal testing are an integral part of the 3Rs concept (refine, reduce, replace) defined by Russel & Burch in 1959. These approaches include in silico methods (databases and computer models), in vitro physicochemical analysis, biological methods using bacteria or isolated cells, reconstructed enzyme systems, and reconstructed tissues. Emerging "omic" methods used in integrated approaches further help to reduce animal use, while stem cells offer promising approaches to toxicologic and pathophysiologic studies, along with organotypic cultures and bio-artificial organs. Only a few alternative methods can so far be used in stand-alone tests as substitutes for animal testing. The best way to use these methods is to integrate them in tiered testing strategies (ITS), in which animals are only used as a last resort.
A review of numerical techniques approaching microstructures of crystalline rocks
NASA Astrophysics Data System (ADS)
Zhang, Yahui; Wong, Louis Ngai Yuen
2018-06-01
The macro-mechanical behavior of crystalline rocks including strength, deformability and failure pattern are dominantly influenced by their grain-scale structures. Numerical technique is commonly used to assist understanding the complicated mechanisms from a microscopic perspective. Each numerical method has its respective strengths and limitations. This review paper elucidates how numerical techniques take geometrical aspects of the grain into consideration. Four categories of numerical methods are examined: particle-based methods, block-based methods, grain-based methods, and node-based methods. Focusing on the grain-scale characters, specific relevant issues including increasing complexity of micro-structure, deformation and breakage of model elements, fracturing and fragmentation process are described in more detail. Therefore, the intrinsic capabilities and limitations of different numerical approaches in terms of accounting for the micro-mechanics of crystalline rocks and their phenomenal mechanical behavior are explicitly presented.
Inquiring into the Real: A Realist Phenomenological Approach
ERIC Educational Resources Information Center
Budd, John M.; Hill, Heather; Shannon, Brooke
2010-01-01
The need for postpositivist or antipositivist methods in the social sciences, including library and information science, is well documented. A promising alternative synthesizes critical realism and phenomenology. This method embraces ontological reality in all things, including human and social action. The ontology underlying the realist…
A geologic approach to field methods in fluvial geomorphology
Fitzpatrick, Faith A.; Thornbush, Mary J; Allen, Casey D; Fitzpatrick, Faith A.
2014-01-01
A geologic approach to field methods in fluvial geomorphology is useful for understanding causes and consequences of past, present, and possible future perturbations in river behavior and floodplain dynamics. Field methods include characterizing river planform and morphology changes and floodplain sedimentary sequences over long periods of time along a longitudinal river continuum. Techniques include topographic and bathymetric surveying of fluvial landforms in valley bottoms and describing floodplain sedimentary sequences through coring, trenching, and examining pits and exposures. Historical sediment budgets that include floodplain sedimentary records can characterize past and present sources and sinks of sediment along a longitudinal river continuum. Describing paleochannels and floodplain vertical accretion deposits, estimating long-term sedimentation rates, and constructing historical sediment budgets can assist in management of aquatic resources, habitat, sedimentation, and flooding issues.
How effects on health equity are assessed in systematic reviews of interventions.
Welch, Vivian; Tugwell, Peter; Petticrew, Mark; de Montigny, Joanne; Ueffing, Erin; Kristjansson, Betsy; McGowan, Jessie; Benkhalti Jandu, Maria; Wells, George A; Brand, Kevin; Smylie, Janet
2010-12-08
Enhancing health equity has now achieved international political importance with endorsement from the World Health Assembly in 2009. The failure of systematic reviews to consider effects on health equity is cited by decision-makers as a limitation to their ability to inform policy and program decisions. To systematically review methods to assess effects on health equity in systematic reviews of effectiveness. We searched the following databases up to July 2 2010: MEDLINE, PsychINFO, the Cochrane Methodology Register, CINAHL, Education Resources Information Center, Education Abstracts, Criminal Justice Abstracts, Index to Legal Periodicals, PAIS International, Social Services Abstracts, Sociological Abstracts, Digital Dissertations and the Health Technology Assessment Database. We searched SCOPUS to identify articles that cited any of the included studies on October 7 2010. We included empirical studies of cohorts of systematic reviews that assessed methods for measuring effects on health inequalities. Data were extracted using a pre-tested form by two independent reviewers. Risk of bias was appraised for included studies according to the potential for bias in selection and detection of systematic reviews. Thirty-four methodological studies were included. The methods used by these included studies were: 1) Targeted approaches (n=22); 2) gap approaches (n=12) and gradient approach (n=1). Gender or sex was assessed in eight out of 34 studies, socioeconomic status in ten studies, race/ethnicity in seven studies, age in seven studies, low and middle income countries in 14 studies, and two studies assessed multiple factors across health inequity may exist.Only three studies provided a definition of health equity. Four methodological approaches to assessing effects on health equity were identified: 1) descriptive assessment of reporting and analysis in systematic reviews (all 34 studies used a type of descriptive method); 2) descriptive assessment of reporting and analysis in original trials (12/34 studies); 3) analytic approaches (10/34 studies); and 4) applicability assessment (11/34 studies). Both analytic and applicability approaches were not reported transparently nor in sufficient detail to judge their credibility. There is a need for improvement in conceptual clarity about the definition of health equity, describing sufficient detail about analytic approaches (including subgroup analyses) and transparent reporting of judgments required for applicability assessments in order to assess and report effects on health equity in systematic reviews.
NASA Astrophysics Data System (ADS)
Ortleb, Sigrun; Seidel, Christian
2017-07-01
In this second symposium at the limits of experimental and numerical methods, recent research is presented on practically relevant problems. Presentations discuss experimental investigation as well as numerical methods with a strong focus on application. In addition, problems are identified which require a hybrid experimental-numerical approach. Topics include fast explicit diffusion applied to a geothermal energy storage tank, noise in experimental measurements of electrical quantities, thermal fluid structure interaction, tensegrity structures, experimental and numerical methods for Chladni figures, optimized construction of hydroelectric power stations, experimental and numerical limits in the investigation of rain-wind induced vibrations as well as the application of exponential integrators in a domain-based IMEX setting.
ERIC Educational Resources Information Center
Pliske, Rebecca M.; Caldwell, Tracy L.; Calin-Jageman, Robert J.; Taylor-Ritzler, Tina
2015-01-01
We developed a two-semester series of intensive (six-contact hours per week) behavioral research methods courses with an integrated statistics curriculum. Our approach includes the use of team-based learning, authentic projects, and Excel and SPSS. We assessed the effectiveness of our approach by examining our students' content area scores on the…
ERIC Educational Resources Information Center
Bieg, Madeleine; Goetz, Thomas; Sticca, Fabio; Brunner, Esther; Becker, Eva; Morger, Vinzenz; Hubbard, Kyle
2017-01-01
Various theoretical approaches propose that emotions in the classroom are elicited by appraisal antecedents, with subjective experiences of control playing a crucial role in this context. Perceptions of control, in turn, are expected to be influenced by the classroom social environment, which can include the teaching methods being employed (e.g.,…
... forms of SMA still shorten life span, new approaches to ventilation and feeding have expanded what’s possible. ... 5-linked SMA in the last decade. Other approaches include less specific methods of helping motor neurons ...
A modeling approach to compare ΣPCB concentrations between congener-specific analyses
Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.
2017-01-01
Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time.
A non-iterative extension of the multivariate random effects meta-analysis.
Makambi, Kepher H; Seung, Hyunuk
2015-01-01
Multivariate methods in meta-analysis are becoming popular and more accepted in biomedical research despite computational issues in some of the techniques. A number of approaches, both iterative and non-iterative, have been proposed including the multivariate DerSimonian and Laird method by Jackson et al. (2010), which is non-iterative. In this study, we propose an extension of the method by Hartung and Makambi (2002) and Makambi (2001) to multivariate situations. A comparison of the bias and mean square error from a simulation study indicates that, in some circumstances, the proposed approach perform better than the multivariate DerSimonian-Laird approach. An example is presented to demonstrate the application of the proposed approach.
Interdisciplinary research on patient-provider communication: a cross-method comparison.
Chou, Wen-Ying Sylvia; Han, Paul; Pilsner, Alison; Coa, Kisha; Greenberg, Larrie; Blatt, Benjamin
2011-01-01
Patient-provider communication, a key aspect of healthcare delivery, has been assessed through multiple methods for purposes of research, education, and quality control. Common techniques include satisfaction ratings and quantitatively- and qualitatively-oriented direct observations. Identifying the strengths and weaknesses of different approaches is critically important in determining the appropriate assessment method for a specific research or practical goal. Analyzing ten videotaped simulated encounters between medical students and Standardized Patients (SPs), this study compared three existing assessment methods through the same data set. Methods included: (1) dichotomized SP ratings on students' communication skills; (2) Roter Interaction Analysis System (RIAS) analysis; and (3) inductive discourse analysis informed by sociolinguistic theories. The large dichotomous contrast between good and poor ratings in (1) was not evidenced in any of the other methods. Following a discussion of strengths and weaknesses of each approach, we pilot-tested a combined assessment done by coders blinded to results of (1)-(3). This type of integrative approach has the potential of adding a quantifiable dimension to qualitative, discourse-based observations. Subjecting the same data set to separate analytic methods provides an excellent opportunity for methodological comparisons with the goal of informing future assessment of clinical encounters.
Melendez-Torres, G J; Grant, Sean; Bonell, Chris
2015-12-01
Reciprocal translation, the understanding of one study's findings in terms of another's, is the foundation of most qualitative metasynthetic methods. In light of the proliferation of metasynthesis methods, the current review sought to create a taxonomy of operations of reciprocal translation using recently published qualitative metasyntheses. On 19 August 2013, MEDLINE, Embase and PsycINFO were searched. Included articles were full reports of metasyntheses of qualitative studies published in 2012 in English-language peer-reviewed journals. Two reviewers, working independently, screened records, assessed full texts for inclusion and extracted data on methods from each included metasynthesis. Systematic review methods used were summarised, and metasynthetic methods were inductively analysed to develop the taxonomy. Of 61 included metasyntheses, 21 (34%) reported fully replicable search strategies and 51 (84%) critically appraised included studies. Based on methods in these metasyntheses, we developed a taxonomy of reciprocal translation with four overlapping categories: visual representation; key paper integration; data reduction and thematic extraction; and line-by-line coding. This systematic review presents an update on methods and reporting currently used in qualitative metasynthesis. It also goes beyond the proliferation of approaches to offer a parsimonious approach to understanding how reciprocal translations are accomplished across metasynthetis methods. Copyright © 2015 John Wiley & Sons, Ltd.
Prediction of a service demand using combined forecasting approach
NASA Astrophysics Data System (ADS)
Zhou, Ling
2017-08-01
Forecasting facilitates cutting down operational and management costs while ensuring service level for a logistics service provider. Our case study here is to investigate how to forecast short-term logistic demand for a LTL carrier. Combined approach depends on several forecasting methods simultaneously, instead of a single method. It can offset the weakness of a forecasting method with the strength of another, which could improve the precision performance of prediction. Main issues of combined forecast modeling are how to select methods for combination, and how to find out weight coefficients among methods. The principles of method selection include that each method should apply to the problem of forecasting itself, also methods should differ in categorical feature as much as possible. Based on these principles, exponential smoothing, ARIMA and Neural Network are chosen to form the combined approach. Besides, least square technique is employed to settle the optimal weight coefficients among forecasting methods. Simulation results show the advantage of combined approach over the three single methods. The work done in the paper helps manager to select prediction method in practice.
Staying Alive: Problems of Survival.
ERIC Educational Resources Information Center
Stalheim, Bill
1990-01-01
Presented is an approach to the teaching of biological diversity using the theme of survival. Teaching methods for this approach and the advantages of its use are discussed. A suggested course outline is included. (CW)
Norbash, Alexander
2017-06-01
To suggest a methodical approach for refining transitional management abilities, including empowerment of a growing leader, leading in an unfamiliar organization or leading in an organization that is changing. Management approaches based on the body of work dealing with leadership studies and transitions and dealing with leadership during times of transition and change management were consolidated and categorized. Transitional leaders can benefit from effective leadership training including defining and prospectively accruing necessary experiences and skills; strengthening information gathering skills; effectively self-assessing; valuing and implementing mentoring; formulating strategy; and communicating. A categorical approach to transitional leadership may be implemented through a systems-based and methodical approach to gaining the definable, and distinct sets of skills and abilities necessary for transitional leadership success. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Willke, Richard J; Zheng, Zhiyuan; Subedi, Prasun; Althin, Rikard; Mullins, C Daniel
2012-12-13
Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the "intermediate" outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading.By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research.
Towards a Viscous Wall Model for Immersed Boundary Methods
NASA Technical Reports Server (NTRS)
Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.
2016-01-01
Immersed boundary methods are frequently employed for simulating flows at low Reynolds numbers or for applications where viscous boundary layer effects can be neglected. The primary shortcoming of Cartesian mesh immersed boundary methods is the inability of efficiently resolving thin turbulent boundary layers in high-Reynolds number flow application. The inefficiency of resolving the thin boundary is associated with the use of constant aspect ratio Cartesian grid cells. Conventional CFD approaches can efficiently resolve the large wall normal gradients by utilizing large aspect ratio cells near the wall. This paper presents different approaches for immersed boundary methods to account for the viscous boundary layer interaction with the flow-field away from the walls. Different wall modeling approaches proposed in previous research studies are addressed and compared to a new integral boundary layer based approach. In contrast to common wall-modeling approaches that usually only utilize local flow information, the integral boundary layer based approach keeps the streamwise history of the boundary layer. This allows the method to remain effective at much larger y+ values than local wall modeling approaches. After a theoretical discussion of the different approaches, the method is applied to increasingly more challenging flow fields including fully attached, separated, and shock-induced separated (laminar and turbulent) flows.
Approaches in Health Human Resource Forecasting: A Roadmap for Improvement
Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh
2016-01-01
Introduction Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. Methods A literature review was conducted for studies published in English from 1990–2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies’ references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses Results Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. Conclusions An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems. PMID:27790343
Zhang, Zhe; Schindler, Christina E. M.; Lange, Oliver F.; Zacharias, Martin
2015-01-01
The high-resolution refinement of docked protein-protein complexes can provide valuable structural and mechanistic insight into protein complex formation complementing experiment. Monte Carlo (MC) based approaches are frequently applied to sample putative interaction geometries of proteins including also possible conformational changes of the binding partners. In order to explore efficiency improvements of the MC sampling, several enhanced sampling techniques, including temperature or Hamiltonian replica exchange and well-tempered ensemble approaches, have been combined with the MC method and were evaluated on 20 protein complexes using unbound partner structures. The well-tempered ensemble method combined with a 2-dimensional temperature and Hamiltonian replica exchange scheme (WTE-H-REMC) was identified as the most efficient search strategy. Comparison with prolonged MC searches indicates that the WTE-H-REMC approach requires approximately 5 times fewer MC steps to identify near native docking geometries compared to conventional MC searches. PMID:26053419
Consistent approach to describing aircraft HIRF protection
NASA Technical Reports Server (NTRS)
Rimbey, P. R.; Walen, D. B.
1995-01-01
The high intensity radiated fields (HIRF) certification process as currently implemented is comprised of an inconsistent combination of factors that tend to emphasize worst case scenarios in assessing commercial airplane certification requirements. By examining these factors which include the process definition, the external HIRF environment, the aircraft coupling and corresponding internal fields, and methods of measuring equipment susceptibilities, activities leading to an approach to appraising airplane vulnerability to HIRF are proposed. This approach utilizes technically based criteria to evaluate the nature of the threat, including the probability of encountering the external HIRF environment. No single test or analytic method comprehensively addresses the full HIRF threat frequency spectrum. Additional tools such as statistical methods must be adopted to arrive at more realistic requirements to reflect commercial aircraft vulnerability to the HIRF threat. Test and analytic data are provided to support the conclusions of this report. This work was performed under NASA contract NAS1-19360, Task 52.
The Effect of Laminar Flow on Rotor Hover Performance
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Martin, Preston B.
2017-01-01
The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.
Wang, Wei; Song, Wei-Guo; Liu, Shi-Xing; Zhang, Yong-Ming; Zheng, Hong-Yang; Tian, Wei
2011-04-01
An improved method for detecting cloud combining Kmeans clustering and the multi-spectral threshold approach is described. On the basis of landmark spectrum analysis, MODIS data is categorized into two major types initially by Kmeans method. The first class includes clouds, smoke and snow, and the second class includes vegetation, water and land. Then a multi-spectral threshold detection is applied to eliminate interference such as smoke and snow for the first class. The method is tested with MODIS data at different time under different underlying surface conditions. By visual method to test the performance of the algorithm, it was found that the algorithm can effectively detect smaller area of cloud pixels and exclude the interference of underlying surface, which provides a good foundation for the next fire detection approach.
Spacelike matching to null infinity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zenginoglu, Anil; Tiglio, Manuel
2009-07-15
We present two methods to include the asymptotic domain of a background spacetime in null directions for numerical solutions of evolution equations so that both the radiation extraction problem and the outer boundary problem are solved. The first method is based on the geometric conformal approach, the second is a coordinate based approach. We apply these methods to the case of a massless scalar wave equation on a Kerr spacetime. Our methods are designed to allow existing codes to reach the radiative zone by including future null infinity in the computational domain with relatively minor modifications. We demonstrate the flexibilitymore » of the methods by considering both Boyer-Lindquist and ingoing Kerr coordinates near the black hole. We also confirm numerically predictions concerning tail decay rates for scalar fields at null infinity in Kerr spacetime due to Hod for the first time.« less
A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.
2003-01-01
In this paper we present, a comparison of trajectory optimization approaches for the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP). Quasi- Newton and Nelder-Meade Simplex. Several cost function parameterizations are considered for the direct approach. We choose one direct approach that appears to be the most flexible. Both the direct and indirect methods are applied to a variety of test cases which are chosen to demonstrate the performance of each method in different flight regimes. The first test case is a simple circular-to-circular coplanar rendezvous. The second test case is an elliptic-to-elliptic line of apsides rotation. The final test case is an orbit phasing maneuver sequence in a highly elliptic orbit. For each test case we present a comparison of the performance of all methods we consider in this paper.
Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method
NASA Astrophysics Data System (ADS)
Lee, G.; Jun, K. S.; Chung, E.-S.
2015-04-01
This study proposes an improved group decision making (GDM) framework that combines the VIKOR method with data fuzzification to quantify the spatial flood vulnerability including multiple criteria. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. This approach effectively can propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the southern Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the spatial flood vulnerability using general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods (i.e., Borda, Condorcet, and Copeland). As a result, the proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.
ERIC Educational Resources Information Center
Perera, Indika
2010-01-01
ICT (information and communication technologies) add enormous approaches to utilize computing into users' daily lives. Every aspect of social needs has been touched by ICT, including learning. VL (virtual learning), with the life span of slightly above a decade, still looks for possible approaches to enhance its functions with significant pressure…
ERIC Educational Resources Information Center
Konstantinidis-Pereira, Alicja
2018-01-01
This paper summarises a new method of grouping postgraduate taught (PGT) courses introduced at Oxford Brookes University as a part of a Portfolio Review. Instead of classifying courses by subject, the new cluster approach uses statistical methods to group the courses based on factors including flexibility of study options, level of specialisation,…
ERIC Educational Resources Information Center
Mundia, Lawrence
2012-01-01
This mixed-methods study incorporated elements of survey, case study and action research approaches in investigating an at-risk child. Using an in-take interview, a diagnostic test, an error analysis, and a think-aloud clinical interview, the study identified the child's major presenting difficulties. These included: inability to use the four…
ERIC Educational Resources Information Center
Oraif, Iman M.
2016-01-01
The aim of this paper is to describe the different approaches applied to teaching writing in the L2 context and the way these different methods have been established so far. The perspectives include a product approach, genre approach and process approach. Each has its own merits and objectives for application. Regarding the study context, it may…
Political Science, The Judicial Process, and A Legal Education
ERIC Educational Resources Information Center
Funston, Richard
1975-01-01
Application of the behavioral approach to the study of the judicial process is examined including methodological approaches used, typical findings, and "behavioralists'" rejection of the case method of studying law. The author concludes that the behavioral approach to the study of judicial politics has not been substantially productive. (JT)
A Comparison of Two Methods for Boolean Query Relevancy Feedback.
ERIC Educational Resources Information Center
Salton, G.; And Others
1984-01-01
Evaluates and compares two recently proposed automatic methods for relevance feedback of Boolean queries (Dillon method, which uses probabilistic approach as basis, and disjunctive normal form method). Conclusions are drawn concerning the use of effective feedback methods in a Boolean query environment. Nineteen references are included. (EJS)
Novel approaches for targeting the adenosine A2A receptor.
Yuan, Gengyang; Gedeon, Nicholas G; Jankins, Tanner C; Jones, Graham B
2015-01-01
The adenosine A2A receptor (A2AR) represents a drug target for a wide spectrum of diseases. Approaches for targeting this membrane-bound protein have been greatly advanced by new stabilization techniques. The resulting X-ray crystal structures and subsequent analyses provide deep insight to the A2AR from both static and dynamic perspectives. Application of this, along with other biophysical methods combined with fragment-based drug design (FBDD), has become a standard approach in targeting A2AR. Complementarities of in silico screening based- and biophysical screening assisted- FBDD are likely to feature in future approaches in identifying novel ligands against this key receptor. This review describes evolution of the above approaches for targeting A2AR and highlights key modulators identified. It includes a review of: adenosine receptor structures, homology modeling, X-ray structural analysis, rational drug design, biophysical methods, FBDD and in silico screening. As a drug target, the A2AR is attractive as its function plays a role in a wide spectrum of diseases including oncologic, inflammatory, Parkinson's and cardiovascular diseases. Although traditional approaches such as high-throughput screening and homology model-based virtual screening (VS) have played a role in targeting A2AR, numerous shortcomings have generally restricted their applications to specific ligand families. Using stabilization methods for crystallization, X-ray structures of A2AR have greatly accelerated drug discovery and influenced development of biophysical-in silico hybrid screening methods. Application of these new methods to other ARs and G-protein-coupled receptors is anticipated in the future.
IMPLICIT DUAL CONTROL BASED ON PARTICLE FILTERING AND FORWARD DYNAMIC PROGRAMMING.
Bayard, David S; Schumitzky, Alan
2010-03-01
This paper develops a sampling-based approach to implicit dual control. Implicit dual control methods synthesize stochastic control policies by systematically approximating the stochastic dynamic programming equations of Bellman, in contrast to explicit dual control methods that artificially induce probing into the control law by modifying the cost function to include a term that rewards learning. The proposed implicit dual control approach is novel in that it combines a particle filter with a policy-iteration method for forward dynamic programming. The integration of the two methods provides a complete sampling-based approach to the problem. Implementation of the approach is simplified by making use of a specific architecture denoted as an H-block. Practical suggestions are given for reducing computational loads within the H-block for real-time applications. As an example, the method is applied to the control of a stochastic pendulum model having unknown mass, length, initial position and velocity, and unknown sign of its dc gain. Simulation results indicate that active controllers based on the described method can systematically improve closed-loop performance with respect to other more common stochastic control approaches.
NASA Astrophysics Data System (ADS)
Sosa, Germán. D.; Cruz-Roa, Angel; González, Fabio A.
2015-01-01
This work addresses the problem of lung sound classification, in particular, the problem of distinguishing between wheeze and normal sounds. Wheezing sound detection is an important step to associate lung sounds with an abnormal state of the respiratory system, usually associated with tuberculosis or another chronic obstructive pulmonary diseases (COPD). The paper presents an approach for automatic lung sound classification, which uses different state-of-the-art sound features in combination with a C-weighted support vector machine (SVM) classifier that works better for unbalanced data. Feature extraction methods used here are commonly applied in speech recognition and related problems thanks to the fact that they capture the most informative spectral content from the original signals. The evaluated methods were: Fourier transform (FT), wavelet decomposition using Wavelet Packet Transform bank of filters (WPT) and Mel Frequency Cepstral Coefficients (MFCC). For comparison, we evaluated and contrasted the proposed approach against previous works using different combination of features and/or classifiers. The different methods were evaluated on a set of lung sounds including normal and wheezing sounds. A leave-two-out per-case cross-validation approach was used, which, in each fold, chooses as validation set a couple of cases, one including normal sounds and the other including wheezing sounds. Experimental results were reported in terms of traditional classification performance measures: sensitivity, specificity and balanced accuracy. Our best results using the suggested approach, C-weighted SVM and MFCC, achieve a 82.1% of balanced accuracy obtaining the best result for this problem until now. These results suggest that supervised classifiers based on kernel methods are able to learn better models for this challenging classification problem even using the same feature extraction methods.
An approach to constrained aerodynamic design with application to airfoils
NASA Technical Reports Server (NTRS)
Campbell, Richard L.
1992-01-01
An approach was developed for incorporating flow and geometric constraints into the Direct Iterative Surface Curvature (DISC) design method. In this approach, an initial target pressure distribution is developed using a set of control points. The chordwise locations and pressure levels of these points are initially estimated either from empirical relationships and observed characteristics of pressure distributions for a given class of airfoils or by fitting the points to an existing pressure distribution. These values are then automatically adjusted during the design process to satisfy the flow and geometric constraints. The flow constraints currently available are lift, wave drag, pitching moment, pressure gradient, and local pressure levels. The geometric constraint options include maximum thickness, local thickness, leading-edge radius, and a 'glove' constraint involving inner and outer bounding surfaces. This design method was also extended to include the successive constraint release (SCR) approach to constrained minimization.
Abrahamse, Mariëlle E; Jonkman, Caroline S; Harting, Janneke
2018-04-10
The large number of children that grow up in poverty is concerning, especially given the negative developmental outcomes that can persist into adulthood. Poverty has been found as a risk factor to negatively affect academic achievement and health outcomes in children. Interdisciplinary interventions can be an effective way to promote health and academic achievement. The present study aims to evaluate a school-based interdisciplinary approach on child health, poverty, and academic achievement using a mixed-method design. Generally taken, outcomes of this study increase the knowledge about effective ways to give disadvantaged children equal chances early in their lives. An observational study with a mixed-methods design including both quantitative and qualitative data collection methods will be used to evaluate the interdisciplinary approach. The overall research project exists of three study parts including a longitudinal study, a cross-sectional study, and a process evaluation. Using a multi-source approach we will assess child health as the primary outcome. Child poverty and child academic achievement will be assessed as secondary outcomes. The process evaluation will observe the program's effects on the school environment and the program's implementation in order to obtain more knowledge on how to disseminate the interdisciplinary approach to other schools and neighborhoods. The implementation of a school-based interdisciplinary approach via primary schools combining the cross-sectoral domains health, poverty, and academic achievement is innovative and a step forward to reach an ethnic minority population. However, the large variety of the interventions and activities within the approach can limit the validity of the study. Including a process evaluation will therefore help to improve the interpretation of our findings. In order to contribute to policy and practice focusing on decreasing the unequal chances of children growing up in deprived neighborhoods, it is important to study whether the intervention leads to positive developmental outcomes in children. ( NTR 6571 ) (retrospectively registered on August 4, 2017).
NASA Astrophysics Data System (ADS)
Lee, Mee-Kyeong
The purposes of the study were (1) to investigate the effects of the 2000 Iowa Professional Development Program on classroom teaching and student learning and (2) to examine the effectiveness of Constructivist/STS approaches in terms of student perceptions regarding their science classrooms, student attitudes toward science, and student creativity. The 2000 Iowa Professional Development Program which focused on Constructivist/STS approaches was carried out at the University of Iowa for visiting Korean physics teachers. Several methods of data collection were used, including observations by means of classroom videotapes, teacher perception surveys, teacher interviews, and student surveys. The data collected was analyzed using both quantitative and qualitative methods. Major findings include: (1) The 2000 Iowa Professional Development Program did not significantly influence teacher perceptions concerning their teaching in terms of Constructivist/STS approaches in their classrooms. (2) The 2000 Iowa Professional Development Program significantly influenced improvement in teaching practices regarding Constructivist/STS approaches. (3) Students taught with Constructivist/STS approaches perceived their learning environments as more constructivist than did those taught with traditional methods. (4) Students taught with Constructivist/STS approaches improved significantly in the development of more positive attitudes toward science, while such positive attitudes decreased among students taught with traditional methods. (5) Students taught with Constructivist/STS approaches improved significantly in their use of creativity skills over those taught in traditional classrooms. (6) Most teachers favored the implementation of Constructivist/STS approaches. They perceived that students became more interested in lessons utilizing such approaches over time. The major difficulties which the teachers experienced with regard to the implementation of Constructivist/STS teaching include: inability to cover required curriculum content; getting away from textbooks; acceptance by parents, community, and supervisors; motivating students to be involved in classroom activities; and lack of materials for Constructivist/STS teaching. The results imply that efforts to improve educational conditions, in tandem with more consistent and ongoing professional development programs, are necessary to encourage teachers to use what they learned, to keep their initial interest and ideas alive, and to contribute specifically to the reform of science education.
Legendre spectral-collocation method for solving some types of fractional optimal control problems
Sweilam, Nasser H.; Al-Ajami, Tamer M.
2014-01-01
In this paper, the Legendre spectral-collocation method was applied to obtain approximate solutions for some types of fractional optimal control problems (FOCPs). The fractional derivative was described in the Caputo sense. Two different approaches were presented, in the first approach, necessary optimality conditions in terms of the associated Hamiltonian were approximated. In the second approach, the state equation was discretized first using the trapezoidal rule for the numerical integration followed by the Rayleigh–Ritz method to evaluate both the state and control variables. Illustrative examples were included to demonstrate the validity and applicability of the proposed techniques. PMID:26257937
Nonlinear flap-lag axial equations of a rotating beam
NASA Technical Reports Server (NTRS)
Kaza, K. R. V.; Kvaternik, R. G.
1977-01-01
It is possible to identify essentially four approaches by which analysts have established either the linear or nonlinear governing equations of motion for a particular problem related to the dynamics of rotating elastic bodies. The approaches include the effective applied load artifice in combination with a variational principle and the use of Newton's second law, written as D'Alembert's principle, applied to the deformed configuration. A third approach is a variational method in which nonlinear strain-displacement relations and a first-degree displacement field are used. The method introduced by Vigneron (1975) for deriving the linear flap-lag equations of a rotating beam constitutes the fourth approach. The reported investigation shows that all four approaches make use of the geometric nonlinear theory of elasticity. An alternative method for deriving the nonlinear coupled flap-lag-axial equations of motion is also discussed.
NASA Astrophysics Data System (ADS)
Szafranko, Elżbieta
2017-10-01
Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.
Joint Data Management for MOVINT Data-to-Decision Making
2011-07-01
flux tensor , aligned motion history images, and related approaches have been shown to be versatile approaches [12, 16, 17, 18]. Scaling these...methods include voting , neural networks, fuzzy logic, neuro-dynamic programming, support vector machines, Bayesian and Dempster-Shafer methods. One way...Information Fusion, 2010. [16] F. Bunyak, K. Palaniappan, S. K. Nath, G. Seetharaman, “Flux tensor constrained geodesic active contours with sensor fusion
ERIC Educational Resources Information Center
Kalchman, Mindy; Kozoll, Richard H.
2017-01-01
Methods for teaching early childhood mathematics and science are often addressed in a single, dual-content course. Approaches to teaching this type of course include integrating the content and the pedagogy of both subjects, or keeping the subject areas distinct. In this article, the authors discuss and illustrate their approach to such a combined…
Mori, Genki; Nonaka, Satoru; Oda, Ichiro; Abe, Seiichiro; Suzuki, Haruhisa; Yoshinaga, Shigetaka; Nakajima, Takeshi; Saito, Yutaka
2015-01-01
Background and study aims: Endoscopic submucosal dissection (ESD) using insulation-tipped knives (IT knives) to treat gastric lesions located on the greater curvature of the gastric body remains technically challenging because of the associated bleeding, control of which can be difficult and time consuming. To eliminate these difficulties, we developed a novel strategy which we have called the “near-side approach method” and assessed its utility. Patients and methods: We reviewed patients who underwent ESD for solitary early gastric cancer located on the greater curvature of the gastric body from January 2003 to September 2014. The technical results of ESD were compared between the group treated with the novel near-side approach method and the group treated with the conventional method. Results: This study included 238 patients with 238 lesions, 118 of which were removed using the near-side approach method and 120 of which were removed using the conventional method. The median procedure time was 92 minutes for the near-side approach method and 120 minutes for the conventional method. The procedure time was significantly shorter in the near-side approach method arm. Although, the procedure time required by an experienced endoscopist was not significantly different between the two groups (100 vs. 110 minutes), the near-side approach group showed significantly shorter procedure time for a less-experienced endoscopist (90 vs. 120 minutes). Conclusions: The near-side approach method appears to require less time to complete gastric ESD than the conventional method using IT knives for technically challenging lesions located on the greater curvature of the gastric body, especially if the procedure is performed by less-experienced endoscopists. PMID:26528496
A sensor and video based ontology for activity recognition in smart environments.
Mitchell, D; Morrow, Philip J; Nugent, Chris D
2014-01-01
Activity recognition is used in a wide range of applications including healthcare and security. In a smart environment activity recognition can be used to monitor and support the activities of a user. There have been a range of methods used in activity recognition including sensor-based approaches, vision-based approaches and ontological approaches. This paper presents a novel approach to activity recognition in a smart home environment which combines sensor and video data through an ontological framework. The ontology describes the relationships and interactions between activities, the user, objects, sensors and video data.
2012-01-01
Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the “intermediate” outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading. By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research. PMID:23234603
3D/3D registration of coronary CTA and biplane XA reconstructions for improved image guidance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dibildox, Gerardo, E-mail: g.dibildox@erasmusmc.nl; Baka, Nora; Walsum, Theo van
2014-09-15
Purpose: The authors aim to improve image guidance during percutaneous coronary interventions of chronic total occlusions (CTO) by providing information obtained from computed tomography angiography (CTA) to the cardiac interventionist. To this end, the authors investigate a method to register a 3D CTA model to biplane reconstructions. Methods: The authors developed a method for registering preoperative coronary CTA with intraoperative biplane x-ray angiography (XA) images via 3D models of the coronary arteries. The models are extracted from the CTA and biplane XA images, and are temporally aligned based on CTA reconstruction phase and XA ECG signals. Rigid spatial alignment ismore » achieved with a robust probabilistic point set registration approach using Gaussian mixture models (GMMs). This approach is extended by including orientation in the Gaussian mixtures and by weighting bifurcation points. The method is evaluated on retrospectively acquired coronary CTA datasets of 23 CTO patients for which biplane XA images are available. Results: The Gaussian mixture model approach achieved a median registration accuracy of 1.7 mm. The extended GMM approach including orientation was not significantly different (P > 0.1) but did improve robustness with regards to the initialization of the 3D models. Conclusions: The authors demonstrated that the GMM approach can effectively be applied to register CTA to biplane XA images for the purpose of improving image guidance in percutaneous coronary interventions.« less
Implementation of Potential of the Transdisciplinary Approaches in Economic Studies
ERIC Educational Resources Information Center
Stepanova, Tatiana E.; Manokhina, Nadeghda V.; Konovalova, Maria E.; Kuzmina, Olga Y.; Andryukhina, Lyudmila M.
2016-01-01
The relevance of the researched problem is caused by the increasing interest in using potential of transdisciplinary approaches, and mathematical methods, which include the game theory in analysis of public and economic processes. The aim of the article is studying a possibility of implementation of the transdisciplinary approaches in economic…
Approaches in Health Human Resource Forecasting: A Roadmap for Improvement.
Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh
2016-09-01
Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. A literature review was conducted for studies published in English from 1990-2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies' references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses. Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems.
Allnutt, Thomas F.; McClanahan, Timothy R.; Andréfouët, Serge; Baker, Merrill; Lagabrielle, Erwann; McClennen, Caleb; Rakotomanjaka, Andry J. M.; Tianarisoa, Tantely F.; Watson, Reg; Kremen, Claire
2012-01-01
The Government of Madagascar plans to increase marine protected area coverage by over one million hectares. To assist this process, we compare four methods for marine spatial planning of Madagascar's west coast. Input data for each method was drawn from the same variables: fishing pressure, exposure to climate change, and biodiversity (habitats, species distributions, biological richness, and biodiversity value). The first method compares visual color classifications of primary variables, the second uses binary combinations of these variables to produce a categorical classification of management actions, the third is a target-based optimization using Marxan, and the fourth is conservation ranking with Zonation. We present results from each method, and compare the latter three approaches for spatial coverage, biodiversity representation, fishing cost and persistence probability. All results included large areas in the north, central, and southern parts of western Madagascar. Achieving 30% representation targets with Marxan required twice the fish catch loss than the categorical method. The categorical classification and Zonation do not consider targets for conservation features. However, when we reduced Marxan targets to 16.3%, matching the representation level of the “strict protection” class of the categorical result, the methods show similar catch losses. The management category portfolio has complete coverage, and presents several management recommendations including strict protection. Zonation produces rapid conservation rankings across large, diverse datasets. Marxan is useful for identifying strict protected areas that meet representation targets, and minimize exposure probabilities for conservation features at low economic cost. We show that methods based on Zonation and a simple combination of variables can produce results comparable to Marxan for species representation and catch losses, demonstrating the value of comparing alternative approaches during initial stages of the planning process. Choosing an appropriate approach ultimately depends on scientific and political factors including representation targets, likelihood of adoption, and persistence goals. PMID:22359534
Allnutt, Thomas F; McClanahan, Timothy R; Andréfouët, Serge; Baker, Merrill; Lagabrielle, Erwann; McClennen, Caleb; Rakotomanjaka, Andry J M; Tianarisoa, Tantely F; Watson, Reg; Kremen, Claire
2012-01-01
The Government of Madagascar plans to increase marine protected area coverage by over one million hectares. To assist this process, we compare four methods for marine spatial planning of Madagascar's west coast. Input data for each method was drawn from the same variables: fishing pressure, exposure to climate change, and biodiversity (habitats, species distributions, biological richness, and biodiversity value). The first method compares visual color classifications of primary variables, the second uses binary combinations of these variables to produce a categorical classification of management actions, the third is a target-based optimization using Marxan, and the fourth is conservation ranking with Zonation. We present results from each method, and compare the latter three approaches for spatial coverage, biodiversity representation, fishing cost and persistence probability. All results included large areas in the north, central, and southern parts of western Madagascar. Achieving 30% representation targets with Marxan required twice the fish catch loss than the categorical method. The categorical classification and Zonation do not consider targets for conservation features. However, when we reduced Marxan targets to 16.3%, matching the representation level of the "strict protection" class of the categorical result, the methods show similar catch losses. The management category portfolio has complete coverage, and presents several management recommendations including strict protection. Zonation produces rapid conservation rankings across large, diverse datasets. Marxan is useful for identifying strict protected areas that meet representation targets, and minimize exposure probabilities for conservation features at low economic cost. We show that methods based on Zonation and a simple combination of variables can produce results comparable to Marxan for species representation and catch losses, demonstrating the value of comparing alternative approaches during initial stages of the planning process. Choosing an appropriate approach ultimately depends on scientific and political factors including representation targets, likelihood of adoption, and persistence goals.
Summary of tracking and identification methods
NASA Astrophysics Data System (ADS)
Blasch, Erik; Yang, Chun; Kadar, Ivan
2014-06-01
Over the last two decades, many solutions have arisen to combine target tracking estimation with classification methods. Target tracking includes developments from linear to non-linear and Gaussian to non-Gaussian processing. Pattern recognition includes detection, classification, recognition, and identification methods. Integrating tracking and pattern recognition has resulted in numerous approaches and this paper seeks to organize the various approaches. We discuss the terminology so as to have a common framework for various standards such as the NATO STANAG 4162 - Identification Data Combining Process. In a use case, we provide a comparative example highlighting that location information (as an example) with additional mission objectives from geographical, human, social, cultural, and behavioral modeling is needed to determine identification as classification alone does not allow determining identification or intent.
Ye, Tao; Zhou, Fuqiang
2015-04-10
When imaged by detectors, space targets (including satellites and debris) and background stars have similar point-spread functions, and both objects appear to change as detectors track targets. Therefore, traditional tracking methods cannot separate targets from stars and cannot directly recognize targets in 2D images. Consequently, we propose an autonomous space target recognition and tracking approach using a star sensor technique and a Kalman filter (KF). A two-step method for subpixel-scale detection of star objects (including stars and targets) is developed, and the combination of the star sensor technique and a KF is used to track targets. The experimental results show that the proposed method is adequate for autonomously recognizing and tracking space targets.
Chemical Mixture Risk Assessment Additivity-Based Approaches
Powerpoint presentation includes additivity-based chemical mixture risk assessment methods. Basic concepts, theory and example calculations are included. Several slides discuss the use of "common adverse outcomes" in analyzing phthalate mixtures.
Probabilistic segmentation and intensity estimation for microarray images.
Gottardo, Raphael; Besag, Julian; Stephens, Matthew; Murua, Alejandro
2006-01-01
We describe a probabilistic approach to simultaneous image segmentation and intensity estimation for complementary DNA microarray experiments. The approach overcomes several limitations of existing methods. In particular, it (a) uses a flexible Markov random field approach to segmentation that allows for a wider range of spot shapes than existing methods, including relatively common 'doughnut-shaped' spots; (b) models the image directly as background plus hybridization intensity, and estimates the two quantities simultaneously, avoiding the common logical error that estimates of foreground may be less than those of the corresponding background if the two are estimated separately; and (c) uses a probabilistic modeling approach to simultaneously perform segmentation and intensity estimation, and to compute spot quality measures. We describe two approaches to parameter estimation: a fast algorithm, based on the expectation-maximization and the iterated conditional modes algorithms, and a fully Bayesian framework. These approaches produce comparable results, and both appear to offer some advantages over other methods. We use an HIV experiment to compare our approach to two commercial software products: Spot and Arrayvision.
Hybrid finite difference/finite element immersed boundary method.
E Griffith, Boyce; Luo, Xiaoyu
2017-12-01
The immersed boundary method is an approach to fluid-structure interaction that uses a Lagrangian description of the structural deformations, stresses, and forces along with an Eulerian description of the momentum, viscosity, and incompressibility of the fluid-structure system. The original immersed boundary methods described immersed elastic structures using systems of flexible fibers, and even now, most immersed boundary methods still require Lagrangian meshes that are finer than the Eulerian grid. This work introduces a coupling scheme for the immersed boundary method to link the Lagrangian and Eulerian variables that facilitates independent spatial discretizations for the structure and background grid. This approach uses a finite element discretization of the structure while retaining a finite difference scheme for the Eulerian variables. We apply this method to benchmark problems involving elastic, rigid, and actively contracting structures, including an idealized model of the left ventricle of the heart. Our tests include cases in which, for a fixed Eulerian grid spacing, coarser Lagrangian structural meshes yield discretization errors that are as much as several orders of magnitude smaller than errors obtained using finer structural meshes. The Lagrangian-Eulerian coupling approach developed in this work enables the effective use of these coarse structural meshes with the immersed boundary method. This work also contrasts two different weak forms of the equations, one of which is demonstrated to be more effective for the coarse structural discretizations facilitated by our coupling approach. © 2017 The Authors International Journal for Numerical Methods in Biomedical Engineering Published by John Wiley & Sons Ltd.
Stamer, M; Güthlin, C; Holmberg, C; Karbach, U; Patzelt, C; Meyer, T
2015-12-01
The third and final discussion paper of the German Network of Health Services Research's (DNVF) "Qualitative Methods Working Group" demonstrates methods for the evaluation and quality of qualitative research in health services research. In this paper we discuss approaches described in evaluating qualitative studies, including: an orientation to the general principles of empirical research, an approach-specific course of action, as well as procedures based on the research-process and criteria-oriented approaches. Divided into general and specific aspects to be considered in a qualitative study quality evaluation, the central focus of the discussion paper undertakes an extensive examination of the process and criteria-oriented approaches. The general aspects include the participation of relevant groups in the research process as well as ethical aspects of the research and data protection issues. The more specific aspects in evaluating the quality of qualitative research include considerations about the research interest, research questions, and the selection of data collection methods and types of analyses. The formulated questions are intended to guide reviewers and researchers to evaluate and to develop qualitative research projects appropriately. The intention of this discussion paper is to ensure a transparent research culture, and to reflect on and discuss the methodological and research approach of qualitative studies in health services research. With this paper we aim to initiate a discussion on high quality evaluation of qualitative health services research. © Georg Thieme Verlag KG Stuttgart · New York.
NASA Astrophysics Data System (ADS)
Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.
2012-01-01
This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method.
Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana
2016-01-01
Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment effects are unclear. PMID:27531725
Ojeda-May, Pedro; Nam, Kwangho
2017-08-08
The strategy and implementation of scalable and efficient semiempirical (SE) QM/MM methods in CHARMM are described. The serial version of the code was first profiled to identify routines that required parallelization. Afterward, the code was parallelized and accelerated with three approaches. The first approach was the parallelization of the entire QM/MM routines, including the Fock matrix diagonalization routines, using the CHARMM message passage interface (MPI) machinery. In the second approach, two different self-consistent field (SCF) energy convergence accelerators were implemented using density and Fock matrices as targets for their extrapolations in the SCF procedure. In the third approach, the entire QM/MM and MM energy routines were accelerated by implementing the hybrid MPI/open multiprocessing (OpenMP) model in which both the task- and loop-level parallelization strategies were adopted to balance loads between different OpenMP threads. The present implementation was tested on two solvated enzyme systems (including <100 QM atoms) and an S N 2 symmetric reaction in water. The MPI version exceeded existing SE QM methods in CHARMM, which include the SCC-DFTB and SQUANTUM methods, by at least 4-fold. The use of SCF convergence accelerators further accelerated the code by ∼12-35% depending on the size of the QM region and the number of CPU cores used. Although the MPI version displayed good scalability, the performance was diminished for large numbers of MPI processes due to the overhead associated with MPI communications between nodes. This issue was partially overcome by the hybrid MPI/OpenMP approach which displayed a better scalability for a larger number of CPU cores (up to 64 CPUs in the tested systems).
Surgical Intervention for Instability of the Craniovertebral Junction
TAKAYASU, Masakazu; AOYAMA, Masahiro; JOKO, Masahiro; TAKEUCHI, Mikinobu
2016-01-01
Surgical approaches for stabilizing the craniovertebral junction (CVJ) are classified as either anterior or posterior approaches. Among the anterior approaches, the established method is anterior odontoid screw fixation. Posterior approaches are classified as either atlanto-axial fixation or occipito-cervical (O-C) fixation. Spinal instrumentation using anchor screws and rods has become a popular method for posterior cervical fixation. Because this method achieves greater stability and higher success rates for fusion without the risk of sublaminar wiring, it has become a substitute for previous methods that used bone grafting and wiring. Several types of anchor screws are available, including C1/2 transarticular, C1 lateral mass, C2 pedicle, and translaminar screws. Appropriate anchor screws should be selected according to characteristics such as technical feasibility, safety, and strength. With these stronger anchor screws, shorter fixation has become possible. The present review discusses the current status of surgical interventions for stabilizing the CVJ. PMID:27041630
Multiview echocardiography fusion using an electromagnetic tracking system.
Punithakumar, Kumaradevan; Hareendranathan, Abhilash R; Paakkanen, Riitta; Khan, Nehan; Noga, Michelle; Boulanger, Pierre; Becher, Harald
2016-08-01
Three-dimensional ultrasound is an emerging modality for the assessment of complex cardiac anatomy and function. The advantages of this modality include lack of ionizing radiation, portability, low cost, and high temporal resolution. Major limitations include limited field-of-view, reliance on frequently limited acoustic windows, and poor signal to noise ratio. This study proposes a novel approach to combine multiple views into a single image using an electromagnetic tracking system in order to improve the field-of-view. The novel method has several advantages: 1) it does not rely on image information for alignment, and therefore, the method does not require image overlap; 2) the alignment accuracy of the proposed approach is not affected by any poor image quality as in the case of image registration based approaches; 3) in contrast to previous optical tracking based system, the proposed approach does not suffer from line-of-sight limitation; and 4) it does not require any initial calibration. In this pilot project, we were able to show that using a heart phantom, our method can fuse multiple echocardiographic images and improve the field-of view. Quantitative evaluations showed that the proposed method yielded a nearly optimal alignment of image data sets in three-dimensional space. The proposed method demonstrates the electromagnetic system can be used for the fusion of multiple echocardiography images with a seamless integration of sensors to the transducer.
Environmental Chemicals in Urine and Blood: Improving Methods for Creatinine and Lipid Adjustment.
O'Brien, Katie M; Upson, Kristen; Cook, Nancy R; Weinberg, Clarice R
2016-02-01
Investigators measuring exposure biomarkers in urine typically adjust for creatinine to account for dilution-dependent sample variation in urine concentrations. Similarly, it is standard to adjust for serum lipids when measuring lipophilic chemicals in serum. However, there is controversy regarding the best approach, and existing methods may not effectively correct for measurement error. We compared adjustment methods, including novel approaches, using simulated case-control data. Using a directed acyclic graph framework, we defined six causal scenarios for epidemiologic studies of environmental chemicals measured in urine or serum. The scenarios include variables known to influence creatinine (e.g., age and hydration) or serum lipid levels (e.g., body mass index and recent fat intake). Over a range of true effect sizes, we analyzed each scenario using seven adjustment approaches and estimated the corresponding bias and confidence interval coverage across 1,000 simulated studies. For urinary biomarker measurements, our novel method, which incorporates both covariate-adjusted standardization and the inclusion of creatinine as a covariate in the regression model, had low bias and possessed 95% confidence interval coverage of nearly 95% for most simulated scenarios. For serum biomarker measurements, a similar approach involving standardization plus serum lipid level adjustment generally performed well. To control measurement error bias caused by variations in serum lipids or by urinary diluteness, we recommend improved methods for standardizing exposure concentrations across individuals.
Breakthroughs In Low-profile Leaky-Wave HPM Antennas
2015-06-18
this approach to help us finally to include, and manage quantitatively, this essential piece of the theoretical puzzle as we continue to revise and...appreciate ONR’s continuing support for this R&D. 10 http://www.uttyler.edu/ math /faculty...dkoslover.php & https://www.uttyler.edu/ math /curriculavitae/dkoslover.pdf 11 These include Variational Methods, Integral Equation Method, Equivalent
Wang, Penghao; Wilson, Susan R
2013-01-01
Mass spectrometry-based protein identification is a very challenging task. The main identification approaches include de novo sequencing and database searching. Both approaches have shortcomings, so an integrative approach has been developed. The integrative approach firstly infers partial peptide sequences, known as tags, directly from tandem spectra through de novo sequencing, and then puts these sequences into a database search to see if a close peptide match can be found. However the current implementation of this integrative approach has several limitations. Firstly, simplistic de novo sequencing is applied and only very short sequence tags are used. Secondly, most integrative methods apply an algorithm similar to BLAST to search for exact sequence matches and do not accommodate sequence errors well. Thirdly, by applying these methods the integrated de novo sequencing makes a limited contribution to the scoring model which is still largely based on database searching. We have developed a new integrative protein identification method which can integrate de novo sequencing more efficiently into database searching. Evaluated on large real datasets, our method outperforms popular identification methods.
Chigerwe, Munashe; Ilkiw, Jan E; Boudreaux, Karen A
2011-01-01
The objectives of the present study were to evaluate first-, second-, third-, and fourth-year veterinary medical students' approaches to studying and learning as well as the factors within the curriculum that may influence these approaches. A questionnaire consisting of the short version of the Approaches and Study Skills Inventory for Students (ASSIST) was completed by 405 students, and it included questions relating to conceptions about learning, approaches to studying, and preferences for different types of courses and teaching. Descriptive statistics, factor analysis, Cronbach's alpha analysis, and log-linear analysis were performed on the data. Deep, strategic, and surface learning approaches emerged. There were a few differences between our findings and those presented in previous studies in terms of the correlation of the subscale monitoring effectiveness, which showed loading with both the deep and strategic learning approaches. In addition, the subscale alertness to assessment demands showed correlation with the surface learning approach. The perception of high workloads, the use of previous test files as a method for studying, and examinations that are based only on material provided in lecture notes were positively associated with the surface learning approach. Focusing on improving specific teaching and assessment methods that enhance deep learning is anticipated to enhance students' positive learning experience. These teaching methods include instructors who encourage students to be critical thinkers, the integration of course material in other disciplines, courses that encourage thinking and reading about the learning material, and books and articles that challenge students while providing explanations beyond lecture material.
Al-Khatib, Ra'ed M; Rashid, Nur'Aini Abdul; Abdullah, Rosni
2011-08-01
The secondary structure of RNA pseudoknots has been extensively inferred and scrutinized by computational approaches. Experimental methods for determining RNA structure are time consuming and tedious; therefore, predictive computational approaches are required. Predicting the most accurate and energy-stable pseudoknot RNA secondary structure has been proven to be an NP-hard problem. In this paper, a new RNA folding approach, termed MSeeker, is presented; it includes KnotSeeker (a heuristic method) and Mfold (a thermodynamic algorithm). The global optimization of this thermodynamic heuristic approach was further enhanced by using a case-based reasoning technique as a local optimization method. MSeeker is a proposed algorithm for predicting RNA pseudoknot structure from individual sequences, especially long ones. This research demonstrates that MSeeker improves the sensitivity and specificity of existing RNA pseudoknot structure predictions. The performance and structural results from this proposed method were evaluated against seven other state-of-the-art pseudoknot prediction methods. The MSeeker method had better sensitivity than the DotKnot, FlexStem, HotKnots, pknotsRG, ILM, NUPACK and pknotsRE methods, with 79% of the predicted pseudoknot base-pairs being correct.
Grégory, Dubourg; Chaudet, Hervé; Lagier, Jean-Christophe; Raoult, Didier
2018-03-01
Describing the human hut gut microbiota is one the most exciting challenges of the 21 st century. Currently, high-throughput sequencing methods are considered as the gold standard for this purpose, however, they suffer from several drawbacks, including their inability to detect minority populations. The advent of mass-spectrometric (MS) approaches to identify cultured bacteria in clinical microbiology enabled the creation of the culturomics approach, which aims to establish a comprehensive repertoire of cultured prokaryotes from human specimens using extensive culture conditions. Areas covered: This review first underlines how mass spectrometric approaches have revolutionized clinical microbiology. It then highlights the contribution of MS-based methods to culturomics studies, paying particular attention to the extension of the human gut microbiota repertoire through the discovery of new bacterial species. Expert commentary: MS-based approaches have enabled cultivation methods to be resuscitated to study the human gut microbiota and thus to fill in the blanks left by high-throughput sequencing methods in terms of culturing minority populations. Continued efforts to recover new taxa using culture methods, combined with their rapid implementation in genomic databases, would allow for an exhaustive analysis of the gut microbiota through the use of a comprehensive approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Charles; Penchoff, Deborah A.; Wilson, Angela K., E-mail: wilson@chemistry.msu.edu
2015-11-21
An effective approach for the determination of lanthanide energetics, as demonstrated by application to the third ionization energy (in the gas phase) for the first half of the lanthanide series, has been developed. This approach uses a combination of highly correlated and fully relativistic ab initio methods to accurately describe the electronic structure of heavy elements. Both scalar and fully relativistic methods are used to achieve an approach that is both computationally feasible and accurate. The impact of basis set choice and the number of electrons included in the correlation space has also been examined.
A proposed configurable approach for recommendation systems via data mining techniques
NASA Astrophysics Data System (ADS)
Khedr, Ayman E.; Idrees, Amira M.; Hegazy, Abd El-Fatah; El-Shewy, Samir
2018-02-01
This study presents a configurable approach for recommendations which determines the suitable recommendation method for each field based on the characteristics of its data, the method includes determining the suitable technique for selecting a representative sample of the provided data. Then selecting the suitable feature weighting measure to provide a correct weight for each feature based on its effect on the recommendations. Finally, selecting the suitable algorithm to provide the required recommendations. The proposed configurable approach could be applied on different domains. The experiments have revealed that the approach is able to provide recommendations with only 0.89 error rate percentage.
Centrifuge Rotor Models: A Comparison of the Euler-Lagrange and the Bond Graph Modeling Approach
NASA Technical Reports Server (NTRS)
Granda, Jose J.; Ramakrishnan, Jayant; Nguyen, Louis H.
2006-01-01
A viewgraph presentation on centrifuge rotor models with a comparison using Euler-Lagrange and bond graph methods is shown. The topics include: 1) Objectives; 2) MOdeling Approach Comparisons; 3) Model Structures; and 4) Application.
Exploration of Opinion-aware Approach to Contextual Suggestion
2014-11-01
effectiveness of the proposed method. 1 Introduction TREC 1014 Contextual Suggestion Track gives researchers the chance to test their methods on providing...each category. Information including the name, average rating, address, business hour, all ratings and the associated text reviews of the candidate...Annals of Statistics , 29:1189–1232, 2000. 5. K. Ganesan, C. Zhai, and J. Han. Opinosis: a graph-based approach to abstractive summarization of highly
Mixed Methods in Biomedical and Health Services Research
Curry, Leslie A.; Krumholz, Harlan M.; O’Cathain, Alicia; Plano Clark, Vicki L.; Cherlin, Emily; Bradley, Elizabeth H.
2013-01-01
Mixed methods studies, in which qualitative and quantitative methods are combined in a single program of inquiry, can be valuable in biomedical and health services research, where the complementary strengths of each approach can yield greater insight into complex phenomena than either approach alone. Although interest in mixed methods is growing among science funders and investigators, written guidance on how to conduct and assess rigorous mixed methods studies is not readily accessible to the general readership of peer-reviewed biomedical and health services journals. Furthermore, existing guidelines for publishing mixed methods studies are not well known or applied by researchers and journal editors. Accordingly, this paper is intended to serve as a concise, practical resource for readers interested in core principles and practices of mixed methods research. We briefly describe mixed methods approaches and present illustrations from published biomedical and health services literature, including in cardiovascular care, summarize standards for the design and reporting of these studies, and highlight four central considerations for investigators interested in using these methods. PMID:23322807
Intercomparison of 3D pore-scale flow and solute transport simulation methods
Mehmani, Yashar; Schoenherr, Martin; Pasquali, Andrea; ...
2015-09-28
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This paper provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less
Intercomparison of 3D pore-scale flow and solute transport simulation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.
2016-09-01
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This study provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less
[Introduction of active learning and student readership in teaching by the pharmaceutical faculty].
Sekiguchi, Masaki; Yamato, Ippei; Kato, Tetsuta; Torigoe, Kojyun
2005-07-01
We have introduced improvements and new approaches into our teaching methods by exploiting 4 active learning methods for pharmacy students of first year. The 4 teaching methods for each lesson or take home assignment are follows: 1) problem-based learning (clinical case) including a student presentation of the clinical case, 2) schematic drawings of the human organs, one drawing done in 15-20 min during the week following a lecture and a second drawing done with reference to a professional textbook, 3) learning of professional themes in take home assignments, and 4) short test in order to confirm the understanding of technical terms by using paper or computer. These improvements and new methods provide active approaches for pharmacy students (as opposed to passive memorization of words and image study). In combination, they have proven to be useful as a learning method to acquire expert knowledge and to convert from passive learning approach to active learning approach of pharmacy students in the classroom.
Integrating qualitative research into occupational health: a case study among hospital workers.
Gordon, Deborah R; Ames, Genevieve M; Yen, Irene H; Gillen, Marion; Aust, Birgit; Rugulies, Reiner; Frank, John W; Blanc, Paul D
2005-04-01
We sought to better use qualitative approaches in occupational health research and integrate them with quantitative methods. We systematically reviewed, selected, and adapted qualitative research methods as part of a multisite study of the predictors and outcomes of work-related musculoskeletal disorders among hospital workers in two large urban tertiary hospitals. The methods selected included participant observation; informal, open-ended, and semistructured interviews with individuals or small groups; and archival study. The nature of the work and social life of the hospitals and the foci of the study all favored using more participant observation methods in the case study than initially anticipated. Exploiting the full methodological spectrum of qualitative methods in occupational health is increasingly relevant. Although labor-intensive, these approaches may increase the yield of established quantitative approaches otherwise used in isolation.
A new automated NaCl based robust method for routine production of gallium-68 labeled peptides
Schultz, Michael K.; Mueller, Dirk; Baum, Richard P.; Watkins, G. Leonard; Breeman, Wouter A. P.
2017-01-01
A new NaCl based method for preparation of gallium-68 labeled radiopharmaceuticals has been adapted for use with an automated gallium-68 generator system. The method was evaluated based on 56 preparations of [68Ga]DOTATOC and compared to a similar acetone-based approach. Advantages of the new NaCl approach include reduced preparation time (< 15 min) and removal of organic solvents. The method produces high peptide-bound % (> 97%), and specific activity (> 40 MBq nmole−1 [68Ga]DOTATOC) and is well-suited for clinical production of radiopharmaceuticals. PMID:23026223
CFD Methods and Tools for Multi-Element Airfoil Analysis
NASA Technical Reports Server (NTRS)
Rogers, Stuart E.; George, Michael W. (Technical Monitor)
1995-01-01
This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.
Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method
NASA Astrophysics Data System (ADS)
Lee, G.; Jun, K. S.; Cung, E. S.
2014-09-01
This study proposes an improved group decision making (GDM) framework that combines VIKOR method with fuzzified data to quantify the spatial flood vulnerability including multi-criteria evaluation indicators. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. Triangular fuzzy numbers are used to consider the uncertainty of weights and the crisp data of proxy variables. This approach can effectively propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the south Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the results from general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods, such as those developed by Borda, Condorcet, and Copeland. The evaluated priorities were significantly dependent on the employed decision-making method. The proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.
Uncertainty modelling of real-time observation of a moving object: photogrammetric measurements
NASA Astrophysics Data System (ADS)
Ulrich, Thomas
2015-04-01
Photogrametric systems are widely used in the field of industrial metrology to measure kinematic tasks such as tracking robot movements. In order to assess spatiotemporal deviations of a kinematic movement, it is crucial to have a reliable uncertainty of the kinematic measurements. Common methods to evaluate the uncertainty in kinematic measurements include approximations specified by the manufactures, various analytical adjustment methods and Kalman filters. Here a hybrid system estimator in conjunction with a kinematic measurement model is applied. This method can be applied to processes which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. Additionally, it has been shown that the approach is in accordance with GUM (Guide to the Expression of Uncertainty in Measurement). The approach is compared to the Kalman filter using simulated data to achieve an overall error calculation. Furthermore, the new approach is used for the analysis of a rotating system as this system has both a constant and a variable turn rate. As the new approach reduces overshoots it is more appropriate for analysing kinematic processes than the Kalman filter. In comparison with the manufacturer’s approximations, the new approach takes account of kinematic behaviour, with an improved description of the real measurement process. Therefore, this approach is well-suited to the analysis of kinematic processes with unknown changes in kinematic behaviour.
Methods for Dichoptic Stimulus Presentation in Functional Magnetic Resonance Imaging - A Review
Choubey, Bhaskar; Jurcoane, Alina; Muckli, Lars; Sireteanu, Ruxandra
2009-01-01
Dichoptic stimuli (different stimuli displayed to each eye) are increasingly being used in functional brain imaging experiments using visual stimulation. These studies include investigation into binocular rivalry, interocular information transfer, three-dimensional depth perception as well as impairments of the visual system like amblyopia and stereodeficiency. In this paper, we review various approaches of displaying dichoptic stimulus used in functional magnetic resonance imaging experiments. These include traditional approaches of using filters (red-green, red-blue, polarizing) with optical assemblies as well as newer approaches of using bi-screen goggles. PMID:19526076
Application of Chimera Grid Scheme to Combustor Flowfields at all Speeds
NASA Technical Reports Server (NTRS)
Yungster, Shaye; Chen, Kuo-Huey
1997-01-01
A CFD method for solving combustor flowfields at all speeds on complex configurations is presented. The approach is based on the ALLSPD-3D code which uses the compressible formulation of the flow equations including real gas effects, nonequilibrium chemistry and spray combustion. To facilitate the analysis of complex geometries, the chimera grid method is utilized. To the best of our knowledge, this is the first application of the chimera scheme to reacting flows. In order to evaluate the effectiveness of this numerical approach, several benchmark calculations of subsonic flows are presented. These include steady and unsteady flows, and bluff-body stabilized spray and premixed combustion flames.
Diversified Research Methods Education in LIS: Thinking outside the Box
ERIC Educational Resources Information Center
Luo, Lili
2017-01-01
A small number of LIS degree programs have adopted a diversified approach to research methods education, including offering an array of specialized research methods courses in addition to a general introductory course. The current study conducted an in-depth investigation of the diversified research methods curriculum of the LIS program at San…
Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken
2013-11-01
Elicitation is a technique that can be used to obtain probability distribution from experts about unknown quantities. We conducted a methodology review of reports where probability distributions had been elicited from experts to be used in model-based health technology assessments. Databases including MEDLINE, EMBASE and the CRD database were searched from inception to April 2013. Reference lists were checked and citation mapping was also used. Studies describing their approach to the elicitation of probability distributions were included. Data was abstracted on pre-defined aspects of the elicitation technique. Reports were critically appraised on their consideration of the validity, reliability and feasibility of the elicitation exercise. Fourteen articles were included. Across these studies, the most marked features were heterogeneity in elicitation approach and failure to report key aspects of the elicitation method. The most frequently used approaches to elicitation were the histogram technique and the bisection method. Only three papers explicitly considered the validity, reliability and feasibility of the elicitation exercises. Judged by the studies identified in the review, reports of expert elicitation are insufficient in detail and this impacts on the perceived usability of expert-elicited probability distributions. In this context, the wider credibility of elicitation will only be improved by better reporting and greater standardisation of approach. Until then, the advantage of eliciting probability distributions from experts may be lost.
Detection and categorization of bacteria habitats using shallow linguistic analysis
2015-01-01
Background Information regarding bacteria biotopes is important for several research areas including health sciences, microbiology, and food processing and preservation. One of the challenges for scientists in these domains is the huge amount of information buried in the text of electronic resources. Developing methods to automatically extract bacteria habitat relations from the text of these electronic resources is crucial for facilitating research in these areas. Methods We introduce a linguistically motivated rule-based approach for recognizing and normalizing names of bacteria habitats in biomedical text by using an ontology. Our approach is based on the shallow syntactic analysis of the text that include sentence segmentation, part-of-speech (POS) tagging, partial parsing, and lemmatization. In addition, we propose two methods for identifying bacteria habitat localization relations. The underlying assumption for the first method is that discourse changes with a new paragraph. Therefore, it operates on a paragraph-basis. The second method performs a more fine-grained analysis of the text and operates on a sentence-basis. We also develop a novel anaphora resolution method for bacteria coreferences and incorporate it with the sentence-based relation extraction approach. Results We participated in the Bacteria Biotope (BB) Task of the BioNLP Shared Task 2013. Our system (Boun) achieved the second best performance with 68% Slot Error Rate (SER) in Sub-task 1 (Entity Detection and Categorization), and ranked third with an F-score of 27% in Sub-task 2 (Localization Event Extraction). This paper reports the system that is implemented for the shared task, including the novel methods developed and the improvements obtained after the official evaluation. The extensions include the expansion of the OntoBiotope ontology using the training set for Sub-task 1, and the novel sentence-based relation extraction method incorporated with anaphora resolution for Sub-task 2. These extensions resulted in promising results for Sub-task 1 with a SER of 68%, and state-of-the-art performance for Sub-task 2 with an F-score of 53%. Conclusions Our results show that a linguistically-oriented approach based on the shallow syntactic analysis of the text is as effective as machine learning approaches for the detection and ontology-based normalization of habitat entities. Furthermore, the newly developed sentence-based relation extraction system with the anaphora resolution module significantly outperforms the paragraph-based one, as well as the other systems that participated in the BB Shared Task 2013. PMID:26201262
ERIC Educational Resources Information Center
Touval, Ayana
2009-01-01
The kinematics teaching strategy is a teaching method that stimulates kinesthetic intelligence and thus offers students an unconventional approach for exploring mathematical ideas through movement. This article describes how to use the kinesthetic approach to introduce radian measure. The article includes detailed descriptions of easy-to-use…
Using Mixed Methods and Collaboration to Evaluate an Education and Public Outreach Program (Invited)
NASA Astrophysics Data System (ADS)
Shebby, S.; Shipp, S. S.
2013-12-01
Traditional indicators (such as the number of participants or Likert-type ratings of participant perceptions) are often used to provide stakeholders with basic information about program outputs and to justify funding decisions. However, use of qualitative methods can strengthen the reliability of these data and provide stakeholders with more meaningful information about program challenges, successes, and ultimate impacts (Stern, Stame, Mayne, Forss, David & Befani, 2012). In this session, presenters will discuss how they used a mixed methods evaluation to determine the impact of an education and public outreach (EPO) program. EPO efforts were intended to foster more effective, sustainable, and efficient utilization of science discoveries and learning experiences through three main goals 1) increase engagement and support by leveraging of resources, expertise, and best practices; 2) organize a portfolio of resources for accessibility, connectivity, and strategic growth; and 3) develop an infrastructure to support coordination. The evaluation team used a mixed methods design to conduct the evaluation. Presenters will first discuss five potential benefits of mixed methods designs: triangulation of findings, development, complementarity, initiation, and value diversity (Greene, Caracelli & Graham, 2005). They will next demonstrate how a 'mix' of methods, including artifact collection, surveys, interviews, focus groups, and vignettes, was included in the EPO project's evaluation design, providing specific examples of how alignment between the program theory and the evaluation plan was best achieved with a mixed methods approach. The presentation will also include an overview of different mixed methods approaches and information about important considerations when using a mixed methods design, such as selection of data collection methods and sources, and the timing and weighting of quantitative and qualitative methods (Creswell, 2003). Ultimately, this presentation will provide insight into how a mixed methods approach was used to provide stakeholders with important information about progress toward program goals. Creswell, J.W. (2003). Research design: Qualitative, quantitative, and mixed approaches. Thousand Oaks, CA: Sage. Greene, J. C., Caracelli, V. J., & Graham, W. D. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274. Stern, E; Stame, N; Mayne, J; Forss, K; Davis, R & Befani, B (2012) Broadening the range of designs and methods for impact evaluation. Department for International Development.
Nonlinear Attitude Filtering Methods
NASA Technical Reports Server (NTRS)
Markley, F. Landis; Crassidis, John L.; Cheng, Yang
2005-01-01
This paper provides a survey of modern nonlinear filtering methods for attitude estimation. Early applications relied mostly on the extended Kalman filter for attitude estimation. Since these applications, several new approaches have been developed that have proven to be superior to the extended Kalman filter. Several of these approaches maintain the basic structure of the extended Kalman filter, but employ various modifications in order to provide better convergence or improve other performance characteristics. Examples of such approaches include: filter QUEST, extended QUEST, the super-iterated extended Kalman filter, the interlaced extended Kalman filter, and the second-order Kalman filter. Filters that propagate and update a discrete set of sigma points rather than using linearized equations for the mean and covariance are also reviewed. A two-step approach is discussed with a first-step state that linearizes the measurement model and an iterative second step to recover the desired attitude states. These approaches are all based on the Gaussian assumption that the probability density function is adequately specified by its mean and covariance. Other approaches that do not require this assumption are reviewed, including particle filters and a Bayesian filter based on a non-Gaussian, finite-parameter probability density function on SO(3). Finally, the predictive filter, nonlinear observers and adaptive approaches are shown. The strengths and weaknesses of the various approaches are discussed.
Fuzzy architecture assessment for critical infrastructure resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muller, George
2012-12-01
This paper presents an approach for the selection of alternative architectures in a connected infrastructure system to increase resilience of the overall infrastructure system. The paper begins with a description of resilience and critical infrastructure, then summarizes existing approaches to resilience, and presents a fuzzy-rule based method of selecting among alternative infrastructure architectures. This methodology includes considerations which are most important when deciding on an approach to resilience. The paper concludes with a proposed approach which builds on existing resilience architecting methods by integrating key system aspects using fuzzy memberships and fuzzy rule sets. This novel approach aids the systemsmore » architect in considering resilience for the evaluation of architectures for adoption into the final system architecture.« less
Methods to enable the design of bioactive small molecules targeting RNA
Disney, Matthew D.; Yildirim, Ilyas; Childs-Disney, Jessica L.
2014-01-01
RNA is an immensely important target for small molecule therapeutics or chemical probes of function. However, methods that identify, annotate, and optimize RNA-small molecule interactions that could enable the design of compounds that modulate RNA function are in their infancies. This review describes recent approaches that have been developed to understand and optimize RNA motif-small molecule interactions, including Structure-Activity Relationships Through Sequencing (StARTS), quantitative structure-activity relationships (QSAR), chemical similarity searching, structure-based design and docking, and molecular dynamics (MD) simulations. Case studies described include the design of small molecules targeting RNA expansions, the bacterial A-site, viral RNAs, and telomerase RNA. These approaches can be combined to afford a synergistic method to exploit the myriad of RNA targets in the transcriptome. PMID:24357181
Methods to enable the design of bioactive small molecules targeting RNA.
Disney, Matthew D; Yildirim, Ilyas; Childs-Disney, Jessica L
2014-02-21
RNA is an immensely important target for small molecule therapeutics or chemical probes of function. However, methods that identify, annotate, and optimize RNA-small molecule interactions that could enable the design of compounds that modulate RNA function are in their infancies. This review describes recent approaches that have been developed to understand and optimize RNA motif-small molecule interactions, including structure-activity relationships through sequencing (StARTS), quantitative structure-activity relationships (QSAR), chemical similarity searching, structure-based design and docking, and molecular dynamics (MD) simulations. Case studies described include the design of small molecules targeting RNA expansions, the bacterial A-site, viral RNAs, and telomerase RNA. These approaches can be combined to afford a synergistic method to exploit the myriad of RNA targets in the transcriptome.
Dictionary Indexing of Electron Channeling Patterns.
Singh, Saransh; De Graef, Marc
2017-02-01
The dictionary-based approach to the indexing of diffraction patterns is applied to electron channeling patterns (ECPs). The main ingredients of the dictionary method are introduced, including the generalized forward projector (GFP), the relevant detector model, and a scheme to uniformly sample orientation space using the "cubochoric" representation. The GFP is used to compute an ECP "master" pattern. Derivative free optimization algorithms, including the Nelder-Mead simplex and the bound optimization by quadratic approximation are used to determine the correct detector parameters and to refine the orientation obtained from the dictionary approach. The indexing method is applied to poly-silicon and shows excellent agreement with the calibrated values. Finally, it is shown that the method results in a mean disorientation error of 1.0° with 0.5° SD for a range of detector parameters.
Recent developments in structural proteomics for protein structure determination.
Liu, Hsuan-Liang; Hsu, Jyh-Ping
2005-05-01
The major challenges in structural proteomics include identifying all the proteins on the genome-wide scale, determining their structure-function relationships, and outlining the precise three-dimensional structures of the proteins. Protein structures are typically determined by experimental approaches such as X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. However, the knowledge of three-dimensional space by these techniques is still limited. Thus, computational methods such as comparative and de novo approaches and molecular dynamic simulations are intensively used as alternative tools to predict the three-dimensional structures and dynamic behavior of proteins. This review summarizes recent developments in structural proteomics for protein structure determination; including instrumental methods such as X-ray crystallography and NMR spectroscopy, and computational methods such as comparative and de novo structure prediction and molecular dynamics simulations.
An international survey and modified Delphi approach revealed numerous rapid review methods.
Tricco, Andrea C; Zarin, Wasifa; Antony, Jesmin; Hutton, Brian; Moher, David; Sherifali, Diana; Straus, Sharon E
2016-02-01
To solicit experiences with and perceptions of rapid reviews from stakeholders, including researchers, policy makers, industry, journal editors, and health care providers. An international survey of rapid review producers and modified Delphi. Forty rapid review producers responded on our survey (63% response rate). Eighty-eight rapid reviews with 31 different names were reported. Rapid review commissioning organizations were predominantly government (78%) and health care (58%) organizations. Several rapid review approaches were identified, including updating the literature search of previous reviews (92%); limiting the search strategy by date of publication (88%); and having only one reviewer screen (85%), abstract data (84%), and assess the quality of studies (86%). The modified Delphi included input from 113 stakeholders on the rapid review approaches from the survey. Approach 1 (search limited by date and language; study selection by one reviewer only, and data abstraction and quality appraisal conducted by one reviewer and one verifier) was ranked the most feasible (72%, 81/113 responses), with the lowest perceived risk of bias (12%, 12/103); it also ranked second in timeliness (37%, 38/102) and fifth in comprehensiveness (5%, 5/100). Rapid reviews have many names and approaches, and some methods might be more desirable than others. Copyright © 2016 Elsevier Inc. All rights reserved.
Systematic Approach to Identifying Deeply Buried Archeological Deposits
DOT National Transportation Integrated Search
2018-02-01
Traditional methods used to discover archeological sites include pedestrian surface surveys and relatively shallow hand-dug shovel or soil core testing. While these methods are appropriate for locating surface and near-surface sites on ridges, hillto...
Mathematical modeling and computational prediction of cancer drug resistance.
Sun, Xiaoqiang; Hu, Bin
2017-06-23
Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of computational methods for studying drug resistance, including inferring drug-induced signaling networks, multiscale modeling, drug combinations and precision medicine. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Bravata, Dena M; McDonald, Kathryn M; Shojania, Kaveh G; Sundaram, Vandana; Owens, Douglas K
2005-06-21
Some important health policy topics, such as those related to the delivery, organization, and financing of health care, present substantial challenges to established methods for evidence synthesis. For example, such reviews may ask: What is the effect of for-profit versus not-for-profit delivery of care on patient outcomes? Or, which strategies are the most effective for promoting preventive care? This paper describes innovative methods for synthesizing evidence related to the delivery, organization, and financing of health care. We found 13 systematic reviews on these topics that described novel methodologic approaches. Several of these syntheses used 3 approaches: conceptual frameworks to inform problem formulation, systematic searches that included nontraditional literature sources, and hybrid synthesis methods that included simulations to address key gaps in the literature. As the primary literature on these topics expands, so will opportunities to develop additional novel methods for performing high-quality comprehensive syntheses.
Discrete Optimization of Electronic Hyperpolarizabilities in a Chemical Subspace
2009-05-01
molecular design. Methods for optimization in discrete spaces have been studied extensively and recently reviewed ( 5). Optimization methods include...integer programming, as in branch-and-bound techniques (including dead-end elimination [ 6]), simulated annealing ( 7), and genetic algorithms ( 8...These algorithms have found renewed interest and application in molecular and materials design (9- 12) . Recently, new approaches have been
A longitudinal multilevel CFA-MTMM model for interchangeable and structurally different methods
Koch, Tobias; Schultze, Martin; Eid, Michael; Geiser, Christian
2014-01-01
One of the key interests in the social sciences is the investigation of change and stability of a given attribute. Although numerous models have been proposed in the past for analyzing longitudinal data including multilevel and/or latent variable modeling approaches, only few modeling approaches have been developed for studying the construct validity in longitudinal multitrait-multimethod (MTMM) measurement designs. The aim of the present study was to extend the spectrum of current longitudinal modeling approaches for MTMM analysis. Specifically, a new longitudinal multilevel CFA-MTMM model for measurement designs with structurally different and interchangeable methods (called Latent-State-Combination-Of-Methods model, LS-COM) is presented. Interchangeable methods are methods that are randomly sampled from a set of equivalent methods (e.g., multiple student ratings for teaching quality), whereas structurally different methods are methods that cannot be easily replaced by one another (e.g., teacher, self-ratings, principle ratings). Results of a simulation study indicate that the parameters and standard errors in the LS-COM model are well recovered even in conditions with only five observations per estimated model parameter. The advantages and limitations of the LS-COM model relative to other longitudinal MTMM modeling approaches are discussed. PMID:24860515
ERIC Educational Resources Information Center
Neman, Robert Lynn
This study was designed to assess the effects of the problem-oriented method compared to those of the traditional approach in general chemistry at the college level. The problem-oriented course included topics such as air and water pollution, drug addiction and analysis, tetraethyl-lead additives, insecticides in the environment, and recycling of…
Changing physician behavior: what works?
Mostofian, Fargoi; Ruban, Cynthiya; Simunovic, Nicole; Bhandari, Mohit
2015-01-01
There are various interventions for guideline implementation in clinical practice, but the effects of these interventions are generally unclear. We conducted a systematic review to identify effective methods of implementing clinical research findings and clinical guidelines to change physician practice patterns, in surgical and general practice. Systematic review of reviews. We searched electronic databases (MEDLINE, EMBASE, and PubMed) for systematic reviews published in English that evaluated the effectiveness of different implementation methods. Two reviewers independently assessed eligibility for inclusion and methodological quality, and extracted relevant data. Fourteen reviews covering a wide range of interventions were identified. The intervention methods used include: audit and feedback, computerized decision support systems, continuing medical education, financial incentives, local opinion leaders, marketing, passive dissemination of information, patient-mediated interventions, reminders, and multifaceted interventions. Active approaches, such as academic detailing, led to greater effects than traditional passive approaches. According to the findings of 3 reviews, 71% of studies included in these reviews showed positive change in physician behavior when exposed to active educational methods and multifaceted interventions. Active forms of continuing medical education and multifaceted interventions were found to be the most effective methods for implementing guidelines into general practice. Additionally, active approaches to changing physician performance were shown to improve practice to a greater extent than traditional passive methods. Further primary research is necessary to evaluate the effectiveness of these methods in a surgical setting.
Structural issues affecting mixed methods studies in health research: a qualitative study.
O'Cathain, Alicia; Nicholl, Jon; Murphy, Elizabeth
2009-12-09
Health researchers undertake studies which combine qualitative and quantitative methods. Little attention has been paid to the structural issues affecting this mixed methods approach. We explored the facilitators and barriers to undertaking mixed methods studies in health research. Face-to-face semi-structured interviews with 20 researchers experienced in mixed methods research in health in the United Kingdom. Structural facilitators for undertaking mixed methods studies included a perception that funding bodies promoted this approach, and the multidisciplinary constituency of some university departments. Structural barriers to exploiting the potential of these studies included a lack of education and training in mixed methods research, and a lack of templates for reporting mixed methods articles in peer-reviewed journals. The 'hierarchy of evidence' relating to effectiveness studies in health care research, with the randomised controlled trial as the gold standard, appeared to pervade the health research infrastructure. Thus integration of data and findings from qualitative and quantitative components of mixed methods studies, and dissemination of integrated outputs, tended to occur through serendipity and effort, further highlighting the presence of structural constraints. Researchers are agents who may also support current structures - journal reviewers and editors, and directors of postgraduate training courses - and thus have the ability to improve the structural support for exploiting the potential of mixed methods research. The environment for health research in the UK appears to be conducive to mixed methods research but not to exploiting the potential of this approach. Structural change, as well as change in researcher behaviour, will be necessary if researchers are to fully exploit the potential of using mixed methods research.
An Aggregated Method for Determining Railway Defects and Obstacle Parameters
NASA Astrophysics Data System (ADS)
Loktev, Daniil; Loktev, Alexey; Stepanov, Roman; Pevzner, Viktor; Alenov, Kanat
2018-03-01
The method of combining algorithms of image blur analysis and stereo vision to determine the distance to objects (including external defects of railway tracks) and the speed of moving objects-obstacles is proposed. To estimate the deviation of the distance depending on the blur a statistical approach, logarithmic, exponential and linear standard functions are used. The statistical approach includes a method of estimating least squares and the method of least modules. The accuracy of determining the distance to the object, its speed and direction of movement is obtained. The paper develops a method of determining distances to objects by analyzing a series of images and assessment of depth using defocusing using its aggregation with stereoscopic vision. This method is based on a physical effect of dependence on the determined distance to the object on the obtained image from the focal length or aperture of the lens. In the calculation of the blur spot diameter it is assumed that blur occurs at the point equally in all directions. According to the proposed approach, it is possible to determine the distance to the studied object and its blur by analyzing a series of images obtained using the video detector with different settings. The article proposes and scientifically substantiates new and improved existing methods for detecting the parameters of static and moving objects of control, and also compares the results of the use of various methods and the results of experiments. It is shown that the aggregate method gives the best approximation to the real distances.
Good Laboratory Practices of Materials Testing at NASA White Sands Test Facility
NASA Technical Reports Server (NTRS)
Hirsch, David; Williams, James H.
2005-01-01
An approach to good laboratory practices of materials testing at NASA White Sands Test Facility is presented. The contents include: 1) Current approach; 2) Data analysis; and 3) Improvements sought by WSTF to enhance the diagnostic capability of existing methods.
From Continuous Improvement to Organisational Learning: Developmental Theory.
ERIC Educational Resources Information Center
Murray, Peter; Chapman, Ross
2003-01-01
Explores continuous improvement methods, which underlie total quality management, finding barriers to implementation in practice that are related to a one-dimensional approach. Suggests a multiple, unbounded learning cycle, a holistic approach that includes adaptive learning, learning styles, generative learning, and capability development.…
Local algebraic analysis of differential systems
NASA Astrophysics Data System (ADS)
Kaptsov, O. V.
2015-06-01
We propose a new approach for studying the compatibility of partial differential equations. This approach is a synthesis of the Riquier method, Gröbner basis theory, and elements of algebraic geometry. As applications, we consider systems including the wave equation and the sine-Gordon equation.
An Extended Spectral-Spatial Classification Approach for Hyperspectral Data
NASA Astrophysics Data System (ADS)
Akbari, D.
2017-11-01
In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF) algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1) unsupervised feature extraction methods including principal component analysis (PCA), independent component analysis (ICA), and minimum noise fraction (MNF); (2) supervised feature extraction including decision boundary feature extraction (DBFE), discriminate analysis feature extraction (DAFE), and nonparametric weighted feature extraction (NWFE); (3) genetic algorithm (GA). The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.
Exploiting Quantum Resonance to Solve Combinatorial Problems
NASA Technical Reports Server (NTRS)
Zak, Michail; Fijany, Amir
2006-01-01
Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.
Inverse transonic airfoil design methods including boundary layer and viscous interaction effects
NASA Technical Reports Server (NTRS)
Carlson, L. A.
1979-01-01
The development and incorporation into TRANDES of a fully conservative analysis method utilizing the artificial compressibility approach is described. The method allows for lifting cases and finite thickness airfoils and utilizes a stretched coordinate system. Wave drag and massive separation studies are also discussed.
ERIC Educational Resources Information Center
Aagaard, Jesper
2017-01-01
In time, phenomenology has become a viable approach to conducting qualitative studies in education. Popular and well-established methods include descriptive and hermeneutic phenomenology. Based on critiques of the essentialism and receptivity of these two methods, however, this article offers a third variation of empirical phenomenology:…
Strategy to Promote Active Learning of an Advanced Research Method
ERIC Educational Resources Information Center
McDermott, Hilary J.; Dovey, Terence M.
2013-01-01
Research methods courses aim to equip students with the knowledge and skills required for research yet seldom include practical aspects of assessment. This reflective practitioner report describes and evaluates an innovative approach to teaching and assessing advanced qualitative research methods to final-year psychology undergraduate students. An…
77 FR 40866 - Applications for New Awards; Innovative Approaches to Literacy Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-11
... supported by the methods that have been employed. The term includes, appropriate to the research being... observational methods that provide reliable data; (iv) making claims of causal relationships only in random...; and (vii) using research designs and methods appropriate to the research question posed...
Verification and Validation of Monte Carlo N-Particle 6 for Computing Gamma Protection Factors
2015-03-26
methods for evaluating RPFs, which it used for the subsequent 30 years. These approaches included computational modeling, radioisotopes , and a high...1.2.1. Past Methods of Experimental Evaluation ........................................................ 2 1.2.2. Modeling Efforts...Other Considerations ......................................................................................... 14 2.4. Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Mow, M.; Zbijewski, W.; Sisniega, A.; Xu, J.; Dang, H.; Stayman, J. W.; Wang, X.; Foos, D. H.; Koliatsos, V.; Aygun, N.; Siewerdsen, J. H.
2017-03-01
Purpose: To improve the timely detection and treatment of intracranial hemorrhage or ischemic stroke, recent efforts include the development of cone-beam CT (CBCT) systems for perfusion imaging and new approaches to estimate perfusion parameters despite slow rotation speeds compared to multi-detector CT (MDCT) systems. This work describes development of a brain perfusion CBCT method using a reconstruction of difference (RoD) approach to enable perfusion imaging on a newly developed CBCT head scanner prototype. Methods: A new reconstruction approach using RoD with a penalized-likelihood framework was developed to image the temporal dynamics of vascular enhancement. A digital perfusion simulation was developed to give a realistic representation of brain anatomy, artifacts, noise, scanner characteristics, and hemo-dynamic properties. This simulation includes a digital brain phantom, time-attenuation curves and noise parameters, a novel forward projection method for improved computational efficiency, and perfusion parameter calculation. Results: Our results show the feasibility of estimating perfusion parameters from a set of images reconstructed from slow scans, sparse data sets, and arc length scans as short as 60 degrees. The RoD framework significantly reduces noise and time-varying artifacts from inconsistent projections. Proper regularization and the use of overlapping reconstructed arcs can potentially further decrease bias and increase temporal resolution, respectively. Conclusions: A digital brain perfusion simulation with RoD imaging approach has been developed and supports the feasibility of using a CBCT head scanner for perfusion imaging. Future work will include testing with data acquired using a 3D-printed perfusion phantom currently and translation to preclinical and clinical studies.
NASA Astrophysics Data System (ADS)
Carpenter, Matthew H.; Jernigan, J. G.
2007-05-01
We present examples of an analysis progression consisting of a synthesis of the Photon Clean Method (Carpenter, Jernigan, Brown, Beiersdorfer 2007) and bootstrap methods to quantify errors and variations in many-parameter models. The Photon Clean Method (PCM) works well for model spaces with large numbers of parameters proportional to the number of photons, therefore a Monte Carlo paradigm is a natural numerical approach. Consequently, PCM, an "inverse Monte-Carlo" method, requires a new approach for quantifying errors as compared to common analysis methods for fitting models of low dimensionality. This presentation will explore the methodology and presentation of analysis results derived from a variety of public data sets, including observations with XMM-Newton, Chandra, and other NASA missions. Special attention is given to the visualization of both data and models including dynamic interactive presentations. This work was performed under the auspices of the Department of Energy under contract No. W-7405-Eng-48. We thank Peter Beiersdorfer and Greg Brown for their support of this technical portion of a larger program related to science with the LLNL EBIT program.
NASA Technical Reports Server (NTRS)
Deshpande, M. D.
1997-01-01
The dyadic Green's function for an electric current source placed in a rectangular waveguide is derived using a magnetic vector potential approach. A complete solution for the electric and magnetic fields including the source location is obtained by simple differentiation of the vector potential around the source location. The simple differentiation approach which gives electric and magnetic fields identical to an earlier derivation is overlooked by the earlier workers in the derivation of the dyadic Green's function particularly around the source location. Numerical results obtained using the Green's function approach are compared with the results obtained using the Finite Element Method (FEM).
Teaching Light Compensation Point: A New Practical Approach.
ERIC Educational Resources Information Center
Aston, T. J.; Robinson, G.
1986-01-01
Describes a simple method for measuring respiration, net photosynthesis, and compensation points of plants in relation to light intensity. Outlines how the method can be used in teaching physiological adaptation. Includes a set of the experiment's results. (ML)
Medina, Cintia Débora; Avila, Luciano Javier; Sites, Jack Walter; Santos, Juan; Morando, Mariana
2018-03-01
We present different approaches to a multi-locus phylogeny for the Liolaemus elongatus-kriegi group, including almost all species and recognized lineages. We sequenced two mitochondrial and five nuclear gene regions for 123 individuals from 35 taxa, and compared relationships resolved from concatenated and species tree methods. The L. elongatus-kriegi group was inferred as monophyletic in three of the five analyses (concatenated mitochondrial, concatenated mitochondrial + nuclear gene trees, and SVD quartet species tree). The mitochondrial gene tree resolved four haploclades, three corresponding to the previously recognized complexes: L. elongatus, L. kriegi and L. petrophilus complexes, and the L. punmahuida group. The BEAST species tree approach included the L. punmahuida group within the L. kriegi complex, but the SVD quartet method placed it as sister to the L. elongatus-kriegi group. BEAST inferred species of the L. elongatus and L. petrophilus complexes as one clade, while SVDquartet inferred these two complexes as monophyletic (although with no statistical support for the L. petrophilus complex). The species tree approach also included the L. punmahuida group as part of the L. elongatus-kriegi group. Our study provides detailed multilocus phylogenetic hypotheses for the L. elongatus-kriegi group, and we discuss possible reasons for differences in the concatenation and species tree methods. Copyright © 2017 Elsevier Inc. All rights reserved.
Theoretical development and first-principles analysis of strongly correlated systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Chen
A variety of quantum many-body methods have been developed for studying the strongly correlated electron systems. We have also proposed a computationally efficient and accurate approach, named the correlation matrix renormalization (CMR) method, to address the challenges. The initial implementation of the CMR method is designed for molecules which have theoretical advantages, including small size of system, manifest mechanism and strongly correlation effect such as bond breaking process. The theoretic development and benchmark tests of the CMR method are included in this thesis. Meanwhile, ground state total energy is the most important property of electronic calculations. We also investigated anmore » alternative approach to calculate the total energy, and extended this method for magnetic anisotropy energy (MAE) of ferromagnetic materials. In addition, another theoretical tool, dynamical mean- field theory (DMFT) on top of the DFT , has also been used in electronic structure calculations for an Iridium oxide to study the phase transition, which results from an interplay of the d electrons' internal degrees of freedom.« less
NASA Astrophysics Data System (ADS)
Yang, Jinping; Li, Peizhen; Yang, Youfa; Xu, Dian
2018-04-01
Empirical mode decomposition (EMD) is a highly adaptable signal processing method. However, the EMD approach has certain drawbacks, including distortions from end effects and mode mixing. In the present study, these two problems are addressed using an end extension method based on the support vector regression machine (SVRM) and a modal decomposition method based on the characteristics of the Hilbert transform. The algorithm includes two steps: using the SVRM, the time series data are extended at both endpoints to reduce the end effects, and then, a modified EMD method using the characteristics of the Hilbert transform is performed on the resulting signal to reduce mode mixing. A new combined static-dynamic method for identifying structural damage is presented. This method combines the static and dynamic information in an equilibrium equation that can be solved using the Moore-Penrose generalized matrix inverse. The combination method uses the differences in displacements of the structure with and without damage and variations in the modal force vector. Tests on a four-story, steel-frame structure were conducted to obtain static and dynamic responses of the structure. The modal parameters are identified using data from the dynamic tests and improved EMD method. The new method is shown to be more accurate and effective than the traditional EMD method. Through tests with a shear-type test frame, the higher performance of the proposed static-dynamic damage detection approach, which can detect both single and multiple damage locations and the degree of the damage, is demonstrated. For structures with multiple damage, the combined approach is more effective than either the static or dynamic method. The proposed EMD method and static-dynamic damage detection method offer improved modal identification and damage detection, respectively, in structures.
ERIC Educational Resources Information Center
Engel, Mimi
2013-01-01
Purpose: Relatively little is known about how principals make decisions about teacher hiring. This article uses mixed methods to examine what characteristics principals look for in teachers. Research Methods: Data were gathered using a mixed method approach, including in-depth interviews with a representative sample of 31 principals as well as an…
Deccache, A
1997-06-01
Health promotion and health education have often been limited to evaluation of the effectiveness of actions and programmes. However, since 1996 with the Third European Conference on Health Promotion and Education Effectiveness, many researchers have become interested in "quality assessment" and new ways of thinking have emerged. Quality assurance is a concept and activity developed in industry with the objective of increasing production efficiency. There are two distinct approaches: External Standard Inspection (ESI) and Continuous Quality Improvement (CQI). ESI involves establishing criteria of quality, evaluating them and improving whatever needs improvement. CQI views the activity or service as a process and includes the quality assessment as part of the process. This article attempts to answer the questions of whether these methods are sufficient and suitable for operationalising the concepts of evaluation, effectiveness and quality in health promotion and education, whether it is necessary to complement them with other methods, and whether the ESI approach is appropriate. The first section of the article explains that health promotion is based on various paradigms from epidemiology to psychology and anthropology. Many authors warn against the exclusive use of public health disciplines for understanding, implementing and evaluating health promotion. The author argues that in practice, health promotion: -integrates preventive actions with those aiming to maintain and improve health, a characteristic which widens the actions of health promotion from those of classic public health which include essentially an epidemiological or "risk" focus; -aims to replace vertical approaches to prevention with a global approach based on educational sciences; -involves a community approach which includes the individual in a "central position of power" as much in the definition of needs as in the evaluation of services; -includes the participation and socio-political actions which necessitate the use of varied and specific instruments for action and evaluation. With the choice of health promotion ideology, there exist corresponding theories, concepts of quality, and therefore methods and techniques that differ from those used until now. The educational sciences have led to a widening of the definition of process to include both "throughput and input", which has meant that the methods of needs analysis, objective and priority setting and project development in health promotion have become objects of quality assessment. Also, the modes of action and interaction among actors are included, which has led to evaluation of ethical and ideological aspects of projects. The second section of the article discusses quality assessment versus evaluation of effectiveness. Different paradigms of evaluation such as the public health approach based on the measurement of (epidemiological) effectiveness, social marketing and communication, and the anthropological approach are briefly discussed, pointing out that there are many approaches which can both complement and contradict one another. The author explains the difference between impact (the intermediate effects, direct or indirect, planned or not planned, changes in practical or theoretical knowledge, perceptions, and attitudes) and results (final effects of mid to long term changes such as changes in morbidity, mortality, or access to services or cost of health care). He argues that by being too concerned with results of programmes, we have often ignored the issue of impact. Also, by limiting ourselves to evaluating effectiveness (i.e. that the expected effects were obtained), we ignore other possible unexpected, unplanned and positive and negative secondary effects. There are therefore many reasons to: -evaluate all possible effects rather than only those lined to objectives; -evaluate the entire process rather than only the resources, procedures and costs; -evaluate the impact rather than results; -evalu
Pluralistic Approaches to Art Criticism.
ERIC Educational Resources Information Center
Blandy, Doug, Ed.; Congdon, Kristin G., Ed.
Contributors to this anthology analyze the contemporary academic methods for critiquing art and suggest new ways that might further the understandings of art created by diverse individuals and groups. Essays are organized into three sections. Part 1, "Changes and Extensions in Critical Approaches" includes essays: (1) "Beyond Universalism in Art…
Three Lectures on Theorem-proving and Program Verification
NASA Technical Reports Server (NTRS)
Moore, J. S.
1983-01-01
Topics concerning theorem proving and program verification are discussed with particlar emphasis on the Boyer/Moore theorem prover, and approaches to program verification such as the functional and interpreter methods and the inductive assertion approach. A history of the discipline and specific program examples are included.
Mueller, Katharina Felicitas; Meerpohl, Joerg J; Briel, Matthias; Antes, Gerd; von Elm, Erik; Lang, Britta; Motschall, Edith; Schwarzer, Guido; Bassler, Dirk
2016-12-01
To systematically review methodological articles which focus on nonpublication of studies and to describe methods of detecting and/or quantifying and/or adjusting for dissemination in meta-analyses. To evaluate whether the methods have been applied to an empirical data set for which one can be reasonably confident that all studies conducted have been included. We systematically searched Medline, the Cochrane Library, and Web of Science, for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for dissemination bias in meta-analyses. The literature search retrieved 2,224 records, of which we finally included 150 full-text articles. A great variety of methods to detect, quantify, or adjust for dissemination bias were described. Methods included graphical methods mainly based on funnel plot approaches, statistical methods, such as regression tests, selection models, sensitivity analyses, and a great number of more recent statistical approaches. Only few methods have been validated in empirical evaluations using unpublished studies obtained from regulators (Food and Drug Administration, European Medicines Agency). We present an overview of existing methods to detect, quantify, or adjust for dissemination bias. It remains difficult to advise which method should be used as they are all limited and their validity has rarely been assessed. Therefore, a thorough literature search remains crucial in systematic reviews, and further steps to increase the availability of all research results need to be taken. Copyright © 2016 Elsevier Inc. All rights reserved.
Phase shifts in I = 2 ππ-scattering from two lattice approaches
NASA Astrophysics Data System (ADS)
Kurth, T.; Ishii, N.; Doi, T.; Aoki, S.; Hatsuda, T.
2013-12-01
We present a lattice QCD study of the phase shift of I = 2 ππ scattering on the basis of two different approaches: the standard finite volume approach by Lüscher and the recently introduced HAL QCD potential method. Quenched QCD simulations are performed on lattices with extents N s = 16 , 24 , 32 , 48 and N t = 128 as well as lattice spacing a ~ 0 .115 fm and a pion mass of m π ~ 940 MeV. The phase shift and the scattering length are calculated in these two methods. In the potential method, the error is dominated by the systematic uncertainty associated with the violation of rotational symmetry due to finite lattice spacing. In Lüscher's approach, such systematic uncertainty is difficult to be evaluated and thus is not included in this work. A systematic uncertainty attributed to the quenched approximation, however, is not evaluated in both methods. In case of the potential method, the phase shift can be calculated for arbitrary energies below the inelastic threshold. The energy dependence of the phase shift is also obtained from Lüscher's method using different volumes and/or nonrest-frame extension of it. The results are found to agree well with the potential method.
Goddard, Katrina A.B.; Knaus, William A.; Whitlock, Evelyn; Lyman, Gary H.; Feigelson, Heather Spencer; Schully, Sheri D.; Ramsey, Scott; Tunis, Sean; Freedman, Andrew N.; Khoury, Muin J.; Veenstra, David L.
2013-01-01
Background The clinical utility is uncertain for many cancer genomic applications. Comparative effectiveness research (CER) can provide evidence to clarify this uncertainty. Objectives To identify approaches to help stakeholders make evidence-based decisions, and to describe potential challenges and opportunities using CER to produce evidence-based guidance. Methods We identified general CER approaches for genomic applications through literature review, the authors’ experiences, and lessons learned from a recent, seven-site CER initiative in cancer genomic medicine. Case studies illustrate the use of CER approaches. Results Evidence generation and synthesis approaches include comparative observational and randomized trials, patient reported outcomes, decision modeling, and economic analysis. We identified significant challenges to conducting CER in cancer genomics: the rapid pace of innovation, the lack of regulation, the limited evidence for clinical utility, and the beliefs that genomic tests could have personal utility without having clinical utility. Opportunities to capitalize on CER methods in cancer genomics include improvements in the conduct of evidence synthesis, stakeholder engagement, increasing the number of comparative studies, and developing approaches to inform clinical guidelines and research prioritization. Conclusions CER offers a variety of methodological approaches to address stakeholders’ needs. Innovative approaches are needed to ensure an effective translation of genomic discoveries. PMID:22516979
Keogh, Ruth H; Daniel, Rhian M; VanderWeele, Tyler J; Vansteelandt, Stijn
2018-05-01
Estimation of causal effects of time-varying exposures using longitudinal data is a common problem in epidemiology. When there are time-varying confounders, which may include past outcomes, affected by prior exposure, standard regression methods can lead to bias. Methods such as inverse probability weighted estimation of marginal structural models have been developed to address this problem. However, in this paper we show how standard regression methods can be used, even in the presence of time-dependent confounding, to estimate the total effect of an exposure on a subsequent outcome by controlling appropriately for prior exposures, outcomes, and time-varying covariates. We refer to the resulting estimation approach as sequential conditional mean models (SCMMs), which can be fitted using generalized estimating equations. We outline this approach and describe how including propensity score adjustment is advantageous. We compare the causal effects being estimated using SCMMs and marginal structural models, and we compare the two approaches using simulations. SCMMs enable more precise inferences, with greater robustness against model misspecification via propensity score adjustment, and easily accommodate continuous exposures and interactions. A new test for direct effects of past exposures on a subsequent outcome is described.
Designs of Empirical Evaluations of Nonexperimental Methods in Field Settings.
Wong, Vivian C; Steiner, Peter M
2018-01-01
Over the last three decades, a research design has emerged to evaluate the performance of nonexperimental (NE) designs and design features in field settings. It is called the within-study comparison (WSC) approach or the design replication study. In the traditional WSC design, treatment effects from a randomized experiment are compared to those produced by an NE approach that shares the same target population. The nonexperiment may be a quasi-experimental design, such as a regression-discontinuity or an interrupted time-series design, or an observational study approach that includes matching methods, standard regression adjustments, and difference-in-differences methods. The goals of the WSC are to determine whether the nonexperiment can replicate results from a randomized experiment (which provides the causal benchmark estimate), and the contexts and conditions under which these methods work in practice. This article presents a coherent theory of the design and implementation of WSCs for evaluating NE methods. It introduces and identifies the multiple purposes of WSCs, required design components, common threats to validity, design variants, and causal estimands of interest in WSCs. It highlights two general approaches for empirical evaluations of methods in field settings, WSC designs with independent and dependent benchmark and NE arms. This article highlights advantages and disadvantages for each approach, and conditions and contexts under which each approach is optimal for addressing methodological questions.
Norris, Peter M; da Silva, Arlindo M
2016-07-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
NASA Technical Reports Server (NTRS)
Norris, Peter M.; Da Silva, Arlindo M.
2016-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
Norris, Peter M.; da Silva, Arlindo M.
2018-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC. PMID:29618847
Musoke, David; Miiro, George; Karani, George; Morris, Keith; Kasasa, Simon; Ndejjo, Rawlance; Nakiyingi-Miiro, Jessica; Guwatudde, David; Musoke, Miph Boses
2015-01-01
Background The World Health Organization recommends use of multiple approaches to control malaria. The integrated approach to malaria prevention advocates the use of several malaria prevention methods in a holistic manner. This study assessed perceptions and practices on integrated malaria prevention in Wakiso district, Uganda. Methods A clustered cross-sectional survey was conducted among 727 households from 29 villages using both quantitative and qualitative methods. Assessment was done on awareness of various malaria prevention methods, potential for use of the methods in a holistic manner, and reasons for dislike of certain methods. Households were classified as using integrated malaria prevention if they used at least two methods. Logistic regression was used to test for factors associated with the use of integrated malaria prevention while adjusting for clustering within villages. Results Participants knew of the various malaria prevention methods in the integrated approach including use of insecticide treated nets (97.5%), removing mosquito breeding sites (89.1%), clearing overgrown vegetation near houses (97.9%), and closing windows and doors early in the evenings (96.4%). If trained, most participants (68.6%) would use all the suggested malaria prevention methods of the integrated approach. Among those who would not use all methods, the main reasons given were there being too many (70.2%) and cost (32.0%). Only 33.0% households were using the integrated approach to prevent malaria. Use of integrated malaria prevention by households was associated with reading newspapers (AOR 0.34; 95% CI 0.22 –0.53) and ownership of a motorcycle/car (AOR 1.75; 95% CI 1.03 – 2.98). Conclusion Although knowledge of malaria prevention methods was high and perceptions on the integrated approach promising, practices on integrated malaria prevention was relatively low. The use of the integrated approach can be improved by promoting use of multiple malaria prevention methods through various communication channels such as mass media. PMID:25837978
SPF Full-scale emissions test method development status ...
This is a non-technical presentation that is intended to inform ASTM task group members about our intended approach to full-scale emissions testing that includes the application of spray foam in an environmental chamber. The presentation describes the approach to emissions characterization, types of measurement systems employed, and expected outcomes from the planned tests. Purpose of this presentation is to update the ASTM D22.05 work group regarding status of our full-scale emissions test method development.
An improved adaptive weighting function method for State Estimation in Power Systems with VSC-MTDC
NASA Astrophysics Data System (ADS)
Zhao, Kun; Yang, Xiaonan; Lang, Yansheng; Song, Xuri; Wang, Minkun; Luo, Yadi; Wu, Lingyun; Liu, Peng
2017-04-01
This paper presents an effective approach for state estimation in power systems that include multi-terminal voltage source converter based high voltage direct current (VSC-MTDC), called improved adaptive weighting function method. The proposed approach is simplified in which the VSC-MTDC system is solved followed by the AC system. Because the new state estimation method only changes the weight and keeps the matrix dimension unchanged. Accurate and fast convergence of AC/DC system can be realized by adaptive weight function method. This method also provides the technical support for the simulation analysis and accurate regulation of AC/DC system. Both the oretical analysis and numerical tests verify practicability, validity and convergence of new method.
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Bai, Yuqiang; Nichols, Jason J
2017-05-01
The thickness of tear film has been investigated under both invasive and non-invasive methods. While invasive methods are largely historical, more recent noninvasive methods are generally based on optical approaches that provide accurate, precise, and rapid measures. Optical microscopy, interferometry, and optical coherence tomography (OCT) have been developed to characterize the thickness of tear film or certain aspects of the tear film (e.g., the lipid layer). This review provides an in-depth overview on contemporary optical techniques used in studying the tear film, including both advantages and limitations of these approaches. It is anticipated that further developments of high-resolution OCT and other interferometric methods will enable a more accurate and precise measurement of the thickness of the tear film and its related dynamic properties. Copyright © 2017 Elsevier Ltd. All rights reserved.
A numerical fragment basis approach to SCF calculations.
NASA Astrophysics Data System (ADS)
Hinde, Robert J.
1997-11-01
The counterpoise method is often used to correct for basis set superposition error in calculations of the electronic structure of bimolecular systems. One drawback of this approach is the need to specify a ``reference state'' for the system; for reactive systems, the choice of an unambiguous reference state may be difficult. An example is the reaction F^- + HCl arrow HF + Cl^-. Two obvious reference states for this reaction are F^- + HCl and HF + Cl^-; however, different counterpoise-corrected interaction energies are obtained using these two reference states. We outline a method for performing SCF calculations which employs numerical basis functions; this method attempts to eliminate basis set superposition errors in an a priori fashion. We test the proposed method on two one-dimensional, three-center systems and discuss the possibility of extending our approach to include electron correlation effects.
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
Generating Dynamic Persistence in the Time Domain
NASA Astrophysics Data System (ADS)
Guerrero, A.; Smith, L. A.; Smith, L. A.; Kaplan, D. T.
2001-12-01
Many dynamical systems present long-range correlations. Physically, these systems vary from biological to economical, including geological or urban systems. Important geophysical candidates for this type of behaviour include weather (or climate) and earthquake sequences. Persistence is characterised by slowly decaying correlation function; that, in theory, never dies out. The Persistence exponent reflects the degree of memory in the system and much effort has been expended creating and analysing methods that successfully estimate this parameter and model data that exhibits persistence. The most widely used methods for generating long correlated time series are not dynamical systems in the time domain, but instead are derived from a given spectral density. Little attention has been drawn to modelling persistence in the time domain. The time domain approach has the advantage that an observation at certain time can be calculated using previous observations which is particularly suitable when investigating the predictability of a long memory process. We will describe two of these methods in the time domain. One is a traditional approach using fractional ARIMA (autoregressive and moving average) models; the second uses a novel approach to extending a given series using random Fourier basis functions. The statistical quality of the two methods is compared, and they are contrasted with weather data which shows, reportedly, persistence. The suitability of this approach both for estimating predictability and for making predictions is discussed.
NASA Technical Reports Server (NTRS)
Gorman, D.; Grant, C.; Kyrias, G.; Lord, C.; Rombach, J. P.; Salis, M.; Skidmore, R.; Thomas, R.
1975-01-01
A sound, practical approach for the assembly and maintenance of very large structures in space is presented. The methods and approaches for assembling two large structures are examined. The maintenance objectives include the investigation of methods to maintain five geosynchronous satellites. The two assembly examples are a 200-meter-diameter radio astronomy telescope and a 1,000-meter-diameter microwave power transmission system. The radio astronomy telescope operates at an 8,000-mile altitude and receives RF signals from space. The microwave power transmission system is part of a solar power satellite that will be used to transmit converted solar energy to microwave ground receivers. Illustrations are included.
Cost approach of health care entity intangible asset valuation.
Reilly, Robert F
2012-01-01
In the valuation synthesis and conclusion process, the analyst should consider the following question: Does the selected valuation approach(es) and method(s) accomplish the analyst's assignment? Also, does the selected valuation approach and method actually quantify the desired objective of the intangible asset analysis? The analyst should also consider if the selected valuation approach and method analyzes the appropriate bundle of legal rights. The analyst should consider if there were sufficient empirical data available to perform the selected valuation approach and method. The valuation synthesis should consider if there were sufficient data available to make the analyst comfortable with the value conclusion. The valuation analyst should consider if the selected approach and method will be understandable to the intended audience. In the valuation synthesis and conclusion, the analyst should also consider which approaches and methods deserve the greatest consideration with respect to the intangible asset's RUL. The intangible asset RUL is a consideration of each valuation approach. In the income approach, the RUL may affect the projection period for the intangible asset income subject to either yield capitalization or direct capitalization. In the cost approach, the RUL may affect the total amount of obsolescence, if any, from the estimate cost measure (that is, the intangible reproduction cost new or replacement cost new). In the market approach, the RUL may effect the selection, rejection, and/or adjustment of the comparable or guideline intangible asset sale and license transactional data. The experienced valuation analyst will use professional judgment to weight the various value indications to conclude a final intangible asset value, based on: The analyst's confidence in the quantity and quality of available data; The analyst's level of due diligence performed on that data; The relevance of the valuation method to the intangible asset life cycle stage and degree of marketability; and The degree of variation in the range of value indications. Valuation analysts value health care intangible assets for a number of reasons. In addition to regulatory compliance reasons, these reasons include various transaction, taxation, financing, litigation, accounting, bankruptcy, and planning purposes. The valuation analyst should consider all generally accepted intangible asset valuation approaches, methods, and procedures. Many valuation analysts are more familiar with market approach and income approach valuation methods. However, there are numerous instances when cost approach valuation methods are also applicable to the health care intangible asset valuation. This discussion summarized the analyst's procedures and considerations with regard to the cost approach. The cost approach is often applicable to the valuation of intangible assets in the health care industry. However, the cost approach is only applicable if the valuation analyst (1) appropriately considers all of the cost components and (2) appropriately identifies and quantifies all obsolescence allowances. Regardless of the health care intangible asset or the reason for the valuation, the analyst should be familiar with all generally accepted valuation approaches and methods. And, the valuation analyst should have a clear, convincing, and cogent rationale for (1) accepting each approach and method applied and (2) rejecting each approach and method not applied. That way, the valuation analyst will best achieve the purpose and objective of the health care intangible asset valuation.
Belger, Mark; Haro, Josep Maria; Reed, Catherine; Happich, Michael; Kahle-Wrobleski, Kristin; Argimon, Josep Maria; Bruno, Giuseppe; Dodel, Richard; Jones, Roy W; Vellas, Bruno; Wimo, Anders
2016-07-18
Missing data are a common problem in prospective studies with a long follow-up, and the volume, pattern and reasons for missing data may be relevant when estimating the cost of illness. We aimed to evaluate the effects of different methods for dealing with missing longitudinal cost data and for costing caregiver time on total societal costs in Alzheimer's disease (AD). GERAS is an 18-month observational study of costs associated with AD. Total societal costs included patient health and social care costs, and caregiver health and informal care costs. Missing data were classified as missing completely at random (MCAR), missing at random (MAR) or missing not at random (MNAR). Simulation datasets were generated from baseline data with 10-40 % missing total cost data for each missing data mechanism. Datasets were also simulated to reflect the missing cost data pattern at 18 months using MAR and MNAR assumptions. Naïve and multiple imputation (MI) methods were applied to each dataset and results compared with complete GERAS 18-month cost data. Opportunity and replacement cost approaches were used for caregiver time, which was costed with and without supervision included and with time for working caregivers only being costed. Total costs were available for 99.4 % of 1497 patients at baseline. For MCAR datasets, naïve methods performed as well as MI methods. For MAR, MI methods performed better than naïve methods. All imputation approaches were poor for MNAR data. For all approaches, percentage bias increased with missing data volume. For datasets reflecting 18-month patterns, a combination of imputation methods provided more accurate cost estimates (e.g. bias: -1 % vs -6 % for single MI method), although different approaches to costing caregiver time had a greater impact on estimated costs (29-43 % increase over base case estimate). Methods used to impute missing cost data in AD will impact on accuracy of cost estimates although varying approaches to costing informal caregiver time has the greatest impact on total costs. Tailoring imputation methods to the reason for missing data will further our understanding of the best analytical approach for studies involving cost outcomes.
A Technical Approach to Marking Explosives, Propellants, and Precursor Chemicals
1998-08-01
polymerase chain reaction (PCR) methods whereby small strands are cut and analyzed under specified temperature mediated enzymatic /molecular reactions (4...such as these are often overlooked. Several other companies have been investigating other methods including immunoassay techniques, microencapsulated
Integrating cell biology and proteomic approaches in plants.
Takáč, Tomáš; Šamajová, Olga; Šamaj, Jozef
2017-10-03
Significant improvements of protein extraction, separation, mass spectrometry and bioinformatics nurtured advancements of proteomics during the past years. The usefulness of proteomics in the investigation of biological problems can be enhanced by integration with other experimental methods from cell biology, genetics, biochemistry, pharmacology, molecular biology and other omics approaches including transcriptomics and metabolomics. This review aims to summarize current trends integrating cell biology and proteomics in plant science. Cell biology approaches are most frequently used in proteomic studies investigating subcellular and developmental proteomes, however, they were also employed in proteomic studies exploring abiotic and biotic stress responses, vesicular transport, cytoskeleton and protein posttranslational modifications. They are used either for detailed cellular or ultrastructural characterization of the object subjected to proteomic study, validation of proteomic results or to expand proteomic data. In this respect, a broad spectrum of methods is employed to support proteomic studies including ultrastructural electron microscopy studies, histochemical staining, immunochemical localization, in vivo imaging of fluorescently tagged proteins and visualization of protein-protein interactions. Thus, cell biological observations on fixed or living cell compartments, cells, tissues and organs are feasible, and in some cases fundamental for the validation and complementation of proteomic data. Validation of proteomic data by independent experimental methods requires development of new complementary approaches. Benefits of cell biology methods and techniques are not sufficiently highlighted in current proteomic studies. This encouraged us to review most popular cell biology methods used in proteomic studies and to evaluate their relevance and potential for proteomic data validation and enrichment of purely proteomic analyses. We also provide examples of representative studies combining proteomic and cell biology methods for various purposes. Integrating cell biology approaches with proteomic ones allow validation and better interpretation of proteomic data. Moreover, cell biology methods remarkably extend the knowledge provided by proteomic studies and might be fundamental for the functional complementation of proteomic data. This review article summarizes current literature linking proteomics with cell biology. Copyright © 2017 Elsevier B.V. All rights reserved.
Environmental Chemicals in Urine and Blood: Improving Methods for Creatinine and Lipid Adjustment
O’Brien, Katie M.; Upson, Kristen; Cook, Nancy R.; Weinberg, Clarice R.
2015-01-01
Background Investigators measuring exposure biomarkers in urine typically adjust for creatinine to account for dilution-dependent sample variation in urine concentrations. Similarly, it is standard to adjust for serum lipids when measuring lipophilic chemicals in serum. However, there is controversy regarding the best approach, and existing methods may not effectively correct for measurement error. Objectives We compared adjustment methods, including novel approaches, using simulated case–control data. Methods Using a directed acyclic graph framework, we defined six causal scenarios for epidemiologic studies of environmental chemicals measured in urine or serum. The scenarios include variables known to influence creatinine (e.g., age and hydration) or serum lipid levels (e.g., body mass index and recent fat intake). Over a range of true effect sizes, we analyzed each scenario using seven adjustment approaches and estimated the corresponding bias and confidence interval coverage across 1,000 simulated studies. Results For urinary biomarker measurements, our novel method, which incorporates both covariate-adjusted standardization and the inclusion of creatinine as a covariate in the regression model, had low bias and possessed 95% confidence interval coverage of nearly 95% for most simulated scenarios. For serum biomarker measurements, a similar approach involving standardization plus serum lipid level adjustment generally performed well. Conclusions To control measurement error bias caused by variations in serum lipids or by urinary diluteness, we recommend improved methods for standardizing exposure concentrations across individuals. Citation O’Brien KM, Upson K, Cook NR, Weinberg CR. 2016. Environmental chemicals in urine and blood: improving methods for creatinine and lipid adjustment. Environ Health Perspect 124:220–227; http://dx.doi.org/10.1289/ehp.1509693 PMID:26219104
Rajaraman, Prathish K; Manteuffel, T A; Belohlavek, M; Heys, Jeffrey J
2017-01-01
A new approach has been developed for combining and enhancing the results from an existing computational fluid dynamics model with experimental data using the weighted least-squares finite element method (WLSFEM). Development of the approach was motivated by the existence of both limited experimental blood velocity in the left ventricle and inexact numerical models of the same flow. Limitations of the experimental data include measurement noise and having data only along a two-dimensional plane. Most numerical modeling approaches do not provide the flexibility to assimilate noisy experimental data. We previously developed an approach that could assimilate experimental data into the process of numerically solving the Navier-Stokes equations, but the approach was limited because it required the use of specific finite element methods for solving all model equations and did not support alternative numerical approximation methods. The new approach presented here allows virtually any numerical method to be used for approximately solving the Navier-Stokes equations, and then the WLSFEM is used to combine the experimental data with the numerical solution of the model equations in a final step. The approach dynamically adjusts the influence of the experimental data on the numerical solution so that more accurate data are more closely matched by the final solution and less accurate data are not closely matched. The new approach is demonstrated on different test problems and provides significantly reduced computational costs compared with many previous methods for data assimilation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Methods for measuring denitrification: Diverse approaches to a difficult problem
Groffman, Peter M; Altabet, Mary A.; Böhlke, J.K.; Butterbach-Bahl, Klaus; David, Mary B.; Firestone, Mary K.; Giblin, Anne E.; Kana, Todd M.; Nielsen , Lars Peter; Voytek, Mary A.
2006-01-01
Denitrification, the reduction of the nitrogen (N) oxides, nitrate (NO3−) and nitrite (NO2−), to the gases nitric oxide (NO), nitrous oxide (N2O), and dinitrogen (N2), is important to primary production, water quality, and the chemistry and physics of the atmosphere at ecosystem, landscape, regional, and global scales. Unfortunately, this process is very difficult to measure, and existing methods are problematic for different reasons in different places at different times. In this paper, we review the major approaches that have been taken to measure denitrification in terrestrial and aquatic environments and discuss the strengths, weaknesses, and future prospects for the different methods. Methodological approaches covered include (1) acetylene-based methods, (2) 15N tracers, (3) direct N2 quantification, (4) N2:Ar ratio quantification, (5) mass balance approaches, (6) stoichiometric approaches, (7) methods based on stable isotopes, (8) in situ gradients with atmospheric environmental tracers, and (9) molecular approaches. Our review makes it clear that the prospects for improved quantification of denitrification vary greatly in different environments and at different scales. While current methodology allows for the production of accurate estimates of denitrification at scales relevant to water and air quality and ecosystem fertility questions in some systems (e.g., aquatic sediments, well-defined aquifers), methodology for other systems, especially upland terrestrial areas, still needs development. Comparison of mass balance and stoichiometric approaches that constrain estimates of denitrification at large scales with point measurements (made using multiple methods), in multiple systems, is likely to propel more improvement in denitrification methods over the next few years.
Phelps, Kevin D; Harmer, Luke S; Crickard, Colin V; Hamid, Nady; Sample, Katherine M; Andrews, Erica B; Seymour, Rachel B; Hsu, Joseph R
2018-06-01
Extensile approaches to the humerus are often needed when treating complex proximal or distal fractures that have extension into the humeral shaft or in those fractures that occur around implants. The 2 most commonly used approaches for more complex fractures include the modified lateral paratricipital approach and the deltopectoral approach with distal anterior extension. Although the former is well described and quantified, the latter is often associated with variable nomenclature with technical descriptions that can be confusing. Furthermore, a method to expose the entire humerus through an anterior extensile approach has not been described. Here, we illustrate and quantify a technique for connecting anterior humeral approaches in a stepwise fashion to form an aggregate anterior approach (AAA). We also describe a method for further distal extension to expose 100% of the length of the humerus and compare this approach with both the AAA and the lateral paratricipital in terms of access to critical bony landmarks, as well as the length and area of bone exposed.
Seismic data fusion anomaly detection
NASA Astrophysics Data System (ADS)
Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David
2014-06-01
Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.
Singh, Gurpreet; Ravi, Koustuban; Wang, Qian; Ho, Seng-Tiong
2012-06-15
A complex-envelope (CE) alternating-direction-implicit (ADI) finite-difference time-domain (FDTD) approach to treat light-matter interaction self-consistently with electromagnetic field evolution for efficient simulations of active photonic devices is presented for the first time (to our best knowledge). The active medium (AM) is modeled using an efficient multilevel system of carrier rate equations to yield the correct carrier distributions, suitable for modeling semiconductor/solid-state media accurately. To include the AM in the CE-ADI-FDTD method, a first-order differential system involving CE fields in the AM is first set up. The system matrix that includes AM parameters is then split into two time-dependent submatrices that are then used in an efficient ADI splitting formula. The proposed CE-ADI-FDTD approach with AM takes 22% of the time as the approach of the corresponding explicit FDTD, as validated by semiconductor microdisk laser simulations.
Structural issues affecting mixed methods studies in health research: a qualitative study
2009-01-01
Background Health researchers undertake studies which combine qualitative and quantitative methods. Little attention has been paid to the structural issues affecting this mixed methods approach. We explored the facilitators and barriers to undertaking mixed methods studies in health research. Methods Face-to-face semi-structured interviews with 20 researchers experienced in mixed methods research in health in the United Kingdom. Results Structural facilitators for undertaking mixed methods studies included a perception that funding bodies promoted this approach, and the multidisciplinary constituency of some university departments. Structural barriers to exploiting the potential of these studies included a lack of education and training in mixed methods research, and a lack of templates for reporting mixed methods articles in peer-reviewed journals. The 'hierarchy of evidence' relating to effectiveness studies in health care research, with the randomised controlled trial as the gold standard, appeared to pervade the health research infrastructure. Thus integration of data and findings from qualitative and quantitative components of mixed methods studies, and dissemination of integrated outputs, tended to occur through serendipity and effort, further highlighting the presence of structural constraints. Researchers are agents who may also support current structures - journal reviewers and editors, and directors of postgraduate training courses - and thus have the ability to improve the structural support for exploiting the potential of mixed methods research. Conclusion The environment for health research in the UK appears to be conducive to mixed methods research but not to exploiting the potential of this approach. Structural change, as well as change in researcher behaviour, will be necessary if researchers are to fully exploit the potential of using mixed methods research. PMID:20003210
Biogenic carbon in combustible waste: waste composition, variability and measurement uncertainty.
Larsen, Anna W; Fuglsang, Karsten; Pedersen, Niels H; Fellner, Johann; Rechberger, Helmut; Astrup, Thomas
2013-10-01
Obtaining accurate data for the contents of biogenic and fossil carbon in thermally-treated waste is essential for determination of the environmental profile of waste technologies. Relations between the variability of waste chemistry and the biogenic and fossil carbon emissions are not well described in the literature. This study addressed the variability of biogenic and fossil carbon in combustible waste received at a municipal solid waste incinerator. Two approaches were compared: (1) radiocarbon dating ((14)C analysis) of carbon dioxide sampled from the flue gas, and (2) mass and energy balance calculations using the balance method. The ability of the two approaches to accurately describe short-term day-to-day variations in carbon emissions, and to which extent these short-term variations could be explained by controlled changes in waste input composition, was evaluated. Finally, the measurement uncertainties related to the two approaches were determined. Two flue gas sampling campaigns at a full-scale waste incinerator were included: one during normal operation and one with controlled waste input. Estimation of carbon contents in the main waste types received was included. Both the (14)C method and the balance method represented promising methods able to provide good quality data for the ratio between biogenic and fossil carbon in waste. The relative uncertainty in the individual experiments was 7-10% (95% confidence interval) for the (14)C method and slightly lower for the balance method.
Economic evaluation: Concepts, selected studies, system costs, and a proposed program
NASA Technical Reports Server (NTRS)
Osterhoudt, F. H. (Principal Investigator)
1979-01-01
The more usual approaches to valuing crop information are reviewed and an integrated approach is recommended. Problems associated with implementation are examined. What has already been accomplished in the economic evaluation of LACIE-type information is reported including various studies of benefits. The costs of the existing and proposed systems are considered. A method and approach is proposed for further studies.
Handbook of Research Methods in Social and Personality Psychology
NASA Astrophysics Data System (ADS)
Reis, Harry T.; Judd, Charles M.
2000-03-01
This volume provides an overview of research methods in contemporary social psychology. Coverage includes conceptual issues in research design, methods of research, and statistical approaches. Because the range of research methods available for social psychology have expanded extensively in the past decade, both traditional and innovative methods are presented. The goal is to introduce new and established researchers alike to new methodological developments in the field.
Three-dimensional Stress Analysis Using the Boundary Element Method
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Banerjee, P. K.
1984-01-01
The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.
Using default methodologies to derive an acceptable daily exposure (ADE).
Faria, Ellen C; Bercu, Joel P; Dolan, David G; Morinello, Eric J; Pecquet, Alison M; Seaman, Christopher; Sehner, Claudia; Weideman, Patricia A
2016-08-01
This manuscript discusses the different historical and more recent default approaches that have been used to derive an acceptable daily exposure (ADE). While it is preferable to derive a health-based ADE based on a complete nonclinical and clinical data package, this is not always possible. For instance, for drug candidates in early development there may be no or limited nonclinical or clinical trial data. Alternative approaches that can support decision making with less complete data packages represent a variety of methods that rely on default assumptions or data inputs where chemical-specific data on health effects are lacking. A variety of default approaches are used including those based on certain toxicity estimates, a fraction of the therapeutic dose, cleaning-based limits, the threshold of toxicological concern (TTC), and application of hazard banding tools such as occupational exposure banding (OEB). Each of these default approaches is discussed in this manuscript, including their derivation, application, strengths, and limitations. In order to ensure patient safety when faced with toxicological and clinical data-gaps, default ADE methods should be purposefully as or more protective than ADEs derived from full data packages. Reliance on the subset of default approaches (e.g., TTC or OEB) that are based on toxicological data is preferred over other methods for establishing ADEs in early development while toxicology and clinical data are still being collected. Copyright © 2016. Published by Elsevier Inc.
Kawaguchi, Tomohiro; Arakawa, Kazuya; Nomura, Kazuhiro; Ogawa, Yoshikazu; Katori, Yukio; Tominaga, Teiji
2017-12-01
Endoscopic endonasal surgery, an innovative surgical technique, is used to approach sinus lesions, lesions of the skull base, and intradural tumors. The cooperation of experienced otolaryngologists and neurosurgeons is important to achieve safe and reliable surgical results. The bath plug closure method is a treatment option for patients with cerebrospinal fluid(CSF)leakage. Although it includes dural and/or intradural procedures, surgery tends to be performed by otolaryngologists because its indications, detailed maneuvers, and pitfalls are not well recognized by neurosurgeons. We reviewed the cases of patients with CSF leakage treated by using the bath plug closure method with an endoscopic endonasal approach at our institution. Three patients were treated using the bath plug closure method. CSF leakage was caused by a meningocele in two cases and trauma in one case. No postoperative intracranial complications or recurrence of CSF leakage were observed. The bath plug closure method is an effective treatment strategy and allows neurosurgeons to gain in-depth knowledge of the treatment options for CSF leakage by using an endoscopic endonasal approach.
ERIC Educational Resources Information Center
Haber-Curran, Paige; Tillapaugh, Daniel
2013-01-01
This qualitative study examines student learning about leadership across three sections of a capstone course in an undergraduate leadership minor. Qualitative methods were informed by exploratory case study analysis and phenomenology. Student-centered and inquiry-focused pedagogical approaches, including case-in-point, action inquiry, and…
Beyond Learning by Doing: The Brain Compatible Approach.
ERIC Educational Resources Information Center
Roberts, Jay W.
2002-01-01
Principles of brain-based learning, including pattern and meaning making, parallel processing, and the role of stress and threat, are explained, along with their connections to longstanding practices of experiential education. The Brain Compatible Approach is one avenue for clarifying to mainstream educators how and why experiential methods are…
Teaching Methodologies for Population Education: Inquiry/Discovery Approach, Values Clarification.
ERIC Educational Resources Information Center
United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and the Pacific.
Divided into two sections, this booklet demonstrates how the discovery/inquiry approach and values clarification can be used to teach population education. Each part presents a theoretical discussion of a teaching method including its definition, its relevance to population education, some outstanding characteristics that make it suitable for…
Qualitative Approaches to Evaluating Education.
ERIC Educational Resources Information Center
Fetterman, David M.
This paper explores the variety of qualitative methods available, in the context of a larger quantitative-qualitative debate in the field of educational evaluation. Each approach is reviewed in terms of the work of its major proponents. The dominant forms of qualitative evaluation include: (1) ethnography; (2) naturalistic inquiry; (3) generic…
Research in Distance Education: A System Modeling Approach.
ERIC Educational Resources Information Center
Saba, Farhad; Twitchell, David
1988-01-01
Describes how a computer simulation research method can be used for studying distance education systems. Topics discussed include systems research in distance education; a technique of model development using the System Dynamics approach and DYNAMO simulation language; and a computer simulation of a prototype model. (18 references) (LRW)
Qualitative approaches to use of the RE-AIM framework: rationale and methods.
Holtrop, Jodi Summers; Rabin, Borsika A; Glasgow, Russell E
2018-03-13
There have been over 430 publications using the RE-AIM model for planning and evaluation of health programs and policies, as well as numerous applications of the model in grant proposals and national programs. Full use of the model includes use of qualitative methods to understand why and how results were obtained on different RE-AIM dimensions, however, recent reviews have revealed that qualitative methods have been used infrequently. Having quantitative and qualitative methods and results iteratively inform each other should enhance understanding and lessons learned. Because there have been few published examples of qualitative approaches and methods using RE-AIM for planning or assessment and no guidance on how qualitative approaches can inform these processes, we provide guidance on qualitative methods to address the RE-AIM model and its various dimensions. The intended audience is researchers interested in applying RE-AIM or similar implementation models, but the methods discussed should also be relevant to those in community or clinical settings. We present directions for, examples of, and guidance on how qualitative methods can be used to address each of the five RE-AIM dimensions. Formative qualitative methods can be helpful in planning interventions and designing for dissemination. Summative qualitative methods are useful when used in an iterative, mixed methods approach for understanding how and why different patterns of results occur. In summary, qualitative and mixed methods approaches to RE-AIM help understand complex situations and results, why and how outcomes were obtained, and contextual factors not easily assessed using quantitative measures.
Rivas, Elena; Lang, Raymond; Eddy, Sean R
2012-02-01
The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.
Rivas, Elena; Lang, Raymond; Eddy, Sean R.
2012-01-01
The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308
Automatic peak selection by a Benjamini-Hochberg-based algorithm.
Abbas, Ahmed; Kong, Xin-Bing; Liu, Zhi; Jing, Bing-Yi; Gao, Xin
2013-01-01
A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into [Formula: see text]-values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx.
Automatic Peak Selection by a Benjamini-Hochberg-Based Algorithm
Abbas, Ahmed; Kong, Xin-Bing; Liu, Zhi; Jing, Bing-Yi; Gao, Xin
2013-01-01
A common issue in bioinformatics is that computational methods often generate a large number of predictions sorted according to certain confidence scores. A key problem is then determining how many predictions must be selected to include most of the true predictions while maintaining reasonably high precision. In nuclear magnetic resonance (NMR)-based protein structure determination, for instance, computational peak picking methods are becoming more and more common, although expert-knowledge remains the method of choice to determine how many peaks among thousands of candidate peaks should be taken into consideration to capture the true peaks. Here, we propose a Benjamini-Hochberg (B-H)-based approach that automatically selects the number of peaks. We formulate the peak selection problem as a multiple testing problem. Given a candidate peak list sorted by either volumes or intensities, we first convert the peaks into -values and then apply the B-H-based algorithm to automatically select the number of peaks. The proposed approach is tested on the state-of-the-art peak picking methods, including WaVPeak [1] and PICKY [2]. Compared with the traditional fixed number-based approach, our approach returns significantly more true peaks. For instance, by combining WaVPeak or PICKY with the proposed method, the missing peak rates are on average reduced by 20% and 26%, respectively, in a benchmark set of 32 spectra extracted from eight proteins. The consensus of the B-H-selected peaks from both WaVPeak and PICKY achieves 88% recall and 83% precision, which significantly outperforms each individual method and the consensus method without using the B-H algorithm. The proposed method can be used as a standard procedure for any peak picking method and straightforwardly applied to some other prediction selection problems in bioinformatics. The source code, documentation and example data of the proposed method is available at http://sfb.kaust.edu.sa/pages/software.aspx. PMID:23308147
Smith, Eric G.
2015-01-01
Background: Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method: A method is presented that exploits the recently-identified phenomenon of “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors. Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure. Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results: Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met. Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations: Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions: To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward. The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226
Insel, Paul A; Amara, Susan G; Blaschke, Terrence F; Meyer, Urs A
2017-01-06
Major advances in scientific discovery and insights can result from the development and use of new techniques, as exemplified by the work of Solomon Snyder, who writes a prefatory article in this volume. The Editors have chosen "New Methods and Novel Therapeutic Approaches in Pharmacology and Toxicology" as the Theme for a number of articles in this volume. These include ones that review the development and use of new experimental tools and approaches (e.g., nanobodies and techniques to explore protein-protein interactions), new types of therapeutics (e.g., aptamers and antisense oligonucleotides), and systems pharmacology, which assembles (big) data derived from omics studies together with information regarding drugs and patients. The application of these new methods and therapeutic approaches has the potential to have a major impact on basic and clinical research in pharmacology and toxicology as well as on patient care.
Anonymizing and Sharing Medical Text Records
Li, Xiao-Bai; Qin, Jialun
2017-01-01
Health information technology has increased accessibility of health and medical data and benefited medical research and healthcare management. However, there are rising concerns about patient privacy in sharing medical and healthcare data. A large amount of these data are in free text form. Existing techniques for privacy-preserving data sharing deal largely with structured data. Current privacy approaches for medical text data focus on detection and removal of patient identifiers from the data, which may be inadequate for protecting privacy or preserving data quality. We propose a new systematic approach to extract, cluster, and anonymize medical text records. Our approach integrates methods developed in both data privacy and health informatics fields. The key novel elements of our approach include a recursive partitioning method to cluster medical text records based on the similarity of the health and medical information and a value-enumeration method to anonymize potentially identifying information in the text data. An experimental study is conducted using real-world medical documents. The results of the experiments demonstrate the effectiveness of the proposed approach. PMID:29569650
Selection of suitable e-learning approach using TOPSIS technique with best ranked criteria weights
NASA Astrophysics Data System (ADS)
Mohammed, Husam Jasim; Kasim, Maznah Mat; Shaharanee, Izwan Nizal Mohd
2017-11-01
This paper compares the performances of four rank-based weighting assessment techniques, Rank Sum (RS), Rank Reciprocal (RR), Rank Exponent (RE), and Rank Order Centroid (ROC) on five identified e-learning criteria to select the best weights method. A total of 35 experts in a public university in Malaysia were asked to rank the criteria and to evaluate five e-learning approaches which include blended learning, flipped classroom, ICT supported face to face learning, synchronous learning, and asynchronous learning. The best ranked criteria weights are defined as weights that have the least total absolute differences with the geometric mean of all weights, were then used to select the most suitable e-learning approach by using TOPSIS method. The results show that RR weights are the best, while flipped classroom approach implementation is the most suitable approach. This paper has developed a decision framework to aid decision makers (DMs) in choosing the most suitable weighting method for solving MCDM problems.
Hey, Jody; Nielsen, Rasmus
2007-01-01
In 1988, Felsenstein described a framework for assessing the likelihood of a genetic data set in which all of the possible genealogical histories of the data are considered, each in proportion to their probability. Although not analytically solvable, several approaches, including Markov chain Monte Carlo methods, have been developed to find approximate solutions. Here, we describe an approach in which Markov chain Monte Carlo simulations are used to integrate over the space of genealogies, whereas other parameters are integrated out analytically. The result is an approximation to the full joint posterior density of the model parameters. For many purposes, this function can be treated as a likelihood, thereby permitting likelihood-based analyses, including likelihood ratio tests of nested models. Several examples, including an application to the divergence of chimpanzee subspecies, are provided. PMID:17301231
IDENTIFICATION OF REGIME SHIFTS IN TIME SERIES USING NEIGHBORHOOD STATISTICS
The identification of alternative dynamic regimes in ecological systems requires several lines of evidence. Previous work on time series analysis of dynamic regimes includes mainly model-fitting methods. We introduce two methods that do not use models. These approaches use state-...
Beginning Reading at All Grade Levels.
ERIC Educational Resources Information Center
Naumann, Nancy
1980-01-01
A third-grade teacher's account of her struggle to determine the most appropriate methods for teaching reading skills includes grouping techniques, methods for creating interest in reading among the students, techniques for diagnosing reading levels, and a fifth dimensional approach to teaching beginning reading. (JN)
2014-01-01
Background To improve quality of care and patient outcomes, health system decision-makers need to identify and implement effective interventions. An increasing number of systematic reviews document the effects of quality improvement programs to assist decision-makers in developing new initiatives. However, limitations in the reporting of primary studies and current meta-analysis methods (including approaches for exploring heterogeneity) reduce the utility of existing syntheses for health system decision-makers. This study will explore the role of innovative meta-analysis approaches and the added value of enriched and updated data for increasing the utility of systematic reviews of complex interventions. Methods/Design We will use the dataset from our recent systematic review of 142 randomized trials of diabetes quality improvement programs to evaluate novel approaches for exploring heterogeneity. These will include exploratory methods, such as multivariate meta-regression analyses and all-subsets combinatorial meta-analysis. We will then update our systematic review to include new trials and enrich the dataset by surveying authors of all included trials. In doing so, we will explore the impact of variables not, reported in previous publications, such as details of study context, on the effectiveness of the intervention. We will use innovative analytical methods on the enriched and updated dataset to identify key success factors in the implementation of quality improvement interventions for diabetes. Decision-makers will be involved throughout to help identify and prioritize variables to be explored and to aid in the interpretation and dissemination of results. Discussion This study will inform future systematic reviews of complex interventions and describe the value of enriching and updating data for exploring heterogeneity in meta-analysis. It will also result in an updated comprehensive systematic review of diabetes quality improvement interventions that will be useful to health system decision-makers in developing interventions to improve outcomes for people with diabetes. Systematic review registration PROSPERO registration no. CRD42013005165 PMID:25115289
Vázquez-Rowe, Ian; Iribarren, Diego
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting.
Vázquez-Rowe, Ian
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting. PMID:25654136
Dave, Vivek S; Gupta, Deepak; Yu, Monica; Nguyen, Phuong; Varghese Gupta, Sheeba
2017-02-01
The Biopharmaceutics Classification System (BCS) classifies pharmaceutical compounds based on their aqueous solubility and intestinal permeability. The BCS Class III compounds are hydrophilic molecules (high aqueous solubility) with low permeability across the biological membranes. While these compounds are pharmacologically effective, poor absorption due to low permeability becomes the rate-limiting step in achieving adequate bioavailability. Several approaches have been explored and utilized for improving the permeability profiles of these compounds. The approaches include traditional methods such as prodrugs, permeation enhancers, ion-pairing, etc., as well as relatively modern approaches such as nanoencapsulation and nanosizing. The most recent approaches include a combination/hybridization of one or more traditional approaches to improve drug permeability. While some of these approaches have been extremely successful, i.e. drug products utilizing the approach have progressed through the USFDA approval for marketing; others require further investigation to be applicable. This article discusses the commonly studied approaches for improving the permeability of BCS Class III compounds.
Inter-departmental dosimetry audits – development of methods and lessons learned
Eaton, David J.; Bolton, Steve; Thomas, Russell A. S.; Clark, Catharine H.
2015-01-01
External dosimetry audits give confidence in the safe and accurate delivery of radiotherapy. In the United Kingdom, such audits have been performed for almost 30 years. From the start, they included clinically relevant conditions, as well as reference machine output. Recently, national audits have tested new or complex techniques, but these methods are then used in regional audits by a peer-to-peer approach. This local approach builds up the radiotherapy community, facilitates communication, and brings synergy to medical physics. PMID:26865753
NASA Technical Reports Server (NTRS)
Kowalski, Marc Edward
2009-01-01
A method for the prediction of time-domain signatures of chafed coaxial cables is presented. The method is quasi-static in nature, and is thus efficient enough to be included in inference and inversion routines. Unlike previous models proposed, no restriction on the geometry or size of the chafe is required in the present approach. The model is validated and its speed is illustrated via comparison to simulations from a commercial, three-dimensional electromagnetic simulator.
Use and misuse of mixed methods in population oral health research: A scoping review.
Gupta, A; Keuskamp, D
2018-05-30
Despite the known benefits of a mixed methods approach in health research, little is known of its use in the field of population oral health. To map the extent of literature using a mixed methods approach to examine population oral health outcomes. For a comprehensive search of all the available literature published in the English language, databases including PubMed, Dentistry and Oral Sciences Source (DOSS), CINAHL, Web of Science and EMBASE (including Medline) were searched using a range of keywords from inception to October 2017. Only peer-reviewed, population-based studies of oral health outcomes conducted among non-institutionalised participants and using mixed methods were considered eligible for inclusion. Only nine studies met the inclusion criteria and were included in the review. The most frequent oral health outcome investigated was caries experience. However, most studies lacked a theoretical rationale or framework for using mixed methods, or supporting the use of qualitative data. Concurrent triangulation with a convergent design was the most commonly used mixed methods typology for integrating quantitative and qualitative data. The tools used to collect quantitative and qualitative data were mostly limited to surveys and interviews. With growing complexity recognised in the determinants of oral disease, future studies addressing population oral health outcomes are likely to benefit from the use of mixed methods. Explicit consideration of theoretical framework and methodology will strengthen those investigations. Copyright© 2018 Dennis Barber Ltd.
A Mixed Methods Approach to Identify Cognitive Warning Signs for Suicide Attempts.
Adler, Abby; Bush, Ashley; Barg, Frances K; Weissinger, Guy; Beck, Aaron T; Brown, Gregory K
2016-01-01
This study used a mixed methods approach to examine pathways to suicidal behavior by identifying cognitive warning signs that occurred within 1 day of a suicide attempt. Transcripts of cognitive therapy sessions from 35 patients who attempted suicide were analyzed using a modified grounded theory approach. Cognitive themes emerging from these transcripts included: state hopelessness, focus on escape, suicide as a solution, fixation on suicide, and aloneness. Differences in demographic and baseline diagnostic and symptom data were explored in relation to each cognitive theme. We propose a potential conceptual model of cognitive warning signs for suicide attempts that requires further testing.
Mixed Methods Designs for Sports Medicine Research.
Kay, Melissa C; Kucera, Kristen L
2018-07-01
Mixed methods research is a relatively new approach in the field of sports medicine, where the benefits of qualitative and quantitative research are combined while offsetting the other's flaws. Despite its known and successful use in other populations, it has been used minimally in sports medicine, including studies of the clinician perspective, concussion, and patient outcomes. Therefore, there is a need for this approach to be applied in other topic areas not easily addressed by one type of research approach in isolation, such as the retirement from sport, effects of and return from injury, and catastrophic injury. Copyright © 2018 Elsevier Inc. All rights reserved.
Nesvizhskii, Alexey I.
2010-01-01
This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881
Nam, Kijoeng; Henderson, Nicholas C; Rohan, Patricia; Woo, Emily Jane; Russek-Cohen, Estelle
2017-01-01
The Vaccine Adverse Event Reporting System (VAERS) and other product surveillance systems compile reports of product-associated adverse events (AEs), and these reports may include a wide range of information including age, gender, and concomitant vaccines. Controlling for possible confounding variables such as these is an important task when utilizing surveillance systems to monitor post-market product safety. A common method for handling possible confounders is to compare observed product-AE combinations with adjusted baseline frequencies where the adjustments are made by stratifying on observable characteristics. Though approaches such as these have proven to be useful, in this article we propose a more flexible logistic regression approach which allows for covariates of all types rather than relying solely on stratification. Indeed, a main advantage of our approach is that the general regression framework provides flexibility to incorporate additional information such as demographic factors and concomitant vaccines. As part of our covariate-adjusted method, we outline a procedure for signal detection that accounts for multiple comparisons and controls the overall Type 1 error rate. To demonstrate the effectiveness of our approach, we illustrate our method with an example involving febrile convulsion, and we further evaluate its performance in a series of simulation studies.
Hubble Space Telescope Angular Velocity Estimation During the Robotic Servicing Mission
NASA Technical Reports Server (NTRS)
Thienel, Julie K.; Sanner, Robert M.
2005-01-01
In 2004 NASA began investigation of a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would require estimates of the HST attitude and rates in order to achieve a capture by the proposed Hubble robotic vehicle (HRV). HRV was to be equipped with vision-based sensors, capable of estimating the relative attitude between HST and HRV. The inertial HST attitude is derived from the measured relative attitude and the HRV computed inertial attitude. However, the relative rate between HST and HRV cannot be measured directly. Therefore, the HST rate with respect to inertial space is not known. Two approaches are developed to estimate the HST rates. Both methods utilize the measured relative attitude and the HRV inertial attitude and rates. First, a nonlinear estimator is developed. The nonlinear approach estimates the HST rate through an estimation of the inertial angular momentum. The development includes an analysis of the estimator stability given errors in the measured attitude. Second, a linearized approach is developed. The linearized approach is a pseudo-linear Kalman filter. Simulation test results for both methods are given, including scenarios with erroneous measured attitudes. Even though the development began as an application for the HST robotic servicing mission, the methods presented are applicable to any rendezvous/capture mission involving a non-cooperative target spacecraft.
Minenkov, Yury; Bistoni, Giovanni; Riplinger, Christoph; Auer, Alexander A; Neese, Frank; Cavallo, Luigi
2017-04-05
In this work, we tested canonical and domain based pair natural orbital coupled cluster methods (CCSD(T) and DLPNO-CCSD(T), respectively) for a set of 32 ligand exchange and association/dissociation reaction enthalpies involving ionic complexes of Li, Be, Na, Mg, Ca, Sr, Ba and Pb(ii). Two strategies were investigated: in the former, only valence electrons were included in the correlation treatment, giving rise to the computationally very efficient FC (frozen core) approach; in the latter, all non-ECP electrons were included in the correlation treatment, giving rise to the AE (all electron) approach. Apart from reactions involving Li and Be, the FC approach resulted in non-homogeneous performance. The FC approach leads to very small errors (<2 kcal mol -1 ) for some reactions of Na, Mg, Ca, Sr, Ba and Pb, while for a few reactions of Ca and Ba deviations up to 40 kcal mol -1 have been obtained. Large errors are both due to artificial mixing of the core (sub-valence) orbitals of metals and the valence orbitals of oxygen and halogens in the molecular orbitals treated as core, and due to neglecting core-core and core-valence correlation effects. These large errors are reduced to a few kcal mol -1 if the AE approach is used or the sub-valence orbitals of metals are included in the correlation treatment. On the technical side, the CCSD(T) and DLPNO-CCSD(T) results differ by a fraction of kcal mol -1 , indicating the latter method as the perfect choice when the CPU efficiency is essential. For completely black-box applications, as requested in catalysis or thermochemical calculations, we recommend the DLPNO-CCSD(T) method with all electrons that are not covered by effective core potentials included in the correlation treatment and correlation-consistent polarized core valence basis sets of cc-pwCVQZ(-PP) quality.
On Multifunctional Collaborative Methods in Engineering Science
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
2001-01-01
Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized.
Laboratory test methods for combustion stability properties of solid propellants
NASA Technical Reports Server (NTRS)
Strand, L. D.; Brown, R. S.
1992-01-01
An overview is presented of experimental methods for determining the combustion-stability properties of solid propellants. The methods are generally based on either the temporal response to an initial disturbance or on external methods for generating the required oscillations. The size distribution of condensed-phase combustion products are characterized by means of the experimental approaches. The 'T-burner' approach is shown to assist in the derivation of pressure-coupled driving contributions and particle damping in solid-propellant rocket motors. Other techniques examined include the rotating-valve apparatus, the impedance tube, the modulated throat-acoustic damping burner, and the magnetic flowmeter. The paper shows that experimental methods do not exist for measuring the interactions between acoustic velocity oscillations and burning propellant.
Engineering large-scale agent-based systems with consensus
NASA Technical Reports Server (NTRS)
Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.
1994-01-01
The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.
NASA Technical Reports Server (NTRS)
Oden, J. Tinsley; Fly, Gerald W.; Mahadevan, L.
1987-01-01
A hybrid stress finite element method is developed for accurate stress and vibration analysis of problems in linear anisotropic elasticity. A modified form of the Hellinger-Reissner principle is formulated for dynamic analysis and an algorithm for the determination of the anisotropic elastic and compliance constants from experimental data is developed. These schemes were implemented in a finite element program for static and dynamic analysis of linear anisotropic two dimensional elasticity problems. Specific numerical examples are considered to verify the accuracy of the hybrid stress approach and compare it with that of the standard displacement method, especially for highly anisotropic materials. It is that the hybrid stress approach gives much better results than the displacement method. Preliminary work on extensions of this method to three dimensional elasticity is discussed, and the stress shape functions necessary for this extension are included.
SEE rate estimation based on diffusion approximation of charge collection
NASA Astrophysics Data System (ADS)
Sogoyan, Armen V.; Chumakov, Alexander I.; Smolin, Anatoly A.
2018-03-01
The integral rectangular parallelepiped (IRPP) method remains the main approach to single event rate (SER) prediction for aerospace systems, despite the growing number of issues impairing method's validity when applied to scaled technology nodes. One of such issues is uncertainty in parameters extraction in the IRPP method, which can lead to a spread of several orders of magnitude in the subsequently calculated SER. The paper presents an alternative approach to SER estimation based on diffusion approximation of the charge collection by an IC element and geometrical interpretation of SEE cross-section. In contrast to the IRPP method, the proposed model includes only two parameters which are uniquely determined from the experimental data for normal incidence irradiation at an ion accelerator. This approach eliminates the necessity of arbitrary decisions during parameter extraction and, thus, greatly simplifies calculation procedure and increases the robustness of the forecast.
Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C
2018-04-01
A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.
NASA Astrophysics Data System (ADS)
Drwal, Malgorzata N.; Agama, Keli; Pommier, Yves; Griffith, Renate
2013-12-01
Purely structure-based pharmacophores (SBPs) are an alternative method to ligand-based approaches and have the advantage of describing the entire interaction capability of a binding pocket. Here, we present the development of SBPs for topoisomerase I, an anticancer target with an unusual ligand binding pocket consisting of protein and DNA atoms. Different approaches to cluster and select pharmacophore features are investigated, including hierarchical clustering and energy calculations. In addition, the performance of SBPs is evaluated retrospectively and compared to the performance of ligand- and complex-based pharmacophores. SBPs emerge as a valid method in virtual screening and a complementary approach to ligand-focussed methods. The study further reveals that the choice of pharmacophore feature clustering and selection methods has a large impact on the virtual screening hit lists. A prospective application of the SBPs in virtual screening reveals that they can be used successfully to identify novel topoisomerase inhibitors.
ERIC Educational Resources Information Center
Seilheimer, Steven D.
1988-01-01
Outlines procedures for developing a microcomputer laboratory for use by students in an academic organization, based on experiences at Niagara University. The four phases described include: (1) needs assessment; (2) establishment, including software and hardware selection and physical facilities; (3) operation, including staffing, maintenance,…
A Structural Modeling Approach to a Multilevel Random Coefficients Model.
ERIC Educational Resources Information Center
Rovine, Michael J.; Molenaar, Peter C. M.
2000-01-01
Presents a method for estimating the random coefficients model using covariance structure modeling and allowing one to estimate both fixed and random effects. The method is applied to real and simulated data, including marriage data from J. Belsky and M. Rovine (1990). (SLD)
DOT National Transportation Integrated Search
1997-08-01
An experimental construction method was evaluated at the Lost River Bridge in Klamath County to reduce the discontinuity between the bridge and the roadway. The method included combining soil in six 300-mm lifts interlaced with geotextile reinforceme...
Teaching Geographic Field Methods Using Paleoecology
ERIC Educational Resources Information Center
Walsh, Megan K.
2014-01-01
Field-based undergraduate geography courses provide numerous pedagogical benefits including an opportunity for students to acquire employable skills in an applied context. This article presents one unique approach to teaching geographic field methods using paleoecological research. The goals of this course are to teach students key geographic…
Silva, Nuno Miguel; Rio, Jeremy; Currat, Mathias
2017-12-15
Recent advances in sequencing technologies have allowed for the retrieval of ancient DNA data (aDNA) from skeletal remains, providing direct genetic snapshots from diverse periods of human prehistory. Comparing samples taken in the same region but at different times, hereafter called "serial samples", may indicate whether there is continuity in the peopling history of that area or whether an immigration of a genetically different population has occurred between the two sampling times. However, the exploration of genetic relationships between serial samples generally ignores their geographical locations and the spatiotemporal dynamics of populations. Here, we present a new coalescent-based, spatially explicit modelling approach to investigate population continuity using aDNA, which includes two fundamental elements neglected in previous methods: population structure and migration. The approach also considers the extensive temporal and geographical variance that is commonly found in aDNA population samples. We first showed that our spatially explicit approach is more conservative than the previous (panmictic) approach and should be preferred to test for population continuity, especially when small and isolated populations are considered. We then applied our method to two mitochondrial datasets from Germany and France, both including modern and ancient lineages dating from the early Neolithic. The results clearly reject population continuity for the maternal line over the last 7500 years for the German dataset but not for the French dataset, suggesting regional heterogeneity in post-Neolithic migratory processes. Here, we demonstrate the benefits of using a spatially explicit method when investigating population continuity with aDNA. It constitutes an improvement over panmictic methods by considering the spatiotemporal dynamics of genetic lineages and the precise location of ancient samples. The method can be used to investigate population continuity between any pair of serial samples (ancient-ancient or ancient-modern) and to investigate more complex evolutionary scenarios. Although we based our study on mitochondrial DNA sequences, diploid molecular markers of different types (DNA, SNP, STR) can also be simulated with our approach. It thus constitutes a promising tool for the analysis of the numerous aDNA datasets being produced, including genome wide data, in humans but also in many other species.
A biologically inspired neural network for dynamic programming.
Francelin Romero, R A; Kacpryzk, J; Gomide, F
2001-12-01
An artificial neural network with a two-layer feedback topology and generalized recurrent neurons, for solving nonlinear discrete dynamic optimization problems, is developed. A direct method to assign the weights of neural networks is presented. The method is based on Bellmann's Optimality Principle and on the interchange of information which occurs during the synaptic chemical processing among neurons. The neural network based algorithm is an advantageous approach for dynamic programming due to the inherent parallelism of the neural networks; further it reduces the severity of computational problems that can occur in methods like conventional methods. Some illustrative application examples are presented to show how this approach works out including the shortest path and fuzzy decision making problems.
Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals
Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.
2016-01-01
Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116
A comparison of methods for DPLL loop filter design
NASA Technical Reports Server (NTRS)
Aguirre, S.; Hurd, W. J.; Kumar, R.; Statman, J.
1986-01-01
Four design methodologies for loop filters for a class of digital phase-locked loops (DPLLs) are presented. The first design maps an optimum analog filter into the digital domain; the second approach designs a filter that minimizes in discrete time weighted combination of the variance of the phase error due to noise and the sum square of the deterministic phase error component; the third method uses Kalman filter estimation theory to design a filter composed of a least squares fading memory estimator and a predictor. The last design relies on classical theory, including rules for the design of compensators. Linear analysis is used throughout the article to compare different designs, and includes stability, steady state performance and transient behavior of the loops. Design methodology is not critical when the loop update rate can be made high relative to loop bandwidth, as the performance approaches that of continuous time. For low update rates, however, the miminization method is significantly superior to the other methods.
Grošelj, Petra; Zadnik Stirn, Lidija
2015-09-15
Environmental management problems can be dealt with by combining participatory methods, which make it possible to include various stakeholders in a decision-making process, and multi-criteria methods, which offer a formal model for structuring and solving a problem. This paper proposes a three-phase decision making approach based on the analytic network process and SWOT (strengths, weaknesses, opportunities and threats) analysis. The approach enables inclusion of various stakeholders or groups of stakeholders in particular stages of decision making. The structure of the proposed approach is composed of a network consisting of an objective cluster, a cluster of strategic goals, a cluster of SWOT factors and a cluster of alternatives. The application of the suggested approach is applied to a management problem of Pohorje, a mountainous area in Slovenia. Stakeholders from sectors that are important for Pohorje (forestry, agriculture, tourism and nature protection agencies) who can offer a wide range of expert knowledge were included in the decision-making process. The results identify the alternative of "sustainable development" as the most appropriate for development of Pohorje. The application in the paper offers an example of employing the new approach to an environmental management problem. This can also be applied to decision-making problems in various other fields. Copyright © 2015 Elsevier Ltd. All rights reserved.
A multi-method approach toward de novo glycan characterization: a Man-5 case study.
Prien, Justin M; Prater, Bradley D; Cockrill, Steven L
2010-05-01
Regulatory agencies' expectations for biotherapeutic approval are becoming more stringent with regard to product characterization, where minor species as low as 0.1% of a given profile are typically identified. The mission of this manuscript is to demonstrate a multi-method approach toward de novo glycan characterization and quantitation, including minor species at or approaching the 0.1% benchmark. Recently, unexpected isomers of the Man(5)GlcNAc(2) (M(5)) were reported (Prien JM, Ashline DJ, Lapadula AJ, Zhang H, Reinhold VN. 2009. The high mannose glycans from bovine ribonuclease B isomer characterization by ion trap mass spectrometry (MS). J Am Soc Mass Spectrom. 20:539-556). In the current study, quantitative analysis of these isomers found in commercial M(5) standard demonstrated that they are in low abundance (<1% of the total) and therefore an exemplary "litmus test" for minor species characterization. A simple workflow devised around three core well-established analytical procedures: (1) fluorescence derivatization; (2) online rapid resolution reversed-phase separation coupled with negative-mode sequential mass spectrometry (RRRP-(-)-MS(n)); and (3) permethylation derivatization with nanospray sequential mass spectrometry (NSI-MS(n)) provides comprehensive glycan structural determination. All methods have limitations; however, a multi-method workflow is an at-line stopgap/solution which mitigates each method's individual shortcoming(s) providing greater opportunity for more comprehensive characterization. This manuscript is the first to demonstrate quantitative chromatographic separation of the M(5) isomers and the use of a commercially available stable isotope variant of 2-aminobenzoic acid to detect and chromatographically resolve multiple M(5) isomers in bovine ribonuclease B. With this multi-method approach, we have the capabilities to comprehensively characterize a biotherapeutic's glycan array in a de novo manner, including structural isomers at >/=0.1% of the total chromatographic peak area.
Searching for life in the Universe: unconventional methods for an unconventional problem.
Nealson, K H; Tsapin, A; Storrie-Lombardi, M
2002-12-01
The search for life, on and off our planet, can be done by conventional methods with which we are all familiar. These methods are sensitive and specific, and are often capable of detecting even single cells. However, if the search broadens to include life that may be different (even subtly different) in composition, the methods and even the approach must be altered. Here we discuss the development of what we call non-earthcentric life detection--detecting life with methods that could detect life no matter what its form or composition. To develop these methods, we simply ask, can we define life in terms of its general properties and particularly those that can be measured and quantified? Taking such an approach we can search for life using physics and chemistry to ask questions about structure, chemical composition, thermodynamics, and kinetics. Structural complexity can be searched for using computer algorithms that recognize complex structures. Once identified, these structures can be examined for a variety of chemical traits, including elemental composition, chirality, and complex chemistry. A second approach involves defining our environment in terms of energy sources (i.e., reductants), and oxidants (e.g. what is available to eat and breathe), and then looking for areas in which such phenomena are inexplicably out of chemical equilibrium. These disequilibria, when found, can then be examined in detail for the presence of the structural and chemical complexity that presumably characterizes any living systems. By this approach, we move the search for life to one that should facilitate the detection of any earthly life it encountered, as well as any non-conventional life forms that have structure, complex chemistry, and live via some form of redox chemistry.
Validity of using ad hoc methods to analyze secondary traits in case-control association studies.
Yung, Godwin; Lin, Xihong
2016-12-01
Case-control association studies often collect from their subjects information on secondary phenotypes. Reusing the data and studying the association between genes and secondary phenotypes provide an attractive and cost-effective approach that can lead to discovery of new genetic associations. A number of approaches have been proposed, including simple and computationally efficient ad hoc methods that ignore ascertainment or stratify on case-control status. Justification for these approaches relies on the assumption of no covariates and the correct specification of the primary disease model as a logistic model. Both might not be true in practice, for example, in the presence of population stratification or the primary disease model following a probit model. In this paper, we investigate the validity of ad hoc methods in the presence of covariates and possible disease model misspecification. We show that in taking an ad hoc approach, it may be desirable to include covariates that affect the primary disease in the secondary phenotype model, even though these covariates are not necessarily associated with the secondary phenotype. We also show that when the disease is rare, ad hoc methods can lead to severely biased estimation and inference if the true disease model follows a probit model instead of a logistic model. Our results are justified theoretically and via simulations. Applied to real data analysis of genetic associations with cigarette smoking, ad hoc methods collectively identified as highly significant (P<10-5) single nucleotide polymorphisms from over 10 genes, genes that were identified in previous studies of smoking cessation. © 2016 WILEY PERIODICALS, INC.
Intercomparison of 3D pore-scale flow and solute transport simulation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.
2016-09-01
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include methods that 1) explicitly model the three-dimensional geometry of pore spaces and 2) those that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of class 1, based on direct numerical simulation using computational fluid dynamics (CFD) codes, against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of class 1 based on the immersed-boundary method (IMB),more » lattice Boltzmann method (LBM), smoothed particle hydrodynamics (SPH), as well as a model of class 2 (a pore-network model or PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and nonreactive solute transport, and intercompare the model results with previously reported experimental observations. Experimental observations are limited to measured pore-scale velocities, so solute transport comparisons are made only among the various models. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations).« less
Delgado, Alejandra; Posada-Ureta, Oscar; Olivares, Maitane; Vallejo, Asier; Etxebarria, Nestor
2013-12-15
In this study a priority organic pollutants usually found in environmental water samples were considered to accomplish two extraction and analysis approaches. Among those compounds organochlorine compounds, pesticides, phthalates, phenols and residues of pharmaceutical and personal care products were included. The extraction and analysis steps were based on silicone rod extraction (SR) followed by liquid desorption in combination with large volume injection-programmable temperature vaporiser (LVI-PTV) and gas chromatography-mass spectrometry (GC-MS). Variables affecting the analytical response as a function of the programmable temperature vaporiser (PTV) parameters were firstly optimised following an experimental design approach. The SR extraction and desorption conditions were assessed afterwards, including matrix modification, time extraction, and stripping solvent composition. Subsequently, the possibility of performing membrane enclosed sorptive coating extraction (MESCO) as a modified extraction approach was also evaluated. The optimised method showed low method detection limits (3-35 ng L(-1)), acceptable accuracy (78-114%) and precision values (<13%) for most of the studied analytes regardless of the aqueous matrix. Finally, the developed approach was successfully applied to the determination of target analytes in aqueous environmental matrices including estuarine and wastewater samples. © 2013 Elsevier B.V. All rights reserved.
Prediction of protein post-translational modifications: main trends and methods
NASA Astrophysics Data System (ADS)
Sobolev, B. N.; Veselovsky, A. V.; Poroikov, V. V.
2014-02-01
The review summarizes main trends in the development of methods for the prediction of protein post-translational modifications (PTMs) by considering the three most common types of PTMs — phosphorylation, acetylation and glycosylation. Considerable attention is given to general characteristics of regulatory interactions associated with PTMs. Different approaches to the prediction of PTMs are analyzed. Most of the methods are based only on the analysis of the neighbouring environment of modification sites. The related software is characterized by relatively low accuracy of PTM predictions, which may be due both to the incompleteness of training data and the features of PTM regulation. Advantages and limitations of the phylogenetic approach are considered. The prediction of PTMs using data on regulatory interactions, including the modular organization of interacting proteins, is a promising field, provided that a more carefully selected training data will be used. The bibliography includes 145 references.
Understanding poisson regression.
Hayat, Matthew J; Higgins, Melinda
2014-04-01
Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.
USDA-ARS?s Scientific Manuscript database
The brown sugar flotation and hot water methods are accepted procedures for detecting larval western cherry fruit fly, Rhagoletis indifferens Curran, in sweet cherry [Prunus avium (L.) L.] and could be included in a systems approach for showing the absence of larvae in fruit. The methods require cr...
Teaching and the Case Method. Text, Cases, and Readings. Third Edition.
ERIC Educational Resources Information Center
Barnes, Louis B.; And Others
This volume includes text, cases, and readings for a college faculty seminar to develop the knowledge, skills, and attitudes necessary for utilization of the case method approach to instruction. It builds on a long-term clinical research effort on the dynamics of the case method of teaching and application at Harvard Business School. In addition…
ERIC Educational Resources Information Center
Genemo, Hussein; Miah, Shah Jahan; McAndrew, Alasdair
2016-01-01
Assessment has been defined as an authentic method that plays an important role in evaluating students' learning attitude in acquiring lifelong knowledge. Traditional methods of assessment including the Computer-Aided Assessment (CAA) for mathematics show limited ability to assess students' full work unless multi-step questions are sub-divided…
The child with developmental delay: An approach to etiology
Meschino, Wendy S
2003-01-01
OBJECTIVE: To describe an approach to history, physical examination and investigation for the developmentally delayed child. METHODS: A review of electronic databases from 1997 to 2001 was done searching for articles relating to the approach to or investigations of children with developmental delay. Five studies, including a review of a consensus conference on evaluation of mental retardation, were chosen because of their general approaches to developmental delay and/or mental retardation, or specific evaluations of a particular laboratory investigation. CONCLUSIONS: A diagnosis or cause of mental retardation can be identified in 20% to 60% of cases. Evaluation of the developmentally delayed child should include a detailed history and physical examination, taking special care to record a three-generation pedigree, as well as to look for dysmorphic features. If no other cause is apparent, routine investigations should include a chromosome study and fragile X studies. Further investigations are warranted depending on the clinical features. PMID:20011550
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Leung, Martin S. K.
1995-01-01
The objective of this research effort was to develop a real-time guidance approach for launch vehicles ascent to orbit injection. Various analytical approaches combined with a variety of model order and model complexity reduction have been investigated. Singular perturbation methods were first attempted and found to be unsatisfactory. The second approach based on regular perturbation analysis was subsequently investigated. It also fails because the aerodynamic effects (ignored in the zero order solution) are too large to be treated as perturbations. Therefore, the study demonstrates that perturbation methods alone (both regular and singular perturbations) are inadequate for use in developing a guidance algorithm for the atmospheric flight phase of a launch vehicle. During a second phase of the research effort, a hybrid analytic/numerical approach was developed and evaluated. The approach combines the numerical methods of collocation and the analytical method of regular perturbations. The concept of choosing intelligent interpolating functions is also introduced. Regular perturbation analysis allows the use of a crude representation for the collocation solution, and intelligent interpolating functions further reduce the number of elements without sacrificing the approximation accuracy. As a result, the combined method forms a powerful tool for solving real-time optimal control problems. Details of the approach are illustrated in a fourth order nonlinear example. The hybrid approach is then applied to the launch vehicle problem. The collocation solution is derived from a bilinear tangent steering law, and results in a guidance solution for the entire flight regime that includes both atmospheric and exoatmospheric flight phases.
Toyota Prius HEV neurocontrol and diagnostics.
Prokhorov, Danil V
2008-01-01
A neural network controller for improved fuel efficiency of the Toyota Prius hybrid electric vehicle is proposed. A new method to detect and mitigate a battery fault is also presented. The approach is based on recurrent neural networks and includes the extended Kalman filter. The proposed approach is quite general and applicable to other control systems.
Children's Perspectives on Conceptual Games Teaching: A Value-Adding Experience
ERIC Educational Resources Information Center
Fry, Joan Marian; Tan, Clara Wee Keat; McNeill, Michael; Wright, Steven
2010-01-01
Background: Revisions of the Singaporean physical education (PE) syllabus in 1999 and 2006 included a conceptual approach to teaching games. The games concept approach (GCA), a form of constructivist pedagogy, was a distinct departure from the direct teaching methods traditionally used in the country. Following the GCA's introduction into a PE…
USDA-ARS?s Scientific Manuscript database
This paper describes a method for the detection and quantification of 38 of the most widely used anthelmintics (including benzimidazoles, macrocyclic lactones and flukicides) in bovine liver at MRL and non-MRL level. A dual validation approach was adapted to reliably detect anthelmintic residues ov...
Comparison of Modern Methods for Analyzing Repeated Measures Data with Missing Values
ERIC Educational Resources Information Center
Vallejo, G.; Fernandez, M. P.; Livacic-Rojas, P. E.; Tuero-Herrero, E.
2011-01-01
Missing data are a pervasive problem in many psychological applications in the real world. In this article we study the impact of dropout on the operational characteristics of several approaches that can be easily implemented with commercially available software. These approaches include the covariance pattern model based on an unstructured…
Different perspectives on economic base.
Lisa K. Crone; Richard W. Haynes; Nicholas E. Reyna
1999-01-01
Two general approaches for measuring the economic base are discussed. Each method is used to define the economic base for each of the counties included in the Interior Columbia Basin Ecosystem Management Project area. A more detailed look at four selected counties results in similar findings from different approaches. Limitations of economic base analysis also are...
ERIC Educational Resources Information Center
Kunkle, Wanda M.
2010-01-01
Many students experience difficulties learning to program. They find learning to program in the object-oriented paradigm particularly challenging. As a result, computing educators have tried a variety of instructional methods to assist beginning programmers. These include developing approaches geared specifically toward novices and experimenting…
ERIC Educational Resources Information Center
Perry, Thomas
2017-01-01
Value-added (VA) measures are currently the predominant approach used to compare the effectiveness of schools. Recent educational effectiveness research, however, has developed alternative approaches including the regression discontinuity (RD) design, which also allows estimation of absolute school effects. Initial research suggests RD is a viable…
On Seeing Red with the "Silent Way".
ERIC Educational Resources Information Center
Seely, Jonathan
As a learner-oriented approach, Gattegno's "Silent Way" has recently been receiving much attention in the teaching of English as a second language. Whereas the cognitive approach to the teaching of language deserves praise, an integral aspect of Gattegno's method includes the introduction of a 37-color alphabet, used in a one-to-one…
Lessons Learned from the Whole Child and Coordinated School Health Approaches
ERIC Educational Resources Information Center
Rasberry, Catherine N.; Slade, Sean; Lohrmann, David K.; Valois, Robert F.
2015-01-01
Background: The new Whole School, Whole Community, Whole Child (WSCC) model, designed to depict links between health and learning, is founded on concepts of coordinated school health (CSH) and a whole child approach to education. Methods: The existing literature, including scientific articles and key publications from national agencies and…
NASA Technical Reports Server (NTRS)
Webb, J. T.
1988-01-01
A new approach to the training, certification, recertification, and proficiency maintenance of the Shuttle launch team is proposed. Previous training approaches are first reviewed. Short term program goals include expanding current training methods, improving the existing simulation capability, and scheduling training exercises with the same priority as hardware tests. Long-term goals include developing user requirements which would take advantage of state-of-the-art tools and techniques. Training requirements for the different groups of people to be trained are identified, and future goals are outlined.
NASA Technical Reports Server (NTRS)
Schweikhard, W. G.; Dennon, S. R.
1986-01-01
A review of the Melick method of inlet flow dynamic distortion prediction by statistical means is provided. These developments include the general Melick approach with full dynamic measurements, a limited dynamic measurement approach, and a turbulence modelling approach which requires no dynamic rms pressure fluctuation measurements. These modifications are evaluated by comparing predicted and measured peak instantaneous distortion levels from provisional inlet data sets. A nonlinear mean-line following vortex model is proposed and evaluated as a potential criterion for improving the peak instantaneous distortion map generated from the conventional linear vortex of the Melick method. The model is simplified to a series of linear vortex segments which lay along the mean line. Maps generated with this new approach are compared with conventionally generated maps, as well as measured peak instantaneous maps. Inlet data sets include subsonic, transonic, and supersonic inlets under various flight conditions.
Fuzzy set methods for object recognition in space applications
NASA Technical Reports Server (NTRS)
Keller, James M.
1992-01-01
Progress on the following tasks is reported: feature calculation; membership calculation; clustering methods (including initial experiments on pose estimation); and acquisition of images (including camera calibration information for digitization of model). The report consists of 'stand alone' sections, describing the activities in each task. We would like to highlight the fact that during this quarter, we believe that we have made a major breakthrough in the area of fuzzy clustering. We have discovered a method to remove the probabilistic constraints that the sum of the memberships across all classes must add up to 1 (as in the fuzzy c-means). A paper, describing this approach, is included.
ERIC Educational Resources Information Center
Shannon, Kathleen
2018-01-01
This paper describes, as an alternative to the Moore Method or a purely flipped classroom, a student-driven, textbook-supported method for teaching that allows movement through the standard course material with differing depths, but the same pace. This method, which includes a combination of board work followed by class discussion, on-demand brief…
"Tools For Analysis and Visualization of Large Time- Varying CFD Data Sets"
NASA Technical Reports Server (NTRS)
Wilhelms, Jane; vanGelder, Allen
1999-01-01
During the four years of this grant (including the one year extension), we have explored many aspects of the visualization of large CFD (Computational Fluid Dynamics) datasets. These have included new direct volume rendering approaches, hierarchical methods, volume decimation, error metrics, parallelization, hardware texture mapping, and methods for analyzing and comparing images. First, we implemented an extremely general direct volume rendering approach that can be used to render rectilinear, curvilinear, or tetrahedral grids, including overlapping multiple zone grids, and time-varying grids. Next, we developed techniques for associating the sample data with a k-d tree, a simple hierarchial data model to approximate samples in the regions covered by each node of the tree, and an error metric for the accuracy of the model. We also explored a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH (Association for Computing Machinery Special Interest Group on Computer Graphics) '96. In our initial implementation, we automatically image the volume from 32 approximately evenly distributed positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation.
Catallo, Cristina; Jack, Susan M.; Ciliska, Donna; MacMillan, Harriet L.
2013-01-01
Little is known about how to systematically integrate complex qualitative studies within the context of randomized controlled trials. A two-phase sequential explanatory mixed methods study was conducted in Canada to understand how women decide to disclose intimate partner violence in emergency department settings. Mixing a RCT (with a subanalysis of data) with a grounded theory approach required methodological modifications to maintain the overall rigour of this mixed methods study. Modifications were made to the following areas of the grounded theory approach to support the overall integrity of the mixed methods study design: recruitment of participants, maximum variation and negative case sampling, data collection, and analysis methods. Recommendations for future studies include: (1) planning at the outset to incorporate a qualitative approach with a RCT and to determine logical points during the RCT to integrate the qualitative component and (2) consideration for the time needed to carry out a RCT and a grounded theory approach, especially to support recruitment, data collection, and analysis. Data mixing strategies should be considered during early stages of the study, so that appropriate measures can be developed and used in the RCT to support initial coding structures and data analysis needs of the grounded theory phase. PMID:23577245
How to choose methods for lake greenhouse gas flux measurements?
NASA Astrophysics Data System (ADS)
Bastviken, David
2017-04-01
Lake greenhouse gas (GHG) fluxes are increasingly recognized as important for lake ecosystems as well as for large scale carbon and GHG budgets. However, many of our flux estimates are uncertain and it can be discussed if the presently available data is representative for the systems studied or not. Data are also very limited for some important flux pathways. Hence, many ongoing efforts try to better constrain fluxes and understand flux regulation. A fundamental challenge towards improved knowledge and when starting new studies is what methods to choose. A variety of approaches to measure aquatic GHG exchange is used and data from different methods and methodological approaches have often been treated as equally valid to create large datasets for extrapolations and syntheses. However, data from different approaches may cover different flux pathways or spatio-temporal domains and are thus not always comparable. Method inter-comparisons and critical method evaluations addressing these issues are rare. Emerging efforts to organize systematic multi-lake monitoring networks for GHG fluxes leads to method choices that may set the foundation for decades of data generation and therefore require fundamental evaluation of different approaches. The method choices do not only regard the equipment but also for example consideration of overall measurement design and field approaches, relevant spatial and temporal resolution for different flux components, and accessory variables to measure. In addition, consideration of how to design monitoring approaches being affordable, suitable for widespread (global) use, and comparable across regions is needed. Inspired by discussions with Prof. Dr. Cristian Blodau during the EGU General Assembly 2016, this presentation aims to (1) illustrate fundamental pros and cons for a number of common methods, (2) show how common methodological approaches originally adapted for other environments can be improved for lake flux measurements, (3) suggest how consideration of spatio-temporal dimensions of flux variability can lead to more optimized approaches, and (4) highlight possibilities of efficient ways forward including low-cost technologies that has potential for world-wide use.
Multifidelity Analysis and Optimization for Supersonic Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory
2010-01-01
Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.
Basic analytical methods for identification of erythropoiesis-stimulating agents in doping control
NASA Astrophysics Data System (ADS)
Postnikov, P. V.; Krotov, G. I.; Efimova, Yu A.; Rodchenkov, G. M.
2016-02-01
The design of new erythropoiesis-stimulating agents for clinical use necessitates constant development of methods for detecting the abuse of these substances, which are prohibited under the World Anti-Doping Code and are included in the World Anti-Doping Agency (WADA) prohibited list. This review integrates and describes systematically the published data on the key methods currently used by WADA-accredited anti-doping laboratories around the world to detect the abuse of erythropoiesis-stimulating agents, including direct methods (various polyacrylamide gel electrophoresis techniques, enzyme-linked immunosorbent assay, membrane enzyme immunoassay and mass spectrometry) and indirect methods (athlete biological passport). Particular attention is given to promising approaches and investigations that can be used to control prohibited erythropoietins in the near future. The bibliography includes 122 references.
Crystallization mosaic effect generation by superpixels
NASA Astrophysics Data System (ADS)
Xie, Yuqi; Bo, Pengbo; Yuan, Ye; Wang, Kuanquan
2015-03-01
Art effect generation from digital images using computational tools has been a hot research topic in recent years. We propose a new method for generating crystallization mosaic effects from color images. Two key problems in generating pleasant mosaic effect are studied: grouping pixels into mosaic tiles and arrangement of mosaic tiles adapting to image features. To give visually pleasant mosaic effect, we propose to create mosaic tiles by pixel clustering in feature space of color information, taking compactness of tiles into consideration as well. Moreover, we propose a method for processing feature boundaries in images which gives guidance for arranging mosaic tiles near image features. This method gives nearly uniform shape of mosaic tiles, adapting to feature lines in an esthetic way. The new approach considers both color distance and Euclidean distance of pixels, and thus is capable of giving mosaic tiles in a more pleasing manner. Some experiments are included to demonstrate the computational efficiency of the present method and its capability of generating visually pleasant mosaic tiles. Comparisons with existing approaches are also included to show the superiority of the new method.
2017-01-01
Background The use of telemedicine technologies in health care has increased substantially, together with a growing interest in participatory design methods when developing telemedicine approaches. Objective We present lessons learned from a case study involving patients with heart disease and health care professionals in the development of a personalized Web-based health care intervention. Methods We used a participatory design approach inspired by the method for feasibility studies in software development. We collected qualitative data using multiple methods in 3 workshops and analyzed the data using thematic analysis. Participants were 7 patients with diagnosis of heart disease, 2 nurses, 1 physician, 2 systems architects, 3 moderators, and 3 observers. Results We present findings in 2 parts. (1) Outcomes of the participatory design process: users gave valuable feedback on ease of use of the platforms’ tracking tools, platform design, terminology, and insights into patients’ monitoring needs, information and communication technologies skills, and preferences for self-management tools. (2) Experiences from the participatory design process: patients and health care professionals contributed different perspectives, with the patients using an experience-based approach and the health care professionals using a more attitude-based approach. Conclusions The essential lessons learned concern planning and organization of workshops, including the finding that patients engaged actively and willingly in a participatory design process, whereas it was more challenging to include and engage health care professionals. PMID:28526674
Evaluating the Impact of the U.S. National Toxicology Program: A Case Study on Hexavalent Chromium
Xie, Yun; Holmgren, Stephanie; Andrews, Danica M. K.; Wolfe, Mary S.
2016-01-01
Background: Evaluating the impact of federally funded research with a broad, methodical, and objective approach is important to ensure that public funds advance the mission of federal agencies. Objectives: We aimed to develop a methodical approach that would yield a broad assessment of National Toxicology Program’s (NTP’s) effectiveness across multiple sectors and demonstrate the utility of the approach through a case study. Methods: A conceptual model was developed with defined activities, outputs (products), and outcomes (proximal, intermediate, distal) and applied retrospectively to NTP’s research on hexavalent chromium (CrVI). Proximal outcomes were measured by counting views of and requests for NTP’s products by external stakeholders. Intermediate outcomes were measured by bibliometric analysis. Distal outcomes were assessed through Web and LexisNexis searches for documents related to legislation or regulation changes. Results: The approach identified awareness of NTP’s work on CrVI by external stakeholders (proximal outcome) and citations of NTP’s research in scientific publications, reports, congressional testimonies, and legal and policy documents (intermediate outcome). NTP’s research was key to the nation’s first-ever drinking water standard for CrVI adopted by California in 2014 (distal outcome). By applying this approach to a case study, the utility and limitations of the approach were identified, including challenges to evaluating the outcomes of a research program. Conclusions: This study identified a broad and objective approach for assessing NTP’s effectiveness, including methodological needs for more thorough and efficient impact assessments in the future. Citation: Xie Y, Holmgren S, Andrews DMK, Wolfe MS. 2017. Evaluating the impact of the U.S. National Toxicology Program: a case study on hexavalent chromium. Environ Health Perspect 125:181–188; http://dx.doi.org/10.1289/EHP21 PMID:27483499
Brandt, Marc; Becker, Eva; Jöhncke, Ulrich; Sättler, Daniel; Schulte, Christoph
2016-01-01
One important purpose of the European REACH Regulation (EC No. 1907/2006) is to promote the use of alternative methods for assessment of hazards of substances in order to avoid animal testing. Experience with environmental hazard assessment under REACH shows that efficient alternative methods are needed in order to assess chemicals when standard test data are missing. One such assessment method is the weight-of-evidence (WoE) approach. In this study, the WoE approach was used to assess the persistence of certain phenolic benzotriazoles, a group of substances including also such of very high concern (SVHC). For phenolic benzotriazoles, assessment of the environmental persistence is challenging as standard information, i.e. simulation tests on biodegradation are not available. Thus, the WoE approach was used: overall information resulting from many sources was considered, and individual uncertainties of each source analysed separately. In a second step, all information was aggregated giving an overall picture of persistence to assess the degradability of the phenolic benzotriazoles under consideration although the reliability of individual sources was incomplete. Overall, the evidence suggesting that phenolic benzotriazoles are very persistent in the environment is unambiguous. This was demonstrated by a WoE approach considering the prerequisites of REACH by combining several limited information sources. The combination enabled a clear overall assessment which can be reliably used for SVHC identification. Finally, it is recommended to include WoE approaches as an important tool in future environmental risk assessments.
Life Support Catalyst Regeneration Using Ionic Liquids and In Situ Resources
NASA Technical Reports Server (NTRS)
Abney, Morgan B.; Karr, Laurel J.; Paley, Mark S.; Donovan, David N.; Kramer, Teersa J.
2016-01-01
Oxygen recovery from metabolic carbon dioxide is an enabling capability for long-duration manned space flight. Complete recovery of oxygen (100%) involves the production of solid carbon. Catalytic approaches for this purpose, such as Bosch technology, have been limited in trade analyses due in part to the mass penalty for high catalyst resupply caused by carbon fouling of the iron or nickel catalyst. In an effort to mitigate this challenge, several technology approaches have been proposed. These approaches have included methods to prolong the life of the catalysts by increasing the total carbon mass loading per mass catalyst, methods for simplified catalyst introduction and removal to limit the resupply container mass, methods of using in situ resources, and methods to regenerate catalyst material. Research and development into these methods is ongoing, but only use of in situ resources and/or complete regeneration of catalyst material has the potential to entirely eliminate the need for resupply. The use of ionic liquids provides an opportunity to combine these methods in a technology approach designed to eliminate the need for resupply of oxygen recovery catalyst. Here we describe the results of an initial feasibility study using ionic liquids and in situ resources for life support catalyst regeneration, we discuss the key challenges with the approach, and we propose future efforts to advance the technology.
Chernikova, Valeriya; Shekhah, Osama; Eddaoudi, Mohamed
2016-08-10
Here, we report a new and advanced method for the fabrication of highly oriented/polycrystalline metal-organic framework (MOF) thin films. Building on the attractive features of the liquid-phase epitaxy (LPE) approach, a facile spin coating method was implemented to generate MOF thin films in a high-throughput fashion. Advantageously, this approach offers a great prospective to cost-effectively construct thin-films with a significantly shortened preparation time and a lessened chemicals and solvents consumption, as compared to the conventional LPE-process. Certainly, this new spin-coating approach has been implemented successfully to construct various MOF thin films, ranging in thickness from a few micrometers down to the nanometer scale, spanning 2-D and 3-D benchmark MOF materials including Cu2(bdc)2·xH2O, Zn2(bdc)2·xH2O, HKUST-1, and ZIF-8. This method was appraised and proved effective on a variety of substrates comprising functionalized gold, silicon, glass, porous stainless steel, and aluminum oxide. The facile, high-throughput and cost-effective nature of this approach, coupled with the successful thin film growth and substrate versatility, represents the next generation of methods for MOF thin film fabrication. Therefore, paving the way for these unique MOF materials to address a wide range of challenges in the areas of sensing devices and membrane technology.
NASA Astrophysics Data System (ADS)
Käser, Martin; Dumbser, Michael; de la Puente, Josep; Igel, Heiner
2007-01-01
We present a new numerical method to solve the heterogeneous anelastic, seismic wave equations with arbitrary high order accuracy in space and time on 3-D unstructured tetrahedral meshes. Using the velocity-stress formulation provides a linear hyperbolic system of equations with source terms that is completed by additional equations for the anelastic functions including the strain history of the material. These additional equations result from the rheological model of the generalized Maxwell body and permit the incorporation of realistic attenuation properties of viscoelastic material accounting for the behaviour of elastic solids and viscous fluids. The proposed method combines the Discontinuous Galerkin (DG) finite element (FE) method with the ADER approach using Arbitrary high order DERivatives for flux calculations. The DG approach, in contrast to classical FE methods, uses a piecewise polynomial approximation of the numerical solution which allows for discontinuities at element interfaces. Therefore, the well-established theory of numerical fluxes across element interfaces obtained by the solution of Riemann problems can be applied as in the finite volume framework. The main idea of the ADER time integration approach is a Taylor expansion in time in which all time derivatives are replaced by space derivatives using the so-called Cauchy-Kovalewski procedure which makes extensive use of the governing PDE. Due to the ADER time integration technique the same approximation order in space and time is achieved automatically and the method is a one-step scheme advancing the solution for one time step without intermediate stages. To this end, we introduce a new unrolled recursive algorithm for efficiently computing the Cauchy-Kovalewski procedure by making use of the sparsity of the system matrices. The numerical convergence analysis demonstrates that the new schemes provide very high order accuracy even on unstructured tetrahedral meshes while computational cost and storage space for a desired accuracy can be reduced when applying higher degree approximation polynomials. In addition, we investigate the increase in computing time, when the number of relaxation mechanisms due to the generalized Maxwell body are increased. An application to a well-acknowledged test case and comparisons with analytic and reference solutions, obtained by different well-established numerical methods, confirm the performance of the proposed method. Therefore, the development of the highly accurate ADER-DG approach for tetrahedral meshes including viscoelastic material provides a novel, flexible and efficient numerical technique to approach 3-D wave propagation problems including realistic attenuation and complex geometry.
Embedded high-contrast distributed grating structures
Zubrzycki, Walter J.; Vawter, Gregory A.; Allerman, Andrew A.
2002-01-01
A new class of fabrication methods for embedded distributed grating structures is claimed, together with optical devices which include such structures. These new methods are the only known approach to making defect-free high-dielectric contrast grating structures, which are smaller and more efficient than are conventional grating structures.
State-of-the-art report on non-traditional traffic counting methods
DOT National Transportation Integrated Search
2001-10-01
The purpose of this report is to look at the state-of-the-art of non-traditional traffic counting methods. This is done through a three-fold approach that includes an assessment of currently available technology, a survey of State Department of Trans...
Methods and compositions for protection of cells and tissues from computed tomography radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grdina, David J.
Described are methods for preventing or inhibiting genomic instability and in cells affected by diagnostic radiology procedures employing ionizing radiation. Embodiments include methods of preventing or inhibiting genomic instability and in cells affected by computed tomography (CT) radiation. Subjects receiving ionizing radiation may be those persons suspected of having cancer, or cancer patients having received or currently receiving cancer therapy, and or those patients having received previous ionizing radiation, including those who are approaching or have exceeded the recommended total radiation dose for a person.
Introducing gender equity to adolescent school children: A mixed methods' study.
Syed, Saba
2017-01-01
Over the past decade, gender equality and women's empowerment have been explicitly recognized as key not only to the health of nations but also to social and economic development. The aim of the present study was to assess the effectiveness of a mixed methods' participatory group education approach to introduce gender equity to adolescent school children. It also assessed baseline and postintervention knowledge, attitudes, and practices regarding gender equity, sexual and reproductive health among adolescent students in government-aided schools, and finally, compare the pre- and post-intervention gender equitable (GE) attitudes among the study participants. A government-aided school was selected by nonprobalistic intentional sampling. On 5 predesignated days, willing students were included in the intervention which included a pretest, a group of educational-based participatory mixed methods' intervention followed by a posttest assessment. A total of 186 students participated in the study. Girls had better baseline GE scores as compared to boys and they also improvised more on the baseline scores following the intervention. The present mixed method approach to introduce gender equity to adolescent school children through a group education-based interventional approach proved to be effective in initiating dialog and sensitizing adolescents on gender equity and violence within a school setting.
Perspectives on the Future Search for Life on Mars and Beyond
NASA Technical Reports Server (NTRS)
Nealson, K. H.
1998-01-01
One can view the search for life on Mars in two ways: first, as the initial step in the search for life elsewhere, and second, as the one place where in situ methods for life detection can be tested and proved via sample return. After Mars, most of the life detection will he done via in situ studies with data return. Mars offers us the opportunity to fine tune our methods - perhaps for a long time to come. Our group is involved in the development of methods for life detection that are independent of specific signals used for detection of life on Earth. These approaches include general indicators of metabolic activity and organismal structure and composition. Using such approaches, we hope to detect the signals of life (biosignatures) that are independent of preconceived notions and yet are convincing and unambiguous. The approaches we are focusing on include stable isotopic analyses of metals, mineral formation and disolution, and elemental analysis. These methods allow us to examine samples at a variety of scales, looking for nonequilibrium distribution of elements that serve as biosignatures. For futures studies of Mars and beyond, they, or some variation of them, should allow inference or proof of life in non-Earth locations.
Developing consumer involvement in rural HIV primary care programmes.
Mamary, Edward M; Toevs, Kim; Burnworth, Karla B; Becker, Lin
2004-06-01
As part of a broader medical and psychosocial needs assessment in a rural region of northern California, USA, five focus groups were conducted to explore innovative approaches to creating a system of consumer involvement in the delivery of HIV primary care services in the region. A total of five focus groups (n = 30) were conducted with clients from three of five counties in the region with the highest number of HIV patients receiving primary care. Participants were recruited by their HIV case managers. They were adults living with HIV, who were receiving health care, and who resided in a rural mountain region of northern California. Group discussions explored ideas for new strategies and examined traditional methods of consumer involvement, considering ways they could be adapted for a rural environment. Recommendations for consumer involvement included a multi-method approach consisting of traditional written surveys, a formal advisory group, and monthly consumer led social support/informal input groups. Specific challenges discussed included winter weather conditions, transportation barriers, physical limitations, confidentiality concerns, and needs for social support and education. A multiple-method approach would ensure more comprehensive consumer involvement in the programme planning process. It is also evident that methods for incorporating consumer involvement must be adapted to the specific context and circumstances of a given programme.
Learning the Relationship between Galaxy Spectra and Star Formation Histories
NASA Astrophysics Data System (ADS)
Lovell, Christopher; Acquaviva, Viviana; Iyer, Kartheik; Gawiser, Eric
2018-01-01
We explore novel approaches to the problem of predicting a galaxy’s star formation history (SFH) from its Spectral Energy Distribution (SED). Traditional approaches to SED template fitting use constant or exponentially declining SFHs, and are known to incur significant bias in the inferred SFHs, which are typically skewed toward younger stellar populations. Machine learning approaches, including tree ensemble methods and convolutional neural networks, would not be affected by the same bias, and may work well in recovering unbiased and multi-episodic star formation histories. We use a supervised approach whereby models are trained using synthetic spectra, generated from three state of the art hydrodynamical simulations, including nebular emission. We explore how SED feature maps can be used to highlight areas of the spectrum with the highest predictive power and discuss the limitations of the approach when applied to real data.
Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh
2011-03-01
It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.
The method of complex characteristics for design of transonic blade sections
NASA Technical Reports Server (NTRS)
Bledsoe, M. R.
1986-01-01
A variety of computational methods were developed to obtain shockless or near shockless flow past two-dimensional airfoils. The approach used was the method of complex characteristics, which determines smooth solutions to the transonic flow equations based on an input speed distribution. General results from fluid mechanics are presented. An account of the method of complex characteristics is given including a description of the particular spaces and coordinates, conformal transformations, and numerical procedures that are used. The operation of the computer program COMPRES is presented along with examples of blade sections designed with the code. A user manual is included with a glossary to provide additional information which may be helpful. The computer program in Fortran, including numerous comment cards is listed.
Material characterization of structural adhesives in the lap shear mode
NASA Technical Reports Server (NTRS)
Sancaktar, E.; Schenck, S. C.
1983-01-01
A general method for characterizing structual adhesives in the bonded lap shear mode is proposed. Two approaches in the form of semiempirical and theoretical approaches are used. The semiempirical approach includes Ludwik's and Zhurkov's equations to describe respectively, the failure stresses in the constant strain rate and constant stress loading modes with the inclusion of the temperature effects. The theoretical approach is used to describe adhesive shear stress-strain behavior with the use of viscoelastic or nonlinear elastic constitutive equations. Two different model adhesives are used in the single lap shear mode with titanium adherends. These adhesives (one of which was developed at NASA Langley Research Center) are currently considered by NASA for possible aerospace applications. Use of different model adhesives helps in assessment of the generality of the method.
Teaching about Fascism: An Interdisciplinary Approach.
ERIC Educational Resources Information Center
Hirshfield, Claire
1980-01-01
Describes a university course which teaches the history of fascism and nazism through interdisciplinary methods: philosophy, film, literature, and art. Visiting lecturers include survivors of concentration camps. (KC)
CW-SSIM kernel based random forest for image classification
NASA Astrophysics Data System (ADS)
Fan, Guangzhe; Wang, Zhou; Wang, Jiheng
2010-07-01
Complex wavelet structural similarity (CW-SSIM) index has been proposed as a powerful image similarity metric that is robust to translation, scaling and rotation of images, but how to employ it in image classification applications has not been deeply investigated. In this paper, we incorporate CW-SSIM as a kernel function into a random forest learning algorithm. This leads to a novel image classification approach that does not require a feature extraction or dimension reduction stage at the front end. We use hand-written digit recognition as an example to demonstrate our algorithm. We compare the performance of the proposed approach with random forest learning based on other kernels, including the widely adopted Gaussian and the inner product kernels. Empirical evidences show that the proposed method is superior in its classification power. We also compared our proposed approach with the direct random forest method without kernel and the popular kernel-learning method support vector machine. Our test results based on both simulated and realworld data suggest that the proposed approach works superior to traditional methods without the feature selection procedure.
The opportunities and challenges of large-scale molecular approaches to songbird neurobiology
Mello, C.V.; Clayton, D.F.
2014-01-01
High-through put methods for analyzing genome structure and function are having a large impact in song-bird neurobiology. Methods include genome sequencing and annotation, comparative genomics, DNA microarrays and transcriptomics, and the development of a brain atlas of gene expression. Key emerging findings include the identification of complex transcriptional programs active during singing, the robust brain expression of non-coding RNAs, evidence of profound variations in gene expression across brain regions, and the identification of molecular specializations within song production and learning circuits. Current challenges include the statistical analysis of large datasets, effective genome curations, the efficient localization of gene expression changes to specific neuronal circuits and cells, and the dissection of behavioral and environmental factors that influence brain gene expression. The field requires efficient methods for comparisons with organisms like chicken, which offer important anatomical, functional and behavioral contrasts. As sequencing costs plummet, opportunities emerge for comparative approaches that may help reveal evolutionary transitions contributing to vocal learning, social behavior and other properties that make songbirds such compelling research subjects. PMID:25280907
NASA Technical Reports Server (NTRS)
Pliutau, Denis; Prasad, Narasimha S.
2012-01-01
In this paper a modeling method based on data reductions is investigated which includes pre analyzed MERRA atmospheric fields for quantitative estimates of uncertainties introduced in the integrated path differential absorption methods for the sensing of various molecules including CO2. This approach represents the extension of our existing lidar modeling framework previously developed and allows effective on- and offline wavelength optimizations and weighting function analysis to minimize the interference effects such as those due to temperature sensitivity and water vapor absorption. The new simulation methodology is different from the previous implementation in that it allows analysis of atmospheric effects over annual spans and the entire Earth coverage which was achieved due to the data reduction methods employed. The effectiveness of the proposed simulation approach is demonstrated with application to the mixing ratio retrievals for the future ASCENDS mission. Independent analysis of multiple accuracy limiting factors including the temperature, water vapor interferences, and selected system parameters is further used to identify favorable spectral regions as well as wavelength combinations facilitating the reduction in total errors in the retrieved XCO2 values.
Cejka, Pavel; Culík, Jiří; Horák, Tomáš; Jurková, Marie; Olšovská, Jana
2013-12-26
The rate of beer aging is affected by storage conditions including largely time and temperature. Although bottled beer is commonly stored for up to 1 year, sensorial damage of it is quite frequent. Therefore, a method for retrospective determination of temperature of stored beer was developed. The method is based on the determination of selected carbonyl compounds called as "aging indicators", which are formed during beer aging. The aging indicators were determined using GC-MS after precolumn derivatization with O-(2,3,4,5,6-pentaflourobenzyl)hydroxylamine hydrochloride, and their profile was correlated with the development of old flavor evolving under defined conditions (temperature, time) using both a mathematical and statistical apparatus. Three approaches, including calculation from regression graph, multiple linear regression, and neural networks, were employed. The ultimate uncertainty of the method ranged from 3.0 to 11.0 °C depending on the approach used. Furthermore, the assay was extended to include prediction of beer tendency to sensory aging from freshly bottled beer.
A review of parametric approaches specific to aerodynamic design process
NASA Astrophysics Data System (ADS)
Zhang, Tian-tian; Wang, Zhen-guo; Huang, Wei; Yan, Li
2018-04-01
Parametric modeling of aircrafts plays a crucial role in the aerodynamic design process. Effective parametric approaches have large design space with a few variables. Parametric methods that commonly used nowadays are summarized in this paper, and their principles have been introduced briefly. Two-dimensional parametric methods include B-Spline method, Class/Shape function transformation method, Parametric Section method, Hicks-Henne method and Singular Value Decomposition method, and all of them have wide application in the design of the airfoil. This survey made a comparison among them to find out their abilities in the design of the airfoil, and the results show that the Singular Value Decomposition method has the best parametric accuracy. The development of three-dimensional parametric methods is limited, and the most popular one is the Free-form deformation method. Those methods extended from two-dimensional parametric methods have promising prospect in aircraft modeling. Since different parametric methods differ in their characteristics, real design process needs flexible choice among them to adapt to subsequent optimization procedure.
Semi-automating the manual literature search for systematic reviews increases efficiency.
Chapman, Andrea L; Morgan, Laura C; Gartlehner, Gerald
2010-03-01
To minimise retrieval bias, manual literature searches are a key part of the search process of any systematic review. Considering the need to have accurate information, valid results of the manual literature search are essential to ensure scientific standards; likewise efficient approaches that minimise the amount of personnel time required to conduct a manual literature search are of great interest. The objective of this project was to determine the validity and efficiency of a new manual search method that utilises the scopus database. We used the traditional manual search approach as the gold standard to determine the validity and efficiency of the proposed scopus method. Outcome measures included completeness of article detection and personnel time involved. Using both methods independently, we compared the results based on accuracy of the results, validity and time spent conducting the search, efficiency. Regarding accuracy, the scopus method identified the same studies as the traditional approach indicating its validity. In terms of efficiency, using scopus led to a time saving of 62.5% compared with the traditional approach (3 h versus 8 h). The scopus method can significantly improve the efficiency of manual searches and thus of systematic reviews.
Task Training Emphasis for Determining Training Priority.
1987-08-01
the relative time spent on tasks performed in their current jobs. Supervisors also rated the tasks on several different task factors, including a new... different task factors, including Task Difficulty, Probable Consequences of Inadequate Performance, Task Delay Tolerance, and Recommended Training Emphasis...3 11. APPROACH. .. ..... ..... ...... ..... ..... ...... ...... 4 III. METHOD
Developing a Mind-Body Exercise Programme for Stressed Children
ERIC Educational Resources Information Center
Wang, Claudia; Seo, Dong-Chul; Geib, Roy W
2017-01-01
Objective: To describe the process of developing a Health Qigong programme for stressed children using a formative evaluation approach. Methods: A multi-step formative evaluation method was utilised. These steps included (1) identifying programme content and drafting the curriculum, (2) synthesising effective and age-appropriate pedagogies, (3)…
Solving ay'' + by' + cy = 0 with a Simple Product Rule Approach
ERIC Educational Resources Information Center
Tolle, John
2011-01-01
When elementary ordinary differential equations (ODEs) of first and second order are included in the calculus curriculum, second-order linear constant coefficient ODEs are typically solved by a method more appropriate to differential equations courses. This method involves the characteristic equation and its roots, complex-valued solutions, and…
NASA Technical Reports Server (NTRS)
Bowen, Howard S.; Cunningham, Douglas M.
2007-01-01
The contents include: 1) Brief history of related events; 2) Overview of original method used to establish absolute radiometric accuracy of remote sensing instruments using stellar sources; and 3) Considerations to improve the stellar calibration approach.
Behavioral Changes Based on a Course in Agroecology: A Mixed Methods Study
ERIC Educational Resources Information Center
Harms, Kristyn; King, James; Francis, Charles
2009-01-01
This study evaluated and described student perceptions of a course in agroecology to determine if participants experienced changed perceptions and behaviors resulting from the Agroecosystems Analysis course. A triangulation validating quantitative data mixed methods approach included a written survey comprised of both quantitative and open-ended…
The Journal of Suggestive-Accelerative Learning and Teaching, Volume 5, Number 2.
ERIC Educational Resources Information Center
Journal of Suggestive-Accelerative Learning and Teaching, 1980
1980-01-01
A collection of articles concerning suggestive-accelerative learning and teaching (SALT) methods includes: "Suggestive Teaching Methods in the Soviet Union" (Eva Szalontai); "SALT Applied to Remedial Reading: A Critical Review" (Allyn Prichard and Jean Taylor); "The Waldorf Schools: An Artistic Approach to Education"…
Automatic building LOD copies for multitextured objects
NASA Astrophysics Data System (ADS)
Souetov, Andrew E.
2000-01-01
This article is dedicated to the research of geometry level of detail technology for systems of real-time 3D visualization. The article includes the conditions of applicability of the method, overview of existing approaches, their drawbacks and advantages. New technology guidelines are suggested as an alternative to existing methods.
New Method for Analysis of Multiple Anthelmintic Residues in Animal Tissue
USDA-ARS?s Scientific Manuscript database
For the first time, 39 of the major anthelmintics can be detected in one rapid and sensitive LC-MS/MS method, including the flukicides, which have been generally overlooked in surveillance programs. Utilizing the QuEChERS approach, residues were extracted from liver and milk using acetonitrile, sod...
Presenting the Iterative Curriculum Discourse Analysis (ICDA) Approach
ERIC Educational Resources Information Center
Iversen, Lars Laird
2014-01-01
The article presents a method for analysing recurring curriculum documents using discourse theory inspired by Ernesto Laclau and Chantal Mouffe. The article includes a presentation of the method in seven practical steps, and is illustrated and discussed throughout using the author's recent case study on religion, identity and values in Norwegian…
In order to screen large numbers of chemicals for their potential to produce developmental neurotoxicity new, in vitro methods are needed. One approach is to develop methods based on the biologic processes which underlie brain development including the growth and maturation of ce...
Comparison of multiple gene assembly methods for metabolic engineering
Chenfeng Lu; Karen Mansoorabadi; Thomas Jeffries
2007-01-01
A universal, rapid DNA assembly method for efficient multigene plasmid construction is important for biological research and for optimizing gene expression in industrial microbes. Three different approaches to achieve this goal were evaluated. These included creating long complementary extensions using a uracil-DNA glycosylase technique, overlap extension polymerase...
Nursing Admission Practices to Discern "Fit": A Case Study Exemplar
ERIC Educational Resources Information Center
Sinutko, Jaime M.
2014-01-01
Admission to a baccalaureate nursing school in the United States is currently a challenging proposition for a variety of reasons. This research explored a holistic nursing school admission process at a small, private, baccalaureate college using a retrospective, mixed-method, approach. The holistic method included multiple admission criteria, both…
101 Short Problems from EQUALS = 101 Problemas Cortos del programma EQUALS.
ERIC Educational Resources Information Center
Stenmark, Jean Kerr, Ed.
EQUALS is a teacher advisory program that helps elementary and secondary educators acquire methods and materials to attract minority and female students to mathematics. The program supports a problem-solving approach to mathematics, including having students working in groups, using active assessment methods, and incorporating a broad mathematics…
Methods for Instructional Diagnosis with Limited Available Resources.
ERIC Educational Resources Information Center
Gillmore, Gerald M.; Clark, D. Joseph
College teaching should be approached with the same careful delineation of problems and systematic attempts to find solutions which characterize research. Specific methods for the diagnosis of instructional problems include audio-video taping, use of teaching assistants, colleague assistance, classroom tests, student projects in and out of class,…
ERIC Educational Resources Information Center
Eckhoff, Angela
2017-01-01
This article documents a collaborative project involving preservice early childhood education students' development of inquiry-based learning experiences alongside kindergarten students within a science methods course. To document this project, I utilized a multiple methods approach and data included classroom observations, transcripts from lesson…
Selecting the Right Construction Delivery Method for a Specific Project.
ERIC Educational Resources Information Center
Klinger, Jeff; Booth, Scott
2002-01-01
Discusses the costs and benefits of various construction delivery methods for higher education facility projects, including the traditional lump sum general contracting approach (also known as design/bid/build); design-build; and, in the case of private institutions, guaranteed maximum pricing offered by those firms willing to perform construction…
ERIC Educational Resources Information Center
Curtis, Dan
2010-01-01
This article gives a simple method for determining the maximum interval of existence for a solution of a single, autonomous, first-order differential equation as well as the behavior of the solution as the independent variable approaches the ends of the interval. The methods used are elementary enough to be included in an introductory differential…
Research in Distance Education: A System Modeling Approach.
ERIC Educational Resources Information Center
Saba, Farhad; Twitchell, David
This demonstration of the use of a computer simulation research method based on the System Dynamics modeling technique for studying distance education reviews research methods in distance education, including the broad categories of conceptual and case studies, and presents a rationale for the application of systems research in this area. The…
A voxel-based investigation for MRI-only radiotherapy of the brain using ultra short echo times
NASA Astrophysics Data System (ADS)
Edmund, Jens M.; Kjer, Hans M.; Van Leemput, Koen; Hansen, Rasmus H.; Andersen, Jon AL; Andreasen, Daniel
2014-12-01
Radiotherapy (RT) based on magnetic resonance imaging (MRI) as the only modality, so-called MRI-only RT, would remove the systematic registration error between MR and computed tomography (CT), and provide co-registered MRI for assessment of treatment response and adaptive RT. Electron densities, however, need to be assigned to the MRI images for dose calculation and patient setup based on digitally reconstructed radiographs (DRRs). Here, we investigate the geometric and dosimetric performance for a number of popular voxel-based methods to generate a so-called pseudo CT (pCT). Five patients receiving cranial irradiation, each containing a co-registered MRI and CT scan, were included. An ultra short echo time MRI sequence for bone visualization was used. Six methods were investigated for three popular types of voxel-based approaches; (1) threshold-based segmentation, (2) Bayesian segmentation and (3) statistical regression. Each approach contained two methods. Approach 1 used bulk density assignment of MRI voxels into air, soft tissue and bone based on logical masks and the transverse relaxation time T2 of the bone. Approach 2 used similar bulk density assignments with Bayesian statistics including or excluding additional spatial information. Approach 3 used a statistical regression correlating MRI voxels with their corresponding CT voxels. A similar photon and proton treatment plan was generated for a target positioned between the nasal cavity and the brainstem for all patients. The CT agreement with the pCT of each method was quantified and compared with the other methods geometrically and dosimetrically using both a number of reported metrics and introducing some novel metrics. The best geometrical agreement with CT was obtained with the statistical regression methods which performed significantly better than the threshold and Bayesian segmentation methods (excluding spatial information). All methods agreed significantly better with CT than a reference water MRI comparison. The mean dosimetric deviation for photons and protons compared to the CT was about 2% and highest in the gradient dose region of the brainstem. Both the threshold based method and the statistical regression methods showed the highest dosimetrical agreement. Generation of pCTs using statistical regression seems to be the most promising candidate for MRI-only RT of the brain. Further, the total amount of different tissues needs to be taken into account for dosimetric considerations regardless of their correct geometrical position.
Quantum chemical approach to estimating the thermodynamics of metabolic reactions.
Jinich, Adrian; Rappoport, Dmitrij; Dunn, Ian; Sanchez-Lengeling, Benjamin; Olivares-Amaya, Roberto; Noor, Elad; Even, Arren Bar; Aspuru-Guzik, Alán
2014-11-12
Thermodynamics plays an increasingly important role in modeling and engineering metabolism. We present the first nonempirical computational method for estimating standard Gibbs reaction energies of metabolic reactions based on quantum chemistry, which can help fill in the gaps in the existing thermodynamic data. When applied to a test set of reactions from core metabolism, the quantum chemical approach is comparable in accuracy to group contribution methods for isomerization and group transfer reactions and for reactions not including multiply charged anions. The errors in standard Gibbs reaction energy estimates are correlated with the charges of the participating molecules. The quantum chemical approach is amenable to systematic improvements and holds potential for providing thermodynamic data for all of metabolism.
The viability of ADVANTG deterministic method for synthetic radiography generation
NASA Astrophysics Data System (ADS)
Bingham, Andrew; Lee, Hyoung K.
2018-07-01
Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.
ERIC Educational Resources Information Center
Rabgay, Tshewang
2018-01-01
The study investigated the effect of using cooperative learning method on tenth grade students' learning achievement in biology and their attitude towards the subject in a Higher Secondary School in Bhutan. The study used a mixed method approach. The quantitative component included an experimental design where cooperative learning was the…
ERIC Educational Resources Information Center
Pinheiro, Sandro O.; Rohrer, Jonathan D.; Heimann, C. F. Larry
This paper describes a mixed method evaluation study that was developed to assess faculty teaching behavior change in a faculty development fellowship program for community-based hospital faculty. Principles of adult learning were taught to faculty participants over the fellowship period. These included instruction in teaching methods, group…
Vancomycin Dosing in Obese Patients: Special Considerations and Novel Dosing Strategies.
Durand, Cheryl; Bylo, Mary; Howard, Brian; Belliveau, Paul
2018-06-01
To review the literature regarding vancomycin pharmacokinetics in obese patients and strategies used to improve dosing in this population. PubMed, EMBASE (1974 to November 2017), and Google Scholar searches were conducted using the search terms vancomycin, obese, obesity, pharmacokinetics, strategy, and dosing. Additional articles were selected from reference lists of selected studies. Included articles were those published in English with a primary focus on vancomycin pharmacokinetic parameters in obese patients and practical vancomycin dosing strategies, clinical experiences, or challenges of dosing vancomycin in this population. Volume of distribution and clearance are the pharmacokinetic parameters that most often affect vancomycin dosing in obese patients; both are increased in this population. Challenges with dosing in obese patients include inconsistent and inadequate dosing, observations that the obese population may not be homogeneous, and reports of an increased likelihood of supratherapeutic trough concentrations. Investigators have revised and developed dosing and monitoring protocols to address these challenges. These approaches improved target trough attainment to varying degrees. Some of the vancomycin dosing approaches provided promising results in obese patients, but there were notable differences in methods used to develop these approaches, and sample sizes were small. Although some approaches can be considered for validation in individual institutions, further research is warranted. This may include validating approaches in larger populations with narrower obesity severity ranges, investigating target attainment in indication-specific target ranges, and evaluating the impact of different dosing weights and methods of creatinine clearance calculation.
Laplace Inversion of Low-Resolution NMR Relaxometry Data Using Sparse Representation Methods
Berman, Paula; Levi, Ofer; Parmet, Yisrael; Saunders, Michael; Wiesman, Zeev
2013-01-01
Low-resolution nuclear magnetic resonance (LR-NMR) relaxometry is a powerful tool that can be harnessed for characterizing constituents in complex materials. Conversion of the relaxation signal into a continuous distribution of relaxation components is an ill-posed inverse Laplace transform problem. The most common numerical method implemented today for dealing with this kind of problem is based on L2-norm regularization. However, sparse representation methods via L1 regularization and convex optimization are a relatively new approach for effective analysis and processing of digital images and signals. In this article, a numerical optimization method for analyzing LR-NMR data by including non-negativity constraints and L1 regularization and by applying a convex optimization solver PDCO, a primal-dual interior method for convex objectives, that allows general linear constraints to be treated as linear operators is presented. The integrated approach includes validation of analyses by simulations, testing repeatability of experiments, and validation of the model and its statistical assumptions. The proposed method provides better resolved and more accurate solutions when compared with those suggested by existing tools. © 2013 Wiley Periodicals, Inc. Concepts Magn Reson Part A 42A: 72–88, 2013. PMID:23847452
Laplace Inversion of Low-Resolution NMR Relaxometry Data Using Sparse Representation Methods.
Berman, Paula; Levi, Ofer; Parmet, Yisrael; Saunders, Michael; Wiesman, Zeev
2013-05-01
Low-resolution nuclear magnetic resonance (LR-NMR) relaxometry is a powerful tool that can be harnessed for characterizing constituents in complex materials. Conversion of the relaxation signal into a continuous distribution of relaxation components is an ill-posed inverse Laplace transform problem. The most common numerical method implemented today for dealing with this kind of problem is based on L 2 -norm regularization. However, sparse representation methods via L 1 regularization and convex optimization are a relatively new approach for effective analysis and processing of digital images and signals. In this article, a numerical optimization method for analyzing LR-NMR data by including non-negativity constraints and L 1 regularization and by applying a convex optimization solver PDCO, a primal-dual interior method for convex objectives, that allows general linear constraints to be treated as linear operators is presented. The integrated approach includes validation of analyses by simulations, testing repeatability of experiments, and validation of the model and its statistical assumptions. The proposed method provides better resolved and more accurate solutions when compared with those suggested by existing tools. © 2013 Wiley Periodicals, Inc. Concepts Magn Reson Part A 42A: 72-88, 2013.
Brown, Lynette; Green, Cherie L; Jones, Nicholas; Stewart, Jennifer J; Fraser, Stephanie; Howell, Kathy; Xu, Yuanxin; Hill, Carla G; Wiwi, Christopher A; White, Wendy I; O'Brien, Peter J; Litwin, Virginia
2015-03-01
The objective of this manuscript is to present an approach for evaluating specimen stability for flow cytometric methods used during drug development. While this approach specifically addresses stability assessment for assays to be used in clinical trials with centralized testing facilities, the concepts can be applied to any stability assessment for flow cytometric methods. The proposed approach is implemented during assay development and optimization, and includes suggestions for designing a stability assessment plan, data evaluation and acceptance criteria. Given that no single solution will be applicable in all scenarios, this manuscript offers the reader a roadmap for stability assessment and is intended to guide the investigator during both the method development phase and in the experimental design of the validation plan. Copyright © 2015 Elsevier B.V. All rights reserved.
Flight-Test Evaluation of Flutter-Prediction Methods
NASA Technical Reports Server (NTRS)
Lind, RIck; Brenner, Marty
2003-01-01
The flight-test community routinely spends considerable time and money to determine a range of flight conditions, called a flight envelope, within which an aircraft is safe to fly. The cost of determining a flight envelope could be greatly reduced if there were a method of safely and accurately predicting the speed associated with the onset of an instability called flutter. Several methods have been developed with the goal of predicting flutter speeds to improve the efficiency of flight testing. These methods include (1) data-based methods, in which one relies entirely on information obtained from the flight tests and (2) model-based approaches, in which one relies on a combination of flight data and theoretical models. The data-driven methods include one based on extrapolation of damping trends, one that involves an envelope function, one that involves the Zimmerman-Weissenburger flutter margin, and one that involves a discrete-time auto-regressive model. An example of a model-based approach is that of the flutterometer. These methods have all been shown to be theoretically valid and have been demonstrated on simple test cases; however, until now, they have not been thoroughly evaluated in flight tests. An experimental apparatus called the Aerostructures Test Wing (ATW) was developed to test these prediction methods.
EPA is working to develop methods and guidance to manage and clean up contaminated land, groundwater and nutrient pollution as well as develop innovative approaches to managing materials and waste including energy recovery.
The role of simulation in mixed-methods research: a framework & application to patient safety.
Guise, Jeanne-Marie; Hansen, Matthew; Lambert, William; O'Brien, Kerth
2017-05-04
Research in patient safety is an important area of health services research and is a national priority. It is challenging to investigate rare occurrences, explore potential causes, and account for the complex, dynamic context of healthcare - yet all are required in patient safety research. Simulation technologies have become widely accepted as education and clinical tools, but have yet to become a standard tool for research. We developed a framework for research that integrates accepted patient safety models with mixed-methods research approaches and describe the performance of the framework in a working example of a large National Institutes of Health (NIH)-funded R01 investigation. This worked example of a framework in action, identifies the strengths and limitations of qualitative and quantitative research approaches commonly used in health services research. Each approach builds essential layers of knowledge. We describe how the use of simulation ties these layers of knowledge together and adds new and unique dimensions of knowledge. A mixed-methods research approach that includes simulation provides a broad multi-dimensional approach to health services and patient safety research.
A Constructivist Approach to HIV/AIDS Education for Women Within the Maritime Provinces of Canada
ERIC Educational Resources Information Center
Bulman, Donna E.
2005-01-01
The primary objective of this research was to increase understanding of how women in the Maritime Provinces of Canada learn about the HIV/AIDS epidemic. This research utilised a qualitative approach with specific methods including interviews, joint interviews and focus groups. Overall 44 women participated in this research. The data was analysed…
Which Types of Leadership Styles Do Followers Prefer? A Decision Tree Approach
ERIC Educational Resources Information Center
Salehzadeh, Reza
2017-01-01
Purpose: The purpose of this paper is to propose a new method to find the appropriate leadership styles based on the followers' preferences using the decision tree technique. Design/methodology/approach: Statistical population includes the students of the University of Isfahan. In total, 750 questionnaires were distributed; out of which, 680…
The Express-Lane Edit: Making Editing Useful for Young Adolescents
ERIC Educational Resources Information Center
Anderson, Jeff
2008-01-01
Editing is a powerful tool for writers, but are our methods of teaching it really demonstrating that power for young adolescents? The author, frustrated with students' inability to edit, blames his own approach and, beginning with a grocery store epiphany, works to develop a more effective system. Elements of his successful approach include time…
Studying Distance Students: Methods, Findings, Actions
ERIC Educational Resources Information Center
Wahl, Diane; Avery, Beth; Henry, Lisa
2013-01-01
University of North Texas (UNT) Libraries began studying the library needs of distance learners in 2009 using a variety of approaches to explore and confirm these needs as well as obtain input into how to meet them. Approaches used to date include analysis of both quantitative and qualitative responses by online students to the LibQUAL+[R] surveys…
ERIC Educational Resources Information Center
Hulstijn, Jan H.; Young, Richard F.; Ortega, Lourdes; Bigelow, Martha; DeKeyser, Robert; Ellis, Nick C.; Lantolf, James P.; Mackey, Alison; Talmy, Steven
2014-01-01
For some, research in learning and teaching of a second language (L2) runs the risk of disintegrating into irreconcilable approaches to L2 learning and use. On the one side, we find researchers investigating linguistic-cognitive issues, often using quantitative research methods including inferential statistics; on the other side, we find…
Tromson, Clara; Bulle, Cécile; Deschênes, Louise
2017-03-01
In life cycle assessment (LCA), the potential terrestrial ecotoxicity effect of metals, calculated as the effect factor (EF), is usually extrapolated from aquatic ecotoxicological data using the equilibrium partitioning method (EqP) as it is more readily available than terrestrial data. However, when following the AMI recommendations (i.e. with at least enough species that represents three different phyla), there are not enough terrestrial data for which soil properties or metal speciation during ecotoxicological testing are specified to account for the influence of soil property variations on metal speciation when using this approach. Alternatively, the TBLM (Terrestrial Biotic Ligand Model) has been used to determine an EF that accounts for speciation, but is not available for metals; hence it cannot be consistently applied to metals in an LCA context. This paper proposes an approach to include metal speciation by regionalizing the EqP method for Cu, Ni and Zn with a geochemical speciation model (the Windermere Humic Aqueous Model 7.0), for 5213 soils selected from the Harmonized World Soil Database. Results obtained by this approach (EF EqP regionalized ) are compared to the EFs calculated with the conventional EqP method, to the EFs based on available terrestrial data and to the EFs calculated with the TBLM (EF TBLM regionalized ) when available. The spatial variability contribution of the EF to the overall spatial variability of the characterization factor (CF) has been analyzed. It was found that the EFs EqP regionalized show a significant spatial variability. The EFs calculated with the two non-regionalized methods (EqP and terrestrial data) fall within the range of the EFs EqP regionalized . The EFs TBLM regionalized cover a larger range of values than the EFs EqP regionalized but the two methods are not correlated. This paper highlights the importance of including speciation into the terrestrial EF and shows that using the regionalized EqP approach is not an acceptable proxy for terrestrial ecotoxicological data even if it can be applied to all metals. Copyright © 2016. Published by Elsevier B.V.
Design of compound libraries for fragment screening
NASA Astrophysics Data System (ADS)
Blomberg, Niklas; Cosgrove, David A.; Kenny, Peter W.; Kolmodin, Karin
2009-08-01
Approaches to the design of libraries for fragment screening are illustrated with reference to a 20 k generic fragment screening library and a 1.2 k generic NMR screening library. Tools and methods for library design that have been developed within AstraZeneca are described, including Foyfi fingerprints and the Flush program for neighborhood characterization. It will be shown how Flush and the BigPicker, which selects maximally diverse sets of compounds, are used to apply the Core and Layer method for library design. Approaches to partitioning libraries into cocktails are also described.
NASA Technical Reports Server (NTRS)
Miller, G.; Heimann, Paula J.; Scheiman, Daniel A.; Duffy, Kirsten P.; Johnston, J. Chris; Roberts, Gary D.
2013-01-01
Vibration mitigation in composite structures has been demonstrated through widely varying methods which include both active and passive damping. Recently, nanomaterials have been investigated as a viable approach to composite vibration damping due to the large surface available to generate energy dissipation through friction. This work evaluates the influence of dispersed nanoparticles on the damping ratio of an epoxy matrix. Limited benefit was observed through dispersion methods, however nanoparticle application as a coating resulting in up to a three-fold increase in damping.
NASA Astrophysics Data System (ADS)
Mikhailov, S. Ia.; Tumatov, K. I.
The paper compares the results obtained using two methods to calculate the amplitude of a short-wave signal field incident on or reflected from a perfectly conducting earth. A technique is presented for calculating the geometric characteristics of the field based on the waveguide approach. It is shown that applying an extended system of characteristic equations to calculate the field amplitude is inadmissible in models which include the discontinuity second derivatives of the permittivity unless a suitable treament of the discontinuity points is applied.
A mean field approach to the Ising chain in a transverse magnetic field
NASA Astrophysics Data System (ADS)
Osácar, C.; Pacheco, A. F.
2017-07-01
We evaluate a mean field method to describe the properties of the ground state of the Ising chain in a transverse magnetic field. Specifically, a method of the Bethe-Peierls type is used by solving spin blocks with a self-consistency condition at the borders. The computations include the critical point for the phase transition, exponent of magnetisation and energy density. All results are obtained using basic quantum mechanics at an undergraduate level. The advantages and the limitations of the approach are emphasised.
Human Pose Estimation from Monocular Images: A Comprehensive Survey
Gong, Wenjuan; Zhang, Xuena; Gonzàlez, Jordi; Sobral, Andrews; Bouwmans, Thierry; Tu, Changhe; Zahzah, El-hadi
2016-01-01
Human pose estimation refers to the estimation of the location of body parts and how they are connected in an image. Human pose estimation from monocular images has wide applications (e.g., image indexing). Several surveys on human pose estimation can be found in the literature, but they focus on a certain category; for example, model-based approaches or human motion analysis, etc. As far as we know, an overall review of this problem domain has yet to be provided. Furthermore, recent advancements based on deep learning have brought novel algorithms for this problem. In this paper, a comprehensive survey of human pose estimation from monocular images is carried out including milestone works and recent advancements. Based on one standard pipeline for the solution of computer vision problems, this survey splits the problem into several modules: feature extraction and description, human body models, and modeling methods. Problem modeling methods are approached based on two means of categorization in this survey. One way to categorize includes top-down and bottom-up methods, and another way includes generative and discriminative methods. Considering the fact that one direct application of human pose estimation is to provide initialization for automatic video surveillance, there are additional sections for motion-related methods in all modules: motion features, motion models, and motion-based methods. Finally, the paper also collects 26 publicly available data sets for validation and provides error measurement methods that are frequently used. PMID:27898003
Intrinsic ethics regarding integrated assessment models for climate management.
Schienke, Erich W; Baum, Seth D; Tuana, Nancy; Davis, Kenneth J; Keller, Klaus
2011-09-01
In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes.
Starting from the bench--prevention and control of foodborne and zoonotic diseases.
Vongkamjan, Kitiya; Wiedmann, Martin
2015-02-01
Foodborne diseases are estimated to cause around 50 million disease cases and 3000 deaths a year in the US. Worldwide, food and waterborne diseases are estimated to cause more than 2 million deaths per year. Lab-based research is a key component of efforts to prevent and control foodborne diseases. Over the last two decades, molecular characterization of pathogen isolates has emerged as a key component of foodborne and zoonotic disease prevention and control. Characterization methods have evolved from banding pattern-based subtyping methods to sequenced-based approaches, including full genome sequencing. Molecular subtyping methods not only play a key role for characterizing pathogen transmission and detection of disease outbreaks, but also allow for identification of clonal pathogen groups that show distinct transmission characteristics. Importantly, the data generated from molecular characterization of foodborne pathogens also represent critical inputs for epidemiological and modeling studies. Continued and enhanced collaborations between infectious disease related laboratory sciences and epidemiologists, modelers, and other quantitative scientists will be critical to a One-Health approach that delivers societal benefits, including improved surveillance systems and prevention approaches for zoonotic and foodborne pathogens. Copyright © 2014 Elsevier B.V. All rights reserved.
An overview: modern techniques for railway vehicle on-board health monitoring systems
NASA Astrophysics Data System (ADS)
Li, Chunsheng; Luo, Shihui; Cole, Colin; Spiryagin, Maksym
2017-07-01
Health monitoring systems with low-cost sensor networks and smart algorithms are always needed in both passenger trains and heavy haul trains due to the increasing need for reliability and safety in the railway industry. This paper focuses on an overview of existing approaches applied for railway vehicle on-board health monitoring systems. The approaches applied in the data measurement systems and the data analysis systems in railway on-board health monitoring systems are presented in this paper, including methodologies, theories and applications. The pros and cons of the various approaches are analysed to determine appropriate benchmarks for an effective and efficient railway vehicle on-board health monitoring system. According to this review, inertial sensors are the most popular due to their advantages of low cost, robustness and low power consumption. Linearisation methods are required for the model-based methods which would inevitably introduce error to the estimation results, and it is time-consuming to include all possible conditions in the pre-built database required for signal-based methods. Based on this review, future development trends in the design of new low-cost health monitoring systems for railway vehicles are discussed.
Operational Retrievals of Evapotranspiration: Are we there yet?
NASA Astrophysics Data System (ADS)
Neale, C. M. U.; Anderson, M. C.; Hain, C.; Schull, M.; Isidro, C., Sr.; Goncalves, I. Z.
2017-12-01
Remote sensing based retrievals of evapotranspiration (ET) have progressed significantly over the last two decades with the improvement of methods and algorithms and the availability of multiple satellite sensors with shortwave and thermal infrared bands on polar orbiting platforms. The modeling approaches include simpler vegetation index (VI) based methods such as the reflectance-based crop coefficient approach coupled with surface reference evapotranspiration estimates to derive actual evapotranspiration of crops or, direct inputs to the Penman-Monteith equation through VI relationships with certain input variables. Methods that are more complex include one-layer or two-layer energy balance approaches that make use of both shortwave and longwave spectral band information to estimate different inputs to the energy balance equation. These models mostly differ in the estimation of sensible heat fluxes. For continental and global scale applications, other satellite-based products such as solar radiation, vegetation leaf area and cover are used as inputs, along with gridded re-analysis weather information. This presentation will review the state-of-the-art in satellite-based evapotranspiration estimation, giving examples of existing efforts to obtain operational ET retrievals over continental and global scales and discussing difficulties and challenges.
Probabilistic analysis of wind-induced vibration mitigation of structures by fluid viscous dampers
NASA Astrophysics Data System (ADS)
Chen, Jianbing; Zeng, Xiaoshu; Peng, Yongbo
2017-11-01
The high-rise buildings usually suffer from excessively large wind-induced vibrations, and thus vibration control systems might be necessary. Fluid viscous dampers (FVDs) with nonlinear power law against velocity are widely employed. With the transition of design method from traditional frequency domain approaches to more refined direct time domain approaches, the difficulty of time integration of these systems occurs sometimes. In the present paper, firstly the underlying reason of the difficulty is revealed by identifying that the equations of motion of high-rise buildings installed with FVDs are sometimes stiff differential equations. Thus, an approach effective for stiff differential systems, i.e., the backward difference formula (BDF), is then introduced, and verified to be effective for the equation of motion of wind-induced vibration controlled systems. Comparative studies are performed among some methods, including the Newmark method, KR-alpha method, energy-based linearization method and the statistical linearization method. Based on the above results, a 20-story steel frame structure is taken as a practical example. Particularly, the randomness of structural parameters and of wind loading input is emphasized. The extreme values of the responses are examined, showing the effectiveness of the proposed approach, and also necessitating the refined probabilistic analysis in the design of wind-induced vibration mitigation systems.
Life course approach in social epidemiology: an overview, application and future implications.
Cable, Noriko
2014-01-01
The application of the life course approach to social epidemiology has helped epidemiologists theoretically examine social gradients in population health. Longitudinal data with rich contextual information collected repeatedly and advanced statistical approaches have made this challenging task easier. This review paper provides an overview of the life course approach in epidemiology, its research application, and future challenges. In summary, a systematic approach to methods, including theoretically guided measurement of socioeconomic position, would assist researchers in gathering evidence for reducing social gradients in health, and collaboration across individual disciplines will make this task achievable.
Diffendorfer, James E.; Beston, Julie A.; Merrill, Matthew; Stanton, Jessica C.; Corum, Margo D.; Loss, Scott R.; Thogmartin, Wayne E.; Johnson, Douglas H.; Erickson, Richard A.; Heist, Kevin W.
2016-01-01
For this study, a methodology was developed for assessing impacts of wind energy generation on populations of birds and bats at regional to national scales. The approach combines existing methods in applied ecology for prioritizing species in terms of their potential risk from wind energy facilities and estimating impacts of fatalities on population status and trend caused by collisions with wind energy infrastructure. Methods include a qualitative prioritization approach, demographic models, and potential biological removal. The approach can be used to prioritize species in need of more thorough study as well as to identify species with minimal risk. However, the components of this methodology require simplifying assumptions and the data required may be unavailable or of poor quality for some species. These issues should be carefully considered before using the methodology. The approach will increase in value as more data become available and will broaden the understanding of anthropogenic sources of mortality on bird and bat populations.
Improving Anatomic Pathology in Sub-Saharan Africa to Support Cancer Care.
Wilson, Michael L; Ayers, Stephanie; Berney, Daniel; Eslan, Alexia; Guarner, Jeannette; Lester, Susan; Masia, Ricard; Moloo, Zahir; Mutuku, Angela; Roberts, Drucilla; Stall, Jennifer; Sayed, Shahin
2018-03-07
Cancer care requires both accurate pathologic diagnosis as well as pathologic cancer staging. We evaluated three approaches to training pathologists in sub-Saharan Africa to perform pathologic cancer staging of breast, cervix, prostate, and colorectal cancers. One of three training methods was used at each workshop: didactic, case-based testing (CBT), or a blended approach. The project involved 52 participants from 16 pathology departments in 11 countries in East, Central, and Southern Africa. Evaluation of each method included pre- and postworkshop knowledge assessments, online pre- and postworkshop surveys of practice changes at the individual and institutional levels, and selected site visits. While CBT resulted in the highest overall average postassessment individual scores, both CBT and blended approaches resulted in 19% increases in average scores from pre- to postworkshop assessments. Institutions that participated in the blended workshop had increased changes in practice as indicated by the institutional survey. Both CBT and a blended approach are effective methods for training pathologists in pathologic cancer staging. Both are superior to traditional lectures alone.
NASA Technical Reports Server (NTRS)
Oconnell, R. F.; Hassig, H. J.; Radovcich, N. A.
1976-01-01
Results of a study of the development of flutter modules applicable to automated structural design of advanced aircraft configurations, such as a supersonic transport, are presented. Automated structural design is restricted to automated sizing of the elements of a given structural model. It includes a flutter optimization procedure; i.e., a procedure for arriving at a structure with minimum mass for satisfying flutter constraints. Methods of solving the flutter equation and computing the generalized aerodynamic force coefficients in the repetitive analysis environment of a flutter optimization procedure are studied, and recommended approaches are presented. Five approaches to flutter optimization are explained in detail and compared. An approach to flutter optimization incorporating some of the methods discussed is presented. Problems related to flutter optimization in a realistic design environment are discussed and an integrated approach to the entire flutter task is presented. Recommendations for further investigations are made. Results of numerical evaluations, applying the five methods of flutter optimization to the same design task, are presented.
2016-01-01
Abstract Background Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. New information In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand. Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset. Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach. PMID:27932919
Holovachov, Oleksandr
2016-01-01
Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand.Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset.Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach.
Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.
Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio
2010-03-26
Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.
Quantification of Uncertainty in the Flood Frequency Analysis
NASA Astrophysics Data System (ADS)
Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.
2017-12-01
Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.
Fast, accurate semiempirical molecular orbital calculations for macromolecules
NASA Astrophysics Data System (ADS)
Dixon, Steven L.; Merz, Kenneth M., Jr.
1997-07-01
A detailed review of the semiempirical divide-and-conquer (D&C) method is given, including a new approach to subsetting, which involves dual buffer regions. Comparisons are drawn between this method and other semiempirical macromolecular schemes. D&C calculations are carried out using a basic 32 Mbyte memory workstation on a variety of peptide systems, including proteins containing up to 1960 atoms. Aspects of storage and SCF convergence are addressed, and parallelization of the D&C algorithm is discussed.
An inviscid-viscous interaction approach to the calculation of dynamic stall initiation on airfoils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cebeci, T.; Platzer, M.F.; Jang, H.M.
An interactive boundary-layer method is described for computing unsteady incompressible flow over airfoils, including the initiation of dynamic stall. The inviscid unsteady panel method developed by Platzer and Teng is extended to include viscous effects. The solutions of the boundary-layer equations are obtained with an inverse finite-difference method employing an interaction law based on the Hilbert integral, and the algebraic eddy-viscosity formulation of Cebeci and Smith. The method is applied to airfoils subject to periodic and ramp-type motions and its abilities are examined for a range of angles of attack, reduced frequency, and pitch rate.
Using groundwater levels to estimate recharge
Healy, R.W.; Cook, P.G.
2002-01-01
Accurate estimation of groundwater recharge is extremely important for proper management of groundwater systems. Many different approaches exist for estimating recharge. This paper presents a review of methods that are based on groundwater-level data. The water-table fluctuation method may be the most widely used technique for estimating recharge; it requires knowledge of specific yield and changes in water levels over time. Advantages of this approach include its simplicity and an insensitivity to the mechanism by which water moves through the unsaturated zone. Uncertainty in estimates generated by this method relate to the limited accuracy with which specific yield can be determined and to the extent to which assumptions inherent in the method are valid. Other methods that use water levels (mostly based on the Darcy equation) are also described. The theory underlying the methods is explained. Examples from the literature are used to illustrate applications of the different methods.
Multiple shooting shadowing for sensitivity analysis of chaotic dynamical systems
NASA Astrophysics Data System (ADS)
Blonigan, Patrick J.; Wang, Qiqi
2018-02-01
Sensitivity analysis methods are important tools for research and design with simulations. Many important simulations exhibit chaotic dynamics, including scale-resolving turbulent fluid flow simulations. Unfortunately, conventional sensitivity analysis methods are unable to compute useful gradient information for long-time-averaged quantities in chaotic dynamical systems. Sensitivity analysis with least squares shadowing (LSS) can compute useful gradient information for a number of chaotic systems, including simulations of chaotic vortex shedding and homogeneous isotropic turbulence. However, this gradient information comes at a very high computational cost. This paper presents multiple shooting shadowing (MSS), a more computationally efficient shadowing approach than the original LSS approach. Through an analysis of the convergence rate of MSS, it is shown that MSS can have lower memory usage and run time than LSS.
Bunck, C.M.; Chen, C.-L.; Pollock, K.H.
1995-01-01
Traditional methods of estimating survival from radio-telemetry studies use either the Trent-Rongstad approach (Trent and Rongstad 1974, Heisey and Fuller 1985) or the Kaplan-Meier approach (Kaplan and Meier 1958; Pollock et al. 1989a,b). Both methods appear to require the assumption that relocation probability for animals with a functioning radio is 1. In practice this may not always be reasonable and, in fact, is unnecessary. The number of animals at risk (i.e., risk set) can be modified to account for uncertain relocation of individuals. This involves including only relocated animals in the risk set instead of also including animals not relocated but that were seen later. Simulation results show that estimators and tests for comparing survival curves should be based on this modification.
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338
Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.
Mioni, Roberto; Marega, Alessandra; Lo Cicero, Marco; Montanaro, Domenico
2016-11-01
The approach to acid-base chemistry in medicine includes several methods. Currently, the two most popular procedures are derived from Stewart's studies and from the bicarbonate/BE-based classical formulation. Another method, unfortunately little known, follows the Kildeberg theory applied to acid-base titration. By using the data produced by Dana Atchley in 1933, regarding electrolytes and blood gas analysis applied to diabetes, we compared the three aforementioned methods, in order to highlight their strengths and their weaknesses. The results obtained, by reprocessing the data of Atchley, have shown that Kildeberg's approach, unlike the other two methods, is consistent, rational and complete for describing the organ-physiological behavior of the hydrogen ion turnover in human organism. In contrast, the data obtained using the Stewart approach and the bicarbonate-based classical formulation are misleading and fail to specify which organs or systems are involved in causing or maintaining the diabetic acidosis. Stewart's approach, despite being considered 'quantitative', does not propose in any way the concept of 'an amount of acid' and becomes even more confusing, because it is not clear how to distinguish between 'strong' and 'weak' ions. As for Stewart's approach, the classical method makes no distinction between hydrogen ions managed by the intermediate metabolism and hydroxyl ions handled by the kidney, but, at least, it is based on the concept of titration (base-excess) and indirectly defines the concept of 'an amount of acid'. In conclusion, only Kildeberg's approach offers a complete understanding of the causes and remedies against any type of acid-base disturbance.
Gorban, A N; Mirkes, E M; Zinovyev, A
2016-12-01
Most of machine learning approaches have stemmed from the application of minimizing the mean squared distance principle, based on the computationally efficient quadratic optimization methods. However, when faced with high-dimensional and noisy data, the quadratic error functionals demonstrated many weaknesses including high sensitivity to contaminating factors and dimensionality curse. Therefore, a lot of recent applications in machine learning exploited properties of non-quadratic error functionals based on L 1 norm or even sub-linear potentials corresponding to quasinorms L p (0
Uncertainty estimation and multi sensor fusion for kinematic laser tracker measurements
NASA Astrophysics Data System (ADS)
Ulrich, Thomas
2013-08-01
Laser trackers are widely used to measure kinematic tasks such as tracking robot movements. Common methods to evaluate the uncertainty in the kinematic measurement include approximations specified by the manufacturers, various analytical adjustment methods and the Kalman filter. In this paper a new, real-time technique is proposed, which estimates the 4D-path (3D-position + time) uncertainty of an arbitrary path in space. Here a hybrid system estimator is applied in conjunction with the kinematic measurement model. This method can be applied to processes, which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. The new approach is compared with the Kalman filter and a manufacturer's approximations. The comparison was made using data obtained by tracking an industrial robot's tool centre point with a Leica laser tracker AT901 and a Leica laser tracker LTD500. It shows that the new approach is more appropriate to analysing kinematic processes than the Kalman filter, as it reduces overshoots and decreases the estimated variance. In comparison with the manufacturer's approximations, the new approach takes account of kinematic behaviour with an improved description of the real measurement process and a reduction in estimated variance. This approach is therefore well suited to the analysis of kinematic processes with unknown changes in kinematic behaviour as well as the fusion among laser trackers.
ERIC Educational Resources Information Center
Norton, Cynthia G.; Gildensoph, Lynne H.; Phillips, Martha M.; Wygal, Deborah D.; Olson, Kurt H.; Pellegrini, John J.; Tweeten, Kathleen A.
1997-01-01
Describes the reform of an introductory biology curriculum to reverse high attrition rates. Objectives include fostering self-directed learning, emphasizing process over content, and offering laboratory experiences that model the way to acquire scientific knowledge. Teaching methods include discussion, group mentoring, laboratory sections, and…
ERIC Educational Resources Information Center
Danielsen, Dina; Bruselius-Jensen, Maria; Laitsch, Daniel
2017-01-01
Health promotion and education researchers and practitioners advocate for more democratic approaches to school-based health education, including participatory teaching methods and the promotion of a broad and positive concept of health and health knowledge, including aspects of the German educational concept of "bildung." Although…
Fraccaro, Paolo; Nicolo, Massimo; Bonetto, Monica; Giacomini, Mauro; Weller, Peter; Traverso, Carlo Enrico; Prosperi, Mattia; OSullivan, Dympna
2015-01-27
To investigate machine learning methods, ranging from simpler interpretable techniques to complex (non-linear) "black-box" approaches, for automated diagnosis of Age-related Macular Degeneration (AMD). Data from healthy subjects and patients diagnosed with AMD or other retinal diseases were collected during routine visits via an Electronic Health Record (EHR) system. Patients' attributes included demographics and, for each eye, presence/absence of major AMD-related clinical signs (soft drusen, retinal pigment epitelium, defects/pigment mottling, depigmentation area, subretinal haemorrhage, subretinal fluid, macula thickness, macular scar, subretinal fibrosis). Interpretable techniques known as white box methods including logistic regression and decision trees as well as less interpreitable techniques known as black box methods, such as support vector machines (SVM), random forests and AdaBoost, were used to develop models (trained and validated on unseen data) to diagnose AMD. The gold standard was confirmed diagnosis of AMD by physicians. Sensitivity, specificity and area under the receiver operating characteristic (AUC) were used to assess performance. Study population included 487 patients (912 eyes). In terms of AUC, random forests, logistic regression and adaboost showed a mean performance of (0.92), followed by SVM and decision trees (0.90). All machine learning models identified soft drusen and age as the most discriminating variables in clinicians' decision pathways to diagnose AMD. Both black-box and white box methods performed well in identifying diagnoses of AMD and their decision pathways. Machine learning models developed through the proposed approach, relying on clinical signs identified by retinal specialists, could be embedded into EHR to provide physicians with real time (interpretable) support.
Determination of the transmission coefficients for quantum structures using FDTD method.
Peng, Yangyang; Wang, Xiaoying; Sui, Wenquan
2011-12-01
The purpose of this work is to develop a simple method to incorporate quantum effect in traditional finite-difference time-domain (FDTD) simulators. Witch could make it possible to co-simulate systems include quantum structures and traditional components. In this paper, tunneling transmission coefficient is calculated by solving time-domain Schrödinger equation with a developed FDTD technique, called FDTD-S method. To validate the feasibility of the method, a simple resonant tunneling diode (RTD) structure model has been simulated using the proposed method. The good agreement between the numerical and analytical results proves its accuracy. The effectness and accuracy of this approach makes it a potential method for analysis and design of hybrid systems includes quantum structures and traditional components.
A data-driven approach to quality risk management
Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David
2013-01-01
Aim: An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Materials and Methods: Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. Results: Only a subset of the risk factors had a significant association with quality issues, and included: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Conclusion: Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety. PMID:24312890
Suemitsu, Atsuo; Dang, Jianwu; Ito, Takayuki; Tiede, Mark
2015-10-01
Articulatory information can support learning or remediating pronunciation of a second language (L2). This paper describes an electromagnetic articulometer-based visual-feedback approach using an articulatory target presented in real-time to facilitate L2 pronunciation learning. This approach trains learners to adjust articulatory positions to match targets for a L2 vowel estimated from productions of vowels that overlap in both L1 and L2. Training of Japanese learners for the American English vowel /æ/ that included visual training improved its pronunciation regardless of whether audio training was also included. Articulatory visual feedback is shown to be an effective method for facilitating L2 pronunciation learning.
Dynamic adaptive learning for decision-making supporting systems
NASA Astrophysics Data System (ADS)
He, Haibo; Cao, Yuan; Chen, Sheng; Desai, Sachi; Hohil, Myron E.
2008-03-01
This paper proposes a novel adaptive learning method for data mining in support of decision-making systems. Due to the inherent characteristics of information ambiguity/uncertainty, high dimensionality and noisy in many homeland security and defense applications, such as surveillances, monitoring, net-centric battlefield, and others, it is critical to develop autonomous learning methods to efficiently learn useful information from raw data to help the decision making process. The proposed method is based on a dynamic learning principle in the feature spaces. Generally speaking, conventional approaches of learning from high dimensional data sets include various feature extraction (principal component analysis, wavelet transform, and others) and feature selection (embedded approach, wrapper approach, filter approach, and others) methods. However, very limited understandings of adaptive learning from different feature spaces have been achieved. We propose an integrative approach that takes advantages of feature selection and hypothesis ensemble techniques to achieve our goal. Based on the training data distributions, a feature score function is used to provide a measurement of the importance of different features for learning purpose. Then multiple hypotheses are iteratively developed in different feature spaces according to their learning capabilities. Unlike the pre-set iteration steps in many of the existing ensemble learning approaches, such as adaptive boosting (AdaBoost) method, the iterative learning process will automatically stop when the intelligent system can not provide a better understanding than a random guess in that particular subset of feature spaces. Finally, a voting algorithm is used to combine all the decisions from different hypotheses to provide the final prediction results. Simulation analyses of the proposed method on classification of different US military aircraft databases show the effectiveness of this method.
Xu, Y.; Xia, J.; Miller, R.D.
2007-01-01
The need for incorporating the traction-free condition at the air-earth boundary for finite-difference modeling of seismic wave propagation has been discussed widely. A new implementation has been developed for simulating elastic wave propagation in which the free-surface condition is replaced by an explicit acoustic-elastic boundary. Detailed comparisons of seismograms with different implementations for the air-earth boundary were undertaken using the (2,2) (the finite-difference operators are second order in time and space) and the (2,6) (second order in time and sixth order in space) standard staggered-grid (SSG) schemes. Methods used in these comparisons to define the air-earth boundary included the stress image method (SIM), the heterogeneous approach, the scheme of modifying material properties based on transversely isotropic medium approach, the acoustic-elastic boundary approach, and an analytical approach. The method proposed achieves the same or higher accuracy of modeled body waves relative to the SIM. Rayleigh waves calculated using the explicit acoustic-elastic boundary approach differ slightly from those calculated using the SIM. Numerical results indicate that when using the (2,2) SSG scheme for SIM and our new method, a spatial step of 16 points per minimum wavelength is sufficient to achieve 90% accuracy; 32 points per minimum wavelength achieves 95% accuracy in modeled Rayleigh waves. When using the (2,6) SSG scheme for the two methods, a spatial step of eight points per minimum wavelength achieves 95% accuracy in modeled Rayleigh waves. Our proposed method is physically reasonable and, based on dispersive analysis of simulated seismographs from a layered half-space model, is highly accurate. As a bonus, our proposed method is easy to program and slightly faster than the SIM. ?? 2007 Society of Exploration Geophysicists.
Physical Processes in the MAGO/MFT Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garanin, Sergey F; Reinovsky, Robert E.
2015-03-23
The Monograph is devoted to theoretical discussion of the physical effects, which are most significant for the alternative approach to the problem of controlled thermonuclear fusion (CTF): the MAGO/MTF approach. The book includes the description of the approach, its difference from the major CTF systems—magnetic confinement and inertial confinement systems. General physical methods of the processes simulation in this approach are considered, including plasma transport phenomena and radiation, and the theory of transverse collisionless shock waves, the surface discharges theory, important for such kind of research. Different flows and magneto-hydrodynamic plasma instabilities occurring in the frames of this approach aremore » also considered. In virtue of the general physical essence of the considered phenomena the presented results are applicable to a wide range of plasma physics and hydrodynamics processes. The book is intended for the plasma physics and hydrodynamics specialists, post-graduate students, and senior students-physicists.« less
Balliu, Brunilda; Tsonaka, Roula; Boehringer, Stefan; Houwing-Duistermaat, Jeanine
2015-03-01
Integrative omics, the joint analysis of outcome and multiple types of omics data, such as genomics, epigenomics, and transcriptomics data, constitute a promising approach for powerful and biologically relevant association studies. These studies often employ a case-control design, and often include nonomics covariates, such as age and gender, that may modify the underlying omics risk factors. An open question is how to best integrate multiple omics and nonomics information to maximize statistical power in case-control studies that ascertain individuals based on the phenotype. Recent work on integrative omics have used prospective approaches, modeling case-control status conditional on omics, and nonomics risk factors. Compared to univariate approaches, jointly analyzing multiple risk factors with a prospective approach increases power in nonascertained cohorts. However, these prospective approaches often lose power in case-control studies. In this article, we propose a novel statistical method for integrating multiple omics and nonomics factors in case-control association studies. Our method is based on a retrospective likelihood function that models the joint distribution of omics and nonomics factors conditional on case-control status. The new method provides accurate control of Type I error rate and has increased efficiency over prospective approaches in both simulated and real data. © 2015 Wiley Periodicals, Inc.
Indirect methods for reference interval determination - review and recommendations.
Jones, Graham R D; Haeckel, Rainer; Loh, Tze Ping; Sikaris, Ken; Streichert, Thomas; Katayev, Alex; Barth, Julian H; Ozarda, Yesim
2018-04-19
Reference intervals are a vital part of the information supplied by clinical laboratories to support interpretation of numerical pathology results such as are produced in clinical chemistry and hematology laboratories. The traditional method for establishing reference intervals, known as the direct approach, is based on collecting samples from members of a preselected reference population, making the measurements and then determining the intervals. An alternative approach is to perform analysis of results generated as part of routine pathology testing and using appropriate statistical techniques to determine reference intervals. This is known as the indirect approach. This paper from a working group of the International Federation of Clinical Chemistry (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) aims to summarize current thinking on indirect approaches to reference intervals. The indirect approach has some major potential advantages compared with direct methods. The processes are faster, cheaper and do not involve patient inconvenience, discomfort or the risks associated with generating new patient health information. Indirect methods also use the same preanalytical and analytical techniques used for patient management and can provide very large numbers for assessment. Limitations to the indirect methods include possible effects of diseased subpopulations on the derived interval. The IFCC C-RIDL aims to encourage the use of indirect methods to establish and verify reference intervals, to promote publication of such intervals with clear explanation of the process used and also to support the development of improved statistical techniques for these studies.
Vanishing points detection using combination of fast Hough transform and deep learning
NASA Astrophysics Data System (ADS)
Sheshkus, Alexander; Ingacheva, Anastasia; Nikolaev, Dmitry
2018-04-01
In this paper we propose a novel method for vanishing points detection based on convolutional neural network (CNN) approach and fast Hough transform algorithm. We show how to determine fast Hough transform neural network layer and how to use it in order to increase usability of the neural network approach to the vanishing point detection task. Our algorithm includes CNN with consequence of convolutional and fast Hough transform layers. We are building estimator for distribution of possible vanishing points in the image. This distribution can be used to find candidates of vanishing point. We provide experimental results from tests of suggested method using images collected from videos of road trips. Our approach shows stable result on test images with different projective distortions and noise. Described approach can be effectively implemented for mobile GPU and CPU.
Finite element methods and Navier-Stokes equations
NASA Astrophysics Data System (ADS)
Cuvelier, C.; Segal, A.; van Steenhoven, A. A.
This book is devoted to two and three-dimensional FEM analysis of the Navier-Stokes (NS) equations describing one flow of a viscous incompressible fluid. Three different approaches to the NS equations are described: a direct method, a penalty method, and a method that constructs discrete solenoidal vector fields. Subjects of current research which are important from the industrial/technological viewpoint are considered, including capillary-free boundaries, nonisothermal flows, turbulence, and non-Newtonian fluids.
Systems and Methods for Composable Analytics
2014-04-29
simplistic module that performs a mathematical operation on two numbers. The most important method is the Execute() method. This will get called when it is...addition, an input control is also specified in the example below. In this example, the mathematical operator can only be chosen from a preconfigured...approaches. Some of the industries that could benefit from Composable Analytics include pharmaceuticals, health care, insurance, actuaries , and
Oldroyd, Rachel A; Morris, Michelle A; Birkin, Mark
2018-06-06
Traditional methods of monitoring foodborne illness are associated with problems of untimeliness and underreporting. In recent years, alternative data sources such as social media data have been used to monitor the incidence of disease in the population (infodemiology and infoveillance). These data sources prove timelier than traditional general practitioner data, they can help to fill the gaps in the reporting process, and they often include additional metadata that is useful for supplementary research. The aim of the study was to identify and formally analyze research papers using consumer-generated data, such as social media data or restaurant reviews, to quantify a disease or public health ailment. Studies of this nature are scarce within the food safety domain, therefore identification and understanding of transferrable methods in other health-related fields are of particular interest. Structured scoping methods were used to identify and analyze primary research papers using consumer-generated data for disease or public health surveillance. The title, abstract, and keyword fields of 5 databases were searched using predetermined search terms. A total of 5239 papers matched the search criteria, of which 145 were taken to full-text review-62 papers were deemed relevant and were subjected to data characterization and thematic analysis. The majority of studies (40/62, 65%) focused on the surveillance of influenza-like illness. Only 10 studies (16%) used consumer-generated data to monitor outbreaks of foodborne illness. Twitter data (58/62, 94%) and Yelp reviews (3/62, 5%) were the most commonly used data sources. Studies reporting high correlations against baseline statistics used advanced statistical and computational approaches to calculate the incidence of disease. These include classification and regression approaches, clustering approaches, and lexicon-based approaches. Although they are computationally intensive due to the requirement of training data, studies using classification approaches reported the best performance. By analyzing studies in digital epidemiology, computer science, and public health, this paper has identified and analyzed methods of disease monitoring that can be transferred to foodborne disease surveillance. These methods fall into 4 main categories: basic approach, classification and regression, clustering approaches, and lexicon-based approaches. Although studies using a basic approach to calculate disease incidence generally report good performance against baseline measures, they are sensitive to chatter generated by media reports. More computationally advanced approaches are required to filter spurious messages and protect predictive systems against false alarms. Research using consumer-generated data for monitoring influenza-like illness is expansive; however, research regarding the use of restaurant reviews and social media data in the context of food safety is limited. Considering the advantages reported in this review, methods using consumer-generated data for foodborne disease surveillance warrant further investment. ©Rachel A Oldroyd, Michelle A Morris, Mark Birkin. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 06.06.2018.
Integration of Geophysical Methods By A Generalised Probability Tomography Approach
NASA Astrophysics Data System (ADS)
Mauriello, P.; Patella, D.
In modern science, the propensity interpretative approach stands on the assumption that any physical system consists of two kinds of reality: actual and potential. Also geophysical data systems have potentialities that extend far beyond the few actual models normally attributed to them. Indeed, any geophysical data set is in itself quite inherently ambiguous. Classical deterministic inversion, including tomography, usu- ally forces a measured data set to collapse into a few rather subjective models based on some available a priori information. Classical interpretation is thus an intrinsically limited approach requiring a very deep logical extension. We think that a way to high- light a system full potentiality is to introduce probability as the leading paradigm in dealing with field data systems. Probability tomography has been recently introduced as a completely new approach to data interpretation. Probability tomography has been originally formulated for the self-potential method. It has been then extended to geo- electric, natural source electromagnetic induction, gravity and magnetic methods. Fol- lowing the same rationale, in this paper we generalize the probability tomography the- ory to a generic geophysical anomaly vector field, including the treatment for scalar fields as a particular case. This generalization makes then possible to address for the first time the problem of the integration of different methods by a conjoint probabil- ity tomography imaging procedure. The aim is to infer the existence of an unknown buried object through the analysis of an ad hoc occurrence probability function, blend- ing the physical messages brought forth by a set of singularly observed anomalies.
What InSAR time-series methods are best suited for the Ecuadorian volcanoes
NASA Astrophysics Data System (ADS)
Mirzaee, S.; Amelung, F.
2017-12-01
Ground displacement measurements from stacks of SAR images obtained using interferometric time-series approaches play an increasingly important role for volcanic hazard assessment. The inflation of the ground surface can indicate that magma ascends to shallower levels and that a volcano gets ready for an eruption. Commonly used InSAR time-series approaches include Small Baseline (SB), Persistent Scatter InSAR (PSI) and SqueeSAR methods but it remains unclear which approach is best suited for volcanic environments. On this poster we present InSAR deformation measurements for the active volcanoes of Ecuador (Cotopaxi, Tungurahua and Pichincha) using a variety of INSAR time-series methods. We discuss the pros and cons of each method given the available data stacks (TerraSAR-X, Cosmo-Skymed and Sentinel-1) in an effort to design a comprehensive observation strategy for the Ecuadorian volcanoes. SAR data are provided in the framework of the Group on Earth Observation's Ecuadorian Volcano Geohazard Supersite.
Integration of QFD, AHP, and LPP methods in supplier development problems under uncertainty
NASA Astrophysics Data System (ADS)
Shad, Zahra; Roghanian, Emad; Mojibian, Fatemeh
2014-04-01
Quality function deployment (QFD) is a customer-driven approach, widely used to develop or process new product to maximize customer satisfaction. Last researches used linear physical programming (LPP) procedure to optimize QFD; however, QFD issue involved uncertainties, or fuzziness, which requires taking them into account for more realistic study. In this paper, a set of fuzzy data is used to address linguistic values parameterized by triangular fuzzy numbers. Proposed integrated approach including analytic hierarchy process (AHP), QFD, and LPP to maximize overall customer satisfaction under uncertain conditions and apply them in the supplier development problem. The fuzzy AHP approach is adopted as a powerful method to obtain the relationship between the customer requirements and engineering characteristics (ECs) to construct house of quality in QFD method. LPP is used to obtain the optimal achievement level of the ECs and subsequently the customer satisfaction level under different degrees of uncertainty. The effectiveness of proposed method will be illustrated by an example.
Tugwell, Peter; Pottie, Kevin; Welch, Vivian; Ueffing, Erin; Chambers, Andrea; Feightner, John
2011-01-01
Background: This article describes the evidence review and guideline development method developed for the Clinical Preventive Guidelines for Immigrants and Refugees in Canada by the Canadian Collaboration for Immigrant and Refugee Health Guideline Committee. Methods: The Appraisal of Guidelines for Research and Evaluation (AGREE) best-practice framework was combined with the recently developed Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to produce evidence-based clinical guidelines for immigrants and refugees in Canada. Results: A systematic approach was designed to produce the evidence reviews and apply the GRADE approach, including building on evidence from previous systematic reviews, searching for and comparing evidence between general and specific immigrant populations, and applying the GRADE criteria for making recommendations. This method was used for priority health conditions that had been selected by practitioners caring for immigrants and refugees in Canada. Interpretation: This article outlines the 14-step method that was defined to standardize the guideline development process for each priority health condition. PMID:20573711
Identification of QRS complex in non-stationary electrocardiogram of sick infants.
Kota, S; Swisher, C B; Al-Shargabi, T; Andescavage, N; du Plessis, A; Govindan, R B
2017-08-01
Due to the high-frequency of routine interventions in an intensive care setting, electrocardiogram (ECG) recordings from sick infants are highly non-stationary, with recurrent changes in the baseline, alterations in the morphology of the waveform, and attenuations of the signal strength. Current methods lack reliability in identifying QRS complexes (a marker of individual cardiac cycles) in the non-stationary ECG. In the current study we address this problem by proposing a novel approach to QRS complex identification. Our approach employs lowpass filtering, half-wave rectification, and the use of instantaneous Hilbert phase to identify QRS complexes in the ECG. We demonstrate the application of this method using ECG recordings from eight preterm infants undergoing intensive care, as well as from 18 normal adult volunteers available via a public database. We compared our approach to the commonly used approaches including Pan and Tompkins (PT), gqrs, wavedet, and wqrs for identifying QRS complexes and then compared each with manually identified QRS complexes. For preterm infants, a comparison between the QRS complexes identified by our approach and those identified through manual annotations yielded sensitivity and positive predictive values of 99% and 99.91%, respectively. The comparison metrics for each method are as follows: PT (sensitivity: 84.49%, positive predictive value: 99.88%), gqrs (85.25%, 99.49%), wavedet (95.24%, 99.86%), and wqrs (96.99%, 96.55%). Thus, the sensitivity values of the four methods previously described, are lower than the sensitivity of the method we propose; however, the positive predictive values of these other approaches is comparable to those of our method, with the exception of the wqrs approach, which yielded a slightly lower value. For adult ECG, our approach yielded a sensitivity of 99.78%, whereas PT yielded 99.79%. The positive predictive value was 99.42% for both our approach as well as for PT. We propose a novel method for identifying QRS complexes that outperforms common currently available tools for non-stationary ECG data in infants. For stationary ECG our proposed approach and the PT approach perform equally well. The ECG acquired in a clinical environment may be prone to issues related to non-stationarity, especially in critically ill patients. The approach proposed in this report offers superior reliability in these scenarios. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.
2008-06-01
For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).
Computer design of porous active materials at different dimensional scales
NASA Astrophysics Data System (ADS)
Nasedkin, Andrey
2017-12-01
The paper presents a mathematical and computer modeling of effective properties of porous piezoelectric materials of three types: with ordinary porosity, with metallized pore surfaces, and with nanoscale porosity structure. The described integrated approach includes the effective moduli method of composite mechanics, simulation of representative volumes, and finite element method.
A Comparative Analysis of Method Books for Class Jazz Instruction
ERIC Educational Resources Information Center
Watson, Kevin E.
2017-01-01
The purpose of this study was to analyze and compare instructional topics and teaching approaches included in selected class method books for jazz pedagogy through content analysis methodology. Frequency counts for the number of pages devoted to each defined instructional content category were compiled and percentages of pages allotted to each…
ERIC Educational Resources Information Center
Crede, Erin; Borrego, Maura
2013-01-01
As part of a sequential exploratory mixed methods study, 9 months of ethnographically guided observations and interviews were used to develop a survey examining graduate engineering student retention. Findings from the ethnographic fieldwork yielded several themes, including international diversity, research group organization and climate,…
Q and you: The application of Q methodology in recreation research
Whitney Ward
2010-01-01
Researchers have used various qualitative and quantitative methods to deal with subjectivity in studying people's recreation experiences. Q methodology has been the most effective approach for analyzing both qualitative and quantitative aspects of experience, including attitudes or perceptions. The method is composed of two main components--Q sorting and Q factor...
Cultural Accommodation as Method and Metaphor
ERIC Educational Resources Information Center
Leong, Frederick T. L.
2007-01-01
The author summarizes the cultural accommodation model (CAM) of cross-cultural psychotherapy (F. T. L. Leong & S. H. Lee, 2006). This summary is divided into 2 parts, with the 1st part describing the theoretical development of the CAM as a method of psychotherapy and the research approach underlying it. This section includes a description of the…
Narrative Inquiry: Theory and Practice
ERIC Educational Resources Information Center
Savin-Baden, Maggi; Van Niekerk, Lana
2007-01-01
This article offers an overview of the method of narrative inquiry and explores competing trends in the use of the approach. It not only examines the theories relating to the method but also offers practical guidance on using narrative inquiry, including an exploration of what might count as a narrative and ways of analysing narrative data. The…
Pluralistic Inquiry for the History of Community Psychology
ERIC Educational Resources Information Center
Kelly, James G.; Chang, Janet
2008-01-01
The authors present the case not only for studying the history of community psychology but also of adopting a pluralistic approach to historical inquiry, using multiple methods and access to resources from other disciplines (e.g., historians of science and social historians). Examples of substantive topics and methods, including social network and…
Einstein Slew Survey: Data analysis innovations
NASA Technical Reports Server (NTRS)
Elvis, Martin S.; Plummer, David; Schachter, Jonathan F.; Fabbiano, G.
1992-01-01
Several new methods were needed in order to make the Einstein Slew X-ray Sky Survey. The innovations which enabled the Slew Survey to be done are summarized. These methods included experimental approach to large projects, parallel processing on a LAN, percolation source detection, minimum action identifications, and rapid dissemination of the whole data base.
ERIC Educational Resources Information Center
Roberts, Richie; Edwards, M. Craig
2015-01-01
American education's journey has witnessed the rise and fall of various progressive education approaches, including service-learning. In many respects, however, service-learning is still undergoing formation and adoption as a teaching method, specifically in School-Based, Agricultural Education (SBAE). For this reason, the interest existed to…
Research on aviation fuel instability
NASA Technical Reports Server (NTRS)
Baker, C. E.; Bittker, D. A.; Cohen, S. M.; Seng, G. T.
1983-01-01
The underlying causes of fuel thermal degradation are discussed. Topics covered include: nature of fuel instability and its temperature dependence, methods of measuring the instability, chemical mechanisms involved in deposit formation, and instrumental methods for characterizing fuel deposits. Finally, some preliminary thoughts on design approaches for minimizing the effects of lowered thermal stability are briefly discussed.
Five Methods for Estimating Angoff Cut Scores with IRT
ERIC Educational Resources Information Center
Wyse, Adam E.
2017-01-01
This article illustrates five different methods for estimating Angoff cut scores using item response theory (IRT) models. These include maximum likelihood (ML), expected a priori (EAP), modal a priori (MAP), and weighted maximum likelihood (WML) estimators, as well as the most commonly used approach based on translating ratings through the test…
ERIC Educational Resources Information Center
Bell, Randy L.; Matkins, Juanita Jo; Gansneder, Bruce M.
2011-01-01
This mixed-methods investigation compared the relative impacts of instructional approach and context of nature of science instruction on preservice elementary teachers' understandings. The sample consisted of 75 preservice teachers enrolled in four sections of an elementary science methods course. Independent variables included instructional…
Cochrane, Anita J; Dick, Bob; King, Neil A; Hills, Andrew P; Kavanagh, David J
2017-10-16
There have been consistent recommendations for multicomponent and multidisciplinary approaches for obesity management. However, there is no clear agreement on the components, disciplines or processes to be considered within such an approach. In this study, we explored multicomponent and multidisciplinary approaches through an examination of knowledge, skills, beliefs, and recommendations of stakeholders involved in obesity management. These stakeholders included researchers, practitioners, educators, and patients. We used qualitative action research methods, including convergent interviewing and observation, to assist the process of inquiry. The consensus was that a multicomponent and multidisciplinary approach should be based on four central meta-components (patient, practitioner, process, and environmental factors), and specific components of these factors were identified. Psychologists, dieticians, exercise physiologists and general practitioners were nominated as key practitioners to be included. A complex condition like obesity requires that multiple components be addressed, and that both patients and multiple disciplines are involved in developing solutions. Implementing cycles of continuous improvement to deal with complexity, instead of trying to control for it, offers an effective way to deal with complex, changing multisystem problems like obesity.
Hügler, Michael; Böckle, Karin; Eberhagen, Ingrid; Thelen, Karin; Beimfohr, Claudia; Hambsch, Beate
2011-01-01
Monitoring of microbiological contaminants in water supplies requires fast and sensitive methods for the specific detection of indicator organisms or pathogens. We developed a protocol for the simultaneous detection of E. coli and coliform bacteria based on the Fluorescence in situ Hybridization (FISH) technology. This protocol consists of two approaches. The first allows the direct detection of single E. coli and coliform bacterial cells on the filter membranes. The second approach includes incubation of the filter membranes on a nutrient agar plate and subsequent detection of the grown micro-colonies. Both approaches were validated using drinking water samples spiked with pure cultures and naturally contaminated water samples. The effects of heat, chlorine and UV disinfection were also investigated. The micro-colony approach yielded very good results for all samples and conditions tested, and thus can be thoroughly recommended for usage as an alternative method to detect E. coli and coliform bacteria in water samples. However, during this study, some limitations became visible for the single cell approach. The method cannot be applied for water samples which have been disinfected by UV irradiation. In addition, our results indicated that green fluorescent dyes are not suitable to be used with chlorine disinfected samples.
Kovács, István A.; Palotai, Robin; Szalay, Máté S.; Csermely, Peter
2010-01-01
Background Network communities help the functional organization and evolution of complex networks. However, the development of a method, which is both fast and accurate, provides modular overlaps and partitions of a heterogeneous network, has proven to be rather difficult. Methodology/Principal Findings Here we introduce the novel concept of ModuLand, an integrative method family determining overlapping network modules as hills of an influence function-based, centrality-type community landscape, and including several widely used modularization methods as special cases. As various adaptations of the method family, we developed several algorithms, which provide an efficient analysis of weighted and directed networks, and (1) determine pervasively overlapping modules with high resolution; (2) uncover a detailed hierarchical network structure allowing an efficient, zoom-in analysis of large networks; (3) allow the determination of key network nodes and (4) help to predict network dynamics. Conclusions/Significance The concept opens a wide range of possibilities to develop new approaches and applications including network routing, classification, comparison and prediction. PMID:20824084
Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr
2010-03-24
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less
Field Impact Evaluation Process on Electronic Tabular Display Subsystem (ETABS).
1979-10-01
structural and process techniques are described. These include a diagonal slice approach to team formulation and several different methods of team building, process control and conflict management . (Author)
Cox regression analysis with missing covariates via nonparametric multiple imputation.
Hsu, Chiu-Hsieh; Yu, Mandi
2018-01-01
We consider the situation of estimating Cox regression in which some covariates are subject to missing, and there exists additional information (including observed event time, censoring indicator and fully observed covariates) which may be predictive of the missing covariates. We propose to use two working regression models: one for predicting the missing covariates and the other for predicting the missing probabilities. For each missing covariate observation, these two working models are used to define a nearest neighbor imputing set. This set is then used to non-parametrically impute covariate values for the missing observation. Upon the completion of imputation, Cox regression is performed on the multiply imputed datasets to estimate the regression coefficients. In a simulation study, we compare the nonparametric multiple imputation approach with the augmented inverse probability weighted (AIPW) method, which directly incorporates the two working models into estimation of Cox regression, and the predictive mean matching imputation (PMM) method. We show that all approaches can reduce bias due to non-ignorable missing mechanism. The proposed nonparametric imputation method is robust to mis-specification of either one of the two working models and robust to mis-specification of the link function of the two working models. In contrast, the PMM method is sensitive to misspecification of the covariates included in imputation. The AIPW method is sensitive to the selection probability. We apply the approaches to a breast cancer dataset from Surveillance, Epidemiology and End Results (SEER) Program.
Introduction to benchmark dose methods and U.S. EPA's benchmark dose software (BMDS) version 2.1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, J. Allen, E-mail: davis.allen@epa.gov; Gift, Jeffrey S.; Zhao, Q. Jay
2011-07-15
Traditionally, the No-Observed-Adverse-Effect-Level (NOAEL) approach has been used to determine the point of departure (POD) from animal toxicology data for use in human health risk assessments. However, this approach is subject to substantial limitations that have been well defined, such as strict dependence on the dose selection, dose spacing, and sample size of the study from which the critical effect has been identified. Also, the NOAEL approach fails to take into consideration the shape of the dose-response curve and other related information. The benchmark dose (BMD) method, originally proposed as an alternative to the NOAEL methodology in the 1980s, addressesmore » many of the limitations of the NOAEL method. It is less dependent on dose selection and spacing, and it takes into account the shape of the dose-response curve. In addition, the estimation of a BMD 95% lower bound confidence limit (BMDL) results in a POD that appropriately accounts for study quality (i.e., sample size). With the recent advent of user-friendly BMD software programs, including the U.S. Environmental Protection Agency's (U.S. EPA) Benchmark Dose Software (BMDS), BMD has become the method of choice for many health organizations world-wide. This paper discusses the BMD methods and corresponding software (i.e., BMDS version 2.1.1) that have been developed by the U.S. EPA, and includes a comparison with recently released European Food Safety Authority (EFSA) BMD guidance.« less
ERIC Educational Resources Information Center
Fomin, Eugene P.; Alekseev, Audrey A.; Fomina, Natalia E.; Dorozhkin, Vladimir E.
2016-01-01
The article illustrates a theoretical approach to scenario modeling of economic indicators of regional waste management system. The method includes a three-iterative algorithm that allows the executive authorities and investors to take a decision on logistics, bulk, technological and economic parameters of the formation of the regional long-term…
ERIC Educational Resources Information Center
Chang, EunJung; Lim, Maria; Kim, Minam
2012-01-01
In this article, three art educators reflect on their ideas and experiences in developing and implementing innovative projects for their courses focusing on art for elementary education majors. They explore three different approaches. The three areas that are discussed in depth include: (1) understanding child art; (2) visual culture; and (3)…
ERIC Educational Resources Information Center
An, Xiaomi; Xu, Shaotong; Mu, Yong; Wang, Wei; Bai, Xian Yang; Dawson, Andy; Han, Hongqi
2012-01-01
Purpose: The purpose of this paper is to propose meta-synthetic ideas and knowledge asset management approaches to build a comprehensive strategic framework for Beijing City in China. Design/methodology/approach: Methods include a review of relevant literature in both English and Chinese, case studies of different types of support frameworks in…
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2014-01-01
This research note contributes to the discussion of methods that can be used to identify useful auxiliary variables for analyses of incomplete data sets. A latent variable approach is discussed, which is helpful in finding auxiliary variables with the property that if included in subsequent maximum likelihood analyses they may enhance considerably…
The Effects of Argumentation Based Science Learning Approach on Creative Thinking Skills of Students
ERIC Educational Resources Information Center
Küçük Demir, Betül; Isleyen, Tevfik
2015-01-01
The aim of this study is to explore the effects of argumentation-based science learning (ABSL) approach on 9th Grade of Secondary Education students' creative thinking skills. The sample of the study included 22 9th grade of Secondary Education students in Bayburt in 2012-2013 academic year. In this study quantitative research method and…
Real-time motion compensation for EM bronchoscope tracking with smooth output - ex-vivo validation
NASA Astrophysics Data System (ADS)
Reichl, Tobias; Gergel, Ingmar; Menzel, Manuela; Hautmann, Hubert; Wegner, Ingmar; Meinzer, Hans-Peter; Navab, Nassir
2012-02-01
Navigated bronchoscopy provides benefits for endoscopists and patients, but accurate tracking information is needed. We present a novel real-time approach for bronchoscope tracking combining electromagnetic (EM) tracking, airway segmentation, and a continuous model of output. We augment a previously published approach by including segmentation information in the tracking optimization instead of image similarity. Thus, the new approach is feasible in real-time. Since the true bronchoscope trajectory is continuous, the output is modeled using splines and the control points are optimized with respect to displacement from EM tracking measurements and spatial relation to segmented airways. Accuracy of the proposed method and its components is evaluated on a ventilated porcine ex-vivo lung with respect to ground truth data acquired from a human expert. We demonstrate the robustness of the output of the proposed method against added artificial noise in the input data. Smoothness in terms of inter-frame distance is shown to remain below 2 mm, even when up to 5 mm of Gaussian noise are added to the input. The approach is shown to be easily extensible to include other measures like image similarity.
Ray-based approach to integrated 3D visual communication
NASA Astrophysics Data System (ADS)
Naemura, Takeshi; Harashima, Hiroshi
2001-02-01
For a high sense of reality in the next-generation communications, it is very important to realize three-dimensional (3D) spatial media, instead of existing 2D image media. In order to comprehensively deal with a variety of 3D visual data formats, the authors first introduce the concept of "Integrated 3D Visual Communication," which reflects the necessity of developing a neutral representation method independent of input/output systems. Then, the following discussions are concentrated on the ray-based approach to this concept, in which any visual sensation is considered to be derived from a set of light rays. This approach is a simple and straightforward to the problem of how to represent 3D space, which is an issue shared by various fields including 3D image communications, computer graphics, and virtual reality. This paper mainly presents the several developments in this approach, including some efficient methods of representing ray data, a real-time video-based rendering system, an interactive rendering system based on the integral photography, a concept of virtual object surface for the compression of tremendous amount of data, and a light ray capturing system using a telecentric lens. Experimental results demonstrate the effectiveness of the proposed techniques.
Dieringer, Matthias A.; Deimling, Michael; Santoro, Davide; Wuerfel, Jens; Madai, Vince I.; Sobesky, Jan; von Knobelsdorff-Brenkenhoff, Florian; Schulz-Menger, Jeanette; Niendorf, Thoralf
2014-01-01
Introduction Visual but subjective reading of longitudinal relaxation time (T1) weighted magnetic resonance images is commonly used for the detection of brain pathologies. For this non-quantitative measure, diagnostic quality depends on hardware configuration, imaging parameters, radio frequency transmission field (B1+) uniformity, as well as observer experience. Parametric quantification of the tissue T1 relaxation parameter offsets the propensity for these effects, but is typically time consuming. For this reason, this study examines the feasibility of rapid 2D T1 quantification using a variable flip angles (VFA) approach at magnetic field strengths of 1.5 Tesla, 3 Tesla, and 7 Tesla. These efforts include validation in phantom experiments and application for brain T1 mapping. Methods T1 quantification included simulations of the Bloch equations to correct for slice profile imperfections, and a correction for B1+. Fast gradient echo acquisitions were conducted using three adjusted flip angles for the proposed T1 quantification approach that was benchmarked against slice profile uncorrected 2D VFA and an inversion-recovery spin-echo based reference method. Brain T1 mapping was performed in six healthy subjects, one multiple sclerosis patient, and one stroke patient. Results Phantom experiments showed a mean T1 estimation error of (-63±1.5)% for slice profile uncorrected 2D VFA and (0.2±1.4)% for the proposed approach compared to the reference method. Scan time for single slice T1 mapping including B1+ mapping could be reduced to 5 seconds using an in-plane resolution of (2×2) mm2, which equals a scan time reduction of more than 99% compared to the reference method. Conclusion Our results demonstrate that rapid 2D T1 quantification using a variable flip angle approach is feasible at 1.5T/3T/7T. It represents a valuable alternative for rapid T1 mapping due to the gain in speed versus conventional approaches. This progress may serve to enhance the capabilities of parametric MR based lesion detection and brain tissue characterization. PMID:24621588
Understanding the Experience of Stroke: A Mixed-Method Research Agenda
Clarke, Philippa
2009-01-01
The use of both quantitative and qualitative strategies to examine a single research question has been a subject of considerable controversy and still remains a largely uncommon practice in the sociology of health and illness. Yet, when seeking to understand the meaning of a chronic disabling condition in later life from a social psychological perspective, a mixed-method approach is likely to provide the most comprehensive picture. This article provides an overview of the usefulness and appropriateness of a mixed-method approach to understanding the stroke experience. I comment on the current state of research on the experience of stroke, including epistemological and ontological orientations. Using real data examples, I address paradigmatic assumptions, methods of integration, as well as challenges and pitfalls in integrating methods. I conclude by considering future directions in this field of research. PMID:19386828
Can improvised somatic dance reduce acute pain for young people in hospital?
Dowler, Lisa
2016-11-08
Aim This study explores the effects of improvised somatic dance (ISD) on children and young people experiencing acute pain following orthopaedic or cardiac surgery, or post-acquired brain injury. Methods The study involved 25 children and young people and adopted a mixed methods approach. This included a descriptive qualitative approach to help the participants and witnesses verbalise their experience of ISD, and pain scores were assessed before and after ISD using validated pain assessment tools. Data were analysed using descriptive statistical analysis. Findings A total of 92% of participants experienced a reduction in pain, with 80% experiencing a >50% reduction. There was an improved sense of well-being for all. Conclusion Although not a replacement for pharmacological treatments, a multidimensional, child-centred and inclusive approach with ISD can be a useful complementary, non-pharmacological method of pain management in children and young people.
Schroeter, Timon Sebastian; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-12-01
We investigate the use of different Machine Learning methods to construct models for aqueous solubility. Models are based on about 4000 compounds, including an in-house set of 632 drug discovery molecules of Bayer Schering Pharma. For each method, we also consider an appropriate method to obtain error bars, in order to estimate the domain of applicability (DOA) for each model. Here, we investigate error bars from a Bayesian model (Gaussian Process (GP)), an ensemble based approach (Random Forest), and approaches based on the Mahalanobis distance to training data (for Support Vector Machine and Ridge Regression models). We evaluate all approaches in terms of their prediction accuracy (in cross-validation, and on an external validation set of 536 molecules) and in how far the individual error bars can faithfully represent the actual prediction error.
Schroeter, Timon Sebastian; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-09-01
We investigate the use of different Machine Learning methods to construct models for aqueous solubility. Models are based on about 4000 compounds, including an in-house set of 632 drug discovery molecules of Bayer Schering Pharma. For each method, we also consider an appropriate method to obtain error bars, in order to estimate the domain of applicability (DOA) for each model. Here, we investigate error bars from a Bayesian model (Gaussian Process (GP)), an ensemble based approach (Random Forest), and approaches based on the Mahalanobis distance to training data (for Support Vector Machine and Ridge Regression models). We evaluate all approaches in terms of their prediction accuracy (in cross-validation, and on an external validation set of 536 molecules) and in how far the individual error bars can faithfully represent the actual prediction error.
NASA Astrophysics Data System (ADS)
Schroeter, Timon Sebastian; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-12-01
We investigate the use of different Machine Learning methods to construct models for aqueous solubility. Models are based on about 4000 compounds, including an in-house set of 632 drug discovery molecules of Bayer Schering Pharma. For each method, we also consider an appropriate method to obtain error bars, in order to estimate the domain of applicability (DOA) for each model. Here, we investigate error bars from a Bayesian model (Gaussian Process (GP)), an ensemble based approach (Random Forest), and approaches based on the Mahalanobis distance to training data (for Support Vector Machine and Ridge Regression models). We evaluate all approaches in terms of their prediction accuracy (in cross-validation, and on an external validation set of 536 molecules) and in how far the individual error bars can faithfully represent the actual prediction error.
NASA Astrophysics Data System (ADS)
Schroeter, Timon Sebastian; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-09-01
We investigate the use of different Machine Learning methods to construct models for aqueous solubility. Models are based on about 4000 compounds, including an in-house set of 632 drug discovery molecules of Bayer Schering Pharma. For each method, we also consider an appropriate method to obtain error bars, in order to estimate the domain of applicability (DOA) for each model. Here, we investigate error bars from a Bayesian model (Gaussian Process (GP)), an ensemble based approach (Random Forest), and approaches based on the Mahalanobis distance to training data (for Support Vector Machine and Ridge Regression models). We evaluate all approaches in terms of their prediction accuracy (in cross-validation, and on an external validation set of 536 molecules) and in how far the individual error bars can faithfully represent the actual prediction error.
Recent developments in broadly applicable structure-biodegradability relationships.
Jaworska, Joanna S; Boethling, Robert S; Howard, Philip H
2003-08-01
Biodegradation is one of the most important processes influencing concentration of a chemical substance after its release to the environment. It is the main process for removal of many chemicals from the environment and therefore is an important factor in risk assessments. This article reviews available methods and models for predicting biodegradability of organic chemicals from structure. The first section of the article briefly discusses current needs for biodegradability estimation methods related to new and existing chemicals and in the context of multimedia exposure models. Following sections include biodegradation test methods and endpoints used in modeling, with special attention given to the Japanese Ministry of International Trade and Industry test; a primer on modeling, describing the various approaches that have been used in the structure/biodegradability relationship work, and contrasting statistical and mechanistic approaches; and recent developments in structure/biodegradability relationships, divided into group contribution, chemometric, and artificial intelligence approaches.
NASA Astrophysics Data System (ADS)
Liu, Qiong; Wang, Wen-xi; Zhu, Ke-ren; Zhang, Chao-yong; Rao, Yun-qing
2014-11-01
Mixed-model assembly line sequencing is significant in reducing the production time and overall cost of production. To improve production efficiency, a mathematical model aiming simultaneously to minimize overtime, idle time and total set-up costs is developed. To obtain high-quality and stable solutions, an advanced scatter search approach is proposed. In the proposed algorithm, a new diversification generation method based on a genetic algorithm is presented to generate a set of potentially diverse and high-quality initial solutions. Many methods, including reference set update, subset generation, solution combination and improvement methods, are designed to maintain the diversification of populations and to obtain high-quality ideal solutions. The proposed model and algorithm are applied and validated in a case company. The results indicate that the proposed advanced scatter search approach is significant for mixed-model assembly line sequencing in this company.
Incremental Transductive Learning Approaches to Schistosomiasis Vector Classification
NASA Astrophysics Data System (ADS)
Fusco, Terence; Bi, Yaxin; Wang, Haiying; Browne, Fiona
2016-08-01
The key issues pertaining to collection of epidemic disease data for our analysis purposes are that it is a labour intensive, time consuming and expensive process resulting in availability of sparse sample data which we use to develop prediction models. To address this sparse data issue, we present the novel Incremental Transductive methods to circumvent the data collection process by applying previously acquired data to provide consistent, confidence-based labelling alternatives to field survey research. We investigated various reasoning approaches for semi-supervised machine learning including Bayesian models for labelling data. The results show that using the proposed methods, we can label instances of data with a class of vector density at a high level of confidence. By applying the Liberal and Strict Training Approaches, we provide a labelling and classification alternative to standalone algorithms. The methods in this paper are components in the process of reducing the proliferation of the Schistosomiasis disease and its effects.
A New Formulation of the Filter-Error Method for Aerodynamic Parameter Estimation in Turbulence
NASA Technical Reports Server (NTRS)
Grauer, Jared A.; Morelli, Eugene A.
2015-01-01
A new formulation of the filter-error method for estimating aerodynamic parameters in nonlinear aircraft dynamic models during turbulence was developed and demonstrated. The approach uses an estimate of the measurement noise covariance to identify the model parameters, their uncertainties, and the process noise covariance, in a relaxation method analogous to the output-error method. Prior information on the model parameters and uncertainties can be supplied, and a post-estimation correction to the uncertainty was included to account for colored residuals not considered in the theory. No tuning parameters, needing adjustment by the analyst, are used in the estimation. The method was demonstrated in simulation using the NASA Generic Transport Model, then applied to the subscale T-2 jet-engine transport aircraft flight. Modeling results in different levels of turbulence were compared with results from time-domain output error and frequency- domain equation error methods to demonstrate the effectiveness of the approach.
Southam-Gerow, Michael A; Dorsey, Shannon
2014-01-01
This special issue provides examples of how qualitative and mixed methods research approaches can be used in dissemination and implementation science. In this introductory article, we provide a brief rationale for why and how qualitative and mixed methods approaches can be useful in moving the field forward. Specifically, we provide a brief primer on common qualitative methods, including a review of guidelines provided by the National Institutes of Health. Next, we introduce the six articles in the issue. The first of the articles by Palinkas represents a more thorough and authoritative discussion related to qualitative methods, using the other five articles in the issue (and other published works) as examples. The remaining five articles are empirical and/or descriptive articles of recently completed or ongoing qualitative or mixed methods studies related to dissemination and implementation of evidence-based practices for children and adolescents.
Vu, Maihan B.; Halladay, Jacqueline R.; Miller, Cassandra; Garcia, Beverly A.; Cummings, Doyle M.; Cene, Crystal W.; Hinderliter, Alan; Little, Edwin; Rachide, Marjorie; DeWalt, Darren
2014-01-01
Introduction Patient and practice perspectives can inform development of team-based approaches to improving blood pressure control in primary care. We used a community-based participatory research approach to assess patient and practice perceptions regarding the value of team-based strategies for controlling blood pressure in a rural North Carolina population from 2010 through 2012. Methods In-depth interviews were conducted with 41 adults with hypertension, purposely sampled to include diversity of sex, race, literacy, and blood pressure control, and with key office staff at 5 rural primary care practices in the southeastern US “stroke belt.” Interviews explored barriers to controlling blood pressure, the practice’s role in controlling blood pressure, and opinions on the use of team care delivery. Results Patients reported that provider strategies to optimize blood pressure control should include regular visits, medication adjustment, side-effect discussion, and behavioral counseling. When discussing team-based approaches to hypertension care, patients valued verbal encouragement, calls from the doctor’s office, and the opportunity to ask questions. However, they voiced concerns about the effect of having too many people involved in their care. Practice staff focused on multiple, broad methods to control blood pressure including counseling, regular office visits, media to improve awareness, and support groups. An explicit focus of delivering care as teams was a newer concept. Conclusion When developing a team approach to hypertension treatment, patients value high-quality communication and not losing their primary relationship with their provider. Practice staff members were open to a team-based approach but had limited knowledge of what such an approach would entail. PMID:24762533
Alberts, Johanna F; van Zyl, Willem H; Gelderblom, Wentzel C A
2016-01-01
Infection by the fumonisin-producing Fusarium spp. and subsequent fumonisin contamination of maize adversely affect international trade and economy with deleterious effects on human and animal health. In developed countries high standards of the major food suppliers and retailers are upheld and regulatory controls deter the importation and local marketing of fumonisin-contaminated food products. In developing countries regulatory measures are either lacking or poorly enforced, due to food insecurity, resulting in an increased mycotoxin exposure. The lack and poor accessibility of effective and environmentally safe control methods have led to an increased interest in practical and biological alternatives to reduce fumonisin intake. These include the application of natural resources, including plants, microbial cultures, genetic material thereof, or clay minerals pre- and post-harvest. Pre-harvest approaches include breeding for resistant maize cultivars, introduction of biocontrol microorganisms, application of phenolic plant extracts, and expression of antifungal proteins and fumonisin degrading enzymes in transgenic maize cultivars. Post-harvest approaches include the removal of fumonisins by natural clay adsorbents and enzymatic degradation of fumonisins through decarboxylation and deamination by recombinant carboxylesterase and aminotransferase enzymes. Although, the knowledge base on biological control methods has expanded, only a limited number of authorized decontamination products and methods are commercially available. As many studies detailed the use of natural compounds in vitro, concepts in reducing fumonisin contamination should be developed further for application in planta and in the field pre-harvest, post-harvest, and during storage and food-processing. In developed countries an integrated approach, involving good agricultural management practices, hazard analysis and critical control point (HACCP) production, and storage management, together with selected biologically based treatments, mild chemical and physical treatments could reduce fumonisin contamination effectively. In rural subsistence farming communities, simple, practical, and culturally acceptable hand-sorting, maize kernel washing, and dehulling intervention methods proved to be effective as a last line of defense for reducing fumonisin exposure. Biologically based methods for control of fumonisin-producing Fusarium spp. and decontamination of the fumonisins could have potential commercial application, while simple and practical intervention strategies could also impact positively on food safety and security, especially in rural populations reliant on maize as a dietary staple.
Missing Data in Alcohol Clinical Trials with Binary Outcomes
Hallgren, Kevin A.; Witkiewitz, Katie; Kranzler, Henry R.; Falk, Daniel E.; Litten, Raye Z.; O’Malley, Stephanie S.; Anton, Raymond F.
2017-01-01
Background Missing data are common in alcohol clinical trials for both continuous and binary endpoints. Approaches to handle missing data have been explored for continuous outcomes, yet no studies have compared missing data approaches for binary outcomes (e.g., abstinence, no heavy drinking days). The present study compares approaches to modeling binary outcomes with missing data in the COMBINE study. Method We included participants in the COMBINE Study who had complete drinking data during treatment and who were assigned to active medication or placebo conditions (N=1146). Using simulation methods, missing data were introduced under common scenarios with varying sample sizes and amounts of missing data. Logistic regression was used to estimate the effect of naltrexone (vs. placebo) in predicting any drinking and any heavy drinking outcomes at the end of treatment using four analytic approaches: complete case analysis (CCA), last observation carried forward (LOCF), the worst-case scenario of missing equals any drinking or heavy drinking (WCS), and multiple imputation (MI). In separate analyses, these approaches were compared when drinking data were manually deleted for those participants who discontinued treatment but continued to provide drinking data. Results WCS produced the greatest amount of bias in treatment effect estimates. MI usually yielded less biased estimates than WCS and CCA in the simulated data, and performed considerably better than LOCF when estimating treatment effects among individuals who discontinued treatment. Conclusions Missing data can introduce bias in treatment effect estimates in alcohol clinical trials. Researchers should utilize modern missing data methods, including MI, and avoid WCS and CCA when analyzing binary alcohol clinical trial outcomes. PMID:27254113
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok
The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities.more » Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).« less
NASA Astrophysics Data System (ADS)
Xu, Pengcheng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Liu, Jiufu; Zou, Ying; He, Ruimin
2017-12-01
Hydrometeorological data are needed for obtaining point and areal mean, quantifying the spatial variability of hydrometeorological variables, and calibration and verification of hydrometeorological models. Hydrometeorological networks are utilized to collect such data. Since data collection is expensive, it is essential to design an optimal network based on the minimal number of hydrometeorological stations in order to reduce costs. This study proposes a two-phase copula entropy- based multiobjective optimization approach that includes: (1) copula entropy-based directional information transfer (CDIT) for clustering the potential hydrometeorological gauges into several groups, and (2) multiobjective method for selecting the optimal combination of gauges for regionalized groups. Although entropy theory has been employed for network design before, the joint histogram method used for mutual information estimation has several limitations. The copula entropy-based mutual information (MI) estimation method is shown to be more effective for quantifying the uncertainty of redundant information than the joint histogram (JH) method. The effectiveness of this approach is verified by applying to one type of hydrometeorological gauge network, with the use of three model evaluation measures, including Nash-Sutcliffe Coefficient (NSC), arithmetic mean of the negative copula entropy (MNCE), and MNCE/NSC. Results indicate that the two-phase copula entropy-based multiobjective technique is capable of evaluating the performance of regional hydrometeorological networks and can enable decision makers to develop strategies for water resources management.
An improved current potential method for fast computation of stellarator coil shapes
NASA Astrophysics Data System (ADS)
Landreman, Matt
2017-04-01
Several fast methods for computing stellarator coil shapes are compared, including the classical NESCOIL procedure (Merkel 1987 Nucl. Fusion 27 867), its generalization using truncated singular value decomposition, and a Tikhonov regularization approach we call REGCOIL in which the squared current density is included in the objective function. Considering W7-X and NCSX geometries, and for any desired level of regularization, we find the REGCOIL approach simultaneously achieves lower surface-averaged and maximum values of both current density (on the coil winding surface) and normal magnetic field (on the desired plasma surface). This approach therefore can simultaneously improve the free-boundary reconstruction of the target plasma shape while substantially increasing the minimum distances between coils, preventing collisions between coils while improving access for ports and maintenance. The REGCOIL method also allows finer control over the level of regularization, it preserves convexity to ensure the local optimum found is the global optimum, and it eliminates two pathologies of NESCOIL: the resulting coil shapes become independent of the arbitrary choice of angles used to parameterize the coil surface, and the resulting coil shapes converge rather than diverge as Fourier resolution is increased. We therefore contend that REGCOIL should be used instead of NESCOIL for applications in which a fast and robust method for coil calculation is needed, such as when targeting coil complexity in fixed-boundary plasma optimization, or for scoping new stellarator geometries.
Ayadurai, Shamala; Hattingh, H Laetitia; Tee, Lisa B G; Md Said, Siti Norlina
2016-01-01
Background. We conducted a review of current diabetes intervention studies in type 2 diabetes and identified opportunities for pharmacists to deliver quality diabetes care. Methods. A search on randomised controlled trials (RCT) on diabetes management by healthcare professionals including pharmacists published between 2010 and 2015 was conducted. Results and Discussion. Diabetes management includes multifactorial intervention which includes seven factors as outlined in diabetes guidelines, namely, glycaemic, cholesterol and blood pressure control, medication, lifestyle, education, and cardiovascular risk factors. Most studies do not provide evidence that the intervention methods used included all seven factors with exception of three RCT which indicated HbA1c (glycated hemoglobin) reduction range of 0.5% to 1.8%. The varied HbA1C reduction suggests a lack of standardised and consistent approach to diabetes care. Furthermore, the duration of most studies was from one month to two years; therefore long term outcomes could not be established. Conclusion. Although pharmacists' contribution towards improving clinical outcomes of diabetes patients was well documented, the methods used to deliver structured, consistent evidence-based care were not clearly stipulated. Therefore, approaches to achieving long term continuity of care are uncertain. An intervention strategy that encompass all seven evidence-based factors will be useful.
Incorporating the sampling design in weighting adjustments for panel attrition
Chen, Qixuan; Gelman, Andrew; Tracy, Melissa; Norris, Fran H.; Galea, Sandro
2015-01-01
We review weighting adjustment methods for panel attrition and suggest approaches for incorporating design variables, such as strata, clusters and baseline sample weights. Design information can typically be included in attrition analysis using multilevel models or decision tree methods such as the CHAID algorithm. We use simulation to show that these weighting approaches can effectively reduce bias in the survey estimates that would occur from omitting the effect of design factors on attrition while keeping the resulted weights stable. We provide a step-by-step illustration on creating weighting adjustments for panel attrition in the Galveston Bay Recovery Study, a survey of residents in a community following a disaster, and provide suggestions to analysts in decision making about weighting approaches. PMID:26239405
Quantum Chemical Approach to Estimating the Thermodynamics of Metabolic Reactions
Jinich, Adrian; Rappoport, Dmitrij; Dunn, Ian; Sanchez-Lengeling, Benjamin; Olivares-Amaya, Roberto; Noor, Elad; Even, Arren Bar; Aspuru-Guzik, Alán
2014-01-01
Thermodynamics plays an increasingly important role in modeling and engineering metabolism. We present the first nonempirical computational method for estimating standard Gibbs reaction energies of metabolic reactions based on quantum chemistry, which can help fill in the gaps in the existing thermodynamic data. When applied to a test set of reactions from core metabolism, the quantum chemical approach is comparable in accuracy to group contribution methods for isomerization and group transfer reactions and for reactions not including multiply charged anions. The errors in standard Gibbs reaction energy estimates are correlated with the charges of the participating molecules. The quantum chemical approach is amenable to systematic improvements and holds potential for providing thermodynamic data for all of metabolism. PMID:25387603
New Methods for Assessing and Reducing Uncertainty in Microgravity Studies
NASA Astrophysics Data System (ADS)
Giniaux, J. M.; Hooper, A. J.; Bagnardi, M.
2017-12-01
Microgravity surveying, also known as dynamic or 4D gravimetry is a time-dependent geophysical method used to detect mass fluctuations within the shallow crust, by analysing temporal changes in relative gravity measurements. We present here a detailed uncertainty analysis of temporal gravity measurements, considering for the first time all possible error sources, including tilt, error in drift estimations and timing errors. We find that some error sources that are actually ignored, can have a significant impact on the total error budget and it is therefore likely that some gravity signals may have been misinterpreted in previous studies. Our analysis leads to new methods for reducing some of the uncertainties associated with residual gravity estimation. In particular, we propose different approaches for drift estimation and free air correction depending on the survey set up. We also provide formulae to recalculate uncertainties for past studies and lay out a framework for best practice in future studies. We demonstrate our new approach on volcanic case studies, which include Kilauea in Hawaii and Askja in Iceland.
Complex basis functions for molecular resonances: Methodology and applications
NASA Astrophysics Data System (ADS)
White, Alec; McCurdy, C. William; Head-Gordon, Martin
The computation of positions and widths of metastable electronic states is a challenge for molecular electronic structure theory because, in addition to the difficulty of the many-body problem, such states obey scattering boundary conditions. These resonances cannot be addressed with naïve application of traditional bound state electronic structure theory. Non-Hermitian electronic structure methods employing complex basis functions is one way that we may rigorously treat resonances within the framework of traditional electronic structure theory. In this talk, I will discuss our recent work in this area including the methodological extension from single determinant SCF-based approaches to highly correlated levels of wavefunction-based theory such as equation of motion coupled cluster and many-body perturbation theory. These approaches provide a hierarchy of theoretical methods for the computation of positions and widths of molecular resonances. Within this framework, we may also examine properties of resonances including the dependence of these parameters on molecular geometry. Some applications of these methods to temporary anions and dianions will also be discussed.
Array magnetics modal analysis for the DIII-D tokamak based on localized time-series modelling
Olofsson, K. Erik J.; Hanson, Jeremy M.; Shiraki, Daisuke; ...
2014-07-14
Here, time-series analysis of magnetics data in tokamaks is typically done using block-based fast Fourier transform methods. This work presents the development and deployment of a new set of algorithms for magnetic probe array analysis. The method is based on an estimation technique known as stochastic subspace identification (SSI). Compared with the standard coherence approach or the direct singular value decomposition approach, the new technique exhibits several beneficial properties. For example, the SSI method does not require that frequencies are orthogonal with respect to the timeframe used in the analysis. Frequencies are obtained directly as parameters of localized time-series models.more » The parameters are extracted by solving small-scale eigenvalue problems. Applications include maximum-likelihood regularized eigenmode pattern estimation, detection of neoclassical tearing modes, including locked mode precursors, and automatic clustering of modes, and magnetics-pattern characterization of sawtooth pre- and postcursors, edge harmonic oscillations and fishbones.« less
Next generation system modeling of NTR systems
NASA Technical Reports Server (NTRS)
Buksa, John J.; Rider, William J.
1993-01-01
The topics are presented in viewgraph form and include the following: nuclear thermal rocket (NTR) modeling challenges; current approaches; shortcomings of current analysis method; future needs; and present steps to these goals.
Dziak, John J.; Bray, Bethany C.; Zhang, Jieting; Zhang, Minqiang; Lanza, Stephanie T.
2016-01-01
Several approaches are available for estimating the relationship of latent class membership to distal outcomes in latent profile analysis (LPA). A three-step approach is commonly used, but has problems with estimation bias and confidence interval coverage. Proposed improvements include the correction method of Bolck, Croon, and Hagenaars (BCH; 2004), Vermunt’s (2010) maximum likelihood (ML) approach, and the inclusive three-step approach of Bray, Lanza, & Tan (2015). These methods have been studied in the related case of latent class analysis (LCA) with categorical indicators, but not as well studied for LPA with continuous indicators. We investigated the performance of these approaches in LPA with normally distributed indicators, under different conditions of distal outcome distribution, class measurement quality, relative latent class size, and strength of association between latent class and the distal outcome. The modified BCH implemented in Latent GOLD had excellent performance. The maximum likelihood and inclusive approaches were not robust to violations of distributional assumptions. These findings broadly agree with and extend the results presented by Bakk and Vermunt (2016) in the context of LCA with categorical indicators. PMID:28630602
Expectation-Based Control of Noise and Chaos
NASA Technical Reports Server (NTRS)
Zak, Michael
2006-01-01
A proposed approach to control of noise and chaos in dynamic systems would supplement conventional methods. The approach is based on fictitious forces composed of expectations governed by Fokker-Planck or Liouville equations that describe the evolution of the probability densities of the controlled parameters. These forces would be utilized as feedback control forces that would suppress the undesired diffusion of the controlled parameters. Examples of dynamic systems in which the approach is expected to prove beneficial include spacecraft, electronic systems, and coupled lasers.
NASA Astrophysics Data System (ADS)
Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet
2016-10-01
Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.
NASA Astrophysics Data System (ADS)
Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn
2013-04-01
SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.
Extended Hansen solubility approach: naphthalene in individual solvents.
Martin, A; Wu, P L; Adjei, A; Beerbower, A; Prausnitz, J M
1981-11-01
A multiple regression method using Hansen partial solubility parameters, delta D, delta p, and delta H, was used to reproduce the solubilities of naphthalene in pure polar and nonpolar solvents and to predict its solubility in untested solvents. The method, called the extended Hansen approach, was compared with the extended Hildebrand solubility approach and the universal-functional-group-activity-coefficient (UNIFAC) method. The Hildebrand regular solution theory was also used to calculate naphthalene solubility. Naphthalene, an aromatic molecule having no side chains or functional groups, is "well-behaved', i.e., its solubility in active solvents known to interact with drug molecules is fairly regular. Because of its simplicity, naphthalene is a suitable solute with which to initiate the difficult study of solubility phenomena. The three methods tested (Hildebrand regular solution theory was introduced only for comparison of solubilities in regular solution) yielded similar results, reproducing naphthalene solubilities within approximately 30% of literature values. In some cases, however, the error was considerably greater. The UNIFAC calculation is superior in that it requires only the solute's heat of fusion, the melting point, and a knowledge of chemical structures of solute and solvent. The extended Hansen and extended Hildebrand methods need experimental solubility data on which to carry out regression analysis. The extended Hansen approach was the method of second choice because of its adaptability to solutes and solvents from various classes. Sample calculations are included to illustrate methods of predicting solubilities in untested solvents at various temperatures. The UNIFAC method was successful in this regard.
New diagnostic methods for pneumonia in the ICU.
Douglas, Ivor S
2016-04-01
Pneumonia leading to severe sepsis and critical illness including respiratory failure remains a common and therapeutically challenging diagnosis. Current clinical approaches to surveillance, early detection, and conventional culture-based microbiology are inadequate for optimal targeted antibiotic treatment and stewardship. Efforts to enhance diagnosis of community-acquired and health care-acquired pneumonia, including ventilator-associated pneumonia (VAP), are the focus of recent studies reviewed here. Newer surveillance definitions are sensitive for pneumonia in the ICU including VAP but consistently underdetect patients that are clinically shown to have bacterial VAP based on clinical diagnostic criteria and response to antibiotic treatment. Routinely measured plasma biomarkers, including procalcitonin and C-reactive protein, lack sufficient precision and predictive accuracy to inform diagnosis. Novel rapid microbiological diagnostics, including nucleic-acid amplification, mass spectrometry, and fluorescence microscopy-based technologies are promising approaches for the future. Exhaled breath biomarkers, including measurement of volatile organic compounds, represent a future approach. The integration of novel diagnostics for rapid microbial identification, resistance phenotyping, and antibiotic sensitivity testing into usual care practice could significantly transform the care of patients and potentially inform significantly improved targeted antimicrobial selection, de-escalation, and stewardship.
Hierarchical Adaptive Regression Kernels for Regression with Functional Predictors.
Woodard, Dawn B; Crainiceanu, Ciprian; Ruppert, David
2013-01-01
We propose a new method for regression using a parsimonious and scientifically interpretable representation of functional predictors. Our approach is designed for data that exhibit features such as spikes, dips, and plateaus whose frequency, location, size, and shape varies stochastically across subjects. We propose Bayesian inference of the joint functional and exposure models, and give a method for efficient computation. We contrast our approach with existing state-of-the-art methods for regression with functional predictors, and show that our method is more effective and efficient for data that include features occurring at varying locations. We apply our methodology to a large and complex dataset from the Sleep Heart Health Study, to quantify the association between sleep characteristics and health outcomes. Software and technical appendices are provided in online supplemental materials.
NASA Astrophysics Data System (ADS)
Kuntman, Ertan; Canillas, Adolf; Arteaga, Oriol
2017-11-01
Experimental Mueller matrices contain certain amount of uncertainty in their elements and these uncertainties can create difficulties for decomposition methods based on analytic solutions. In an earlier paper [1], we proposed a decomposition method for depolarizing Mueller matrices by using certain symmetry conditions. However, because of the experimental error, that method creates over-determined systems with non-unique solutions. Here we propose to use least squares minimization approach in order to improve the accuracy of our results. In this method, we are taking into account the number of independent parameters of the corresponding symmetry and the rank constraints on the component matrices to decide on our fitting model. This approach is illustrated with experimental Mueller matrices that include material media with different Mueller symmetries.
Modeling and Simulation of Nanoindentation
NASA Astrophysics Data System (ADS)
Huang, Sixie; Zhou, Caizhi
2017-11-01
Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.
Teaching to Strengths: Engaging Young Boys in Learning
ERIC Educational Resources Information Center
Johnson, Cynthia; Gooliaff, Shauna
2013-01-01
Traditional teaching methods often fail to engage male students in learning. The purpose of this research was to increase student engagement in the story writing process and increase self-confidence in boys at risk. A qualitative approach included student surveys as well as teacher journaling and portfolios (including e-portfolios). The student…
ERIC Educational Resources Information Center
Herrmann, Thom, Ed.
The 35 papers included in these proceedings report on innovative approaches to teaching used by faculty members at Ontario's technical colleges and universities. Included in this collection are papers on optimum instructional methods using microcomputers, teaching French conversational classes through drama, competencies for the educational…
USDA-ARS?s Scientific Manuscript database
The objective of this analysis is to estimate and compare the cost-effectiveness of on- and off-field approaches to reducing nitrogen loadings. On-field practices include improving the timing, rate, and method of nitrogen application. Off-field practices include restoring wetlands and establishing v...
Research Methods in Health, Physical Education, and Recreation. Third Revised Edition.
ERIC Educational Resources Information Center
Hubbard, Alfred W., Ed.
This book presents new ideas and approaches in research techniques in the areas of health, physical education, and recreation. Part 1, the introduction, includes two articles, which are "Why This Research?" by Arthur H. Steinhaus and "Overview of Research: Basic Principles" by Benjamin H. Massey. Part 2, discusses preparations and includes the…
ERIC Educational Resources Information Center
Weiss, Heather B.; Little, Priscilla M. D.
2008-01-01
Heather B. Weiss and Priscilla D. Little of the Harvard Family Research Project suggest seven possible approaches to strengthening OST (Out-of-School) organizations, including methods to ensure that OST providers become stronger partners with other groups and more adept advocates for their field. Strategies discussed include: (1) Cultivate…
SAW based micro- and acousto-fluidics in biomedicine
NASA Astrophysics Data System (ADS)
Ramasamy, Mouli; Varadan, Vijay K.
2017-04-01
Protein association starts with random collisions of individual proteins. Multiple collisions and rotational diffusion brings the molecules to a state of orientation. Majority of the protein associations are influenced by electrostatic interactions. To introduce: electrostatic rate enhancement, Brownian dynamics and transient complex theory has been traditionally used. Due to the recent advances in interdisciplinary sciences, an array of molecular assembly methods is being studied. Protein nanostructural assembly and macromolecular crowding are derived from the subsets of biochemistry to study protein-protein interactions and protein self-assembly. This paper tries to investigate the issue of enhancing the protein self-association rate, and bridging the gap between the simulations and experimental results. The methods proposed here include: electrostatic rate enhancement, macromolecular crowing, nanostructural protein assembly, microfluidics based approaches and magnetic force based approaches. Despite the suggestions of several methods, microfluidic and magnetic force based approaches seem to serve the need of protein assembly in a wider scale. Congruence of these approaches may also yield better results. Even though, these methods prove to be conceptually strong, to prevent the disagreement of theory and practice, a wide range of experiments is required. This proposal intends to study theoretical and experimental methods to successfully implement the aforementioned assembly strategies, and conclude with an extensive analysis of experimental data to address practical feasibility.
Goldstein, Benjamin A.; Navar, Ann Marie; Carter, Rickey E.
2017-01-01
Abstract Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. PMID:27436868
Extension of a hybrid particle-continuum method for a mixture of chemical species
NASA Astrophysics Data System (ADS)
Verhoff, Ashley M.; Boyd, Iain D.
2012-11-01
Due to the physical accuracy and numerical efficiency achieved by analyzing transitional, hypersonic flow fields with hybrid particle-continuum methods, this paper describes a Modular Particle-Continuum (MPC) method and its extension to include multiple chemical species. Considerations that are specific to a hybrid approach for simulating gas mixtures are addressed, including a discussion of the Chapman-Enskog velocity distribution function (VDF) for near-equilibrium flows, and consistent viscosity models for the individual CFD and DSMC modules of the MPC method. Representative results for a hypersonic blunt-body flow are then presented, where the flow field properties, surface properties, and computational performance are compared for simulations employing full CFD, full DSMC, and the MPC method.
Automated recognition of stratigraphic marker shales from geophysical logs in iron ore deposits
NASA Astrophysics Data System (ADS)
Silversides, Katherine; Melkumyan, Arman; Wyman, Derek; Hatherly, Peter
2015-04-01
The mining of stratiform ore deposits requires a means of determining the location of stratigraphic boundaries. A variety of geophysical logs may provide the required data but, in the case of banded iron formation hosted iron ore deposits in the Hamersley Ranges of Western Australia, only one geophysical log type (natural gamma) is collected for this purpose. The information from these logs is currently processed by slow manual interpretation. In this paper we present an alternative method of automatically identifying recurring stratigraphic markers in natural gamma logs from multiple drill holes. Our approach is demonstrated using natural gamma geophysical logs that contain features corresponding to the presence of stratigraphically important marker shales. The host stratigraphic sequence is highly consistent throughout the Hamersley and the marker shales can therefore be used to identify the stratigraphic location of the banded iron formation (BIF) or BIF hosted ore. The marker shales are identified using Gaussian Processes (GP) trained by either manual or active learning methods and the results are compared to the existing geological interpretation. The manual method involves the user selecting the signatures for improving the library, whereas the active learning method uses the measure of uncertainty provided by the GP to select specific examples for the user to consider for addition. The results demonstrate that both GP methods can identify a feature, but the active learning approach has several benefits over the manual method. These benefits include greater accuracy in the identified signatures, faster library building, and an objective approach for selecting signatures that includes the full range of signatures across a deposit in the library. When using the active learning method, it was found that the current manual interpretation could be replaced in 78.4% of the holes with an accuracy of 95.7%.
NASA Astrophysics Data System (ADS)
Marques, Fernando; Queiroz, Sónia; Gouveia, Luís; Vasconcelos, Manuel
2017-12-01
In Portugal, the modifications introduced in 2008 and 2012 in the National Ecological Reserve law (REN) included the mandatory study of slope instability, including slopes, natural scarps, and sea cliffs, at municipal or regional scale, with the purpose of avoiding the use of hazardous zones with buildings and other structures. The law also indicates specific methods to perform these studies, with different approaches for slope instability, natural scarps and sea cliffs. The methods used to produce the maps required by REN law, with modifications and improvements to the law specified methods, were applied to the 71 km2 territory of Almada County, and included: 1) Slope instability mapping using the statistically based Information Value method validated with the landslide inventory using ROC curves, which provided an AAC=0.964, with the higher susceptibility zones which cover at least 80% of the landslides of the inventory to be included in REN map. The map was object of a generalization process to overcome the inconveniences of the use of a pixel based approach. 2) Natural scarp mapping including setback areas near the top, defined according to the law and setback areas near the toe defined by the application of the shadow angle calibrated with the major rockfalls which occurred in the study area; 3) Sea cliffs mapping including two levels of setback zones near the top, and one setback zone at the cliffs toe, which were based on systematic inventories of cliff failures occurred between 1947 and 2010 in a large scale regional littoral monitoring project. In the paper are described the methods used and the results obtained in this study, which correspond to the final maps of areas to include in REN. The results obtained in this study may be considered as an example of good practice of the municipal authorities in terms of solid, technical and scientifically supported regulation definitions, hazard prevention and safe and sustainable land use management.
An approach to achieve progress in spacecraft shielding
NASA Astrophysics Data System (ADS)
Thoma, K.; Schäfer, F.; Hiermaier, S.; Schneider, E.
2004-01-01
Progress in shield design against space debris can be achieved only when a combined approach based on several tools is used. This approach depends on the combined application of advanced numerical methods, specific material models and experimental determination of input parameters for these models. Examples of experimental methods for material characterization are given, covering the range from quasi static to very high strain rates for materials like Nextel and carbon fiber-reinforced materials. Mesh free numerical methods have extraordinary capabilities in the simulation of extreme material behaviour including complete failure with phase changes, combined with shock wave phenomena and the interaction with structural components. In this paper the benefits from combining numerical methods, material modelling and detailed experimental studies for shield design are demonstrated. The following examples are given: (1) Development of a material model for Nextel and Kevlar-Epoxy to enable numerical simulation of hypervelocity impacts on complex heavy protection shields for the International Space Station. (2) The influence of projectile shape on protection performance of Whipple Shields and how experimental problems in accelerating such shapes can be overcome by systematic numerical simulation. (3) The benefits of using metallic foams in "sandwich bumper shields" for spacecraft and how to approach systematic characterization of such materials.
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Clinical approach to incidental pancreatic cysts
Chiang, Austin L; Lee, Linda S
2016-01-01
The approach to incidentally noted pancreatic cysts is constantly evolving. While surgical resection is indicated for malignant or higher risk cysts, correctly identifying these highest risk pancreatic cystic lesions remains difficult. Using parameters including cyst size, presence of solid components, and pancreatic duct involvement, the 2012 International Association of Pancreatology (IAP) and the 2015 American Gastroenterological Association (AGA) guidelines have sought to identify the higher risk patients who would benefit from further evaluation using endoscopic ultrasound (EUS). Not only can EUS help further assess the presence of solid component and nodules, but also fine needle aspiration of cyst fluid aids in diagnosis by obtaining cellular, molecular, and genetic data. The impact of new endoscopic innovations with novel methods of direct visualization including confocal endomicroscopy require further validation. This review also highlights the differences between the 2012 IAP and 2015 AGA guidelines, which include the thresholds for sending patients for EUS and surgery and methods, interval, and duration of surveillance for unresected cysts. PMID:26811661
OXIDATIVE TREATMENT OF INDUSTRIAL WASTEWATER
This paper defines industrial waste treatment process as falling into categories of oxidative destruction, reductive destruction, and non-destructive, separation operations. The various oxidative approaches, including biological, chemical and thermal methods, are then discussed i...
Ozturk, Orgul D; McInnes, Melayne M; Blake, Christine E; Frongillo, Edward A; Jones, Sonya J
2016-01-01
The objective of this study is to develop a structured observational method for the systematic assessment of the food-choice architecture that can be used to identify key points for behavioral economic intervention intended to improve the health quality of children's diets. We use an ethnographic approach with observations at twelve elementary schools to construct our survey instrument. Elements of the structured observational method include decision environment, salience, accessibility/convenience, defaults/verbal prompts, number of choices, serving ware/method/packaging, and social/physical eating environment. Our survey reveals important "nudgeable" components of the elementary school food-choice architecture, including precommitment and default options on the lunch line.
D'Abramo, Marco; Aschi, Massimiliano; Amadei, Andrea
2014-04-28
Here, we extend a recently introduced theoretical-computational procedure [M. D'Alessandro, M. Aschi, C. Mazzuca, A. Palleschi, and A. Amadei, J. Chem. Phys. 139, 114102 (2013)] to include quantum vibrational transitions in modelling electronic spectra of atomic molecular systems in condensed phase. The method is based on the combination of Molecular Dynamics simulations and quantum chemical calculations within the Perturbed Matrix Method approach. The main aim of the presented approach is to reproduce as much as possible the spectral line shape which results from a subtle combination of environmental and intrinsic (chromophore) mechanical-dynamical features. As a case study, we were able to model the low energy UV-vis transitions of pyrene in liquid acetonitrile in good agreement with the experimental data.
Beyond Self-Report: Emerging Methods for Capturing Individual Differences in Decision-Making Process
Connors, Brenda L.; Rende, Richard; Colton, Timothy J.
2016-01-01
People vary in the way in which they approach decision-making, which impacts real-world behavior. There has been a surge of interest in moving beyond reliance on self-report measures to capture such individual differences. Particular emphasis has been placed on devising and applying a range of methodologies that include experimental, neuroscience, and observational paradigms. This paper provides a selective review of recent studies that illustrate the methods and yield of these approaches in terms of generating a deeper understanding of decision-making style and the notable differences that can be found across individuals. PMID:26973589
Connors, Brenda L; Rende, Richard; Colton, Timothy J
2016-01-01
People vary in the way in which they approach decision-making, which impacts real-world behavior. There has been a surge of interest in moving beyond reliance on self-report measures to capture such individual differences. Particular emphasis has been placed on devising and applying a range of methodologies that include experimental, neuroscience, and observational paradigms. This paper provides a selective review of recent studies that illustrate the methods and yield of these approaches in terms of generating a deeper understanding of decision-making style and the notable differences that can be found across individuals.
Community-Based Participatory Research and Smoking Cessation Interventions: A Review of the Evidence
Newman, Susan D.; Heath, Janie; Williams, Lovoria B.; Tingen, Martha S.
2011-01-01
SYNOPSIS This article presents a review of the evidence on the use of community-based participatory research (CBPR) and smoking cessation interventions. An overview of CBPR is provided, along with a description of the search methods and quality scoring. Research questions are explored to determine: if CBPR improves the quality of research methods and community involvement in cessation intervention studies; and, cessation outcomes when using CBPR approaches. Results of the review are provided along with a comprehensive table summarizing all included studies. Strengths and challenges of the CBPR approach are presented with recommendations for future research. PMID:22289400
Is multicultural psychology a-scientific?: diverse methods for diversity research.
Cauce, Ana Mari
2011-07-01
This article asks, and answers three separate questions: What is multicultural psychology? What is psychological science? Are multicultural psychology and (empirical/positivist) psychological science incompatible? A brief overview of the history of science is provided emphasizing the emancipatory impulses behind a modernist, empirical, positivist approach to science. It is argued that such an approach is not incompatible with multicultural psychology. The author concludes that multicultural psychological will be strengthened if psychologists draw upon both qualitative and quantitative methods, including those that come from a positivist tradition, when investigating psychological and social issues as they affect diverse populations.
Camargo Plazas, Maria del Pilar; Cameron, Brenda L
2015-06-01
Many approaches and efforts have been used to better understand chronic diseases worldwide. Yet, little is known about the meaning of living with chronic illness under the pressures of globalization and neoliberal ideologies. Through Freire's participatory educational method, this article presents an innovative approach to understanding the multiple dimensions of living with chronic illness. In this way, we hope to use an innovative approach to address the impact of globalization on the daily life of chronically ill people and thus expand to the body of knowledge on nursing. This article uses Freire's participatory educational method to understand the multiple dimensions of living with chronic illness. This qualitative study follows an interpretive inquiry approach and uses a critical hermeneutic phenomenological method and critical research methodologies. Five participants were recruited for this participatory educational activity. Data collection methods included digitally recorded semistructured individual interviews and a Freire's participatory educational method session. Data analysis included a thematic analysis. Participants reported lacking adequate access to healthcare services because of insurance policies; a general perception that they were an unwanted burden on the healthcare system; and a general lack of government support, advocacy, and political interest. This research activity assisted participants to gain a new critical perspective about the condition of others with chronic diseases and thus provided an enlightening opportunity to learn about the illnesses and experiences of others and to realize that others experienced the same oppression from the healthcare system. Participants became agents of change within their own families and communities. Chronic diseases cause many economic and social consequences in their victims. These findings urge us to move from merely acknowledging the difficulties of people who live with chronic illness in an age of globalization to taking the actions necessary to bring about healthcare, social, and political reform through a process of conscientization and mutual transformation.
Marginalized zero-altered models for longitudinal count data.
Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A
2016-10-01
Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.
Marginalized zero-altered models for longitudinal count data
Tabb, Loni Philip; Tchetgen, Eric J. Tchetgen; Wellenius, Greg A.; Coull, Brent A.
2015-01-01
Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias. PMID:27867423
[Bayesian approach for the cost-effectiveness evaluation of healthcare technologies].
Berchialla, Paola; Gregori, Dario; Brunello, Franco; Veltri, Andrea; Petrinco, Michele; Pagano, Eva
2009-01-01
The development of Bayesian statistical methods for the assessment of the cost-effectiveness of health care technologies is reviewed. Although many studies adopt a frequentist approach, several authors have advocated the use of Bayesian methods in health economics. Emphasis has been placed on the advantages of the Bayesian approach, which include: (i) the ability to make more intuitive and meaningful inferences; (ii) the ability to tackle complex problems, such as allowing for the inclusion of patients who generate no cost, thanks to the availability of powerful computational algorithms; (iii) the importance of a full use of quantitative and structural prior information to produce realistic inferences. Much literature comparing the cost-effectiveness of two treatments is based on the incremental cost-effectiveness ratio. However, new methods are arising with the purpose of decision making. These methods are based on a net benefits approach. In the present context, the cost-effectiveness acceptability curves have been pointed out to be intrinsically Bayesian in their formulation. They plot the probability of a positive net benefit against the threshold cost of a unit increase in efficacy.A case study is presented in order to illustrate the Bayesian statistics in the cost-effectiveness analysis. Emphasis is placed on the cost-effectiveness acceptability curves. Advantages and disadvantages of the method described in this paper have been compared to frequentist methods and discussed.
Review of Test Theory and Methods.
1981-01-01
literature, although some books , technical reports, and unpub- lished literature have been included where relevant. The focus of the review is on practical...1977) and Abu-Sayf (1977) developed new versions of formula scores, and Molenaar (1977) took a Bayesian approach to correcting for random guessing. The...Snow’s (1977) book on aptitude and instructional methods is a landmark review of the research on the interaction between instructional methods and
ERIC Educational Resources Information Center
Berge, Analia
2006-01-01
Burn (2005) proposes a "genetic approach" to teaching limits of numerical sequences. The article includes an explanation of the Method of Exhaustion, a generalization of this method, and a description of how this method was used for obtaining areas and lengths in the seventeenth century. The author uses these historical and mathematical analyses…
Growth of saprotrophic fungi and bacteria in soil.
Rousk, Johannes; Bååth, Erland
2011-10-01
Bacterial and fungal growth rate measurements are sensitive variables to detect changes in environmental conditions. However, while considerable progress has been made in methods to assess the species composition and biomass of fungi and bacteria, information about growth rates remains surprisingly rudimentary. We review the recent history of approaches to assess bacterial and fungal growth rates, leading up to current methods, especially focusing on leucine/thymidine incorporation to estimate bacterial growth and acetate incorporation into ergosterol to estimate fungal growth. We present the underlying assumptions for these methods, compare estimates of turnover times for fungi and bacteria based on them, and discuss issues, including for example elusive conversion factors. We review what the application of fungal and bacterial growth rate methods has revealed regarding the influence of the environmental factors of temperature, moisture (including drying/rewetting), pH, as well as the influence of substrate additions, the presence of plants and toxins. We highlight experiments exploring the competitive and facilitative interaction between bacteria and fungi enabled using growth rate methods. Finally, we predict that growth methods will be an important complement to molecular approaches to elucidate fungal and bacterial ecology, and we identify methodological concerns and how they should be addressed. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.
Noar, Seth M; Mehrotra, Purnima
2011-03-01
Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Pursiainen, S; Vorwerk, J; Wolters, C H
2016-12-21
The goal of this study is to develop focal, accurate and robust finite element method (FEM) based approaches which can predict the electric potential on the surface of the computational domain given its structure and internal primary source current distribution. While conducting an EEG evaluation, the placement of source currents to the geometrically complex grey matter compartment is a challenging but necessary task to avoid forward errors attributable to tissue conductivity jumps. Here, this task is approached via a mathematically rigorous formulation, in which the current field is modeled via divergence conforming H(div) basis functions. Both linear and quadratic functions are used while the potential field is discretized via the standard linear Lagrangian (nodal) basis. The resulting model includes dipolar sources which are interpolated into a random set of positions and orientations utilizing two alternative approaches: the position based optimization (PBO) and the mean position/orientation (MPO) method. These results demonstrate that the present dipolar approach can reach or even surpass, at least in some respects, the accuracy of two classical reference methods, the partial integration (PI) and St. Venant (SV) approach which utilize monopolar loads instead of dipolar currents.
NASA Astrophysics Data System (ADS)
WANG, D.; Wang, Y.; Zeng, X.
2017-12-01
Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.
Computer aided analysis and optimization of mechanical system dynamics
NASA Technical Reports Server (NTRS)
Haug, E. J.
1984-01-01
The purpose is to outline a computational approach to spatial dynamics of mechanical systems that substantially enlarges the scope of consideration to include flexible bodies, feedback control, hydraulics, and related interdisciplinary effects. Design sensitivity analysis and optimization is the ultimate goal. The approach to computer generation and solution of the system dynamic equations and graphical methods for creating animations as output is outlined.
ERIC Educational Resources Information Center
Dade County Public Schools, Miami, FL.
Performance objectives are stated for each of the three secondary school units included in this package prepared for the Dade County Florida Quinmester Program. The units all concern some aspect of instruction in scientific method. "The Scientific Approach to Solving Problems" introduces students to the use of experimental testing of…
Plug Load Behavioral Change Demonstration Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metzger, I.; Kandt, A.; VanGeet, O.
2011-08-01
This report documents the methods and results of a plug load study of the Environmental Protection Agency's Region 8 Headquarters in Denver, Colorado, conducted by the National Renewable Energy Laboratory. The study quantified the effect of mechanical and behavioral change approaches on plug load energy reduction and identified effective ways to reduce plug load energy. Load reduction approaches included automated energy management systems and behavioral change strategies.
ERIC Educational Resources Information Center
Stauffer, Mary
2008-01-01
This article describes an unconventional method to teach un-contracted braille reading and writing skills to students who are blind and have additional disabilities. It includes a keyboarding curriculum that focuses on the whole language approach to literacy. A special feature is the keyboard that is adapted with braille symbols. Un-contracted…
ERIC Educational Resources Information Center
Lieberman, Marcus
The growing number of value clarification curriculum materials is an indication that moral education is becoming a major focal point of curriculum. This study looks at one social studies course that includes both a moral development component and an inquiry approach. The hypotheses of the study are that students will show significant growth in (1)…
ERIC Educational Resources Information Center
Marans, Robert W.; Edelstein, Jack Y.
2010-01-01
Purpose: The purpose of this paper is to determine the behaviors, attitudes, and levels of understanding among faculty, staff, and students in efforts to design programs aimed at reducing energy use in University of Michigan (UM) buildings. Design/methodology/approach: A multi-method approach is used in five diverse pilot buildings including focus…
ERIC Educational Resources Information Center
Castro-Schilo, Laura; Ferrer, Emilio
2013-01-01
We illustrate the idiographic/nomothetic debate by comparing 3 approaches to using daily self-report data on affect for predicting relationship quality and breakup. The 3 approaches included (a) the first day in the series of daily data; (b) the mean and variability of the daily series; and (c) parameters from dynamic factor analysis, a…
Isospin Breaking Corrections to the HVP with Domain Wall Fermions
NASA Astrophysics Data System (ADS)
Boyle, Peter; Guelpers, Vera; Harrison, James; Juettner, Andreas; Lehner, Christoph; Portelli, Antonin; Sachrajda, Christopher
2018-03-01
We present results for the QED and strong isospin breaking corrections to the hadronic vacuum polarization using Nf = 2 + 1 Domain Wall fermions. QED is included in an electro-quenched setup using two different methods, a stochastic and a perturbative approach. Results and statistical errors from both methods are directly compared with each other.
Charting the Learning Journey of a Group of Adults Returning to Education
ERIC Educational Resources Information Center
Mooney, Des
2011-01-01
Using a qualitative case study method the researcher studied a group of adult returning students completing a childcare course. Methods used included focus groups, a questionnaire and observations. Using a holistic analysis approach (Yin 2003) of the case the researcher then focused on a number of key issues. From this analysis the themes of…
Analysis of High School English Curriculum Materials through Rasch Measurement Model and Maxqda
ERIC Educational Resources Information Center
Batdi, Veli; Elaldi, Senel
2016-01-01
The purpose of the study is to analyze high school English curriculum materials (ECM) through FACETS analysis and MAXQDA-11 programs. The mixed methods approach, both quantitative and qualitative methods, were used in three samples including English teachers in Elazig during the 2014-2015 academic year. While the quantitative phase of the study…
The quest for methods to identify longleaf pine stump relicts in Southeastern Virginia
Thomas L. Eberhardt; Philip M. Sheridan; Chi-Leung So; Arvind A.R. Bhuta; Karen G. Reed
2015-01-01
The discovery of lightwood and turpentine stumps in southeastern Virginia raised questions about the true historical range for longleaf pine (Pinus palustris Mill.). Several investigative studies were therefore carried out to develop a method to determine the taxa of these relicts. Chemical approaches included the use of near infrared (NIR) spectroscopy coupled with...
ERIC Educational Resources Information Center
Lafayette, R. C.
1991-01-01
A discussion of the Total Physical Response method of second language instruction places the concept within the context of other unconventional language learning methods, reviews the rationale behind the approach, and outlines the classroom procedures used. A sampling of useful commands for classroom use is included. (19 references) (MSE)
2005-04-01
This book represents an extensive revision and updating of the 1997 first edition. It includes five new chapters commissioned for this volume. It is intended to be a comprehensive but accessible guide to 'a variety of methodological approaches to qualitative research'.
Methods of recovering alkali metals
Krumhansl, James L; Rigali, Mark J
2014-03-04
Approaches for alkali metal extraction, sequestration and recovery are described. For example, a method of recovering alkali metals includes providing a CST or CST-like (e.g., small pore zeolite) material. The alkali metal species is scavenged from the liquid mixture by the CST or CST-like material. The alkali metal species is extracted from the CST or CST-like material.
A smoothed residual based goodness-of-fit statistic for nest-survival models
Rodney X. Sturdivant; Jay J. Rotella; Robin E. Russell
2008-01-01
Estimating nest success and identifying important factors related to nest-survival rates is an essential goal for many wildlife researchers interested in understanding avian population dynamics. Advances in statistical methods have led to a number of estimation methods and approaches to modeling this problem. Recently developed models allow researchers to include a...
ERIC Educational Resources Information Center
Gittelsohn, Joel; Steckler, Allan; Johnson, Carolyn C.; Pratt, Charlotte; Grieser, Mira; Pickrel, Julie; Stone, Elaine J.; Conway, Terry; Coombs, Derek; Staten, Lisa K.
2006-01-01
Formative research uses qualitative and quantitative methods to provide information for researchers to plan intervention programs. Gaps in the formative research literature include how to define goals, implementation plans, and research questions; select methods; analyze data; and develop interventions. The National Heart, Lung, and Blood…
Clowne Science Scheme--A Method Based Course for the Early Years in Secondary Schools
ERIC Educational Resources Information Center
Burden, I. J.; And Others
1975-01-01
Describes a two-year course sequence that is team taught and theme centered. Themes include the earth, the senses, time, and rate of change. The teaching method is the discovery approach and the role of the teacher is outlined. Explains student assessment and outlines problems and observations related to the program. (GS)
Theoretical Significance in Q Methodology: A Qualitative Approach to a Mixed Method
ERIC Educational Resources Information Center
Ramlo, Susan
2015-01-01
Q methodology (Q) has offered researchers a unique scientific measure of subjectivity since William Stephenson's first article in 1935. Q's focus on subjectivity includes self-referential meaning and interpretation. Q is most often identified with its technique (Q-sort) and its method (factor analysis to group people); yet, it consists of a…
ERIC Educational Resources Information Center
Cinici, Ayhan
2016-01-01
The aim of my study was to explore the nature of changes in pre-service science teachers' (PSTs') self-efficacy beliefs toward science teaching through a mixed-methods approach. Thirty-six participants enrolled in a science methods course that included a collaborative peer microteaching ("Cope-M"). Participants' science teaching…
Theoretical Methods of Domain Structures in Ultrathin Ferroelectric Films: A Review
Liu, Jianyi; Chen, Weijin; Wang, Biao; Zheng, Yue
2014-01-01
This review covers methods and recent developments of the theoretical study of domain structures in ultrathin ferroelectric films. The review begins with an introduction to some basic concepts and theories (e.g., polarization and its modern theory, ferroelectric phase transition, domain formation, and finite size effects, etc.) that are relevant to the study of domain structures in ultrathin ferroelectric films. Basic techniques and recent progress of a variety of important approaches for domain structure simulation, including first-principles calculation, molecular dynamics, Monte Carlo simulation, effective Hamiltonian approach and phase field modeling, as well as multiscale simulation are then elaborated. For each approach, its important features and relative merits over other approaches for modeling domain structures in ultrathin ferroelectric films are discussed. Finally, we review recent theoretical studies on some important issues of domain structures in ultrathin ferroelectric films, with an emphasis on the effects of interfacial electrostatics, boundary conditions and external loads. PMID:28788198