Integrated Sensitivity Analysis Workflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.
2014-08-01
Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.
Perspective: Optical measurement of feature dimensions and shapes by scatterometry
NASA Astrophysics Data System (ADS)
Diebold, Alain C.; Antonelli, Andy; Keller, Nick
2018-05-01
The use of optical scattering to measure feature shape and dimensions, scatterometry, is now routine during semiconductor manufacturing. Scatterometry iteratively improves an optical model structure using simulations that are compared to experimental data from an ellipsometer. These simulations are done using the rigorous coupled wave analysis for solving Maxwell's equations. In this article, we describe the Mueller matrix spectroscopic ellipsometry based scatterometry. Next, the rigorous coupled wave analysis for Maxwell's equations is presented. Following this, several example measurements are described as they apply to specific process steps in the fabrication of gate-all-around (GAA) transistor structures. First, simulations of measurement sensitivity for the inner spacer etch back step of horizontal GAA transistor processing are described. Next, the simulated metrology sensitivity for sacrificial (dummy) amorphous silicon etch back step of vertical GAA transistor processing is discussed. Finally, we present the application of plasmonically active test structures for improving the sensitivity of the measurement of metal linewidths.
ERIC Educational Resources Information Center
Metzger, Isha; Cooper, Shauna M.; Zarrett, Nicole; Flory, Kate
2013-01-01
The current review conducted a systematic assessment of culturally sensitive risk prevention programs for African American adolescents. Prevention programs meeting the inclusion and exclusion criteria were evaluated across several domains: (1) theoretical orientation and foundation; (2) methodological rigor; (3) level of cultural integration; (4)…
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
McNeill, Shalene H; Cifelli, Amy M; Roseland, Janet M; Belk, Keith E; Woerner, Dale R; Gehring, Kerri B; Savell, Jeffrey W; Brooks, J Chance; Thompson, Leslie D
2017-08-25
Knowing whether or not a food contains gluten is vital for the growing number of individuals with celiac disease and non-celiac gluten sensitivity. Questions have recently been raised about whether beef from conventionally-raised, grain-finished cattle may contain gluten. To date, basic principles of ruminant digestion have been cited in support of the prevailing expert opinion that beef is inherently gluten-free. For this study, gluten analysis was conducted in beef samples collected using a rigorous nationally representative sampling protocol to determine whether gluten was present. The findings of our research uphold the understanding of the principles of gluten digestion in beef cattle and corroborate recommendations that recognize beef as a naturally gluten-free food.
NASA Astrophysics Data System (ADS)
Wang, Qiqi; Rigas, Georgios; Esclapez, Lucas; Magri, Luca; Blonigan, Patrick
2016-11-01
Bluff body flows are of fundamental importance to many engineering applications involving massive flow separation and in particular the transport industry. Coherent flow structures emanating in the wake of three-dimensional bluff bodies, such as cars, trucks and lorries, are directly linked to increased aerodynamic drag, noise and structural fatigue. For low Reynolds laminar and transitional regimes, hydrodynamic stability theory has aided the understanding and prediction of the unstable dynamics. In the same framework, sensitivity analysis provides the means for efficient and optimal control, provided the unstable modes can be accurately predicted. However, these methodologies are limited to laminar regimes where only a few unstable modes manifest. Here we extend the stability analysis to low-dimensional chaotic regimes by computing the Lyapunov covariant vectors and their associated Lyapunov exponents. We compare them to eigenvectors and eigenvalues computed in traditional hydrodynamic stability analysis. Computing Lyapunov covariant vectors and Lyapunov exponents also enables the extension of sensitivity analysis to chaotic flows via the shadowing method. We compare the computed shadowing sensitivities to traditional sensitivity analysis. These Lyapunov based methodologies do not rely on mean flow assumptions, and are mathematically rigorous for calculating sensitivities of fully unsteady flow simulations.
McNeill, Shalene H.; Cifelli, Amy M.; Roseland, Janet M.; Belk, Keith E.; Gehring, Kerri B.; Brooks, J. Chance; Thompson, Leslie D.
2017-01-01
Knowing whether or not a food contains gluten is vital for the growing number of individuals with celiac disease and non-celiac gluten sensitivity. Questions have recently been raised about whether beef from conventionally-raised, grain-finished cattle may contain gluten. To date, basic principles of ruminant digestion have been cited in support of the prevailing expert opinion that beef is inherently gluten-free. For this study, gluten analysis was conducted in beef samples collected using a rigorous nationally representative sampling protocol to determine whether gluten was present. The findings of our research uphold the understanding of the principles of gluten digestion in beef cattle and corroborate recommendations that recognize beef as a naturally gluten-free food. PMID:28841165
ERIC Educational Resources Information Center
Hudziak, James J.; Althoff, Robert R.; Stanger, Catherine; van Beijsterveldt, C. E. M.; Nelson, Elliot C.; Hanna, Gregory L.; Boomsma, Dorret I.; Todd, Richard D.
2006-01-01
Background: The purpose of this study was to determine a score on the Obsessive Compulsive Scale (OCS) from the Child Behavior Checklist (CBCL) to screen for obsessive compulsive disorder (OCD) in children and to rigorously test the specificity and sensitivity of a single cutpoint. Methods: A receiver operating characteristic (ROC) curve analysis…
Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A
2011-01-01
The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less
Assessing Sensitivity of Early Head Start Study Findings to Manipulated Randomization Threats
ERIC Educational Resources Information Center
Green, Sheridan
2013-01-01
Increasing demands for design rigor and an emphasis on evidence-based practice on a national level indicated a need for further guidance related to successful implementation of randomized studies in education. Rigorous and meaningful experimental research and its conclusions help establish a valid theoretical and evidence base for educational…
Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I
2015-01-01
High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®
Spatial Contrast Sensitivity in Adolescents with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Koh, Hwan Cui; Milne, Elizabeth; Dobkins, Karen
2010-01-01
Adolescents with autism spectrum disorders (ASD) and typically developing (TD) controls underwent a rigorous psychophysical assessment that measured contrast sensitivity to seven spatial frequencies (0.5-20 cycles/degree). A contrast sensitivity function (CSF) was then fitted for each participant, from which four measures were obtained: visual…
Tosi, L L; Detsky, A S; Roye, D P; Morden, M L
1987-01-01
Using a decision analysis model, we estimated the savings that might be derived from a mass prenatal screening program aimed at detecting open neural tube defects (NTDs) in low-risk pregnancies. Our baseline analysis showed that screening v. no screening could be expected to save approximately $8 per pregnancy given a cost of $7.50 for the maternal serum alpha-feto-protein (MSAFP) test and a cost of $42,507 for hospital and rehabilitation services for the first 10 years of life for a child with spina bifida. When a more liberal estimate of the costs of caring for such a child was used, the savings with the screening program were more substantial. We performed extensive sensitivity analyses, which showed that the savings were somewhat sensitive to the cost of the MSAFP test and highly sensitive to the specificity (but not the sensitivity) of the test. A screening program for NTDs in low-risk pregnancies may result in substantial savings in direct health care costs if the screening protocol is followed rigorously and efficiently. PMID:2433011
Rigorous electromagnetic simulation applied to alignment systems
NASA Astrophysics Data System (ADS)
Deng, Yunfei; Pistor, Thomas V.; Neureuther, Andrew R.
2001-09-01
Rigorous electromagnetic simulation with TEMPEST is used to provide benchmark data and understanding of key parameters in the design of topographical features of alignment marks. Periodic large silicon trenches are analyzed as a function of wavelength (530-800 nm), duty cycle, depth, slope and angle of incidence. The signals are well behaved except when the trench width becomes about 1 micrometers or smaller. Segmentation of the trenches to form 3D marks shows that a segmentation period of 2-5 wavelengths makes the diffraction in the (1,1) direction about 1/3 to 1/2 of that in the main first order (1,0). Transmission alignment marks nanoimprint lithography using the difference between the +1 and -1 reflected orders showed a sensitivity of the difference signal to misalignment of 0.7%/nm for rigorous simulation and 0.5%/nm for simple ray-tracing. The sensitivity to a slanted substrate indentation was 10 nm off-set per degree of tilt from horizontal.
NASA Astrophysics Data System (ADS)
Razavi, S.; Gupta, H. V.
2014-12-01
Sensitivity analysis (SA) is an important paradigm in the context of Earth System model development and application, and provides a powerful tool that serves several essential functions in modelling practice, including 1) Uncertainty Apportionment - attribution of total uncertainty to different uncertainty sources, 2) Assessment of Similarity - diagnostic testing and evaluation of similarities between the functioning of the model and the real system, 3) Factor and Model Reduction - identification of non-influential factors and/or insensitive components of model structure, and 4) Factor Interdependence - investigation of the nature and strength of interactions between the factors, and the degree to which factors intensify, cancel, or compensate for the effects of each other. A variety of sensitivity analysis approaches have been proposed, each of which formally characterizes a different "intuitive" understanding of what is meant by the "sensitivity" of one or more model responses to its dependent factors (such as model parameters or forcings). These approaches are based on different philosophies and theoretical definitions of sensitivity, and range from simple local derivatives and one-factor-at-a-time procedures to rigorous variance-based (Sobol-type) approaches. In general, each approach focuses on, and identifies, different features and properties of the model response and may therefore lead to different (even conflicting) conclusions about the underlying sensitivity. This presentation revisits the theoretical basis for sensitivity analysis, and critically evaluates existing approaches so as to demonstrate their flaws and shortcomings. With this background, we discuss several important properties of response surfaces that are associated with the understanding and interpretation of sensitivity. Finally, a new approach towards global sensitivity assessment is developed that is consistent with important properties of Earth System model response surfaces.
Prediction of coefficients of thermal expansion for unidirectional composites
NASA Technical Reports Server (NTRS)
Bowles, David E.; Tompkins, Stephen S.
1989-01-01
Several analyses for predicting the longitudinal, alpha(1), and transverse, alpha(2), coefficients of thermal expansion of unidirectional composites were compared with each other, and with experimental data on different graphite fiber reinforced resin, metal, and ceramic matrix composites. Analytical and numerical analyses that accurately accounted for Poisson restraining effects in the transverse direction were in consistently better agreement with experimental data for alpha(2), than the less rigorous analyses. All of the analyses predicted similar values of alpha(1), and were in good agreement with the experimental data. A sensitivity analysis was conducted to determine the relative influence of constituent properties on the predicted values of alpha(1), and alpha(2). As would be expected, the prediction of alpha(1) was most sensitive to longitudinal fiber properties and the prediction of alpha(2) was most sensitive to matrix properties.
Genetic and environmental effects on the muscle structure response post-mortem.
Thompson, J M; Perry, D; Daly, B; Gardner, G E; Johnston, D J; Pethick, D W
2006-09-01
This paper reviewed the mechanisms by which glycolytic rate and pre-rigor stretching of muscle impact on meat quality. If muscle is free to shorten during the rigor process extremes in glycolytic rate can impact negatively on meat quality by inducing either cold or rigor shortening. Factors that contribute to variation in glycolytic rate include the glycogen concentration at slaughter and fibre type of the muscle. Glycolysis is highly sensitive to temperature, which is an important factor in heavy grain fed carcasses. An alternative solution to controlling glycolysis is to stretch the muscle pre-rigor so that it cannot shorten, thus providing an insurance against extremes in processing conditions. Results are presented which show a large reduction in variance (both additive and phenotypic) in tenderness caused by pre-rigor stretching. Whilst this did not impact on the heritability of shear force, it did reduce genotype differences. The implications of these results on the magnitude of genotype effects on tenderness is discussed.
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2015-08-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
NASA Astrophysics Data System (ADS)
Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2016-04-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
Chan, T M Simon; Teram, Eli; Shaw, Ian
2017-01-01
Despite growing consideration of the needs of research participants in studies related to sensitive issues, discussions of alternative ways to design sensitive research are scarce. Structured as an exchange between two researchers who used different approaches in their studies with childhood sexual abuse survivors, in this article, we seek to advance understanding of methodological and ethical issues in designing sensitive research. The first perspective, which is termed protective, promotes the gradual progression of participants from a treatment phase into a research phase, with the ongoing presence of a researcher and a social worker in both phases. In the second perspective, which is termed minimalist, we argue for clear boundaries between research and treatment processes, limiting the responsibility of researchers to ensuring that professional support is available to participants who experience emotional difficulties. Following rebuttals, lessons are drawn for ethical balancing between methodological rigor and the needs of participants. © The Author(s) 2015.
From screening to synthesis: using nvivo to enhance transparency in qualitative evidence synthesis.
Houghton, Catherine; Murphy, Kathy; Meehan, Ben; Thomas, James; Brooker, Dawn; Casey, Dympna
2017-03-01
To explore the experiences and perceptions of healthcare staff caring for people with dementia in the acute setting. This article focuses on the methodological process of conducting framework synthesis using nvivo for each stage of the review: screening, data extraction, synthesis and critical appraisal. Qualitative evidence synthesis brings together many research findings in a meaningful way that can be used to guide practice and policy development. For this purpose, synthesis must be conducted in a comprehensive and rigorous way. There has been previous discussion on how using nvivo can assist in enhancing and illustrate the rigorous processes involved. Qualitative framework synthesis. Twelve documents, or research reports, based on nine studies, were included for synthesis. The benefits of using nvivo are outlined in terms of facilitating teams of researchers to systematically and rigorously synthesise findings. nvivo functions were used to conduct a sensitivity analysis. Some valuable lessons were learned, and these are presented to assist and guide researchers who wish to use similar methods in future. Ultimately, good qualitative evidence synthesis will provide practitioners and policymakers with significant information that will guide decision-making on many aspects of clinical practice. The example provided explored how people with dementia are cared for acute settings. © 2016 The Authors. Journal of Clinical Nursing Published by John Wiley & Sons Ltd.
Rui, Jing; Runge, M Brett; Spinner, Robert J; Yaszemski, Michael J; Windebank, Anthony J; Wang, Huan
2014-10-01
Video-assisted gait kinetics analysis has been a sensitive method to assess rat sciatic nerve function after injury and repair. However, in conduit repair of sciatic nerve defects, previously reported kinematic measurements failed to be a sensitive indicator because of the inferior recovery and inevitable joint contracture. This study aimed to explore the role of physiotherapy in mitigating joint contracture and to seek motion analysis indices that can sensitively reflect motor function. Data were collected from 26 rats that underwent sciatic nerve transection and conduit repair. Regular postoperative physiotherapy was applied. Parameters regarding step length, phase duration, and ankle angle were acquired and analyzed from video recording of gait kinetics preoperatively and at regular postoperative intervals. Stride length ratio (step length of uninjured foot/step length of injured foot), percent swing of the normal paw (percentage of the total stride duration when the uninjured paw is in the air), propulsion angle (toe-off angle subtracted by midstance angle), and clearance angle (ankle angle change from toe off to midswing) decreased postoperatively comparing with baseline values. The gradual recovery of these measurements had a strong correlation with the post-nerve repair time course. Ankle joint contracture persisted despite rigorous physiotherapy. Parameters acquired from a 2-dimensional motion analysis system, that is, stride length ratio, percent swing of the normal paw, propulsion angle, and clearance angle, could sensitively reflect nerve function impairment and recovery in the rat sciatic nerve conduit repair model despite the existence of joint contractures.
Rigorous Science: a How-To Guide.
Casadevall, Arturo; Fang, Ferric C
2016-11-08
Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.
NASA Astrophysics Data System (ADS)
Pirozzi, K. L.; Long, C. J.; McAleer, C. W.; Smith, A. S. T.; Hickman, J. J.
2013-08-01
Rigorous analysis of muscle function in in vitro systems is needed for both acute and chronic biomedical applications. Forces generated by skeletal myotubes on bio-microelectromechanical cantilevers were calculated using a modified version of Stoney's thin-film equation and finite element analysis (FEA), then analyzed for regression to physical parameters. The Stoney's equation results closely matched the more intensive FEA and the force correlated to cross-sectional area (CSA). Normalizing force to measured CSA significantly improved the statistical sensitivity and now allows for close comparison of in vitro data to in vivo measurements for applications in exercise physiology, robotics, and modeling neuromuscular diseases.
Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B
2016-11-01
As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.
Alves, Vinicius M.; Muratov, Eugene; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander
2015-01-01
Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putative sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using random forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers were 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the ScoreCard database of possible skin or sense organ toxicants as primary candidates for experimental validation. PMID:25560674
Puttarajappa, Chethan; Wijkstrom, Martin; Ganoza, Armando; Lopez, Roberto; Tevar, Amit
2018-01-01
Background Recent studies have reported a significant decrease in wound problems and hospital stay in obese patients undergoing renal transplantation by robotic-assisted minimally invasive techniques with no difference in graft function. Objective Due to the lack of cost-benefit studies on the use of robotic-assisted renal transplantation versus open surgical procedure, the primary aim of our study is to develop a Markov model to analyze the cost-benefit of robotic surgery versus open traditional surgery in obese patients in need of a renal transplant. Methods Electronic searches will be conducted to identify studies comparing open renal transplantation versus robotic-assisted renal transplantation. Costs associated with the two surgical techniques will incorporate the expenses of the resources used for the operations. A decision analysis model will be developed to simulate a randomized controlled trial comparing three interventional arms: (1) continuation of renal replacement therapy for patients who are considered non-suitable candidates for renal transplantation due to obesity, (2) transplant recipients undergoing open transplant surgery, and (3) transplant patients undergoing robotic-assisted renal transplantation. TreeAge Pro 2017 R1 TreeAge Software, Williamstown, MA, USA) will be used to create a Markov model and microsimulation will be used to compare costs and benefits for the two competing surgical interventions. Results The model will simulate a randomized controlled trial of adult obese patients affected by end-stage renal disease undergoing renal transplantation. The absorbing state of the model will be patients' death from any cause. By choosing death as the absorbing state, we will be able simulate the population of renal transplant recipients from the day of their randomization to transplant surgery or continuation on renal replacement therapy to their death and perform sensitivity analysis around patients' age at the time of randomization to determine if age is a critical variable for cost-benefit analysis or cost-effectiveness analysis comparing renal replacement therapy, robotic-assisted surgery or open renal transplant surgery. After running the model, one of the three competing strategies will result as the most cost-beneficial or cost-effective under common circumstances. To assess the robustness of the results of the model, a multivariable probabilistic sensitivity analysis will be performed by modifying the mean values and confidence intervals of key parameters with the main intent of assessing if the winning strategy is sensitive to rigorous and plausible variations of those values. Conclusions After running the model, one of the three competing strategies will result as the most cost-beneficial or cost-effective under common circumstances. To assess the robustness of the results of the model, a multivariable probabilistic sensitivity analysis will be performed by modifying the mean values and confidence intervals of key parameters with the main intent of assessing if the winning strategy is sensitive to rigorous and plausible variations of those values. PMID:29519780
1989-03-03
address global parameter space mapping issues for first order differential equations. The rigorous criteria for the existence of exact lumping by linear projective transformations was also established.
NASA Astrophysics Data System (ADS)
Zhang, Zu-Yin; Wang, Li-Na; Hu, Hai-Feng; Li, Kang-Wen; Ma, Xun-Peng; Song, Guo-Feng
2013-10-01
We investigate the sensitivity and figure of merit (FOM) of a localized surface plasmon (LSP) sensor with gold nanograting on the top of planar metallic film. The sensitivity of the localized surface plasmon sensor is 317 nm/RIU, and the FOM is predicted to be above 8, which is very high for a localized surface plasmon sensor. By employing the rigorous coupled-wave analysis (RCWA) method, we analyze the distribution of the magnetic field and find that the sensing property of our proposed system is attributed to the interactions between the localized surface plasmon around the gold nanostrips and the surface plasmon polarition on the surface of the gold planar metallic film. These findings are important for developing high FOM localized surface plasmon sensors.
Theoretical study of surface plasmon resonance sensors based on 2D bimetallic alloy grating
NASA Astrophysics Data System (ADS)
Dhibi, Abdelhak; Khemiri, Mehdi; Oumezzine, Mohamed
2016-11-01
A surface plasmon resonance (SPR) sensor based on 2D alloy grating with a high performance is proposed. The grating consists of homogeneous alloys of formula MxAg1-x, where M is gold, copper, platinum and palladium. Compared to the SPR sensors based a pure metal, the sensor based on angular interrogation with silver exhibits a sharper (i.e. larger depth-to-width ratio) reflectivity dip, which provides a big detection accuracy, whereas the sensor based on gold exhibits the broadest dips and the highest sensitivity. The detection accuracy of SPR sensor based a metal alloy is enhanced by the increase of silver composition. In addition, the composition of silver which is around 0.8 improves the sensitivity and the quality of SPR sensor of pure metal. Numerical simulations based on rigorous coupled wave analysis (RCWA) show that the sensor based on a metal alloy not only has a high sensitivity and a high detection accuracy, but also exhibits a good linearity and a good quality.
ERIC Educational Resources Information Center
Carlson, Laurie A.; Harper, Kelly S.
2011-01-01
Service provision to gay, lesbian, bisexual, and transgender (GLBT) older adults is a dynamic and sensitive area, requiring rigorous and extensive inquiry and action. Examining the readiness and assets of organizations serving GLBT older adults requires not only heart and sensitivity but also resources and a clear vision. The Community Readiness…
NASA Technical Reports Server (NTRS)
Lewis, Robert Michael; Patera, Anthony T.; Peraire, Jaume
1998-01-01
We present a Neumann-subproblem a posteriori finite element procedure for the efficient and accurate calculation of rigorous, 'constant-free' upper and lower bounds for sensitivity derivatives of functionals of the solutions of partial differential equations. The design motivation for sensitivity derivative error control is discussed; the a posteriori finite element procedure is described; the asymptotic bounding properties and computational complexity of the method are summarized; and illustrative numerical results are presented.
NASA Technical Reports Server (NTRS)
Glytsis, Elias N.; Brundrett, David L.; Gaylord, Thomas K.
1993-01-01
A review of the rigorous coupled-wave analysis as applied to the diffraction of electro-magnetic waves by gratings is presented. The analysis is valid for any polarization, angle of incidence, and conical diffraction. Cascaded and/or multiplexed gratings as well as material anisotropy can be incorporated under the same formalism. Small period rectangular groove gratings can also be modeled using approximately equivalent uniaxial homogeneous layers (effective media). The ordinary and extraordinary refractive indices of these layers depend on the gratings filling factor, the refractive indices of the substrate and superstrate, and the ratio of the freespace wavelength to grating period. Comparisons of the homogeneous effective medium approximations with the rigorous coupled-wave analysis are presented. Antireflection designs (single-layer or multilayer) using the effective medium models are presented and compared. These ultra-short period antireflection gratings can also be used to produce soft x-rays. Comparisons of the rigorous coupled-wave analysis with experimental results on soft x-ray generation by gratings are also included.
Rigorous Science: a How-To Guide
Fang, Ferric C.
2016-01-01
ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205
ERIC Educational Resources Information Center
Petrilli, Salvatore John, Jr.
2009-01-01
Historians of mathematics considered the nineteenth century to be the Golden Age of mathematics. During this time period many areas of mathematics, such as algebra and geometry, were being placed on rigorous foundations. Another area of mathematics which experienced fundamental change was analysis. The drive for rigor in calculus began in 1797…
[Rigor mortis -- a definite sign of death?].
Heller, A R; Müller, M P; Frank, M D; Dressler, J
2005-04-01
In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Vinicius M.; Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599; Muratov, Eugene
Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putativemore » sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using Random Forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers was 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR Toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the Scorecard database of possible skin or sense organ toxicants as primary candidates for experimental validation. - Highlights: • It was compiled the largest publicly-available skin sensitization dataset. • Predictive QSAR models were developed for skin sensitization. • Developed models have higher prediction accuracy than OECD QSAR Toolbox. • Putative chemical hazards in the Scorecard database were found using our models.« less
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar
2016-01-01
This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.
NASA Astrophysics Data System (ADS)
Dehbashi, Reza; Shahabadi, Mahmoud
2013-12-01
The commonly used coordinate transformation for cylindrical cloaks is generalized. This transformation is utilized to determine an anisotropic inhomogeneous diagonal material tensors of a shell type cloak for various material types, i.e., double-positive (DPS: ɛ, μ > 0), double-negative (DNG: ɛ, μ < 0), ɛ-negative (ENG), and μ-negative (MNG). To obtain conditions of perfect cloaking for various material types, a rigorous analysis is performed. It is shown that perfect cloaking will be achieved for same type material for the cloak and its surrounding medium. Moreover, material losses are included in the analysis to demonstrate that perfect cloaking for lossy materials can be achieved for identical loss tangent of the cloak and its surrounding material. Sensitivity of the cloaking performance to losses for different material types is also investigated. The obtained analytical results are verified using a Finite-Element computational analysis.
Walmsley, Christopher W; McCurry, Matthew R; Clausen, Phillip D; McHenry, Colin R
2013-01-01
Finite element analysis (FEA) is a computational technique of growing popularity in the field of comparative biomechanics, and is an easily accessible platform for form-function analyses of biological structures. However, its rapid evolution in recent years from a novel approach to common practice demands some scrutiny in regards to the validity of results and the appropriateness of assumptions inherent in setting up simulations. Both validation and sensitivity analyses remain unexplored in many comparative analyses, and assumptions considered to be 'reasonable' are often assumed to have little influence on the results and their interpretation. HERE WE REPORT AN EXTENSIVE SENSITIVITY ANALYSIS WHERE HIGH RESOLUTION FINITE ELEMENT (FE) MODELS OF MANDIBLES FROM SEVEN SPECIES OF CROCODILE WERE ANALYSED UNDER LOADS TYPICAL FOR COMPARATIVE ANALYSIS: biting, shaking, and twisting. Simulations explored the effect on both the absolute response and the interspecies pattern of results to variations in commonly used input parameters. Our sensitivity analysis focuses on assumptions relating to the selection of material properties (heterogeneous or homogeneous), scaling (standardising volume, surface area, or length), tooth position (front, mid, or back tooth engagement), and linear load case (type of loading for each feeding type). Our findings show that in a comparative context, FE models are far less sensitive to the selection of material property values and scaling to either volume or surface area than they are to those assumptions relating to the functional aspects of the simulation, such as tooth position and linear load case. Results show a complex interaction between simulation assumptions, depending on the combination of assumptions and the overall shape of each specimen. Keeping assumptions consistent between models in an analysis does not ensure that results can be generalised beyond the specific set of assumptions used. Logically, different comparative datasets would also be sensitive to identical simulation assumptions; hence, modelling assumptions should undergo rigorous selection. The accuracy of input data is paramount, and simulations should focus on taking biological context into account. Ideally, validation of simulations should be addressed; however, where validation is impossible or unfeasible, sensitivity analyses should be performed to identify which assumptions have the greatest influence upon the results.
NASA Astrophysics Data System (ADS)
Meliga, Philippe
2017-07-01
We provide in-depth scrutiny of two methods making use of adjoint-based gradients to compute the sensitivity of drag in the two-dimensional, periodic flow past a circular cylinder (Re≲189 ): first, the time-stepping analysis used in Meliga et al. [Phys. Fluids 26, 104101 (2014), 10.1063/1.4896941] that relies on classical Navier-Stokes modeling and determines the sensitivity to any generic control force from time-dependent adjoint equations marched backwards in time; and, second, a self-consistent approach building on the model of Mantič-Lugo et al. [Phys. Rev. Lett. 113, 084501 (2014), 10.1103/PhysRevLett.113.084501] to compute semilinear approximations of the sensitivity to the mean and fluctuating components of the force. Both approaches are applied to open-loop control by a small secondary cylinder and allow identifying the sensitive regions without knowledge of the controlled states. The theoretical predictions obtained by time-stepping analysis reproduce well the results obtained by direct numerical simulation of the two-cylinder system. So do the predictions obtained by self-consistent analysis, which corroborates the relevance of the approach as a guideline for efficient and systematic control design in the attempt to reduce drag, even though the Reynolds number is not close to the instability threshold and the oscillation amplitude is not small. This is because, unlike simpler approaches relying on linear stability analysis to predict the main features of the flow unsteadiness, the semilinear framework encompasses rigorously the effect of the control on the mean flow, as well as on the finite-amplitude fluctuation that feeds back nonlinearly onto the mean flow via the formation of Reynolds stresses. Such results are especially promising as the self-consistent approach determines the sensitivity from time-independent equations that can be solved iteratively, which makes it generally less computationally demanding. We ultimately discuss the extent to which relevant information can be gained from a hybrid modeling computing self-consistent sensitivities from the postprocessing of DNS data. Application to alternative control objectives such as increasing the lift and alleviating the fluctuating drag and lift is also discussed.
McCurry, Matthew R.; Clausen, Phillip D.; McHenry, Colin R.
2013-01-01
Finite element analysis (FEA) is a computational technique of growing popularity in the field of comparative biomechanics, and is an easily accessible platform for form-function analyses of biological structures. However, its rapid evolution in recent years from a novel approach to common practice demands some scrutiny in regards to the validity of results and the appropriateness of assumptions inherent in setting up simulations. Both validation and sensitivity analyses remain unexplored in many comparative analyses, and assumptions considered to be ‘reasonable’ are often assumed to have little influence on the results and their interpretation. Here we report an extensive sensitivity analysis where high resolution finite element (FE) models of mandibles from seven species of crocodile were analysed under loads typical for comparative analysis: biting, shaking, and twisting. Simulations explored the effect on both the absolute response and the interspecies pattern of results to variations in commonly used input parameters. Our sensitivity analysis focuses on assumptions relating to the selection of material properties (heterogeneous or homogeneous), scaling (standardising volume, surface area, or length), tooth position (front, mid, or back tooth engagement), and linear load case (type of loading for each feeding type). Our findings show that in a comparative context, FE models are far less sensitive to the selection of material property values and scaling to either volume or surface area than they are to those assumptions relating to the functional aspects of the simulation, such as tooth position and linear load case. Results show a complex interaction between simulation assumptions, depending on the combination of assumptions and the overall shape of each specimen. Keeping assumptions consistent between models in an analysis does not ensure that results can be generalised beyond the specific set of assumptions used. Logically, different comparative datasets would also be sensitive to identical simulation assumptions; hence, modelling assumptions should undergo rigorous selection. The accuracy of input data is paramount, and simulations should focus on taking biological context into account. Ideally, validation of simulations should be addressed; however, where validation is impossible or unfeasible, sensitivity analyses should be performed to identify which assumptions have the greatest influence upon the results. PMID:24255817
Preoperative identification of a suspicious adnexal mass: a systematic review and meta-analysis.
Dodge, Jason E; Covens, Allan L; Lacchetti, Christina; Elit, Laurie M; Le, Tien; Devries-Aboud, Michaela; Fung-Kee-Fung, Michael
2012-07-01
To systematically review the existing literature in order to determine the optimal strategy for preoperative identification of the adnexal mass suspicious for ovarian cancer. A review of all systematic reviews and guidelines published between 1999 and 2009 was conducted as a first step. After the identification of a 2004 AHRQ systematic review on the topic, searches of MEDLINE for studies published since 2004 was also conducted to update and supplement the evidentiary base. A bivariate, random-effects meta-regression model was used to produce summary estimates of sensitivity and specificity and to plot summary ROC curves with 95% confidence regions. Four meta-analyses and 53 primary studies were included in this review. The diagnostic performance of each technology was compared and contrasted based on the summary data on sensitivity and specificity obtained from the meta-analysis. Results suggest that 3D ultrasonography has both a higher sensitivity and specificity when compared to 2D ultrasound. Established morphological scoring systems also performed with respectable sensitivity and specificity, each with equivalent diagnostic competence. Explicit scoring systems did not perform as well as other diagnostic testing methods. Assessment of an adnexal mass by colour Doppler technology was neither as sensitive nor as specific as simple ultrasonography. Of the three imaging modalities considered, MRI appeared to perform the best, although results were not statistically different from CT. PET did not perform as well as either MRI or CT. The measurement of the CA-125 tumour marker appears to be less reliable than do other available assessment methods. The best available evidence was collected and included in this rigorous systematic review and meta-analysis. The abundant evidentiary base provided the context and direction for the diagnosis of early-staged ovarian cancer. Copyright © 2012 Elsevier Inc. All rights reserved.
Rigorous coupled wave analysis of acousto-optics with relativistic considerations.
Xia, Guoqiang; Zheng, Weijian; Lei, Zhenggang; Zhang, Ruolan
2015-09-01
A relativistic analysis of acousto-optics is presented, and a rigorous coupled wave analysis is generalized for the diffraction of the acousto-optical effect. An acoustic wave generates a grating with temporally and spatially modulated permittivity, hindering direct applications of the rigorous coupled wave analysis for the acousto-optical effect. In a reference frame which moves with the acoustic wave, the grating is static, the medium moves, and the coupled wave equations for the static grating may be derived. Floquet's theorem is then applied to cast these equations into an eigenproblem. Using a Lorentz transformation, the electromagnetic fields in the grating region are transformed to the lab frame where the medium is at rest, and relativistic Doppler frequency shifts are introduced into various diffraction orders. In the lab frame, the boundary conditions are considered and the diffraction efficiencies of various orders are determined. This method is rigorous and general, and the plane waves in the resulting expansion satisfy the dispersion relation of the medium and are propagation modes. Properties of various Bragg diffractions are results, rather than preconditions, of this method. Simulations of an acousto-optical tunable filter made by paratellurite, TeO(2), are given as examples.
Sukumaran, Anuraj T; Holtcamp, Alexander J; Campbell, Yan L; Burnett, Derris; Schilling, Mark W; Dinh, Thu T N
2018-06-07
The objective of this study was to determine the effects of deboning time (pre- and post-rigor), processing steps (grinding - GB; salting - SB; batter formulation - BB), and storage time on the quality of raw beef mixtures and vacuum-packaged cooked sausage, produced using a commercial formulation with 0.25% phosphate. The pH was greater in pre-rigor GB and SB than in post-rigor GB and SB (P < .001). However, deboning time had no effect on metmyoglobin reducing activity, cooking loss, and color of raw beef mixtures. Protein solubility of pre-rigor beef mixtures (124.26 mg/kg) was greater than that of post-rigor beef (113.93 mg/kg; P = .071). TBARS were increased in BB but decreased during vacuum storage of cooked sausage (P ≤ .018). Except for chewiness and saltiness being 52.9 N-mm and 0.3 points greater in post-rigor sausage (P = .040 and 0.054, respectively), texture profile analysis and trained panelists detected no difference in texture between pre- and post-rigor sausage. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Stacy; English, Shawn; Briggs, Timothy
Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less
Rigorous ILT optimization for advanced patterning and design-process co-optimization
NASA Astrophysics Data System (ADS)
Selinidis, Kosta; Kuechler, Bernd; Cai, Howard; Braam, Kyle; Hoppe, Wolfgang; Domnenko, Vitaly; Poonawala, Amyn; Xiao, Guangming
2018-03-01
Despite the large difficulties involved in extending 193i multiple patterning and the slow ramp of EUV lithography to full manufacturing readiness, the pace of development for new technology node variations has been accelerating. Multiple new variations of new and existing technology nodes have been introduced for a range of device applications; each variation with at least a few new process integration methods, layout constructs and/or design rules. This had led to a strong increase in the demand for predictive technology tools which can be used to quickly guide important patterning and design co-optimization decisions. In this paper, we introduce a novel hybrid predictive patterning method combining two patterning technologies which have each individually been widely used for process tuning, mask correction and process-design cooptimization. These technologies are rigorous lithography simulation and inverse lithography technology (ILT). Rigorous lithography simulation has been extensively used for process development/tuning, lithography tool user setup, photoresist hot-spot detection, photoresist-etch interaction analysis, lithography-TCAD interactions/sensitivities, source optimization and basic lithography design rule exploration. ILT has been extensively used in a range of lithographic areas including logic hot-spot fixing, memory layout correction, dense memory cell optimization, assist feature (AF) optimization, source optimization, complex patterning design rules and design-technology co-optimization (DTCO). The combined optimization capability of these two technologies will therefore have a wide range of useful applications. We investigate the benefits of the new functionality for a few of these advanced applications including correction for photoresist top loss and resist scumming hotspots.
Qualitative Methods in Field Research: An Indonesian Experience in Community Based Practice.
ERIC Educational Resources Information Center
Lysack, Catherine L.; Krefting, Laura
1994-01-01
Cross-cultural evaluation of a community-based rehabilitation project in Indonesia used three methods: focus groups, questionnaires, and key informant interviews. A continuous cyclical approach to data collection and concern for cultural sensitivity increased the rigor of the research. (SK)
Nonlinear analysis of a model of vascular tumour growth and treatment
NASA Astrophysics Data System (ADS)
Tao, Youshan; Yoshida, Norio; Guo, Qian
2004-05-01
We consider a mathematical model describing the evolution of a vascular tumour in response to traditional chemotherapy. The model is a free boundary problem for a system of partial differential equations governing intratumoural drug concentration, cancer cell density and blood vessel density. Tumour cells consist of two types of competitive cells that have different proliferation rates and different sensitivities to drugs. The balance between cell proliferation and death generates a velocity field that drives tumour cell movement. The tumour surface is a moving boundary. The purpose of this paper is to establish a rigorous mathematical analysis of the model for studying the dynamics of intratumoural blood vessels and to explore drug dosage for the successful treatment of a tumour. We also study numerically the competitive effects of the two cell types on tumour growth.
Challenges in Timeseries Analysis from Microlensing
NASA Astrophysics Data System (ADS)
Street, R. A.
2017-06-01
Despite a flood of discoveries over the last ~ 20 years, our knowledge of the exoplanet population is incomplete owing to a gap between the sensitivities of different detection techniques. However, a census of exoplanets at all separations from their host stars is essential to fully understand planet formation mechanisms. Microlensing offers an effective way to bridge the gap around 1-10 AU and is therefore one of the major science goals of the Wide Field Infrared Survey Telescope (WFIRST) mission. WFIRST's survey of the Galactic Bulge is expected to discover ~ 20,000 microlensing events, including ~ 3000 planets, which represents a substantial data analysis challenge with the modeling software currently available. This paper highlights areas where further work is needed. The community is encouraged to join new software development efforts aimed at making the modeling of microlensing events both more accessible and rigorous.
Adjoint equations and analysis of complex systems: Application to virus infection modelling
NASA Astrophysics Data System (ADS)
Marchuk, G. I.; Shutyaev, V.; Bocharov, G.
2005-12-01
Recent development of applied mathematics is characterized by ever increasing attempts to apply the modelling and computational approaches across various areas of the life sciences. The need for a rigorous analysis of the complex system dynamics in immunology has been recognized since more than three decades ago. The aim of the present paper is to draw attention to the method of adjoint equations. The methodology enables to obtain information about physical processes and examine the sensitivity of complex dynamical systems. This provides a basis for a better understanding of the causal relationships between the immune system's performance and its parameters and helps to improve the experimental design in the solution of applied problems. We show how the adjoint equations can be used to explain the changes in hepatitis B virus infection dynamics between individual patients.
Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.
Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P
2018-03-03
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.
Monitoring muscle optical scattering properties during rigor mortis
NASA Astrophysics Data System (ADS)
Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.
2007-09-01
Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.
Nelson, Stacy; English, Shawn; Briggs, Timothy
2016-05-06
Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less
Tools for observational gait analysis in patients with stroke: a systematic review.
Ferrarello, Francesco; Bianchi, Valeria Anna Maria; Baccini, Marco; Rubbieri, Gaia; Mossello, Enrico; Cavallini, Maria Chiara; Marchionni, Niccolò; Di Bari, Mauro
2013-12-01
Stroke severely affects walking ability, and assessment of gait kinematics is important in defining diagnosis, planning treatment, and evaluating interventions in stroke rehabilitation. Although observational gait analysis is the most common approach to evaluate gait kinematics, tools useful for this purpose have received little attention in the scientific literature and have not been thoroughly reviewed. The aims of this systematic review were to identify tools proposed to conduct observational gait analysis in adults with a stroke, to summarize evidence concerning their quality, and to assess their implementation in rehabilitation research and clinical practice. An extensive search was performed of original articles reporting on visual/observational tools developed to investigate gait kinematics in adults with a stroke. Two reviewers independently selected studies, extracted data, assessed quality of the included studies, and scored the metric properties and clinical utility of each tool. Rigor in reporting metric properties and dissemination of the tools also was evaluated. Five tools were identified, not all of which had been tested adequately for their metric properties. Evaluation of content validity was partially satisfactory. Reliability was poorly investigated in all but one tool. Concurrent validity and sensitivity to change were shown for 3 and 2 tools, respectively. Overall, adequate levels of quality were rarely reached. The dissemination of the tools was poor. Based on critical appraisal, the Gait Assessment and Intervention Tool shows a good level of quality, and its use in stroke rehabilitation is recommended. Rigorous studies are needed for the other tools in order to establish their usefulness.
A case of instantaneous rigor?
Pirch, J; Schulz, Y; Klintschar, M
2013-09-01
The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.
Educating Part-Time MBAs for the Global Business Environment
ERIC Educational Resources Information Center
Randolph, W. Alan
2008-01-01
To be successful managers in the business world of the 21st century, MBA students must acquire global skills of business acumen, reflection, cultural sensitivity, and multi-cultural teamwork. Developing these skills requires international experience, but educating part-time MBAs creates a special challenge demanding both rigor and efficiency. This…
Approaches to Cross-Cultural Research in Art Education.
ERIC Educational Resources Information Center
Anderson, Frances E.
1979-01-01
The author defines the aims of cross-cultural research in art education and examines the problems inherent in such research, using as an illustration a summary chart of Child's cross-cultural studies of esthetic sensitivity. Emphasis is placed on the need for rigor in research design and execution. (SJL)
Tactile Perception in Adults with Autism: A Multidimensional Psychophysical Study
ERIC Educational Resources Information Center
Cascio, Carissa; McGlone, Francis; Folger, Stephen; Tannan, Vinay; Baranek, Grace; Pelphrey, Kevin A.; Essick, Gregory
2008-01-01
Although sensory problems, including unusual tactile sensitivity, are heavily associated with autism, there is a dearth of rigorous psychophysical research. We compared tactile sensation in adults with autism to controls on the palm and forearm, the latter innervated by low-threshold unmyelinated afferents subserving a social/affiliative…
ERIC Educational Resources Information Center
Elgin, Catherine Z.
2013-01-01
Virtue epistemologists hold that knowledge results from the display of epistemic virtues--open-mindedness, rigor, sensitivity to evidence, and the like. But epistemology cannot rest satisfied with a list of the virtues. What is wanted is a criterion for being an epistemic virtue. An extension of a formulation of Kant's categorical imperative…
Byron, Meg; Hall, Lisa L; Lawrence, Jeanne B
2013-01-01
Fluorescence in situ hybridization (FISH) is not a singular technique, but a battery of powerful and versatile tools for examining the distribution of endogenous genes and RNAs in precise context with each other and in relation to specific proteins or cell structures. This unit offers the details of highly sensitive and successful protocols that were initially developed largely in our lab and honed over a number of years. Our emphasis is on analysis of nuclear RNAs and DNA to address specific biological questions about nuclear structure, pre-mRNA metabolism, or the role of noncoding RNAs; however, cytoplasmic RNA detection is also discussed. Multifaceted molecular cytological approaches bring precise resolution and sensitive multicolor detection to illuminate the organization and functional roles of endogenous genes and their RNAs within the native structure of fixed cells. Solutions to several common technical pitfalls are discussed, as are cautions regarding the judicious use of digital imaging and the rigors of analyzing and interpreting complex molecular cytological results.
Alves, Vinicius M.; Muratov, Eugene; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander
2015-01-01
Skin permeability is widely considered to be mechanistically implicated in chemically-induced skin sensitization. Although many chemicals have been identified as skin sensitizers, there have been very few reports analyzing the relationships between molecular structure and skin permeability of sensitizers and non-sensitizers. The goals of this study were to: (i) compile, curate, and integrate the largest publicly available dataset of chemicals studied for their skin permeability; (ii) develop and rigorously validate QSAR models to predict skin permeability; and (iii) explore the complex relationships between skin sensitization and skin permeability. Based on the largest publicly available dataset compiled in this study, we found no overall correlation between skin permeability and skin sensitization. In addition, cross-species correlation coefficient between human and rodent permeability data was found to be as low as R2=0.44. Human skin permeability models based on the random forest method have been developed and validated using OECD-compliant QSAR modeling workflow. Their external accuracy was high (Q2ext = 0.73 for 63% of external compounds inside the applicability domain). The extended analysis using both experimentally-measured and QSAR-imputed data still confirmed the absence of any overall concordance between skin permeability and skin sensitization. This observation suggests that chemical modifications that affect skin permeability should not be presumed a priori to modulate the sensitization potential of chemicals. The models reported herein as well as those developed in the companion paper on skin sensitization suggest that it may be possible to rationally design compounds with the desired high skin permeability but low sensitization potential. PMID:25560673
Tight finite-key analysis for quantum cryptography
Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato
2012-01-01
Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies. PMID:22252558
Tight finite-key analysis for quantum cryptography.
Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato
2012-01-17
Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.
NASA Technical Reports Server (NTRS)
Butler, R.; Williams, F. W.
1992-01-01
A computer program for obtaining the optimum (least mass) dimensions of the kind of prismatic assemblies of laminated, composite plates which occur in advanced aerospace construction is described. Rigorous buckling analysis (derived from exact member theory) and a tailored design procedure are used to produce designs which satisfy buckling and material strength constraints and configurational requirements. Analysis is two to three orders of magnitude quicker than FEM, keeps track of all the governing modes of failure and is efficiently adapted to give sensitivities and to maintain feasibility. Tailoring encourages convergence in fewer sizing cycles than competing programs and permits start designs which are a long way from feasible and/or optimum. Comparisons with its predecessor, PASCO, show that the program is more likely to produce an optimum, will do so more quickly in some cases, and remains accurate for a wider range of problems.
Wu, Xue; Sengupta, Kaushik
2018-03-19
This paper demonstrates a methodology to miniaturize THz spectroscopes into a single silicon chip by eliminating traditional solid-state architectural components such as complex tunable THz and optical sources, nonlinear mixing and amplifiers. The proposed method achieves this by extracting incident THz spectral signatures from the surface of an on-chip antenna itself. The information is sensed through the spectrally-sensitive 2D distribution of the impressed current surface under the THz incident field. By converting the antenna from a single-port to a massively multi-port architecture with integrated electronics and deep subwavelength sensing, THz spectral estimation is converted into a linear estimation problem. We employ rigorous regression techniques and analysis to demonstrate a single silicon chip system operating at room temperature across 0.04-0.99 THz with 10 MHz accuracy in spectrum estimation of THz tones across the entire spectrum.
MUSiC - A general search for deviations from monte carlo predictions in CMS
NASA Astrophysics Data System (ADS)
Biallass, Philipp A.; CMS Collaboration
2009-06-01
A model independent analysis approach in CMS is presented, systematically scanning the data for deviations from the Monte Carlo expectation. Such an analysis can contribute to the understanding of the detector and the tuning of the event generators. Furthermore, due to the minimal theoretical bias this approach is sensitive to a variety of models of new physics, including those not yet thought of. Events are classified into event classes according to their particle content (muons, electrons, photons, jets and missing transverse energy). A broad scan of various distributions is performed, identifying significant deviations from the Monte Carlo simulation. The importance of systematic uncertainties is outlined, which are taken into account rigorously within the algorithm. Possible detector effects and generator issues, as well as models involving Supersymmetry and new heavy gauge bosons are used as an input to the search algorithm.
MUSiC - A Generic Search for Deviations from Monte Carlo Predictions in CMS
NASA Astrophysics Data System (ADS)
Hof, Carsten
2009-05-01
We present a model independent analysis approach, systematically scanning the data for deviations from the Standard Model Monte Carlo expectation. Such an analysis can contribute to the understanding of the CMS detector and the tuning of the event generators. Furthermore, due to the minimal theoretical bias this approach is sensitive to a variety of models of new physics, including those not yet thought of. Events are classified into event classes according to their particle content (muons, electrons, photons, jets and missing transverse energy). A broad scan of various distributions is performed, identifying significant deviations from the Monte Carlo simulation. We outline the importance of systematic uncertainties, which are taken into account rigorously within the algorithm. Possible detector effects and generator issues, as well as models involving supersymmetry and new heavy gauge bosons have been used as an input to the search algorithm.
MUSiC - Model-independent search for deviations from Standard Model predictions in CMS
NASA Astrophysics Data System (ADS)
Pieta, Holger
2010-02-01
We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )
Response to Ridgeway, Dunston, and Qian: On Methodological Rigor: Has Rigor Mortis Set In?
ERIC Educational Resources Information Center
Baldwin, R. Scott; Vaughn, Sharon
1993-01-01
Responds to an article in the same issue of the journal presenting a meta-analysis of reading research. Expresses concern that the authors' conclusions will promote a slavish adherence to a methodology and a rigidity of thought that reading researchers can ill afford. (RS)
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2017-01-01
This review brings rigorous life cycle cost (LCC) analysis into discussions about COTS program costs. We gather publicly available cost data, review the data for credibility, check for consistency among sources, and rigorously define and analyze specific cost metrics.
Systemic Planning: An Annotated Bibliography and Literature Guide. Exchange Bibliography No. 91.
ERIC Educational Resources Information Center
Catanese, Anthony James
Systemic planning is an operational approach to using scientific rigor and qualitative judgment in a complementary manner. It integrates rigorous techniques and methods from systems analysis, cybernetics, decision theory, and work programing. The annotated reference sources in this bibliography include those works that have been most influential…
Capabilities of the new “Universal” AC-DC monitor for electropenetrography (EPG)
USDA-ARS?s Scientific Manuscript database
Electropenetrography (EPG), invented over 50 years ago, is the most rigorous and important means of studying the feeding of piercing-sucking crop pests. The 1st-generation monitor (or AC monitor) used AC applied signal voltage and had fixed amplifier sensitivity (input resistor or Ri) of 106 Ohms. T...
The Ebbinghaus Illusion Deceives Adults but Not Young Children
ERIC Educational Resources Information Center
Doherty, Martin J.; Campbell, Nicola M.; Tsuji, Hiromi; Phillips, William A.
2010-01-01
The sensitivity of size perception to context has been used to distinguish between "vision for action" and "vision for perception", and to study cultural, psychopathological, and developmental differences in perception. The status of that evidence is much debated, however. Here we use a rigorous double dissociation paradigm based on the Ebbinghaus…
The Personal Selling Ethics Scale: Revisions and Expansions for Teaching Sales Ethics
ERIC Educational Resources Information Center
Donoho, Casey; Heinze, Timothy
2011-01-01
The field of sales draws a large number of marketing graduates. Sales curricula used within today's marketing programs should include rigorous discussions of sales ethics. The Personal Selling Ethics Scale (PSE) provides an analytical tool for assessing and discussing students' ethical sales sensitivities. However, since the scale fails to address…
High and low rigor temperature effects on sheep meat tenderness and ageing.
Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J
2002-02-01
Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (P<0.001). The mean sarcomere length values of meat samples for 18 and 35°C rigor at each ageing time were significantly different (P<0.001), the samples at 35°C being shorter. When the short sarcomere length values and corresponding shear force values were removed for further data analysis, the shear force values for the 35°C rigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.
Space radiator simulation system analysis
NASA Technical Reports Server (NTRS)
Black, W. Z.; Wulff, W.
1972-01-01
A transient heat transfer analysis was carried out on a space radiator heat rejection system exposed to an arbitrarily prescribed combination of aerodynamic heating, solar, albedo, and planetary radiation. A rigorous analysis was carried out for the radiation panel and tubes lying in one plane and an approximate analysis was used to extend the rigorous analysis to the case of a curved panel. The analysis permits the consideration of both gaseous and liquid coolant fluids, including liquid metals, under prescribed, time dependent inlet conditions. The analysis provided a method for predicting: (1) transient and steady-state, two dimensional temperature profiles, (2) local and total heat rejection rates, (3) coolant flow pressure in the flow channel, and (4) total system weight and protection layer thickness.
Numerical parametric studies of spray combustion instability
NASA Technical Reports Server (NTRS)
Pindera, M. Z.
1993-01-01
A coupled numerical algorithm has been developed for studies of combustion instabilities in spray-driven liquid rocket engines. The model couples gas and liquid phase physics using the method of fractional steps. Also introduced is a novel, efficient methodology for accounting for spray formation through direct solution of liquid phase equations. Preliminary parametric studies show marked sensitivity of spray penetration and geometry to droplet diameter, considerations of liquid core, and acoustic interactions. Less sensitivity was shown to the combustion model type although more rigorous (multi-step) formulations may be needed for the differences to become apparent.
NASA Astrophysics Data System (ADS)
Böbel, A.; Knapek, C. A.; Räth, C.
2018-05-01
Experiments of the recrystallization processes in two-dimensional complex plasmas are analyzed to rigorously test a recently developed scale-free phase transition theory. The "fractal-domain-structure" (FDS) theory is based on the kinetic theory of Frenkel. It assumes the formation of homogeneous domains, separated by defect lines, during crystallization and a fractal relationship between domain area and boundary length. For the defect number fraction and system energy a scale-free power-law relation is predicted. The long-range scaling behavior of the bond-order correlation function shows clearly that the complex plasma phase transitions are not of the Kosterlitz, Thouless, Halperin, Nelson, and Young type. Previous preliminary results obtained by counting the number of dislocations and applying a bond-order metric for structural analysis are reproduced. These findings are supplemented by extending the use of the bond-order metric to measure the defect number fraction and furthermore applying state-of-the-art analysis methods, allowing a systematic testing of the FDS theory with unprecedented scrutiny: A morphological analysis of lattice structure is performed via Minkowski tensor methods. Minkowski tensors form a complete family of additive, motion covariant and continuous morphological measures that are sensitive to nonlinear properties. The FDS theory is rigorously confirmed and predictions of the theory are reproduced extremely well. The predicted scale-free power-law relation between defect fraction number and system energy is verified for one more order of magnitude at high energies compared to the inherently discontinuous bond-order metric. It is found that the fractal relation between crystalline domain area and circumference is independent of the experiment, the particular Minkowski tensor method, and the particular choice of parameters. Thus, the fractal relationship seems to be inherent to two-dimensional phase transitions in complex plasmas. Minkowski tensor analysis turns out to be a powerful tool for investigations of crystallization processes. It is capable of revealing nonlinear local topological properties, however, still provides easily interpretable results founded on a solid mathematical framework.
Towards rigorous analysis of the Levitov-Mirlin-Evers recursion
NASA Astrophysics Data System (ADS)
Fyodorov, Y. V.; Kupiainen, A.; Webb, C.
2016-12-01
This paper aims to develop a rigorous asymptotic analysis of an approximate renormalization group recursion for inverse participation ratios P q of critical powerlaw random band matrices. The recursion goes back to the work by Mirlin and Evers (2000 Phys. Rev. B 62 7920) and earlier works by Levitov (1990 Phys. Rev. Lett. 64 547, 1999 Ann. Phys. 8 697-706) and is aimed to describe the ensuing multifractality of the eigenvectors of such matrices. We point out both similarities and dissimilarities between the LME recursion and those appearing in the theory of multiplicative cascades and branching random walks and show that the methods developed in those fields can be adapted to the present case. In particular the LME recursion is shown to exhibit a phase transition, which we expect is a freezing transition, where the role of temperature is played by the exponent q. However, the LME recursion has features that make its rigorous analysis considerably harder and we point out several open problems for further study.
Validation of Self-Image of Aging Scale for Chinese Elders
ERIC Educational Resources Information Center
Bai, Xue; Chan, K. S.; Chow, Nelson
2012-01-01
Researchers are increasingly interested in the "image of aging" concept. Models on the image of aging abound, but few have rigorously tested measures that are culturally sensitive and domain-specific. This study first translates Levy et al.'s (2004) Image of Aging Scale into the Chinese language and revises it into the Chinese Version of…
NASA Astrophysics Data System (ADS)
Toman, Blaza; Nelson, Michael A.; Bedner, Mary
2017-06-01
Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Vinicius M.; Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599; Muratov, Eugene
Skin permeability is widely considered to be mechanistically implicated in chemically-induced skin sensitization. Although many chemicals have been identified as skin sensitizers, there have been very few reports analyzing the relationships between molecular structure and skin permeability of sensitizers and non-sensitizers. The goals of this study were to: (i) compile, curate, and integrate the largest publicly available dataset of chemicals studied for their skin permeability; (ii) develop and rigorously validate QSAR models to predict skin permeability; and (iii) explore the complex relationships between skin sensitization and skin permeability. Based on the largest publicly available dataset compiled in this study, wemore » found no overall correlation between skin permeability and skin sensitization. In addition, cross-species correlation coefficient between human and rodent permeability data was found to be as low as R{sup 2} = 0.44. Human skin permeability models based on the random forest method have been developed and validated using OECD-compliant QSAR modeling workflow. Their external accuracy was high (Q{sup 2}{sub ext} = 0.73 for 63% of external compounds inside the applicability domain). The extended analysis using both experimentally-measured and QSAR-imputed data still confirmed the absence of any overall concordance between skin permeability and skin sensitization. This observation suggests that chemical modifications that affect skin permeability should not be presumed a priori to modulate the sensitization potential of chemicals. The models reported herein as well as those developed in the companion paper on skin sensitization suggest that it may be possible to rationally design compounds with the desired high skin permeability but low sensitization potential. - Highlights: • It was compiled the largest publicly-available skin permeability dataset. • Predictive QSAR models were developed for skin permeability. • No concordance between skin sensitization and skin permeability has been found. • Structural rules for optimizing sensitization and penetration were established.« less
Polarization sensitivity testing of off-plane reflection gratings
NASA Astrophysics Data System (ADS)
Marlowe, Hannah; McEntaffer, Randal L.; DeRoo, Casey T.; Miles, Drew M.; Tutt, James H.; Laubis, Christian; Soltwisch, Victor
2015-09-01
Off-Plane reflection gratings were previously predicted to have different efficiencies when the incident light is polarized in the transverse-magnetic (TM) versus transverse-electric (TE) orientations with respect to the grating grooves. However, more recent theoretical calculations which rigorously account for finitely conducting, rather than perfectly conducting, grating materials no longer predict significant polarization sensitivity. We present the first empirical results for radially ruled, laminar groove profile gratings in the off-plane mount which demonstrate no difference in TM versus TE efficiency across our entire 300-1500 eV bandpass. These measurements together with the recent theoretical results confirm that grazing incidence off-plane reflection gratings using real, not perfectly conducting, materials are not polarization sensitive.
Real, Jordi; Forné, Carles; Roso-Llorach, Albert; Martínez-Sánchez, Jose M
2016-05-01
Controlling for confounders is a crucial step in analytical observational studies, and multivariable models are widely used as statistical adjustment techniques. However, the validation of the assumptions of the multivariable regression models (MRMs) should be made clear in scientific reporting. The objective of this study is to review the quality of statistical reporting of the most commonly used MRMs (logistic, linear, and Cox regression) that were applied in analytical observational studies published between 2003 and 2014 by journals indexed in MEDLINE.Review of a representative sample of articles indexed in MEDLINE (n = 428) with observational design and use of MRMs (logistic, linear, and Cox regression). We assessed the quality of reporting about: model assumptions and goodness-of-fit, interactions, sensitivity analysis, crude and adjusted effect estimate, and specification of more than 1 adjusted model.The tests of underlying assumptions or goodness-of-fit of the MRMs used were described in 26.2% (95% CI: 22.0-30.3) of the articles and 18.5% (95% CI: 14.8-22.1) reported the interaction analysis. Reporting of all items assessed was higher in articles published in journals with a higher impact factor.A low percentage of articles indexed in MEDLINE that used multivariable techniques provided information demonstrating rigorous application of the model selected as an adjustment method. Given the importance of these methods to the final results and conclusions of observational studies, greater rigor is required in reporting the use of MRMs in the scientific literature.
Ely, D. Matthew
2006-01-01
Recharge is a vital component of the ground-water budget and methods for estimating it range from extremely complex to relatively simple. The most commonly used techniques, however, are limited by the scale of application. One method that can be used to estimate ground-water recharge includes process-based models that compute distributed water budgets on a watershed scale. These models should be evaluated to determine which model parameters are the dominant controls in determining ground-water recharge. Seven existing watershed models from different humid regions of the United States were chosen to analyze the sensitivity of simulated recharge to model parameters. Parameter sensitivities were determined using a nonlinear regression computer program to generate a suite of diagnostic statistics. The statistics identify model parameters that have the greatest effect on simulated ground-water recharge and that compare and contrast the hydrologic system responses to those parameters. Simulated recharge in the Lost River and Big Creek watersheds in Washington State was sensitive to small changes in air temperature. The Hamden watershed model in west-central Minnesota was developed to investigate the relations that wetlands and other landscape features have with runoff processes. Excess soil moisture in the Hamden watershed simulation was preferentially routed to wetlands, instead of to the ground-water system, resulting in little sensitivity of any parameters to recharge. Simulated recharge in the North Fork Pheasant Branch watershed, Wisconsin, demonstrated the greatest sensitivity to parameters related to evapotranspiration. Three watersheds were simulated as part of the Model Parameter Estimation Experiment (MOPEX). Parameter sensitivities for the MOPEX watersheds, Amite River, Louisiana and Mississippi, English River, Iowa, and South Branch Potomac River, West Virginia, were similar and most sensitive to small changes in air temperature and a user-defined flow routing parameter. Although the primary objective of this study was to identify, by geographic region, the importance of the parameter value to the simulation of ground-water recharge, the secondary objectives proved valuable for future modeling efforts. The value of a rigorous sensitivity analysis can (1) make the calibration process more efficient, (2) guide additional data collection, (3) identify model limitations, and (4) explain simulated results.
Chomsky-Higgins, Kathryn; Seib, Carolyn; Rochefort, Holly; Gosnell, Jessica; Shen, Wen T; Kahn, James G; Duh, Quan-Yang; Suh, Insoo
2018-01-01
Guidelines for management of small adrenal incidentalomas are mutually inconsistent. No cost-effectiveness analysis has been performed to evaluate rigorously the relative merits of these strategies. We constructed a decision-analytic model to evaluate surveillance strategies for <4cm, nonfunctional, benign-appearing adrenal incidentalomas. We evaluated 4 surveillance strategies: none, one-time, annual for 2 years, and annual for 5 years. Threshold and sensitivity analyses assessed robustness of the model. Costs were represented in 2016 US dollars and health outcomes in quality-adjusted life-years. No surveillance has an expected net cost of $262 and 26.22 quality-adjusted life-years. One-time surveillance costs $158 more and adds 0.2 quality-adjusted life-years for an incremental cost-effectiveness ratio of $778/quality-adjusted life-years. The strategies involving more surveillance were dominated by the no surveillance and one-time surveillance strategies less effective and more expensive. Above a 0.7% prevalence of adrenocortical carcinoma, one-time surveillance was the most effective strategy. The results were robust to all sensitivity analyses of disease prevalence, sensitivity, and specificity of diagnostic assays and imaging as well as health state utility. For patients with a < 4cm, nonfunctional, benign-appearing mass, one-time follow-up evaluation involving a noncontrast computed tomography and biochemical evaluation is cost-effective. Strategies requiring more surveillance accrue more cost without incremental benefit. Copyright © 2017 Elsevier Inc. All rights reserved.
2014-01-01
Background Recent innovations in sequencing technologies have provided researchers with the ability to rapidly characterize the microbial content of an environmental or clinical sample with unprecedented resolution. These approaches are producing a wealth of information that is providing novel insights into the microbial ecology of the environment and human health. However, these sequencing-based approaches produce large and complex datasets that require efficient and sensitive computational analysis workflows. Many recent tools for analyzing metagenomic-sequencing data have emerged, however, these approaches often suffer from issues of specificity, efficiency, and typically do not include a complete metagenomic analysis framework. Results We present PathoScope 2.0, a complete bioinformatics framework for rapidly and accurately quantifying the proportions of reads from individual microbial strains present in metagenomic sequencing data from environmental or clinical samples. The pipeline performs all necessary computational analysis steps; including reference genome library extraction and indexing, read quality control and alignment, strain identification, and summarization and annotation of results. We rigorously evaluated PathoScope 2.0 using simulated data and data from the 2011 outbreak of Shiga-toxigenic Escherichia coli O104:H4. Conclusions The results show that PathoScope 2.0 is a complete, highly sensitive, and efficient approach for metagenomic analysis that outperforms alternative approaches in scope, speed, and accuracy. The PathoScope 2.0 pipeline software is freely available for download at: http://sourceforge.net/projects/pathoscope/. PMID:25225611
Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.
Morse, Janice M
2015-09-01
Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.
Individualistic and Time-Varying Tree-Ring Growth to Climate Sensitivity
Carrer, Marco
2011-01-01
The development of dendrochronological time series in order to analyze climate-growth relationships usually involves first a rigorous selection of trees and then the computation of the mean tree-growth measurement series. This study suggests a change in the perspective, passing from an analysis of climate-growth relationships that typically focuses on the mean response of a species to investigating the whole range of individual responses among sample trees. Results highlight that this new approach, tested on a larch and stone pine tree-ring dataset, outperforms, in terms of information obtained, the classical one, with significant improvements regarding the strength, distribution and time-variability of the individual tree-ring growth response to climate. Moreover, a significant change over time of the tree sensitivity to climatic variability has been detected. Accordingly, the best-responder trees at any one time may not always have been the best-responders and may not continue to be so. With minor adjustments to current dendroecological protocol and adopting an individualistic approach, we can improve the quality and reliability of the ecological inferences derived from the climate-growth relationships. PMID:21829523
Tankasala, Archana; Hsueh, Yuling; Charles, James; Fonseca, Jim; Povolotskyi, Michael; Kim, Jun Oh; Krishna, Sanjay; Allen, Monica S; Allen, Jeffery W; Rahman, Rajib; Klimeck, Gerhard
2018-01-01
A detailed theoretical study of the optical absorption in doped self-assembled quantum dots is presented. A rigorous atomistic strain model as well as a sophisticated 20-band tight-binding model are used to ensure accurate prediction of the single particle states in these devices. We also show that for doped quantum dots, many-particle configuration interaction is also critical to accurately capture the optical transitions of the system. The sophisticated models presented in this work reproduce the experimental results for both undoped and doped quantum dot systems. The effects of alloy mole fraction of the strain controlling layer and quantum dot dimensions are discussed. Increasing the mole fraction of the strain controlling layer leads to a lower energy gap and a larger absorption wavelength. Surprisingly, the absorption wavelength is highly sensitive to the changes in the diameter, but almost insensitive to the changes in dot height. This behavior is explained by a detailed sensitivity analysis of different factors affecting the optical transition energy. PMID:29719758
ERIC Educational Resources Information Center
Gibson, Jenny; Hussain, Jamilla; Holsgrove, Samina; Adams, Catherine; Green, Jonathan
2011-01-01
Direct observation of peer relating is potentially a sensitive and ecologically valid measure of child social functioning, but there has been a lack of standardised methods. The Manchester Inventory for Playground Observation (MIPO) was developed as a practical yet rigorous assessment of this kind for 5-11 year olds. We report on the initial…
From Industry to Higher Education and Libraries: Building the Fast Response Library (FRL).
ERIC Educational Resources Information Center
Apostolou, A. S.; Skiadas, C. H.
In order to be effective in the coming millennium, libraries will need to measure their performance rigorously against the expectations and real needs of their customers. The library of the future will need to be a customer sensitive, knowledge creating, agile enterprise. It must provide value to every customer, where value is the customer's…
NASA Technical Reports Server (NTRS)
1999-01-01
The full complement of EDOMP investigations called for a broad spectrum of flight hardware ranging from commercial items, modified for spaceflight, to custom designed hardware made to meet the unique requirements of testing in the space environment. In addition, baseline data collection before and after spaceflight required numerous items of ground-based hardware. Two basic categories of ground-based hardware were used in EDOMP testing before and after flight: (1) hardware used for medical baseline testing and analysis, and (2) flight-like hardware used both for astronaut training and medical testing. To ensure post-landing data collection, hardware was required at both the Kennedy Space Center (KSC) and the Dryden Flight Research Center (DFRC) landing sites. Items that were very large or sensitive to the rigors of shipping were housed permanently at the landing site test facilities. Therefore, multiple sets of hardware were required to adequately support the prime and backup landing sites plus the Johnson Space Center (JSC) laboratories. Development of flight hardware was a major element of the EDOMP. The challenges included obtaining or developing equipment that met the following criteria: (1) compact (small size and light weight), (2) battery-operated or requiring minimal spacecraft power, (3) sturdy enough to survive the rigors of spaceflight, (4) quiet enough to pass acoustics limitations, (5) shielded and filtered adequately to assure electromagnetic compatibility with spacecraft systems, (6) user-friendly in a microgravity environment, and (7) accurate and efficient operation to meet medical investigative requirements.
McMahon, Camilla M; Lerner, Matthew D; Britton, Noah
2013-01-01
In this paper, we synthesize the current literature on group-based social skills interventions (GSSIs) for adolescents (ages 10–20 years) with higher-functioning autism spectrum disorder and identify key concepts that should be addressed in future research on GSSIs. We consider the research participants, the intervention, the assessment of the intervention, and the research methodology and results to be integral and interconnected components of the GSSI literature, and we review each of these components respectively. Participant characteristics (eg, age, IQ, sex) and intervention characteristics (eg, targeted social skills, teaching strategies, duration and intensity) vary considerably across GSSIs; future research should evaluate whether participant and intervention characteristics mediate/moderate intervention efficacy. Multiple assessments (eg, parent-report, child-report, social cognitive assessments) are used to evaluate the efficacy of GSSIs; future research should be aware of the limitations of current measurement approaches and employ more accurate, sensitive, and comprehensive measurement approaches. Results of GSSIs are largely inconclusive, with few consistent findings across studies (eg, high parent and child satisfaction with the intervention); future research should employ more rigorous methodological standards for evaluating efficacy. A better understanding of these components in the current GSSI literature and a more sophisticated and rigorous analysis of these components in future research will lend clarity to key questions regarding the efficacy of GSSIs for individuals with autism spectrum disorder. PMID:23956616
ERIC Educational Resources Information Center
Conn, Katharine
2014-01-01
In the last three decades, there has been a large increase in the number of rigorous experimental and quasi-experimental evaluations of education programs in developing countries. These impact evaluations have taken place all over the globe, including a large number in Sub-Saharan Africa (SSA). The fact that the developing world is socially and…
NASA Astrophysics Data System (ADS)
Tong, Xiaojun; Cui, Minggen; Wang, Zhu
2009-07-01
The design of the new compound two-dimensional chaotic function is presented by exploiting two one-dimensional chaotic functions which switch randomly, and the design is used as a chaotic sequence generator which is proved by Devaney's definition proof of chaos. The properties of compound chaotic functions are also proved rigorously. In order to improve the robustness against difference cryptanalysis and produce avalanche effect, a new feedback image encryption scheme is proposed using the new compound chaos by selecting one of the two one-dimensional chaotic functions randomly and a new image pixels method of permutation and substitution is designed in detail by array row and column random controlling based on the compound chaos. The results from entropy analysis, difference analysis, statistical analysis, sequence randomness analysis, cipher sensitivity analysis depending on key and plaintext have proven that the compound chaotic sequence cipher can resist cryptanalytic, statistical and brute-force attacks, and especially it accelerates encryption speed, and achieves higher level of security. By the dynamical compound chaos and perturbation technology, the paper solves the problem of computer low precision of one-dimensional chaotic function.
NASA Astrophysics Data System (ADS)
Biset, S.; Nieto Deglioumini, L.; Basualdo, M.; Garcia, V. M.; Serra, M.
The aim of this work is to investigate which would be a good preliminary plantwide control structure for the process of Hydrogen production from bioethanol to be used in a proton exchange membrane (PEM) accounting only steady-state information. The objective is to keep the process under optimal operation point, that is doing energy integration to achieve the maximum efficiency. Ethanol, produced from renewable feedstocks, feeds a fuel processor investigated for steam reforming, followed by high- and low-temperature shift reactors and preferential oxidation, which are coupled to a polymeric fuel cell. Applying steady-state simulation techniques and using thermodynamic models the performance of the complete system with two different control structures have been evaluated for the most typical perturbations. A sensitivity analysis for the key process variables together with the rigorous operability requirements for the fuel cell are taking into account for defining acceptable plantwide control structure. This is the first work showing an alternative control structure applied to this kind of process.
Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity.
Li, Jielin; Hassebrook, Laurence G; Guan, Chun
2003-01-01
Temporal frame-to-frame noise in multipattern structured light projection can significantly corrupt depth measurement repeatability. We present a rigorous stochastic analysis of phase-measuring-profilometry temporal noise as a function of the pattern parameters and the reconstruction coefficients. The analysis is used to optimize the two-frequency phase measurement technique. In phase-measuring profilometry, a sequence of phase-shifted sine-wave patterns is projected onto a surface. In two-frequency phase measurement, two sets of pattern sequences are used. The first, low-frequency set establishes a nonambiguous depth estimate, and the second, high-frequency set is unwrapped, based on the low-frequency estimate, to obtain an accurate depth estimate. If the second frequency is too low, then depth error is caused directly by temporal noise in the phase measurement. If the second frequency is too high, temporal noise triggers ambiguous unwrapping, resulting in depth measurement error. We present a solution for finding the second frequency, where intensity noise variance is at its minimum.
Wu, Wan-Ru; Chung, Ue-Lin; Chang, Sophia C N
2007-06-01
The purpose of this qualitative research study was to explore the preoperative through postoperative phase experience of women who had undergone augmentation mammaplasty. Nine women undergoing augmentation mammaplasty were selected by purposive sampling and interviewed using semi- structured, open-ended interview guidelines. Researchers used Symbolic interactionism to frame their overall perspective and analyzed data with the content analysis method. Rigors of data analysis were adopted credibility, transferability, dependability and confirmability proposed by Guba and Lincoln. The main theme of living experience of women who received augmentation mammaplasty could be summarized as "a journey to restore self-confidence". The categories identified within this journey included: (1) the invisible standards of breast beauty; (2) Taking courageous action to make changes; (3) conflicts between the natural and artificial. The above findings provided initial qualitative data from Taiwanese women's perspective. By better understanding their experience, nurses can become increasingly sensitive to patients' psychosocial adjustment and provide prudential nursing care.
Validation of Self-Image of Aging Scale for Chinese elders.
Bai, Xue; Chan, K S; Chow, Nelson
2012-01-01
Researchers are increasingly interested in the "image of aging" concept. Models on the image of aging abound, but few have rigorously tested measures that are culturally sensitive and domain-specific. This study first translates Levy et al.'s (2004) Image of Aging Scale into the Chinese language and revises it into the Chinese Version of the Self-Image of Aging Scale (SIAS-C). Based on the results of a survey of 445 elderly people in Wuhan-China, it then reports the factorial structure of SIAS-C and some of its psychometric properties. Confirmatory factor analysis (CFA) supports a conceptually meaningful five-factor model, as suggested in an exploratory factor analysis (EFA). The 14-item SIAS-C vindicates an acceptable level of internal consistency and test-retest reliability. Its criteria-referenced validity is demonstrated by its correlation with several criteria in expected directions. In conclusion, the SIAS-C is a psychometrically sound instrument which is recommended for use among Chinese older people.
Nedbal, Jakub; Visitkul, Viput; Ortiz-Zapater, Elena; Weitsman, Gregory; Chana, Prabhjoat; Matthews, Daniel R; Ng, Tony; Ameer-Beg, Simon M
2015-01-01
Sensing ion or ligand concentrations, physico-chemical conditions, and molecular dimerization or conformation change is possible by assays involving fluorescent lifetime imaging. The inherent low throughput of imaging impedes rigorous statistical data analysis on large cell numbers. We address this limitation by developing a fluorescence lifetime-measuring flow cytometer for fast fluorescence lifetime quantification in living or fixed cell populations. The instrument combines a time-correlated single photon counting epifluorescent microscope with microfluidics cell-handling system. The associated computer software performs burst integrated fluorescence lifetime analysis to assign fluorescence lifetime, intensity, and burst duration to each passing cell. The maximum safe throughput of the instrument reaches 3,000 particles per minute. Living cells expressing spectroscopic rulers of varying peptide lengths were distinguishable by Förster resonant energy transfer measured by donor fluorescence lifetime. An epidermal growth factor (EGF)-stimulation assay demonstrated the technique's capacity to selectively quantify EGF receptor phosphorylation in cells, which was impossible by measuring sensitized emission on a standard flow cytometer. Dual-color fluorescence lifetime detection and cell-specific chemical environment sensing were exemplified using di-4-ANEPPDHQ, a lipophilic environmentally sensitive dye that exhibits changes in its fluorescence lifetime as a function of membrane lipid order. To our knowledge, this instrument opens new applications in flow cytometry which were unavailable due to technological limitations of previously reported fluorescent lifetime flow cytometers. The presented technique is sensitive to lifetimes of most popular fluorophores in the 0.5–5 ns range including fluorescent proteins and is capable of detecting multi-exponential fluorescence lifetime decays. This instrument vastly enhances the throughput of experiments involving fluorescence lifetime measurements, thereby providing statistically significant quantitative data for analysis of large cell populations. © 2014 International Society for Advancement of Cytometry PMID:25523156
Space radiator simulation manual for computer code
NASA Technical Reports Server (NTRS)
Black, W. Z.; Wulff, W.
1972-01-01
A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.
Optimal design and evaluation of a color separation grating using rigorous coupled wave analysis
NASA Astrophysics Data System (ADS)
Nagayoshi, Mayumi; Oka, Keiko; Klaus, Werner; Komai, Yuki; Kodate, Kashiko
2006-02-01
In recent years, the technology which separates white light into the three primary colors of Red (R), Green (G) and Blue (B) and adjusts each optical intensity and composites R, G and B to display various colors is required in the development and spread of color visual equipments. Various color separation devices have been proposed and have been put to practical use in color visual equipments. We have focused on a small and light grating-type device which has the possibility of reduction in cost and large-scale production and generates only the three primary colors of R, G and B so that a high saturation level can be obtained. To perform a rigorous analysis and design of color separation gratings, our group has developed a program that is based on the Rigorous Coupled Wave Analysis (RCWA). We then calculated the parameters to obtain a diffraction efficiency of higher than 70% and the color gamut of about 70%. We will report on the design, fabrication and evaluation of color separation gratings that have been optimized for fabrication by laser drawing.
Adjoint-Based Algorithms for Adaptation and Design Optimizations on Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2006-01-01
Schemes based on discrete adjoint algorithms present several exciting opportunities for significantly advancing the current state of the art in computational fluid dynamics. Such methods provide an extremely efficient means for obtaining discretely consistent sensitivity information for hundreds of design variables, opening the door to rigorous, automated design optimization of complex aerospace configuration using the Navier-Stokes equation. Moreover, the discrete adjoint formulation provides a mathematically rigorous foundation for mesh adaptation and systematic reduction of spatial discretization error. Error estimates are also an inherent by-product of an adjoint-based approach, valuable information that is virtually non-existent in today's large-scale CFD simulations. An overview of the adjoint-based algorithm work at NASA Langley Research Center is presented, with examples demonstrating the potential impact on complex computational problems related to design optimization as well as mesh adaptation.
Heat transfer evaluation in a plasma core reactor
NASA Technical Reports Server (NTRS)
Smith, D. E.; Smith, T. M.; Stoenescu, M. L.
1976-01-01
Numerical evaluations of heat transfer in a fissioning uranium plasma core reactor cavity, operating with seeded hydrogen propellant, was performed. A two-dimensional analysis is based on an assumed flow pattern and cavity wall heat exchange rate. Various iterative schemes were required by the nature of the radiative field and by the solid seed vaporization. Approximate formulations of the radiative heat flux are generally used, due to the complexity of the solution of a rigorously formulated problem. The present work analyzes the sensitivity of the results with respect to approximations of the radiative field, geometry, seed vaporization coefficients and flow pattern. The results present temperature, heat flux, density and optical depth distributions in the reactor cavity, acceptable simplifying assumptions, and iterative schemes. The present calculations, performed in cartesian and spherical coordinates, are applicable to any most general heat transfer problem.
Rigorous Electromagnetic Analysis of the Focusing Action of Refractive Cylindrical Microlens
NASA Astrophysics Data System (ADS)
Liu, Juan; Gu, Ben-Yuan; Dong, Bi-Zhen; Yang, Guo-Zhen
The focusing action of refractive cylindrical microlens is investigated based on the rigorous electromagnetic theory with the use of the boundary element method. The focusing behaviors of these refractive microlenses with continuous and multilevel surface-envelope are characterized in terms of total electric-field patterns, the electric-field intensity distributions on the focal plane, and their diffractive efficiencies at the focal spots. The obtained results are also compared with the ones obtained by Kirchhoff's scalar diffraction theory. The present numerical and graphical results may provide useful information for the analysis and design of refractive elements in micro-optics.
Rigorous diffraction analysis using geometrical theory of diffraction for future mask technology
NASA Astrophysics Data System (ADS)
Chua, Gek S.; Tay, Cho J.; Quan, Chenggen; Lin, Qunying
2004-05-01
Advanced lithographic techniques such as phase shift masks (PSM) and optical proximity correction (OPC) result in a more complex mask design and technology. In contrast to the binary masks, which have only transparent and nontransparent regions, phase shift masks also take into consideration transparent features with a different optical thickness and a modified phase of the transmitted light. PSM are well-known to show prominent diffraction effects, which cannot be described by the assumption of an infinitely thin mask (Kirchhoff approach) that is used in many commercial photolithography simulators. A correct prediction of sidelobe printability, process windows and linearity of OPC masks require the application of rigorous diffraction theory. The problem of aerial image intensity imbalance through focus with alternating Phase Shift Masks (altPSMs) is performed and compared between a time-domain finite-difference (TDFD) algorithm (TEMPEST) and Geometrical theory of diffraction (GTD). Using GTD, with the solution to the canonical problems, we obtained a relationship between the edge on the mask and the disturbance in image space. The main interest is to develop useful formulations that can be readily applied to solve rigorous diffraction for future mask technology. Analysis of rigorous diffraction effects for altPSMs using GTD approach will be discussed.
Cooper, Glinda S.; Lunn, Ruth M.; Ågerstrand, Marlene; Glenn, Barbara S.; Kraft, Andrew D.; Luke, April M.; Ratcliffe, Jennifer M.
2016-01-01
A critical step in systematic reviews of potential health hazards is the structured evaluation of the strengths and weaknesses of the included studies; risk of bias is a term often used to represent this process, specifically with respect to the evaluation of systematic errors that can lead to inaccurate (biased) results (i.e. focusing on internal validity). Systematic review methods developed in the clinical medicine arena have been adapted for use in evaluating environmental health hazards; this expansion raises questions about the scope of risk of bias tools and the extent to which they capture the elements that can affect the interpretation of results from environmental and occupational epidemiology studies and in vivo animal toxicology studies, (the studies typically available for assessment of risk of chemicals). One such element, described here as “sensitivity”, is a measure of the ability of a study to detect a true effect or hazard. This concept is similar to the concept of the sensitivity of an assay; an insensitive study may fail to show a difference that truly exists, leading to a false conclusion of no effect. Factors relating to study sensitivity should be evaluated in a systematic manner with the same rigor as the evaluation of other elements within a risk of bias framework. We discuss the importance of this component for the interpretation of individual studies, examine approaches proposed or in use to address it, and describe how it relates to other evaluation components. The evaluation domains contained within a risk of bias tool can include, or can be modified to include, some features relating to study sensitivity; the explicit inclusion of these sensitivity criteria with the same rigor and at the same stage of study evaluation as other bias-related criteria can improve the evaluation process. In some cases, these and other features may be better addressed through a separate sensitivity domain. The combined evaluation of risk of bias and sensitivity can be used to identify the most informative studies, to evaluate the confidence of the findings from individual studies and to identify those study elements that may help to explain heterogeneity across the body of literature. PMID:27156196
Image Hashes as Templates for Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.
2012-07-17
Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images,more » and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the information contained in the hashed image data (available out-of-IB) cannot be used to extract sensitive information about the imaged object is of primary concern. Thus the techniques are characterized by high unpredictability to guarantee security. We will present an assessment of the performance of our techniques with respect to security, sensitivity and robustness on the basis of a methodical and mathematically precise framework.« less
Galyean, Anne A; Filliben, James J; Holbrook, R David; Vreeland, Wyatt N; Weinberg, Howard S
2016-11-18
Asymmetric flow field flow fractionation (AF 4 ) has several instrumental factors that may have a direct effect on separation performance. A sensitivity analysis was applied to ascertain the relative importance of AF 4 primary instrument factor settings for the separation of a complex environmental sample. The analysis evaluated the impact of instrumental factors namely, cross flow, ramp time, focus flow, injection volume, and run buffer concentration on the multi-angle light scattering measurement of natural organic matter (NOM) molar mass (MM). A 2 (5-1) orthogonal fractional factorial design was used to minimize analysis time while preserving the accuracy and robustness in the determination of the main effects and interactions between any two instrumental factors. By assuming that separations resulting in smaller MM measurements would be more accurate, the analysis produced a ranked list of effects estimates for factors and interactions of factors based on their relative importance in minimizing the MM. The most important and statistically significant AF 4 instrumental factors were buffer concentration and cross flow. The least important was ramp time. A parallel 2 (5-2) orthogonal fractional factorial design was also employed on five environmental factors for synthetic natural water samples containing silver nanoparticles (NPs), namely: NP concentration, NP size, NOM concentration, specific conductance, and pH. None of the water quality characteristic effects or interactions were found to be significant in minimizing the measured MM; however, the interaction between NP concentration and NP size was an important effect when considering NOM recovery. This work presents a structured approach for the rigorous assessment of AF 4 instrument factors and optimal settings for the separation of complex samples utilizing efficient orthogonal factional factorial design and appropriate graphical analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Mehl, Steffen W.; Hill, Mary C.
2013-01-01
This report documents the addition of ghost node Local Grid Refinement (LGR2) to MODFLOW-2005, the U.S. Geological Survey modular, transient, three-dimensional, finite-difference groundwater flow model. LGR2 provides the capability to simulate groundwater flow using multiple block-shaped higher-resolution local grids (a child model) within a coarser-grid parent model. LGR2 accomplishes this by iteratively coupling separate MODFLOW-2005 models such that heads and fluxes are balanced across the grid-refinement interface boundary. LGR2 can be used in two-and three-dimensional, steady-state and transient simulations and for simulations of confined and unconfined groundwater systems. Traditional one-way coupled telescopic mesh refinement methods can have large, often undetected, inconsistencies in heads and fluxes across the interface between two model grids. The iteratively coupled ghost-node method of LGR2 provides a more rigorous coupling in which the solution accuracy is controlled by convergence criteria defined by the user. In realistic problems, this can result in substantially more accurate solutions and require an increase in computer processing time. The rigorous coupling enables sensitivity analysis, parameter estimation, and uncertainty analysis that reflects conditions in both model grids. This report describes the method used by LGR2, evaluates accuracy and performance for two-and three-dimensional test cases, provides input instructions, and lists selected input and output files for an example problem. It also presents the Boundary Flow and Head (BFH2) Package, which allows the child and parent models to be simulated independently using the boundary conditions obtained through the iterative process of LGR2.
Mehl, Steffen W.; Hill, Mary C.
2006-01-01
This report documents the addition of shared node Local Grid Refinement (LGR) to MODFLOW-2005, the U.S. Geological Survey modular, transient, three-dimensional, finite-difference ground-water flow model. LGR provides the capability to simulate ground-water flow using one block-shaped higher-resolution local grid (a child model) within a coarser-grid parent model. LGR accomplishes this by iteratively coupling two separate MODFLOW-2005 models such that heads and fluxes are balanced across the shared interfacing boundary. LGR can be used in two-and three-dimensional, steady-state and transient simulations and for simulations of confined and unconfined ground-water systems. Traditional one-way coupled telescopic mesh refinement (TMR) methods can have large, often undetected, inconsistencies in heads and fluxes across the interface between two model grids. The iteratively coupled shared-node method of LGR provides a more rigorous coupling in which the solution accuracy is controlled by convergence criteria defined by the user. In realistic problems, this can result in substantially more accurate solutions and require an increase in computer processing time. The rigorous coupling enables sensitivity analysis, parameter estimation, and uncertainty analysis that reflects conditions in both model grids. This report describes the method used by LGR, evaluates LGR accuracy and performance for two- and three-dimensional test cases, provides input instructions, and lists selected input and output files for an example problem. It also presents the Boundary Flow and Head (BFH) Package, which allows the child and parent models to be simulated independently using the boundary conditions obtained through the iterative process of LGR.
Naujokaitis-Lewis, Ilona; Curtis, Janelle M R
2016-01-01
Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along with demographic parameters in sensitivity routines. GRIP 2.0 is an important decision-support tool that can be used to prioritize research, identify habitat-based thresholds and management intervention points to improve probability of species persistence, and evaluate trade-offs of alternative management options.
Curtis, Janelle M.R.
2016-01-01
Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along with demographic parameters in sensitivity routines. GRIP 2.0 is an important decision-support tool that can be used to prioritize research, identify habitat-based thresholds and management intervention points to improve probability of species persistence, and evaluate trade-offs of alternative management options. PMID:27547529
Improved key-rate bounds for practical decoy-state quantum-key-distribution systems
NASA Astrophysics Data System (ADS)
Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng
2017-01-01
The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.
Failure-Modes-And-Effects Analysis Of Software Logic
NASA Technical Reports Server (NTRS)
Garcia, Danny; Hartline, Thomas; Minor, Terry; Statum, David; Vice, David
1996-01-01
Rigorous analysis applied early in design effort. Method of identifying potential inadequacies and modes and effects of failures caused by inadequacies (failure-modes-and-effects analysis or "FMEA" for short) devised for application to software logic.
The DOZZ formula from the path integral
NASA Astrophysics Data System (ADS)
Kupiainen, Antti; Rhodes, Rémi; Vargas, Vincent
2018-05-01
We present a rigorous proof of the Dorn, Otto, Zamolodchikov, Zamolodchikov formula (the DOZZ formula) for the 3 point structure constants of Liouville Conformal Field Theory (LCFT) starting from a rigorous probabilistic construction of the functional integral defining LCFT given earlier by the authors and David. A crucial ingredient in our argument is a probabilistic derivation of the reflection relation in LCFT based on a refined tail analysis of Gaussian multiplicative chaos measures.
Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik; ...
2017-02-15
Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Finally, we usemore » gate set tomography to completely characterize operations on a trapped-Yb +-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10 -4).« less
Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik; Rudinger, Kenneth; Mizrahi, Jonathan; Fortier, Kevin; Maunz, Peter
2017-01-01
Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Here we use gate set tomography to completely characterize operations on a trapped-Yb+-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10−4). PMID:28198466
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik
Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Finally, we usemore » gate set tomography to completely characterize operations on a trapped-Yb +-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10 -4).« less
Does McNemar's test compare the sensitivities and specificities of two diagnostic tests?
Kim, Soeun; Lee, Woojoo
2017-02-01
McNemar's test is often used in practice to compare the sensitivities and specificities for the evaluation of two diagnostic tests. For correct evaluation of accuracy, an intuitive recommendation is to test the diseased and the non-diseased groups separately so that the sensitivities can be compared among the diseased, and specificities can be compared among the healthy group of people. This paper provides a rigorous theoretical framework for this argument and study the validity of McNemar's test regardless of the conditional independence assumption. We derive McNemar's test statistic under the null hypothesis considering both assumptions of conditional independence and conditional dependence. We then perform power analyses to show how the result is affected by the amount of the conditional dependence under alternative hypothesis.
SHARP ENTRYWISE PERTURBATION BOUNDS FOR MARKOV CHAINS.
Thiede, Erik; VAN Koten, Brian; Weare, Jonathan
For many Markov chains of practical interest, the invariant distribution is extremely sensitive to perturbations of some entries of the transition matrix, but insensitive to others; we give an example of such a chain, motivated by a problem in computational statistical physics. We have derived perturbation bounds on the relative error of the invariant distribution that reveal these variations in sensitivity. Our bounds are sharp, we do not impose any structural assumptions on the transition matrix or on the perturbation, and computing the bounds has the same complexity as computing the invariant distribution or computing other bounds in the literature. Moreover, our bounds have a simple interpretation in terms of hitting times, which can be used to draw intuitive but rigorous conclusions about the sensitivity of a chain to various types of perturbations.
Assessments of species' vulnerability to climate change: From pseudo to science
Wade, Alisa A.; Hand, Brian K.; Kovach, Ryan; Muhlfeld, Clint C.; Waples, Robin S.; Luikart, Gordon
2017-01-01
Climate change vulnerability assessments (CCVAs) are important tools to plan for and mitigate potential impacts of climate change. However, CCVAs often lack scientific rigor, which can ultimately lead to poor conservation prioritization and associated ecological and economic costs. We discuss the need to improve comparability and consistency of CCVAs and either validate their findings or improve assessment of CCVA uncertainty and sensitivity to methodological assumptions.
Jean-Christophe Domec; Ge Sun; Asko Noormets; Michael J. Gavazzi; Emrys A. Treasure; Erika Cohen; Jennifer J. Swenson; Steve G. McNulty; John S. King
2012-01-01
Increasing variability of rainfall patterns requires detailed understanding of the pathways of water loss from ecosystems to optimize carbon uptake and management choices. In the current study we characterized the usability of three alternative methods of different rigor for quantifying stand-level evapotranspiration (ET), partitioned ET into tree transpiration (T),...
Standards Get Boost on the Hill: Bills before Congress Aim to Raise the Bar in States
ERIC Educational Resources Information Center
Olson, Lynn
2007-01-01
This article focuses on the standards debate in the context of renewing the 5-year-old No Child Left Behind Act. The politically sensitive idea of increasing the rigor of state standards and tests by linking them to standards set at the national level is getting a push from prominent lawmakers as Congress moves to reauthorize the No Child Left…
Systematic review and meta-analysis of flow cytometry in urinary tract infection screening.
Shang, Yan-Jun; Wang, Qian-Qian; Zhang, Jian-Rong; Xu, Yu-Lian; Zhang, Wei-Wei; Chen, Yan; Gu, Ming-Li; Hu, Zhi-De; Deng, An-Mei
2013-09-23
Automated urine sediment analysis of white blood cells (WBCs) and bacteria is a promising approach for urinary tract infections (UTIs) screening. However, available data on their screening efficacy is inconsistent. English articles from Pubmed, EMBASE, and Web of Science published before December 1, 2012 were analyzed. The Quality Assessment for Studies of Diagnostic Accuracy (QUADAS) tool was used to evaluate the quality of eligible studies. Performance characteristics of WBCs and bacteria (sensitivity, specificity, and other measures of accuracy) were pooled and examined by random-effects models. Nineteen studies containing 22,305 samples were included. Pooled sensitivities were 0.87 (95% confidence interval [CI], 0.86-0.89) for WBCs and 0.92 (95% CI, 0.91-0.93) for bacteria. Corresponding pooled specificities were 0.67 (95% CI, 0.66-0.68) for WBCs and 0.60 (95% CI, 0.59-0.61) for bacteria. Areas under the summary receiver operating characteristics curves were 0.87 and 0.93 for WBCs and bacteria, respectively. The major limitation of eligible studies was that enrolled subjects were often not representative of clinical patient populations in which UTI would be suspected. WBC and bacterial measurements by the UF-100 and UF-1000i are useful indicators in UTI screening; however, the performances of these systems should be rigorously evaluated by additional studies. Copyright © 2013 Elsevier B.V. All rights reserved.
The case for treatment fidelity in active music interventions: why and how.
Wiens, Natalie; Gordon, Reyna L
2018-05-04
As the volume of studies testing the benefits of active music-making interventions increases exponentially, it is important to document what exactly is happening during music treatment sessions in order to provide evidence for the mechanisms through which music training affects other domains. Thus, to complement systematic and rigorous attention to outcomes of the treatment, we outline four vital components of treatment fidelity and discuss their implementation in nonmusic- and music-based interventions. We then describe the design of Music Impacting Language Expertise (MILEStone), a new intervention that aims to improve grammar skills in children with specific language impairment by increasing sensitivity to rhythmic structure, which may enhance general temporal processing and sensitivity to syntactic structure. We describe the approach to addressing treatment fidelity in MILEStone adapted from intervention research from other fields, including a behavioral coding system to track instructional episodes and child participation, a treatment manual, activity checklists, provider training and monitoring, a home practice log, and teacher ratings of participant engagement. This approach takes an important first step in modeling a formalized procedure for assessing treatment fidelity in active music-making intervention research, as a means of increasing methodological rigor in support of evidence-based practice in clinical and educational settings. © 2018 New York Academy of Sciences.
Scientific approaches to science policy.
Berg, Jeremy M
2013-11-01
The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.
EOS imaging versus current radiography: A health technology assessment study
Mahboub-Ahari, Alireza; Hajebrahimi, Sakineh; Yusefi, Mahmoud; Velayati, Ashraf
2016-01-01
Background: EOS is a 2D/3D muscle skeletal diagnostic imaging system. The device has been developed to produce a high quality 2D, full body radiographs in standing, sitting and squatting positions. Three dimensional images can be reconstructed via sterEOS software. This Health Technology Assessment study aimed to investigate efficacy, effectiveness and cost-effectiveness of new emerged EOS imaging system in comparison with conventional x-ray radiographic techniques. Methods: All cost and outcome data were assessed from Iran's Ministry of Health Perspective. Data for clinical effectiveness was extracted using a rigorous systematic review. As clinical outcomes the rate of x-ray emission and related quality of life were compared with Computed Radiography (CR) and Digital Radiography (DR). Standard costing method was conducted to find related direct medical costs. In order to examine robustness of the calculated Incremental Cost Effectiveness Ratios (ICERs) we used two-way sensitivity analysis. GDP Per capita of Islamic Republic of Iran (2012) adopted as cost-effectiveness threshold. Results: Review of related literature highlighted the lack of rigorous evidence for clinical outcomes. Ultra low dose EOS imaging device is known as a safe intervention because of FDA, CE and CSA certificates. The rate of emitted X-ray was 2 to 18 fold lower for EOS compared to the conventional techniques (p<0.001). The Incremental Cost Effectiveness Ratio for EOS relative to CR calculated $50706 in baseline analysis (the first scenario) and $50714, $9446 respectively for the second and third scenarios. Considering the value of neither $42146 as upper limit, nor the first neither the second scenario could pass the cost-effectiveness threshold for Iran. Conclusion: EOS imaging technique might not be considered as a cost-effective intervention in routine practice of health system, especially within in-patient wards. Scenario analysis shows that, only in an optimum condition such as lower assembling costs and higher utilization rates, the device can be recruited for research and therapeutic purposes in pediatric orthopedic centers. PMID:27390701
Architecture of marine food webs: To be or not be a 'small-world'.
Marina, Tomás Ignacio; Saravia, Leonardo A; Cordone, Georgina; Salinas, Vanesa; Doyle, Santiago R; Momo, Fernando R
2018-01-01
The search for general properties in network structure has been a central issue for food web studies in recent years. One such property is the small-world topology that combines a high clustering and a small distance between nodes of the network. This property may increase food web resilience but make them more sensitive to the extinction of connected species. Food web theory has been developed principally from freshwater and terrestrial ecosystems, largely omitting marine habitats. If theory needs to be modified to accommodate observations from marine ecosystems, based on major differences in several topological characteristics is still on debate. Here we investigated if the small-world topology is a common structural pattern in marine food webs. We developed a novel, simple and statistically rigorous method to examine the largest set of complex marine food webs to date. More than half of the analyzed marine networks exhibited a similar or lower characteristic path length than the random expectation, whereas 39% of the webs presented a significantly higher clustering than its random counterpart. Our method proved that 5 out of 28 networks fulfilled both features of the small-world topology: short path length and high clustering. This work represents the first rigorous analysis of the small-world topology and its associated features in high-quality marine networks. We conclude that such topology is a structural pattern that is not maximized in marine food webs; thus it is probably not an effective model to study robustness, stability and feasibility of marine ecosystems.
Photomask CD and LER characterization using Mueller matrix spectroscopic ellipsometry
NASA Astrophysics Data System (ADS)
Heinrich, A.; Dirnstorfer, I.; Bischoff, J.; Meiner, K.; Ketelsen, H.; Richter, U.; Mikolajick, T.
2014-10-01
Critical dimension and line edge roughness on photomask arrays are determined with Mueller matrix spectroscopic ellipsometry. Arrays with large sinusoidal perturbations are measured for different azimuth angels and compared with simulations based on rigorous coupled wave analysis. Experiment and simulation show that line edge roughness leads to characteristic changes in the different Mueller matrix elements. The influence of line edge roughness is interpreted as an increase of isotropic character of the sample. The changes in the Mueller matrix elements are very similar when the arrays are statistically perturbed with rms roughness values in the nanometer range suggesting that the results on the sinusoidal test structures are also relevant for "real" mask errors. Critical dimension errors and line edge roughness have similar impact on the SE MM measurement. To distinguish between both deviations, a strategy based on the calculation of sensitivities and correlation coefficients for all Mueller matrix elements is shown. The Mueller matrix elements M13/M31 and M34/M43 are the most suitable elements due to their high sensitivities to critical dimension errors and line edge roughness and, at the same time, to a low correlation coefficient between both influences. From the simulated sensitivities, it is estimated that the measurement accuracy has to be in the order of 0.01 and 0.001 for the detection of 1 nm critical dimension error and 1 nm line edge roughness, respectively.
Teram, Eli; Schachter, Candice L; Stalker, Carol A
2005-10-01
Grounded theory and participatory action research methods are distinct approaches to qualitative inquiry. Although grounded theory has been conceptualized in constructivist terms, it has elements of positivist thinking with an image of neutral search for objective truth through rigorous data collection and analysis. Participatory action research is based on a critique of this image and calls for more inclusive research processes. It questions the possibility of objective social sciences and aspires to engage people actively in all stages of generating knowledge. The authors applied both approaches in a project designed to explore the experiences of female survivors of childhood sexual abuse with physical therapy and subsequently develop a handbook on sensitive practice for clinicians that takes into consideration the needs and perspectives of these clients. Building on this experience, they argue that the integration of grounded theory and participatory action research can empower clients to inform professional practice.
SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach.
Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang
2017-01-01
As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project.
Managing unexpected events in the manufacturing of biologic medicines.
Grampp, Gustavo; Ramanan, Sundar
2013-08-01
The manufacturing of biologic medicines (biologics) requires robust process and facility design, rigorous regulatory compliance, and a well-trained workforce. Because of the complex attributes of biologics and their sensitivity to production and handling conditions, manufacturing of these medicines also requires a high-reliability manufacturing organization. As required by regulators, such an organization must monitor the state-of-control for the manufacturing process. A high-reliability organization also invests in an experienced and fully engaged technical support staff and fosters a management culture that rewards in-depth analysis of unexpected results, robust risk assessments, and timely and effective implementation of mitigation measures. Such a combination of infrastructure, technology, human capital, management, and a science-based operations culture does not occur without a strong organizational and financial commitment. These attributes of a high-reliability biologics manufacturer are difficult to achieve and may be differentiating factors as the supply of biologics diversifies in future years.
Observation of giant Goos-Hänchen and angular shifts at designed metasurfaces
Yallapragada, Venkata Jayasurya; Ravishankar, Ajith P.; Mulay, Gajendra L.; Agarwal, Girish S.; Achanta, Venu Gopal
2016-01-01
Metasurfaces with sub-wavelength features are useful in modulating the phase, amplitude or polarization of electromagnetic fields. While several applications are reported for light manipulation and control, the sharp phase changes would be useful in enhancing the beam shifts at reflection from a metasurface. In designed periodic patterns on metal film, at surface plasmon resonance, we demonstrate Goos-Hanchen shift of the order of 70 times the incident wavelength and the angular shifts of several hundred microradians. We have designed the patterns using rigorous coupled wave analysis (RCWA) together with S-matrices and have used a complete vector theory to calculate the shifts as well as demonstrate a versatile experimental setup to directly measure the shifts. The giant shifts demonstrated could prove to be useful in enhancing the sensitivity of experiments ranging from atomic force microscopy to gravitational wave detection. PMID:26758471
Thermally driven advection for radioxenon transport from an underground nuclear explosion
NASA Astrophysics Data System (ADS)
Sun, Yunwei; Carrigan, Charles R.
2016-05-01
Barometric pumping is a ubiquitous process resulting in migration of gases in the subsurface that has been studied as the primary mechanism for noble gas transport from an underground nuclear explosion (UNE). However, at early times following a UNE, advection driven by explosion residual heat is relevant to noble gas transport. A rigorous measure is needed for demonstrating how, when, and where advection is important. In this paper three physical processes of uncertain magnitude (oscillatory advection, matrix diffusion, and thermally driven advection) are parameterized by using boundary conditions, system properties, and source term strength. Sobol' sensitivity analysis is conducted to evaluate the importance of all physical processes influencing the xenon signals. This study indicates that thermally driven advection plays a more important role in producing xenon signals than oscillatory advection and matrix diffusion at early times following a UNE, and xenon isotopic ratios are observed to have both time and spatial dependence.
SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach
Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang
2017-01-01
As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project. PMID:29854245
Near Identifiability of Dynamical Systems
NASA Technical Reports Server (NTRS)
Hadaegh, F. Y.; Bekey, G. A.
1987-01-01
Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.
Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.
Mulvany, M J
1975-01-01
1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023
Cooper, Glinda S; Lunn, Ruth M; Ågerstrand, Marlene; Glenn, Barbara S; Kraft, Andrew D; Luke, April M; Ratcliffe, Jennifer M
2016-01-01
A critical step in systematic reviews of potential health hazards is the structured evaluation of the strengths and weaknesses of the included studies; risk of bias is a term often used to represent this process, specifically with respect to the evaluation of systematic errors that can lead to inaccurate (biased) results (i.e. focusing on internal validity). Systematic review methods developed in the clinical medicine arena have been adapted for use in evaluating environmental health hazards; this expansion raises questions about the scope of risk of bias tools and the extent to which they capture the elements that can affect the interpretation of results from environmental and occupational epidemiology studies and in vivo animal toxicology studies, (the studies typically available for assessment of risk of chemicals). One such element, described here as "sensitivity", is a measure of the ability of a study to detect a true effect or hazard. This concept is similar to the concept of the sensitivity of an assay; an insensitive study may fail to show a difference that truly exists, leading to a false conclusion of no effect. Factors relating to study sensitivity should be evaluated in a systematic manner with the same rigor as the evaluation of other elements within a risk of bias framework. We discuss the importance of this component for the interpretation of individual studies, examine approaches proposed or in use to address it, and describe how it relates to other evaluation components. The evaluation domains contained within a risk of bias tool can include, or can be modified to include, some features relating to study sensitivity; the explicit inclusion of these sensitivity criteria with the same rigor and at the same stage of study evaluation as other bias-related criteria can improve the evaluation process. In some cases, these and other features may be better addressed through a separate sensitivity domain. The combined evaluation of risk of bias and sensitivity can be used to identify the most informative studies, to evaluate the confidence of the findings from individual studies and to identify those study elements that may help to explain heterogeneity across the body of literature. Copyright © 2016. Published by Elsevier Ltd.
Frempong, Samuel N; Sutton, Andrew J; Davenport, Clare; Barton, Pelham
2018-02-01
There is little specific guidance on the implementation of cost-effectiveness modelling at the early stage of test development. The aim of this study was to review the literature in this field to examine the methodologies and tools that have been employed to date. Areas Covered: A systematic review to identify relevant studies in established literature databases. Five studies were identified and included for narrative synthesis. These studies revealed that there is no consistent approach in this growing field. The perspective of patients and the potential for value of information (VOI) to provide information on the value of future research is often overlooked. Test accuracy is an essential consideration, with most studies having described and included all possible test results in their analysis, and conducted extensive sensitivity analyses on important parameters. Headroom analysis was considered in some instances but at the early development stage (not the concept stage). Expert commentary: The techniques available to modellers that can demonstrate the value of conducting further research and product development (i.e. VOI analysis, headroom analysis) should be better utilized. There is the need for concerted efforts to develop rigorous methodology in this growing field to maximize the value and quality of such analysis.
Drama-induced affect and pain sensitivity.
Zillmann, D; de Wied, M; King-Jablonski, C; Jenzowsky, S
1996-01-01
This study was conducted to examine the pain-ameliorating and pain-sensitizing effects of exposure to emotionally engaging drama. Specifically, the consequences for pain sensitivity of exposure to dramatic expositions differing in both excitatory and hedonic qualities were determined. Hedonically negative, neutral, and positive affective states were induced in male respondents by exposure to excerpts from cinematic drama. Pain sensitivity was assessed by the cuff-pressure procedure before and after exposure and by the cold pressor test after exposure only. When compared against the control condition, pain sensitivity diminished under conditions of hedonically positive affect. An inverse effect was suggested for hedonically negative conditions, but proved tentative and statistically unreliable. The findings are consistent with earlier demonstrations of mood effects on pain sensitivity. Unlike inconclusive earlier findings concerning the magnitude of directional effects, however, they suggest an asymmetry that emphasizes the pain-ameliorating effect of positive affects while lending little, if any, support to the proposal of a pain-sensitizing effect of negative affects. The investigation did not accomplish the intended creation of conditions necessary to test the proposal that heightened sympathetic activity diminishes pain sensitivity. The utility of a rigorous determination of this hypothesized relationship is emphasized, and procedures for a viable test of the proposal are suggested.
3D Airborne Electromagnetic Inversion: A case study from the Musgrave Region, South Australia
NASA Astrophysics Data System (ADS)
Cox, L. H.; Wilson, G. A.; Zhdanov, M. S.; Sunwall, D. A.
2012-12-01
Geophysicists know and accept that geology is inherently 3D, and is resultant from complex, overlapping processes related to genesis, metamorphism, deformation, alteration, weathering, and/or hydrogeology. Yet, the geophysics community has long relied on qualitative analysis, conductivity depth imaging (CDIs), 1D inversion, and/or plate modeling. There are many reasons for this deficiency, not the least of which has been the lack of capacity for historic 3D AEM inversion algorithms to invert entire surveys so as to practically affect exploration decisions. Our recent introduction of a moving sensitivity domain (footprint) methodology has been a paradigm shift in AEM interpretation. The basis of this method is that one needs only to calculate the responses and sensitivities for that part of the 3D earth model that is within the AEM system's sensitivity domain (footprint), and then superimpose all sensitivity domains into a single, sparse sensitivity matrix for the entire 3D earth model which is then updated in a regularized inversion scheme. This has made it practical to rigorously invert entire surveys with thousands of line kilometers of AEM data to mega-cell 3D models in hours using multi-processor workstations. Since 2010, over eighty individual projects have been completed for Aerodat, AEROTEM, DIGHEM, GEOTEM, HELITEM, HoisTEM, MEGATEM, RepTEM, RESOLVE, SkyTEM, SPECTREM, TEMPEST, and VTEM data from Australia, Brazil, Canada, Finland, Ghana, Peru, Tanzania, the US, and Zambia. Examples of 3D AEM inversion have been published for a variety of applications, including mineral exploration, oil sands exploration, salinity, permafrost, and bathymetry mapping. In this paper, we present a comparison of 3D inversions for SkyTEM, SPECTREM, TEMPET and VTEM data acquired over the same area in the Musgrave region of South Australia for exploration under cover.
On analyticity of linear waves scattered by a layered medium
NASA Astrophysics Data System (ADS)
Nicholls, David P.
2017-10-01
The scattering of linear waves by periodic structures is a crucial phenomena in many branches of applied physics and engineering. In this paper we establish rigorous analytic results necessary for the proper numerical analysis of a class of High-Order Perturbation of Surfaces methods for simulating such waves. More specifically, we prove a theorem on existence and uniqueness of solutions to a system of partial differential equations which model the interaction of linear waves with a multiply layered periodic structure in three dimensions. This result provides hypotheses under which a rigorous numerical analysis could be conducted for recent generalizations to the methods of Operator Expansions, Field Expansions, and Transformed Field Expansions.
Razavi, Morteza; Frick, Lauren E; LaMarr, William A; Pope, Matthew E; Miller, Christine A; Anderson, N Leigh; Pearson, Terry W
2012-12-07
We investigated the utility of an SPE-MS/MS platform in combination with a modified SISCAPA workflow for chromatography-free MRM analysis of proteotypic peptides in digested human plasma. This combination of SISCAPA and SPE-MS/MS technology allows sensitive, MRM-based quantification of peptides from plasma digests with a sample cycle time of ∼7 s, a 300-fold improvement over typical MRM analyses with analysis times of 30-40 min that use liquid chromatography upstream of MS. The optimized system includes capture and enrichment to near purity of target proteotypic peptides using rigorously selected, high affinity, antipeptide monoclonal antibodies and reduction of background peptides using a novel treatment of magnetic bead immunoadsorbents. Using this method, we have successfully quantitated LPS-binding protein and mesothelin (concentrations of ∼5000 ng/mL and ∼10 ng/mL, respectively) in human plasma. The method eliminates the need for upstream liquid-chromatography and can be multiplexed, thus facilitating quantitative analysis of proteins, including biomarkers, in large sample sets. The method is ideal for high-throughput biomarker validation after affinity enrichment and has the potential for applications in clinical laboratories.
Quantitative high-performance liquid chromatography of nucleosides in biological materials.
Gehrke, C W; Kuo, K C; Davis, G E; Suits, R D; Waalkes, T P; Borek, E
1978-03-21
A rigorous, comprehensive, and reliable reversed-phase high-performance liquid chromatographic (HPLC) method has been developed for the analysis of ribonucleosides in urine (psi, m1A, m1I, m2G, A, m2(2)G). An initial isolation of ribonucleosides with an affinity gel containing an immobilized phenylboronic acid was used to improve selectivity and sensitivity. Response for all nucleosides was linear from 0.1 to 50 nmoles injected and good quantitation was obtained for 25 microliter or less of sample placed on the HPLC column. Excellent precision of analysis for urinary nucleosides was achieved on matrix dependent and independent samples, and the high resolution of the reversed-phase column allowed the complete separation of 9 nucleosides from other unidentified UV absorbing components at the 1-ng level. Supporting experimental data are presented on precision, recovery, chromatographic methods, minimum detection limit, retention time, relative molar response, sample clean-up, stability of nucleosides, boronate gel capacity, and application to analysis of urine from patients with leukemia and breast cancer. This method is now being used routinely for the determination of the concentration and ratios of nucleosides in urine from patients with different types of cancer and in chemotherapy response studies.
[Application of evidence based medicine to the individual patient: the role of decision analysis].
Housset, B; Junod, A F
2003-11-01
The objective of evidence based medicine (EBM) is to contribute to medical decision making by providing the best possible information in terms of validity and relevance. This allows evaluation in a specific manner of the benefits and risks of a decision. The limitations and hazards of this approach are discussed in relation to a clinical case where the diagnosis of pulmonary embolism was under consideration. The individual details and the limited availability of some technical procedures illustrate the need to adapt the data of EBM to the circumstances. The choice between two diagnostic tests (d-dimers and ultrasound of the legs) and their optimal timing is analysed with integration of the consequences for the patient of the treatments proposed. This allows discussion of the concept of utility and the use of sensitivity analysis. If EBM is the cornerstone of rational and explicit practise it should also allow for the constraints of real life. Decision analysis, which depends on the same critical demands as EBM but can also take account of the individual features of each patient and test the robustness of a decision, gives a unique opportunity reconcile rigorous reasoning with individualisation of management.
The accurate assessment of small-angle X-ray scattering data
Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...
2015-01-23
Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less
Molecular Characterization of Transgenic Events Using Next Generation Sequencing Approach.
Guttikonda, Satish K; Marri, Pradeep; Mammadov, Jafar; Ye, Liang; Soe, Khaing; Richey, Kimberly; Cruse, James; Zhuang, Meibao; Gao, Zhifang; Evans, Clive; Rounsley, Steve; Kumpatla, Siva P
2016-01-01
Demand for the commercial use of genetically modified (GM) crops has been increasing in light of the projected growth of world population to nine billion by 2050. A prerequisite of paramount importance for regulatory submissions is the rigorous safety assessment of GM crops. One of the components of safety assessment is molecular characterization at DNA level which helps to determine the copy number, integrity and stability of a transgene; characterize the integration site within a host genome; and confirm the absence of vector DNA. Historically, molecular characterization has been carried out using Southern blot analysis coupled with Sanger sequencing. While this is a robust approach to characterize the transgenic crops, it is both time- and resource-consuming. The emergence of next-generation sequencing (NGS) technologies has provided highly sensitive and cost- and labor-effective alternative for molecular characterization compared to traditional Southern blot analysis. Herein, we have demonstrated the successful application of both whole genome sequencing and target capture sequencing approaches for the characterization of single and stacked transgenic events and compared the results and inferences with traditional method with respect to key criteria required for regulatory submissions.
Calhoun, Vince D.; Maciejewski, Paul K.; Pearlson, Godfrey D.; Kiehl, Kent A.
2009-01-01
Schizophrenia and bipolar disorder are currently diagnosed on the basis of psychiatric symptoms and longitudinal course. The determination of a reliable, biologically-based diagnostic indicator of these diseases (a biomarker) could provide the groundwork for developing more rigorous tools for differential diagnosis and treatment assignment. Recently, methods have been used to identify distinct sets of brain regions or “spatial modes” exhibiting temporally coherent brain activity. Using functional magnetic resonance imaging (fMRI) data and a multivariate analysis method, independent component analysis, we combined the temporal lobe and the default modes to discriminate subjects with bipolar disorder, chronic schizophrenia, and healthy controls. Temporal lobe and default mode networks were reliably identified in all participants. Classification results on an independent set of individuals revealed an average sensitivity and specificity of 90 and 95%, respectively. The use of coherent brain networks such as the temporal lobe and default mode networks may provide a more reliable measure of disease state than task-correlated fMRI activity. A combination of two such hemodynamic brain networks shows promise as a biomarker for schizophrenia and bipolar disorder. PMID:17894392
Calhoun, Vince D; Maciejewski, Paul K; Pearlson, Godfrey D; Kiehl, Kent A
2008-11-01
Schizophrenia and bipolar disorder are currently diagnosed on the basis of psychiatric symptoms and longitudinal course. The determination of a reliable, biologically-based diagnostic indicator of these diseases (a biomarker) could provide the groundwork for developing more rigorous tools for differential diagnosis and treatment assignment. Recently, methods have been used to identify distinct sets of brain regions or "spatial modes" exhibiting temporally coherent brain activity. Using functional magnetic resonance imaging (fMRI) data and a multivariate analysis method, independent component analysis, we combined the temporal lobe and the default modes to discriminate subjects with bipolar disorder, chronic schizophrenia, and healthy controls. Temporal lobe and default mode networks were reliably identified in all participants. Classification results on an independent set of individuals revealed an average sensitivity and specificity of 90 and 95%, respectively. The use of coherent brain networks such as the temporal lobe and default mode networks may provide a more reliable measure of disease state than task-correlated fMRI activity. A combination of two such hemodynamic brain networks shows promise as a biomarker for schizophrenia and bipolar disorder.
Surrogate decision making and intellectual virtue.
Bock, Gregory L
2014-01-01
Patients can be harmed by a religiously motivated surrogate decision maker whose decisions are contrary to the standard of care; therefore, surrogate decision making should be held to a high standard. Stewart Eskew and Christopher Meyers proposed a two-part rule for deciding which religiously based decisions to honor: (1) a secular reason condition and (2) a rationality condition. The second condition is based on a coherence theory of rationality, which they claim is accessible, generous, and culturally sensitive. In this article, I will propose strengthening the rationality condition by grounding it in a theory of intellectual virtue, which is both rigorous and culturally sensitive. Copyright 2014 The Journal of Clinical Ethics. All rights reserved.
NASA Technical Reports Server (NTRS)
Contreras, Michael T.; Trease, Brian P.; Bojanowski, Cezary; Kulakx, Ronald F.
2013-01-01
A wheel experiencing sinkage and slippage events poses a high risk to planetary rover missions as evidenced by the mobility challenges endured by the Mars Exploration Rover (MER) project. Current wheel design practice utilizes loads derived from a series of events in the life cycle of the rover which do not include (1) failure metrics related to wheel sinkage and slippage and (2) performance trade-offs based on grouser placement/orientation. Wheel designs are rigorously tested experimentally through a variety of drive scenarios and simulated soil environments; however, a robust simulation capability is still in development due to myriad of complex interaction phenomena that contribute to wheel sinkage and slippage conditions such as soil composition, large deformation soil behavior, wheel geometry, nonlinear contact forces, terrain irregularity, etc. For the purposes of modeling wheel sinkage and slippage at an engineering scale, meshfree nite element approaches enable simulations that capture su cient detail of wheel-soil interaction while remaining computationally feasible. This study implements the JPL wheel-soil benchmark problem in the commercial code environment utilizing the large deformation modeling capability of Smooth Particle Hydrodynamics (SPH) meshfree methods. The nominal, benchmark wheel-soil interaction model that produces numerically stable and physically realistic results is presented and simulations are shown for both wheel traverse and wheel sinkage cases. A sensitivity analysis developing the capability and framework for future ight applications is conducted to illustrate the importance of perturbations to critical material properties and parameters. Implementation of the proposed soil-wheel interaction simulation capability and associated sensitivity framework has the potential to reduce experimentation cost and improve the early stage wheel design proce
Pushing boundaries-culture-sensitive care in oncology and palliative care: a qualitative study.
Schrank, Beate; Rumpold, Tamara; Amering, Michaela; Masel, Eva Katharina; Watzke, Herbert; Schur, Sophie
2017-06-01
In increasingly globalized societies, patient-centered cancer care requires culture-sensitive approaches in order to ensure patients well-being. While migrant patients' needs are frequently reported in the literature, staff members' perception of work with migrant patients, associated challenges, or individual work approaches are largely unknown. This study addresses this research gap through qualitative exploration of experiences of multicultural health care professionals in supportive oncology and palliative care, working with patients from different cultural backgrounds. This study aims to understand staff experience of the impact of culture on cancer care. This study was conducted at the Medical University of Vienna, including staff from different settings of oncology and palliative care, in different professional positions, and with a range of individual migration backgrounds. Semistructured interviews were conducted with 21 staff members working with patients from different cultural backgrounds. Interviews explored views on the impact of culture on care were audio-taped, transcribed, and analyzed using a rigorous method of thematic analysis, enhanced with grounded theory techniques. Interviews revealed 4 key topics: culture-specific differences, assumed reasons for differences, consequences of multicultural care, and tools for culture-sensitive care. Strategies to better deal with migrant patients and their families were suggested to improve work satisfaction amongst staff. This study identifies relevant staff challenges in work with migrant patients. Concrete suggestions for improvement include measures on an organizational level, team level, and personal tools. The suggested measures are applicable to improve work satisfaction and culture-sensitive care not only in cancer care but also in other areas of medicine. Copyright © 2016 John Wiley & Sons, Ltd.
Telemedicine in the management of chronic pain: a cost analysis study.
Pronovost, Antoine; Peng, Philip; Kern, Ralph
2009-08-01
Telemedicine provides patients with easy and remote access to consultant expertise irrespective of geographic location. In a randomized controlled trial, this study has applied a rigorous costing methodology to the use of telemedicine in chronic pain management. We performed a randomized two-period crossover trial comparing in-person (IP) consultation with telemedicine (TM) consultation in the management of chronic pain. Over an 18-month period, 26 patients each completed two diaries capturing their direct and indirect travel costs, daily pain scores, and satisfaction with physician consultation. Costing models were developed to account for direct, indirect, fixed, and variable costs in order to perform break-even analyses. Sensitivity analysis was performed over a broad range of assumptions. Direct patient costs were significantly lower in the TM group than in the IP group, with median cost and interquartile range 133 dollars (28-377) vs 443 dollars (292-1075), respectively (P = 0.001). More patients were highly satisfied with the TM consultation than with the IP consultation (56 and 24%, respectively; P < 0.05). Break-even annual patient volume was estimated at 57 patients. A two-way sensitivity analysis controlling for annual patient volume and round-trip distance indicated that TM remains cost-effective at volumes >50 patients/year or at round-trip distances >200 km. Telemedicine is cost-effective over a broad range of assumptions, including annual patient volumes, travel distance, fuel costs, amortization, and discount rates. This study provides data from a real-world setting to determine relevant thresholds and targets for establishing a TM program for patients who are undergoing chronic pain therapy.
Shao, Huikai; Zhao, Lingguo; Chen, Fuchao; Zeng, Shengbo; Liu, Shengquan; Li, Jiajia
2015-11-29
BACKGROUND In the past decades, a large number of randomized controlled trials (RCTs) on the efficacy of ligustrazine injection combined with conventional antianginal drugs for angina pectoris have been reported. However, these RCTs have not been evaluated in accordance with PRISMA systematic review standards. The aim of this study was to evaluate the efficacy of ligustrazine injection as adjunctive therapy for angina pectoris. MATERIAL AND METHODS The databases PubMed, Medline, Cochrane Library, Embase, Sino-Med, Wanfang Databases, Chinese Scientific Journal Database, Google Scholar, Chinese Biomedical Literature Database, China National Knowledge Infrastructure, and the Chinese Science Citation Database were searched for published RCTs. Meta-analysis was performed on the primary outcome measures, including the improvements of electrocardiography (ECG) and the reductions in angina symptoms. Sensitivity and subgroup analysis based on the M score (the refined Jadad scores) were also used to evaluate the effect of quality, sample size, and publication year of the included RCTs on the overall effect of ligustrazine injection. RESULTS Eleven RCTs involving 870 patients with angina pectoris were selected in this study. Compared with conventional antianginal drugs alone, ligustrazine injection combined with antianginal drugs significantly increased the efficacy in symptom improvement (odds ratio [OR], 3.59; 95% confidence interval [CI]: 2.39 to 5.40) and in ECG improvement (OR, 3.42; 95% CI: 2.33 to 5.01). Sensitivity and subgroup analysis also confirmed that ligustrazine injection had better effect in the treatment of angina pectoris as adjunctive therapy. CONCLUSIONS The 11 eligible RCTs indicated that ligustrazine injection as adjunctive therapy was more effective than antianginal drugs alone. However, due to the low quality of included RCTs, more rigorously designed RCTs were still needed to verify the effects of ligustrazine injection as adjunctive therapy for angina pectoris.
Shao, Huikai; Zhao, Lingguo; Chen, Fuchao; Zeng, Shengbo; Liu, Shengquan; Li, Jiajia
2015-01-01
Background In the past decades, a large number of randomized controlled trials (RCTs) on the efficacy of ligustrazine injection combined with conventional antianginal drugs for angina pectoris have been reported. However, these RCTs have not been evaluated in accordance with PRISMA systematic review standards. The aim of this study was to evaluate the efficacy of ligustrazine injection as adjunctive therapy for angina pectoris. Material/Methods The databases PubMed, Medline, Cochrane Library, Embase, Sino-Med, Wanfang Databases, Chinese Scientific Journal Database, Google Scholar, Chinese Biomedical Literature Database, China National Knowledge Infrastructure, and the Chinese Science Citation Database were searched for published RCTs. Meta-analysis was performed on the primary outcome measures, including the improvements of electrocardiography (ECG) and the reductions in angina symptoms. Sensitivity and subgroup analysis based on the M score (the refined Jadad scores) were also used to evaluate the effect of quality, sample size, and publication year of the included RCTs on the overall effect of ligustrazine injection. Results Eleven RCTs involving 870 patients with angina pectoris were selected in this study. Compared with conventional antianginal drugs alone, ligustrazine injection combined with antianginal drugs significantly increased the efficacy in symptom improvement (odds ratio [OR], 3.59; 95% confidence interval [CI]: 2.39 to 5.40) and in ECG improvement (OR, 3.42; 95% CI: 2.33 to 5.01). Sensitivity and subgroup analysis also confirmed that ligustrazine injection had better effect in the treatment of angina pectoris as adjunctive therapy. Conclusions The 11 eligible RCTs indicated that ligustrazine injection as adjunctive therapy was more effective than antianginal drugs alone. However, due to the low quality of included RCTs, more rigorously designed RCTs were still needed to verify the effects of ligustrazine injection as adjunctive therapy for angina pectoris. PMID:26615387
Digital morphogenesis via Schelling segregation
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2018-04-01
Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.
2014-01-01
Background Cancer detection using sniffer dogs is a potential technology for clinical use and research. Our study sought to determine whether dogs could be trained to discriminate the odour of urine from men with prostate cancer from controls, using rigorous testing procedures and well-defined samples from a major research hospital. Methods We attempted to train ten dogs by initially rewarding them for finding and indicating individual prostate cancer urine samples (Stage 1). If dogs were successful in Stage 1, we then attempted to train them to discriminate prostate cancer samples from controls (Stage 2). The number of samples used to train each dog varied depending on their individual progress. Overall, 50 unique prostate cancer and 67 controls were collected and used during training. Dogs that passed Stage 2 were tested for their ability to discriminate 15 (Test 1) or 16 (Tests 2 and 3) unfamiliar prostate cancer samples from 45 (Test 1) or 48 (Tests 2 and 3) unfamiliar controls under double-blind conditions. Results Three dogs reached training Stage 2 and two of these learnt to discriminate potentially familiar prostate cancer samples from controls. However, during double-blind tests using new samples the two dogs did not indicate prostate cancer samples more frequently than expected by chance (Dog A sensitivity 0.13, specificity 0.71, Dog B sensitivity 0.25, specificity 0.75). The other dogs did not progress past Stage 1 as they did not have optimal temperaments for the sensitive odour discrimination training. Conclusions Although two dogs appeared to have learnt to select prostate cancer samples during training, they did not generalise on a prostate cancer odour during robust double-blind tests involving new samples. Our study illustrates that these rigorous tests are vital to avoid drawing misleading conclusions about the abilities of dogs to indicate certain odours. Dogs may memorise the individual odours of large numbers of training samples rather than generalise on a common odour. The results do not exclude the possibility that dogs could be trained to detect prostate cancer. We recommend that canine olfactory memory is carefully considered in all future studies and rigorous double-blind methods used to avoid confounding effects. PMID:24575737
Why Open-Ended Survey Questions Are Unlikely to Support Rigorous Qualitative Insights.
LaDonna, Kori A; Taylor, Taryn; Lingard, Lorelei
2018-03-01
Health professions education researchers are increasingly relying on a combination of quantitative and qualitative research methods to explore complex questions in the field. This important and necessary development, however, creates new methodological challenges that can affect both the rigor of the research process and the quality of the findings. One example is "qualitatively" analyzing free-text responses to survey or assessment instrument questions. In this Invited Commentary, the authors explain why analysis of such responses rarely meets the bar for rigorous qualitative research. While the authors do not discount the potential for free-text responses to enhance quantitative findings or to inspire new research questions, they caution that these responses rarely produce data rich enough to generate robust, stand-alone insights. The authors consider exemplars from health professions education research and propose strategies for treating free-text responses appropriately.
IMPROVING ALTERNATIVES FOR ENVIRONMENTAL IMPACT ASSESSMENT. (R825758)
Environmental impact assessment (EIA), in the US, requires an objective and rigorous analysis of alternatives. Yet the choice of alternatives for that analysis can be subjective and arbitrary. Alternatives often reflect narrow project objectives, agency agendas, and predilecti...
FORMAL SCENARIO DEVELOPMENT FOR ENVIRONMENTAL IMPACT ASSESSMENT STUDIES
Scenario analysis is a process of evaluating possible future events through the consideration of alternative plausible (though not equally likely) outcomes (scenarios). The analysis is designed to enable improved decision-making and assessment through a more rigorous evaluation o...
Comparative sensitizing potencies of fragrances, preservatives, and hair dyes.
Lidén, Carola; Yazar, Kerem; Johansen, Jeanne D; Karlberg, Ann-Therese; Uter, Wolfgang; White, Ian R
2016-11-01
The local lymph node assay (LLNA) is used for assessing sensitizing potential in hazard identification and risk assessment for regulatory purposes. Sensitizing potency on the basis of the LLNA is categorized into extreme (EC3 value of ≤0.2%), strong (>0.2% to ≤2%), and moderate (>2%). To compare the sensitizing potencies of fragrance substances, preservatives, and hair dye substances, which are skin sensitizers that frequently come into contact with the skin of consumers and workers, LLNA results and EC3 values for 72 fragrance substances, 25 preservatives and 107 hair dye substances were obtained from two published compilations of LLNA data and opinions by the Scientific Committee on Consumer Safety and its predecessors. The median EC3 values of fragrances (n = 61), preservatives (n = 19) and hair dyes (n = 59) were 5.9%, 0.9%, and 1.3%, respectively. The majority of sensitizing preservatives and hair dyes are thus strong or extreme sensitizers (EC3 value of ≤2%), and fragrances are mostly moderate sensitizers. Although fragrances are typically moderate sensitizers, they are among the most frequent causes of contact allergy. This indicates that factors other than potency need to be addressed more rigorously in risk assessment and risk management. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Xu, Mei-Mei; Jia, Hong-Yu; Yan, Li-Li; Li, Shan-Shan; Zheng, Yue
2017-01-01
Abstract Background: This meta-analysis aimed to provide a pooled analysis of prospective controlled trials comparing the diagnostic accuracy of 22-G and 25-G needles on endoscopic ultrasonography (EUS-FNA) of the solid pancreatic mass. Methods: We established a rigorous study protocol according to Cochrane Collaboration recommendations. We systematically searched the PubMed and Embase databases to identify articles to include in the meta-analysis. Sensitivity, specificity, and corresponding 95% confidence intervals were calculated for 22-G and 25-G needles of individual studies from the contingency tables. Results: Eleven prospective controlled trials included a total of 837 patients (412 with 22-G vs 425 with 25-G). Our outcomes revealed that 25-G needles (92% [95% CI, 89%–95%]) have higher sensitivity than 22-G needles (88% [95% CI, 84%–91%]) on solid pancreatic mass EUS-FNA (P = 0.046). However, there were no significant differences between the 2 groups in overall diagnostic specificity (P = 0.842). The pooled positive and negative likelihood ratio of the 22-G needle were 12.61 (95% CI, 5.65–28.14) and 0.16 (95% CI, 0.12–0.21), respectively. The pooled positive likelihood ratio was 12.61 (95% CI, 5.65–28.14), and the negative likelihood ratio was 0.16 (95% CI, 0.12–0.21) for the 22-G needle. The pooled positive likelihood ratio was 8.44 (95% CI, 3.87–18.42), and the negative likelihood ratio was 0.13 (95% CI, 0.09–0.18) for the 25-G needle. The area under the summary receiver operating characteristic curve was 0.97 for the 22-G needle and 0.96 for the 25-G needle. Conclusion: Compared to the study of 22-G EUS-FNA needles, our study showed that 25-G needles have superior sensitivity in the evaluation of solid pancreatic lesions by EUS–FNA. PMID:28151856
A Glycomics Platform for the Analysis of Permethylated Oligosaccharide Alditols
Costello, Catherine E.; Contado-Miller, Joy May; Cipollo, John F.
2007-01-01
This communication reports the development of an LC/MS platform for the analysis of permethylated oligosaccharide alditols that, for the first time, demonstrates routine online oligosaccharide isomer separation of these compounds prior to introduction into the mass spectrometer. The method leverages a high resolution liquid chromatography system with the superior fragmentation pattern characteristics of permethylated oligosaccharide alditols that are dissociated under low-energy collision conditions using quadrupole orthogonal time-of-flight (QoTOF) instrumentation and up to pseudo MS3 mass spectrometry. Glycoforms, including isomers, are readily identified and their structures assigned. The isomer-specific spectra include highly informative cross-ring and elimination fragments, branch position specific signatures and glycosidic bond fragments, thus facilitating linkage, branch and sequence assignment. The method is sensitive and can be applied using as little as 40 fmol of derivatized oligosaccharide. Because permethylation renders oligosaccharides nearly chemically equivalent in the mass spectrometer, the method is semi-quantitative and, in this regard, is comparable to methods reported using high field NMR and capillary electrophoresis. In this post - genomic age, the importance of glycosylation in biological processes has become clear. The nature of many of the important questions in glycomics is such that sample material is often extremely limited, thus necessitating the development of highly sensitive methods for rigorous structural assignment of the oligosaccharides in complex mixtures. The glycomics platform presented here fulfills these criteria and should lead to more facile glycomics analyses. PMID:17719235
Development of rigor mortis is not affected by muscle volume.
Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H
2001-04-01
There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.
... A, Sancesario GM, Esposito Z, et al. Plasmin system of Alzheimer's disease: CSF analysis. J Neural Transm (Vienna) . ... urac.org). URAC's accreditation program is an independent audit to verify that A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is ...
Mokel, Melissa Jennifer; Shellman, Juliette M
2013-01-01
Many instruments in which religious involvement is measured often (a) contain unclear, poorly developed constructs; (b) lack methodological rigor in scale development; and (c) contain language and content culturally incongruent with the religious experiences of diverse ethnic groups. The primary aims of this review were to (a) synthesize the research on instruments designed to measure religious involvement, (b) evaluate the methodological quality of instruments that measure religious involvement, and (c) examine these instruments for conceptual congruency with African American religious involvement. An updated integrative research review method guided the process (Whittemore & Knafl, 2005). 152 articles were reviewed and 23 articles retrieved. Only 3 retained instruments were developed under methodologically rigorous conditions. All 3 instruments were congruent with a conceptual model of African American religious involvement. The Fetzer Multidimensional Measure of Religious Involvement and Spirituality (FMMRS; Idler et al., 2003) was found to have favorable characteristics. Further examination and psychometric testing is warranted to determine its acceptability, readability, and cultural sensitivity in an African American population.
Diffraction-based overlay measurement on dedicated mark using rigorous modeling method
NASA Astrophysics Data System (ADS)
Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang
2012-03-01
Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.
The human repeated insult patch test in the 21st century: a commentary.
Basketter, David A
2009-01-01
The human repeated insult patch test (HRIPT) is over half a century old, but is still used in several countries as a confirmatory test in the safety evaluation of skin sensitizers. This is despite the criticism it receives from an ethical perspective and regarding the scientific validity of such testing. In this commentary, the HRIPT is reviewed, with emphasis on ethical aspects and where the test can, and cannot, contribute in a scientifically meaningful manner to safety evaluation. It is concluded that where there is a specific rationale for testing, for example, to substantiate a no-effect level for a sensitizing chemical or to ensure that matrix effects are not making an unexpected contribution to sensitizing potency, then rigorous independent review may confirm that an HRIPT is ethical and scientifically justifiable. The possibility that sensitization may be induced in volunteers dictates that HRIPTs should be conducted rarely and in cases where the benefits overwhelmingly outweigh the risk. However, for the very large majority of HRIPTs conducted concerning the risk of skin sensitization, there is neither scientific justification nor any other merit.
Ridenour, Ty A; Pineo, Thomas Z; Maldonado Molina, Mildred M; Hassmiller Lich, Kristen
2013-06-01
Psychosocial prevention research lacks evidence from intensive within-person lines of research to understand idiographic processes related to development and response to intervention. Such data could be used to fill gaps in the literature and expand the study design options for prevention researchers, including lower-cost yet rigorous studies (e.g., for program evaluations), pilot studies, designs to test programs for low prevalence outcomes, selective/indicated/adaptive intervention research, and understanding of differential response to programs. This study compared three competing analytic strategies designed for this type of research: autoregressive moving average, mixed model trajectory analysis, and P-technique. Illustrative time series data were from a pilot study of an intervention for nursing home residents with diabetes (N = 4) designed to improve control of blood glucose. A within-person, intermittent baseline design was used. Intervention effects were detected using each strategy for the aggregated sample and for individual patients. The P-technique model most closely replicated observed glucose levels. ARIMA and P-technique models were most similar in terms of estimated intervention effects and modeled glucose levels. However, ARIMA and P-technique also were more sensitive to missing data, outliers and number of observations. Statistical testing suggested that results generalize both to other persons as well as to idiographic, longitudinal processes. This study demonstrated the potential contributions of idiographic research in prevention science as well as the need for simulation studies to delineate the research circumstances when each analytic approach is optimal for deriving the correct parameter estimates.
Pineo, Thomas Z.; Maldonado Molina, Mildred M.; Lich, Kristen Hassmiller
2013-01-01
Psychosocial prevention research lacks evidence from intensive within-person lines of research to understand idiographic processes related to development and response to intervention. Such data could be used to fill gaps in the literature and expand the study design options for prevention researchers, including lower-cost yet rigorous studies (e.g., for program evaluations), pilot studies, designs to test programs for low prevalence outcomes, selective/indicated/ adaptive intervention research, and understanding of differential response to programs. This study compared three competing analytic strategies designed for this type of research: autoregressive moving average, mixed model trajectory analysis, and P-technique. Illustrative time series data were from a pilot study of an intervention for nursing home residents with diabetes (N=4) designed to improve control of blood glucose. A within-person, intermittent baseline design was used. Intervention effects were detected using each strategy for the aggregated sample and for individual patients. The P-technique model most closely replicated observed glucose levels. ARIMA and P-technique models were most similar in terms of estimated intervention effects and modeled glucose levels. However, ARIMA and P-technique also were more sensitive to missing data, outliers and number of observations. Statistical testing suggested that results generalize both to other persons as well as to idiographic, longitudinal processes. This study demonstrated the potential contributions of idiographic research in prevention science as well as the need for simulation studies to delineate the research circumstances when each analytic approach is optimal for deriving the correct parameter estimates. PMID:23299558
NASA Astrophysics Data System (ADS)
Craig, S. E.; Lee, Z.; Du, K.; Lin, J.
2016-02-01
An approach based on empirical orthogonal function (EOF) analysis of ocean colour spectra has been shown to accurately derive inherent optical properties (IOPs) and chlorophyll concentration in scenarios, such as optically complex waters, where standard algorithms often perform poorly. The algorithm has been successfully used in a number of regional applications, and has also shown promise in a global implementation based on the NASA NOMAD data set. Additionally, it has demonstrated the unique ability to derive ocean colour products from top of atmosphere (TOA) signals with either no or minimal atmospheric correction applied. Due to its high potential for use over coastal and inland waters, the EOF approach is currently being rigorously characterised as part of a suite of approaches that will be used to support the new NASA ocean colour mission, PACE (Pre-Aerosol, Clouds and ocean Ecosystem). A major component in this model characterisation is the generation of a synthetic TOA data set using a coupled ocean-atmosphere radiative transfer model, which has been run to mimic PACE spectral resolution, and under a wide range of geographical locations, water constituent concentrations, and sea surface and atmospheric conditions. The resulting multidimensional data set will be analysed, and results presented on the sensitivity of the model to various combinations of parameters, and preliminary conclusions made regarding the optimal implementation strategy of this promising approach (e.g. on a global, optical water type or regional basis). This will provide vital guidance for operational implementation of the model for both existing satellite ocean colour sensors and the upcoming PACE mission.
Simulations of Scatterometry Down to 22 nm Structure Sizes and Beyond with Special Emphasis on LER
NASA Astrophysics Data System (ADS)
Osten, W.; Ferreras Paz, V.; Frenner, K.; Schuster, T.; Bloess, H.
2009-09-01
In recent years, scatterometry has become one of the most commonly used methods for CD metrology. With decreasing structure size for future technology nodes, the search for optimized scatterometry measurement configurations gets more important to exploit maximum sensitivity. As widespread industrial scatterometry tools mainly still use a pre-set measurement configuration, there are still free parameters to improve sensitivity. Our current work uses a simulation based approach to predict and optimize sensitivity of future technology nodes. Since line edge roughness is getting important for such small structures, these imperfections of the periodic continuation cannot be neglected. Using fourier methods like e.g. rigorous coupled wave approach (RCWA) for diffraction calculus, nonperiodic features are hard to reach. We show that in this field certain types of fieldstitching methods show nice numerical behaviour and lead to useful results.
The effects of the photomask on multiphase shift test monitors
NASA Astrophysics Data System (ADS)
McIntyre, Gregory; Neureuther, Andrew
2006-10-01
A series of chromeless multiple-phase shift lithographic test monitors have been previously introduced. This paper investigates various effects that impact the performance of these monitors, focusing primarily on PSM Polarimetry, a technique to monitor illumination polarization. The measurement sensitivities from a variety of scalar and rigorous electromagnetic simulations are compared to experimental results from three industrial quality multi-phase test reticles. This analysis enables the relative importance of the various effects to be identified and offers the industry unique insight into various issues associated with the photomask. First, the unavoidable electromagnetic interaction as light propagates through the multiple phase steps of the mask topography appears to account for about 10 to 20% of the lost sensitivity, when experimental results are compared to an ideal simulated case. The polarization dependence of this effect is analyzed, concluding that the 4-phase topography is more effective at manipulating TM polarization. Second, various difficulties in the fabrication of these complicated mask patterns are described and likely account for an additional 60-80% loss in sensitivity. Smaller effects are also described, associated with the photoresist, mask design and subtle differences in the proximity effect of TE and TM polarization of off-axis light at high numerical aperture. Finally, the question: "How practical is PSM polarimetry?" is considered. It is concluded that, despite many severe limiting factors, an accurately calibrated test reticle promises to monitor polarization in state-of-the-art lithography scanners to within about 2%.
Cottle, D J; Coffey, M P
2013-02-01
The objective of this study was to assess the impact of using different relative economic values (REVs) in selection indices on predicted financial and trait gains from selection of sires of cows and on the choice of leading Holstein bulls available in the UK dairy industry. Breeding objective traits were milk yield, fat yield, protein yield, lifespan, mastitis, non-return rate, calving interval and lameness. Relative importance of a trait, as estimated by a.h(2), was only moderately related to the rate of financial loss or total economic merit (ΔTEM) per percentage under- or overestimation of REV (r = 0.38 and 0.29, respectively) as a result of the variance-covariance structure of traits. The effects on TEM of under- or overestimating trait REVs were non-symmetrical. TEM was most sensitive to incorrect REVs for protein, fat, milk and lifespan and least sensitive to incorrect calving interval, lameness, non-return and mastitis REVs. A guide to deciding which dairy traits require the most rigorous analysis in the calculation of their REVs is given. Varying the REVs within a fairly wide range resulted in different bulls being selected by index and their differing predicted transmitting abilities would result in the herds moving in different directions in the long term (20 years). It is suggested that customized indices, where the farmer creates rankings of bulls tailored to their specific farm circumstances, can be worthwhile. © 2012 Blackwell Verlag GmbH.
Marlowe, Hannah; McEntaffer, Randall L; Tutt, James H; DeRoo, Casey T; Miles, Drew M; Goray, Leonid I; Soltwisch, Victor; Scholze, Frank; Herrero, Analia Fernandez; Laubis, Christian
2016-07-20
Off-plane reflection gratings were previously predicted to have different efficiencies when the incident light is polarized in the transverse-magnetic (TM) versus transverse-electric (TE) orientations with respect to the grating grooves. However, more recent theoretical calculations which rigorously account for finitely conducting, rather than perfectly conducting, grating materials no longer predict significant polarization sensitivity. We present the first empirical results for radially ruled, laminar groove profile gratings in the off-plane mount, which demonstrate no difference in TM versus TE efficiency across our entire 300-1500 eV bandpass. These measurements together with the recent theoretical results confirm that grazing incidence off-plane reflection gratings using real, not perfectly conducting, materials are not polarization sensitive.
Quality and rigor of the concept mapping methodology: a pooled study analysis.
Rosas, Scott R; Kane, Mary
2012-05-01
The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative characteristics and estimates of quality and rigor that may guide for future studies are lacking. To address this gap, we conducted a pooled analysis of 69 concept mapping studies to describe characteristics across study phases, generate specific indicators of validity and reliability, and examine the relationship between select study characteristics and quality indicators. Individual study characteristics and estimates were pooled and quantitatively summarized, describing the distribution, variation and parameters for each. In addition, variation in the concept mapping data collection in relation to characteristics and estimates was examined. Overall, results suggest concept mapping yields strong internal representational validity and very strong sorting and rating reliability estimates. Validity and reliability were consistently high despite variation in participation and task completion percentages across data collection modes. The implications of these findings as a practical reference to assess the quality and rigor for future concept mapping studies are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Phillips, Christine B; Dwan, Kathryn; Hepworth, Julie; Pearce, Christopher; Hall, Sally
2014-11-19
The primary health care sector delivers the majority of health care in western countries through small, community-based organizations. However, research into these healthcare organizations is limited by the time constraints and pressure facing them, and the concern by staff that research is peripheral to their work. We developed Q-RARA-Qualitative Rapid Appraisal, Rigorous Analysis-to study small, primary health care organizations in a way that is efficient, acceptable to participants and methodologically rigorous. Q-RARA comprises a site visit, semi-structured interviews, structured and unstructured observations, photographs, floor plans, and social scanning data. Data were collected over the course of one day per site and the qualitative analysis was integrated and iterative. We found Q-RARA to be acceptable to participants and effective in collecting data on organizational function in multiple sites without disrupting the practice, while maintaining a balance between speed and trustworthiness. The Q-RARA approach is capable of providing a richly textured, rigorous understanding of the processes of the primary care practice while also allowing researchers to develop an organizational perspective. For these reasons the approach is recommended for use in small-scale organizations both within and outside the primary health care sector.
Jager, Marieke F; Ott, Christian; Kaplan, Christopher J; Kraus, Peter M; Neumark, Daniel M; Leone, Stephen R
2018-01-01
We present an extreme ultraviolet (XUV) transient absorption apparatus tailored to attosecond and femtosecond measurements on bulk solid-state thin-film samples, specifically when the sample dynamics are sensitive to heating effects. The setup combines methodology for stabilizing sub-femtosecond time-resolution measurements over 48 h and techniques for mitigating heat buildup in temperature-dependent samples. Single-point beam stabilization in pump and probe arms and periodic time-zero reference measurements are described for accurate timing and stabilization. A hollow-shaft motor configuration for rapid sample rotation, raster scanning capability, and additional diagnostics are described for heat mitigation. Heat transfer simulations performed using a finite element analysis allow comparison of sample rotation and traditional raster scanning techniques for 100 Hz pulsed laser measurements on vanadium dioxide, a material that undergoes an insulator-to-metal transition at a modest temperature of 340 K. Experimental results are presented confirming that the vanadium dioxide (VO 2 ) sample cannot cool below its phase transition temperature between laser pulses without rapid rotation, in agreement with the simulations. The findings indicate the stringent conditions required to perform rigorous broadband XUV time-resolved absorption measurements on bulk solid-state samples, particularly those with temperature sensitivity, and elucidate a clear methodology to perform them.
NASA Astrophysics Data System (ADS)
Jager, Marieke F.; Ott, Christian; Kaplan, Christopher J.; Kraus, Peter M.; Neumark, Daniel M.; Leone, Stephen R.
2018-01-01
We present an extreme ultraviolet (XUV) transient absorption apparatus tailored to attosecond and femtosecond measurements on bulk solid-state thin-film samples, specifically when the sample dynamics are sensitive to heating effects. The setup combines methodology for stabilizing sub-femtosecond time-resolution measurements over 48 h and techniques for mitigating heat buildup in temperature-dependent samples. Single-point beam stabilization in pump and probe arms and periodic time-zero reference measurements are described for accurate timing and stabilization. A hollow-shaft motor configuration for rapid sample rotation, raster scanning capability, and additional diagnostics are described for heat mitigation. Heat transfer simulations performed using a finite element analysis allow comparison of sample rotation and traditional raster scanning techniques for 100 Hz pulsed laser measurements on vanadium dioxide, a material that undergoes an insulator-to-metal transition at a modest temperature of 340 K. Experimental results are presented confirming that the vanadium dioxide (VO2) sample cannot cool below its phase transition temperature between laser pulses without rapid rotation, in agreement with the simulations. The findings indicate the stringent conditions required to perform rigorous broadband XUV time-resolved absorption measurements on bulk solid-state samples, particularly those with temperature sensitivity, and elucidate a clear methodology to perform them.
Preserving pre-rigor meat functionality for beef patty production.
Claus, J R; Sørheim, O
2006-06-01
Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.
Climate Benchmark Missions: CLARREO
NASA Technical Reports Server (NTRS)
Wielicki, Bruce A.; Young, David F.
2010-01-01
CLARREO (Climate Absolute Radiance and Refractivity Observatory) is one of the four Tier 1 missions recommended by the recent NRC decadal survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to rigorously observe climate change on decade time scales and to use decadal change observations as the most critical method to determine the accuracy of climate change projections such as those used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4). A rigorously known accuracy of both decadal change observations as well as climate projections is critical in order to enable sound policy decisions. The CLARREO mission accomplishes this critical objective through highly accurate and SI traceable decadal change observations sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. The same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. The CLARREO breakthrough in decadal climate change observations is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. These accuracy levels are determined both by the projected decadal changes as well as by the background natural variability that such signals must be detected against. The accuracy for decadal change traceability to SI standards includes uncertainties of calibration, sampling, and analysis methods. Unlike most other missions, all of the CLARREO requirements are judged not by instantaneous accuracy, but instead by accuracy in large time/space scale average decadal changes. Given the focus on decadal climate change, the NRC Decadal Survey concluded that the single most critical issue for decadal change observations was their lack of accuracy and low confidence in observing the small but critical climate change signals. CLARREO is the recommended attack on this challenge, and builds on the last decade of climate observation advances in the Earth Observing System as well as metrological advances at NIST (National Institute of Standards and Technology) and other standards laboratories.
Design analysis of doped-silicon surface plasmon resonance immunosensors in mid-infrared range.
DiPippo, William; Lee, Bong Jae; Park, Keunhan
2010-08-30
This paper reports the design analysis of a microfabricatable mid-infrared (mid-IR) surface plasmon resonance (SPR) sensor platform. The proposed platform has periodic heavily doped profiles implanted into intrinsic silicon and a thin gold layer deposited on top, making a physically flat grating SPR coupler. A rigorous coupled-wave analysis was conducted to prove the design feasibility, characterize the sensor's performance, and determine geometric parameters of the heavily doped profiles. Finite element analysis (FEA) was also employed to compute the electromagnetic field distributions at the plasmon resonance. Obtained results reveal that the proposed structure can excite the SPR on the normal incidence of mid-IR light, resulting in a large probing depth that will facilitate the study of larger analytes. Furthermore, the whole structure can be microfabricated with well-established batch protocols, providing tunability in the SPR excitation wavelength for specific biosensing needs with a low manufacturing cost. When the SPR sensor is to be used in a Fourier-transform infrared (FTIR) spectroscopy platform, its detection sensitivity and limit of detection are estimated to be 3022 nm/RIU and ~70 pg/mm(2), respectively, at a sample layer thickness of 100 nm. The design analysis performed in the present study will allow the fabrication of a tunable, disposable mid-IR SPR sensor that combines advantages of conventional prism and metallic grating SPR sensors.
The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research
ERIC Educational Resources Information Center
Harwell, Michael
2018-01-01
The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…
ERIC Educational Resources Information Center
Johnson, Lawrence J.; LaMontagne, M. J.
1993-01-01
This paper describes content analysis as a data analysis technique useful for examining written or verbal communication within early intervention. The article outlines the use of referential or thematic recording units derived from interview data, identifies procedural guidelines, and addresses issues of rigor and validity. (Author/JDD)
Cowan, Richard J; Abel, Leah; Candel, Lindsay
2017-05-01
We conducted a meta-analysis of single-subject research studies investigating the effectiveness of antecedent strategies grounded in behavioral momentum for improving compliance and on-task performance for students with autism. First, we assessed the research rigor of those studies meeting our inclusionary criteria. Next, in order to apply a universal metric to help determine the effectiveness of this category of antecedent strategies investigated via single-subject research methods, we calculated effect sizes via omnibus improvement rate differences (IRDs). Outcomes provide additional support for behavioral momentum, especially interventions incorporating the high-probability command sequence. Implications for research and practice are discussed, including the consideration of how single-subject research is systematically reviewed to assess the rigor of studies and assist in determining overall intervention effectiveness .
Yang, Ying; Wang, Congcong; Li, Xinxue; Chai, Qianyun; Fei, Yutong; Xia, Ruyu; Xu, Rongqian; Yang, Li; Liu, Jianping
2015-10-01
Henoch-Schönlein Purpura (HSP) is the most common necrotizing vasculitis affecting children. Traditional Chinese herbal medicine (CHM) was widely used. We aim to explore the evidence of effectiveness and safety of CHM for HSP in children without renal damage. Randomized controlled trials (RCTs) comparing CHM with conventional medications were searched from five databases. Eligible data were pooled using random-effects model using RevMan 5.2 Subgroup analysis for different co-interventions and sensitivity analysis for reducing heterogeneity were implemented. GRADE approach was adopted. We included 15 trials with 1112HSP children (age 1-16 years old), disease duration one day to three months. The overall methodological quality of included trials is relatively low. Adjunctive oral CHM treatments reduced renal damage (6 trials, RR 0.47, 95%CI 0.31-0.72, I(2)=0%), and subsiding time (days) of purpura (5 trials, mean difference (MD) -3.60, 95%CI -4.21 to -2.99, I(2)=23%), joint pain (5 trials, MD -1.04, 95%CI -1.33 to -0.74, I(2)=1%) and abdomen pain (5 trials, MD -1.69, 95%CI -2.51 to -0.86, I(2)=74%). Subgroup and sensitivity analysis did not change the direction of results. No severe adverse events reported. Orally taken adjunctive CHM treatments are effective for children suffering HSP in terms of reducing renal damage and subsiding time of purpura, and could possibly reduce subsiding pain of joint and abdomen. No reliable conclusion regarding safety is possible based on the safety data retrieved. Further rigorous trials are warranted. Copyright © 2015. Published by Elsevier Ltd.
Roegner, Amber F.; Schirmer, Macarena Pírez; Puschner, Birgit; Brena, Beatriz; Gonzalez-Sapienza, Gualberto
2014-01-01
The freshwater cyanotoxins, microcystins (MCs), pose a global public health threat as potent hepatotoxins in cyanobacterial blooms; their persistence in drinking and recreational water has been associated with potential chronic effects in addition to acute intoxications. Rapid and accurate detection of the over 80 structural congeners is challenged by the rigorous and time consuming clean up required to overcome interference found in raw water samples. MALDI-MS has shown promise for rapid quantification of individual congeners in raw water samples, with very low operative cost, but so far limited sensitivity and lack of available and versatile internal standards (ISs) has limited its use. Two easily synthesized S-hydroxyethyl–Cys(7)-MC-LR and –RR ISs were used to generate linear standard curves in a reflectron MALDI instrument, reproducible across several orders of magnitude for MC –LR, - RR and –YR. Minimum quantification limits in direct water samples with no clean up or concentration step involved were consistently below 7 μg/L, with recoveries from spiked samples between 80 and 119%. This method improves sensitivity by 30 fold over previous reports of quantitative MALDI-TOF applications to MCs and provides a salient option for rapid throughput analysis for multiple MC congeners in untreated raw surface water blooms as a means to identify source public health threats and target intervention strategies within a watershed. As demonstrated by analysis of a set of samples from Uruguay, utilizing the reaction of different MC congeners with alternate sulfhydryl compounds, the m/z of the IS can be customized to avoid overlap with interfering compounds in local surface water samples. PMID:24388801
Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katsoulakis, Markos
2014-08-09
Our two key accomplishments in the first three years were towards the development of, (1) a mathematically rigorous and at the same time computationally flexible framework for parallelization of Kinetic Monte Carlo methods, and its implementation on GPUs, and (2) spatial multilevel coarse-graining methods for Monte Carlo sampling and molecular simulation. A common underlying theme in both these lines of our work is the development of numerical methods which are at the same time both computationally efficient and reliable, the latter in the sense that they provide controlled-error approximations for coarse observables of the simulated molecular systems. Finally, our keymore » accomplishment in the last year of the grant is that we started developing (3) pathwise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of nonequilibrium extended (high-dimensional) systems. We discuss these three research directions in some detail below, along with the related publications.« less
Optical analysis of nanoparticles via enhanced backscattering facilitated by 3-D photonic nanojets
NASA Astrophysics Data System (ADS)
Li, Xu; Chen, Zhigang; Taflove, Allen; Backman, Vadim
2005-01-01
We report the phenomenon of ultra-enhanced backscattering of visible light by nanoparticles facilitated by the 3-D photonic nanojet a sub-diffraction light beam appearing at the shadow side of a plane-waveilluminated dielectric microsphere. Our rigorous numerical simulations show that backscattering intensity of nanoparticles can be enhanced up to eight orders of magnitude when locating in the nanojet. As a result, the enhanced backscattering from a nanoparticle with diameter on the order of 10 nm is well above the background signal generated by the dielectric microsphere itself. We also report that nanojet-enhanced backscattering is extremely sensitive to the size of the nanoparticle, permitting in principle resolving sub-nanometer size differences using visible light. Finally, we show how the position of a nanoparticle could be determined with subdiffractional accuracy by recording the angular distribution of the backscattered light. These properties of photonic nanojets promise to make this phenomenon a useful tool for optically detecting, differentiating, and sorting nanoparticles.
NASA Astrophysics Data System (ADS)
Georganos, Stefanos; Grippa, Tais; Vanhuysse, Sabine; Lennert, Moritz; Shimoni, Michal; Wolff, Eléonore
2017-10-01
This study evaluates the impact of three Feature Selection (FS) algorithms in an Object Based Image Analysis (OBIA) framework for Very-High-Resolution (VHR) Land Use-Land Cover (LULC) classification. The three selected FS algorithms, Correlation Based Selection (CFS), Mean Decrease in Accuracy (MDA) and Random Forest (RF) based Recursive Feature Elimination (RFE), were tested on Support Vector Machine (SVM), K-Nearest Neighbor, and Random Forest (RF) classifiers. The results demonstrate that the accuracy of SVM and KNN classifiers are the most sensitive to FS. The RF appeared to be more robust to high dimensionality, although a significant increase in accuracy was found by using the RFE method. In terms of classification accuracy, SVM performed the best using FS, followed by RF and KNN. Finally, only a small number of features is needed to achieve the highest performance using each classifier. This study emphasizes the benefits of rigorous FS for maximizing performance, as well as for minimizing model complexity and interpretation.
Laser Assisted Micro Wire GMAW and Droplet Welding
DOE Office of Scientific and Technical Information (OSTI.GOV)
FUERSCHBACH, PHILLIP W.; LUCK, D. L.; BERTRAM, LEE A.
2002-03-01
Laser beam welding is the principal welding process for the joining of Sandia weapon components because it can provide a small fusion zone with low overall heating. Improved process robustness is desired since laser energy absorption is extremely sensitive to joint variation and filler metal is seldom added. This project investigated the experimental and theoretical advantages of combining a fiber optic delivered Nd:YAG laser with a miniaturized GMAW system. Consistent gas metal arc droplet transfer employing a 0.25 mm diameter wire was only obtained at high currents in the spray transfer mode. Excessive heating of the workpiece in this modemore » was considered an impractical result for most Sandia micro-welding applications. Several additional droplet detachment approaches were investigated and analyzed including pulsed tungsten arc transfer(droplet welding), servo accelerated transfer, servo dip transfer, and electromechanically braked transfer. Experimental observations and rigorous analysis of these approaches indicate that decoupling droplet detachment from the arc melting process is warranted and may someday be practical.« less
El-Houjeiri, Hassan M; Brandt, Adam R; Duffy, James E
2013-06-04
Existing transportation fuel cycle emissions models are either general and calculate nonspecific values of greenhouse gas (GHG) emissions from crude oil production, or are not available for public review and auditing. We have developed the Oil Production Greenhouse Gas Emissions Estimator (OPGEE) to provide open-source, transparent, rigorous GHG assessments for use in scientific assessment, regulatory processes, and analysis of GHG mitigation options by producers. OPGEE uses petroleum engineering fundamentals to model emissions from oil and gas production operations. We introduce OPGEE and explain the methods and assumptions used in its construction. We run OPGEE on a small set of fictional oil fields and explore model sensitivity to selected input parameters. Results show that upstream emissions from petroleum production operations can vary from 3 gCO2/MJ to over 30 gCO2/MJ using realistic ranges of input parameters. Significant drivers of emissions variation are steam injection rates, water handling requirements, and rates of flaring of associated gas.
An Overview of the Clinical Use of Filter Paper in the Diagnosis of Tropical Diseases
Smit, Pieter W.; Elliott, Ivo; Peeling, Rosanna W.; Mabey, David; Newton, Paul N.
2014-01-01
Tropical infectious diseases diagnosis and surveillance are often hampered by difficulties of sample collection and transportation. Filter paper potentially provides a useful medium to help overcome such problems. We reviewed the literature on the use of filter paper, focusing on the evaluation of nucleic acid and serological assays for diagnosis of infectious diseases using dried blood spots (DBS) compared with recognized gold standards. We reviewed 296 eligible studies and included 101 studies evaluating DBS and 192 studies on other aspects of filter paper use. We also discuss the use of filter paper with other body fluids and for tropical veterinary medicine. In general, DBS perform with sensitivities and specificities similar or only slightly inferior to gold standard sample types. However, important problems were revealed with the uncritical use of DBS, inappropriate statistical analysis, and lack of standardized methodology. DBS have great potential to empower healthcare workers by making laboratory-based diagnostic tests more readily accessible, but additional and more rigorous research is needed. PMID:24366501
Li, Kangkang; Yu, Hai; Tade, Moses; Feron, Paul; Yu, Jingwen; Wang, Shujuan
2014-06-17
An advanced NH3 abatement and recycling process that makes great use of the waste heat in flue gas was proposed to solve the problems of ammonia slip, NH3 makeup, and flue gas cooling in the ammonia-based CO2 capture process. The rigorous rate-based model, RateFrac in Aspen Plus, was thermodynamically and kinetically validated by experimental data from open literature and CSIRO pilot trials at Munmorah Power Station, Australia, respectively. After a thorough sensitivity analysis and process improvement, the NH3 recycling efficiency reached as high as 99.87%, and the NH3 exhaust concentration was only 15.4 ppmv. Most importantly, the energy consumption of the NH3 abatement and recycling system was only 59.34 kJ/kg CO2 of electricity. The evaluation of mass balance and temperature steady shows that this NH3 recovery process was technically effective and feasible. This process therefore is a promising prospect toward industrial application.
Probabilistic risk analysis of building contamination.
Bolster, D T; Tartakovsky, D M
2008-10-01
We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.
Scientific Data Analysis Toolkit: A Versatile Add-in to Microsoft Excel for Windows
ERIC Educational Resources Information Center
Halpern, Arthur M.; Frye, Stephen L.; Marzzacco, Charles J.
2018-01-01
Scientific Data Analysis Toolkit (SDAT) is a rigorous, versatile, and user-friendly data analysis add-in application for Microsoft Excel for Windows (PC). SDAT uses the familiar Excel environment to carry out most of the analytical tasks used in data analysis. It has been designed for student use in manipulating and analyzing data encountered in…
On the Tracy-Widomβ Distribution for β=6
NASA Astrophysics Data System (ADS)
Grava, Tamara; Its, Alexander; Kapaev, Andrei; Mezzadri, Francesco
2016-11-01
We study the Tracy-Widom distribution function for Dyson's β-ensemble with β = 6. The starting point of our analysis is the recent work of I. Rumanov where he produces a Lax-pair representation for the Bloemendal-Virág equation. The latter is a linear PDE which describes the Tracy-Widom functions corresponding to general values of β. Using his Lax pair, Rumanov derives an explicit formula for the Tracy-Widom β=6 function in terms of the second Painlevé transcendent and the solution of an auxiliary ODE. Rumanov also shows that this formula allows him to derive formally the asymptotic expansion of the Tracy-Widom function. Our goal is to make Rumanov's approach and hence the asymptotic analysis it provides rigorous. In this paper, the first one in a sequel, we show that Rumanov's Lax-pair can be interpreted as a certain gauge transformation of the standard Lax pair for the second Painlevé equation. This gauge transformation though contains functional parameters which are defined via some auxiliary nonlinear ODE which is equivalent to the auxiliary ODE of Rumanov's formula. The gauge-interpretation of Rumanov's Lax-pair allows us to highlight the steps of the original Rumanov's method which needs rigorous justifications in order to make the method complete. We provide a rigorous justification of one of these steps. Namely, we prove that the Painlevé function involved in Rumanov's formula is indeed, as it has been suggested by Rumanov, the Hastings-McLeod solution of the second Painlevé equation. The key issue which we also discuss and which is still open is the question of integrability of the auxiliary ODE in Rumanov's formula. We note that this question is crucial for the rigorous asymptotic analysis of the Tracy-Widom function. We also notice that our work is a partial answer to one of the problems related to the β-ensembles formulated by Percy Deift during the June 2015 Montreal Conference on integrable systems.
Benn, Emma K T; Alexis, Andrew; Mohamed, Nihal; Wang, Yan-Hong; Khan, Ikhlas A; Liu, Bian
2016-12-01
Skin-bleaching practices, such as using skin creams and soaps to achieve a lighter skin tone, are common throughout the world and are triggered by cosmetic reasons that oftentimes have deep historical, economic, sociocultural, and psychosocial roots. Exposure to chemicals in the bleaching products, notably, mercury (Hg), hydroquinone, and steroids, has been associated with a variety of adverse health effects, such as Hg poisoning and exogenous ochronosis. In New York City (NYC), skin care product use has been identified as an important route of Hg exposure, especially among Caribbean-born blacks and Dominicans. However, surprisingly sparse information is available on the epidemiology of the health impacts of skin-bleaching practices among these populations. We highlight the dearth of large-scale, comprehensive, community-based, clinical, and translational research in this area, especially the limited skin-bleaching-related research among non-White populations in the US. We offer five new research directions, including investigating the known and under-studied health consequences among populations for which the skin bleach practice is newly emerging at an alarming rate using innovative laboratory and statistical methods. We call for conducting methodologically rigorous, multidisciplinary, and culturally sensitive research in order to provide insights into the root and the epidemiological status of the practice and provide evidence of exposure-outcome associations, with an ultimate goal of developing potential intervention strategies to reduce the health burdens of skin-bleaching practice.
NASA Astrophysics Data System (ADS)
Nucciotti, V.; Stringari, C.; Sacconi, L.; Vanzi, F.; Linari, M.; Piazzesi, G.; Lombardi, V.; Pavone, F. S.
2009-02-01
Recently, the use of Second Harmonic Generation (SHG) for imaging biological samples has been explored with regard to intrinsic SHG in highly ordered biological samples. As shown by fractional extraction of proteins, myosin is the source of SHG signal in skeletal muscle. SHG is highly dependent on symmetries and provides selective information on the structural order and orientation of the emitting proteins and the dynamics of myosin molecules responsible for the mechano-chemical transduction during contraction. We characterise the polarization-dependence of SHG intensity in three different physiological states: resting, rigor and isometric tetanic contraction in a sarcomere length range between 2.0 μm and 4.0 μm. The orientation of motor domains of the myosin molecules is dependent on their physiological states and modulate the SHG signal. We can discriminate the orientation of the emitting dipoles in four different molecular conformations of myosin heads in intact fibers during isometric contraction, in resting and rigor. We estimate the contribution of the myosin motor domain to the total second order bulk susceptibility from its molecular structure and its functional conformation. We demonstrate that SHG is sensitive to the fraction of ordered myosin heads by disrupting the order of myosin heads in rigor with an ATP analog. We estimate the fraction of myosin motors generating the isometric force in the active muscle fiber from the dependence of the SHG modulation on the degree of overlap between actin and myosin filaments during an isometric contraction.
ERIC Educational Resources Information Center
Hollands, Fiona M.; Kieffer, Michael J.; Shand, Robert; Pan, Yilin; Cheng, Henan; Levin, Henry M.
2016-01-01
We review the value of cost-effectiveness analysis for evaluation and decision making with respect to educational programs and discuss its application to early reading interventions. We describe the conditions for a rigorous cost-effectiveness analysis and illustrate the challenges of applying the method in practice, providing examples of programs…
ERIC Educational Resources Information Center
Sharp, William G.; Berry, Rashelle C.; McCracken, Courtney; Nuhu, Nadrat N.; Marvel, Elizabeth; Saulnier, Celine A.; Klin, Ami; Jones, Warren; Jaquess, David L.
2013-01-01
We conducted a comprehensive review and meta-analysis of research regarding feeding problems and nutrient status among children with autism spectrum disorders (ASD). The systematic search yielded 17 prospective studies involving a comparison group. Using rigorous meta-analysis techniques, we calculated the standardized mean difference (SMD) with…
Can power-law scaling and neuronal avalanches arise from stochastic dynamics?
Touboul, Jonathan; Destexhe, Alain
2010-02-11
The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.
Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey
2014-01-01
We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"a term and...
ERIC Educational Resources Information Center
Firmin, Michael W.; Gilson, Krista Merrick
2007-01-01
Using rigorous qualitative research methodology, twenty-four college students receiving their undergraduate degrees in three years were interviewed. Following analysis of the semi-structured interview transcripts and coding, themes emerged, indicating that these students possessed self-discipline, self-motivation, and drive. Overall, the results…
Gender, Discourse, and "Gender and Discourse."
ERIC Educational Resources Information Center
Davis, Hayley
1997-01-01
A critic of Deborah Tannen's book "Gender and Discourse" responds to comments made about her critique, arguing that the book's analysis of the relationship of gender and discourse tends to seek, and perhaps force, explanations only in those terms. Another linguist's analysis of similar phenomena is found to be more rigorous. (MSE)
1985-02-01
Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical
Evaluating Computer-Related Incidents on Campus
ERIC Educational Resources Information Center
Rothschild, Daniel; Rezmierski, Virginia
2004-01-01
The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…
Clarifying Objectives and Results of Equivalent System Mass Analyses for Advanced Life Support
NASA Technical Reports Server (NTRS)
Levri, Julie A.; Drysdale, Alan E.
2003-01-01
This paper discusses some of the analytical decisions that an investigator must make during the course of a life support system trade study. Equivalent System Mass (ESM) is often applied to evaluate trade study options in the Advanced Life Support (ALS) Program. ESM can be used to identify which of several options that meet all requirements are most likely to have lowest cost. It can also be used to identify which of the many interacting parts of a life support system have the greatest impact and sensitivity to assumptions. This paper summarizes recommendations made in the newly developed ALS ESM Guidelines Document and expands on some of the issues relating to trade studies that involve ESM. In particular, the following three points are expounded: 1) The importance of objectives: Analysis objectives drive the approach to any trade study, including identification of assumptions, selection of characteristics to compare in the analysis, and the most appropriate techniques for reflecting those characteristics. 2) The importance of results inferprefafion: The accuracy desired in the results depends upon the analysis objectives, whereas the realized accuracy is determined by the data quality and degree of detail in analysis methods. 3) The importance of analysis documentation: Documentation of assumptions and data modifications is critical for effective peer evaluation of any trade study. ESM results are analysis-specific and should always be reported in context, rather than as solitary values. For this reason, results reporting should be done with adequate rigor to allow for verification by other researchers.
NASA Astrophysics Data System (ADS)
Sun, Jiwen; Wei, Ling; Fu, Danying
2002-01-01
resolution and wide swath. In order to assure its high optical precision smoothly passing the rigorous dynamic load of launch, it should be of high structural rigidity. Therefore, a careful study of the dynamic features of the camera structure should be performed. Pro/E. An interference examination is performed on the precise CAD model of the camera for mending the structural design. for the first time in China, and the analysis of structural dynamic of the camera is accomplished by applying the structural analysis code PATRAN and NASTRAN. The main research programs include: 1) the comparative calculation of modes analysis of the critical structure of the camera is achieved by using 4 nodes and 10 nodes tetrahedral elements respectively, so as to confirm the most reasonable general model; 2) through the modes analysis of the camera from several cases, the inherent frequencies and modes are obtained and further the rationality of the structural design of the camera is proved; 3) the static analysis of the camera under self gravity and overloads is completed and the relevant deformation and stress distributions are gained; 4) the response calculation of sine vibration of the camera is completed and the corresponding response curve and maximum acceleration response with corresponding frequencies are obtained. software technique is accurate and efficient. sensitivity, the dynamic design and engineering optimization of the critical structure of the camera are discussed. fundamental technology in design of forecoming space optical instruments.
V Soares Maciel, Edvaldo; de Toffoli, Ana Lúcia; Lanças, Fernando Mauro
2018-04-20
The accelerated rising of the world's population increased the consumption of food, thus demanding more rigors in the control of residue and contaminants in food-based products marketed for human consumption. In view of the complexity of most food matrices, including fruits, vegetables, different types of meat, beverages, among others, a sample preparation step is important to provide more reliable results when combined with HPLC separations. An adequate sample preparation step before the chromatographic analysis is mandatory in obtaining higher precision and accuracy in order to improve the extraction of the target analytes, one of the priorities in analytical chemistry. The recent discovery of new materials such as ionic liquids, graphene-derived materials, molecularly imprinted polymers, restricted access media, magnetic nanoparticles, and carbonaceous nanomaterials, provided high sensitivity and selectivity results in an extensive variety of applications. These materials, as well as their several possible combinations, have been demonstrated to be highly appropriate for the extraction of different analytes in complex samples such as food products. The main characteristics and application of these new materials in food analysis will be presented and discussed in this paper. Another topic discussed in this review covers the main advantages and limitations of sample preparation microtechniques, as well as their off-line and on-line combination with HPLC for food analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Krompecher, T
1981-01-01
Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.
NASA Astrophysics Data System (ADS)
Beneš, Michal; Pažanin, Igor
2018-03-01
This paper reports an analytical investigation of non-isothermal fluid flow in a thin (or long) vertical pipe filled with porous medium via asymptotic analysis. We assume that the fluid inside the pipe is cooled (or heated) by the surrounding medium and that the flow is governed by the prescribed pressure drop between pipe's ends. Starting from the dimensionless Darcy-Brinkman-Boussinesq system, we formally derive a macroscopic model describing the effective flow at small Brinkman-Darcy number. The asymptotic approximation is given by the explicit formulae for the velocity, pressure and temperature clearly acknowledging the effects of the cooling (heating) and porous structure. The theoretical error analysis is carried out to indicate the order of accuracy and to provide a rigorous justification of the effective model.
Refining Estimates of Bird Collision and Electrocution Mortality at Power Lines in the United States
Loss, Scott R.; Will, Tom; Marra, Peter P.
2014-01-01
Collisions and electrocutions at power lines are thought to kill large numbers of birds in the United States annually. However, existing estimates of mortality are either speculative (for electrocution) or based on extrapolation of results from one study to all U.S. power lines (for collision). Because national-scale estimates of mortality and comparisons among threats are likely to be used for prioritizing policy and management strategies and for identifying major research needs, these estimates should be based on systematic and transparent assessment of rigorously collected data. We conducted a quantitative review that incorporated data from 14 studies meeting our inclusion criteria to estimate that between 12 and 64 million birds are killed each year at U.S. power lines, with between 8 and 57 million birds killed by collision and between 0.9 and 11.6 million birds killed by electrocution. Sensitivity analyses indicate that the majority of uncertainty in our estimates arises from variation in mortality rates across studies; this variation is due in part to the small sample of rigorously conducted studies that can be used to estimate mortality. Little information is available to quantify species-specific vulnerability to mortality at power lines; the available literature over-represents particular bird groups and habitats, and most studies only sample and present data for one or a few species. Furthermore, additional research is needed to clarify whether, to what degree, and in what regions populations of different bird species are affected by power line-related mortality. Nonetheless, our data-driven analysis suggests that the amount of bird mortality at U.S. power lines is substantial and that conservation management and policy is necessary to reduce this mortality. PMID:24991997
NASA Astrophysics Data System (ADS)
James, Mike R.; Robson, Stuart; d'Oleire-Oltmanns, Sebastian; Niethammer, Uwe
2016-04-01
Structure-from-motion (SfM) algorithms are greatly facilitating the production of detailed topographic models based on images collected by unmanned aerial vehicles (UAVs). However, SfM-based software does not generally provide the rigorous photogrammetric analysis required to fully understand survey quality. Consequently, error related to problems in control point data or the distribution of control points can remain undiscovered. Even if these errors are not large in magnitude, they can be systematic, and thus have strong implications for the use of products such as digital elevation models (DEMs) and orthophotos. Here, we develop a Monte Carlo approach to (1) improve the accuracy of products when SfM-based processing is used and (2) reduce the associated field effort by identifying suitable lower density deployments of ground control points. The method highlights over-parameterisation during camera self-calibration and provides enhanced insight into control point performance when rigorous error metrics are not available. Processing was implemented using commonly-used SfM-based software (Agisoft PhotoScan), which we augment with semi-automated and automated GCPs image measurement. We apply the Monte Carlo method to two contrasting case studies - an erosion gully survey (Taurodont, Morocco) carried out with an fixed-wing UAV, and an active landslide survey (Super-Sauze, France), acquired using a manually controlled quadcopter. The results highlight the differences in the control requirements for the two sites, and we explore the implications for future surveys. We illustrate DEM sensitivity to critical processing parameters and show how the use of appropriate parameter values increases DEM repeatability and reduces the spatial variability of error due to processing artefacts.
Treetrimmer: a method for phylogenetic dataset size reduction.
Maruyama, Shinichiro; Eveleigh, Robert J M; Archibald, John M
2013-04-12
With rapid advances in genome sequencing and bioinformatics, it is now possible to generate phylogenetic trees containing thousands of operational taxonomic units (OTUs) from a wide range of organisms. However, use of rigorous tree-building methods on such large datasets is prohibitive and manual 'pruning' of sequence alignments is time consuming and raises concerns over reproducibility. There is a need for bioinformatic tools with which to objectively carry out such pruning procedures. Here we present 'TreeTrimmer', a bioinformatics procedure that removes unnecessary redundancy in large phylogenetic datasets, alleviating the size effect on more rigorous downstream analyses. The method identifies and removes user-defined 'redundant' sequences, e.g., orthologous sequences from closely related organisms and 'recently' evolved lineage-specific paralogs. Representative OTUs are retained for more rigorous re-analysis. TreeTrimmer reduces the OTU density of phylogenetic trees without sacrificing taxonomic diversity while retaining the original tree topology, thereby speeding up downstream computer-intensive analyses, e.g., Bayesian and maximum likelihood tree reconstructions, in a reproducible fashion.
ERIC Educational Resources Information Center
Francis, Clay
2018-01-01
Historic notions of academic rigor usually follow from critiques of the system--we often define our goals for academically rigorous work through the lens of our shortcomings. This chapter discusses how the Truman Commission in 1947 and the Spellings Commission in 2006 shaped the way we think about academic rigor in today's context.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flinn, D.G.; Hall, S.; Morris, J.
This volume describes the background research, the application of the proposed loss evaluation techniques, and the results. The research identified present loss calculation methods as appropriate, provided care was taken to represent the various system elements in sufficient detail. The literature search of past methods and typical data revealed that extreme caution in using typical values (load factor, etc.) should be taken to ensure that all factors were referred to the same time base (daily, weekly, etc.). The performance of the method (and computer program) proposed in this project was determined by comparison of results with a rigorous evaluation ofmore » losses on the Salt River Project system. This rigorous evaluation used statistical modeling of the entire system as well as explicit enumeration of all substation and distribution transformers. Further tests were conducted at Public Service Electric and Gas of New Jersey to check the appropriateness of the methods in a northern environment. Finally sensitivity tests indicated data elements inaccuracy of which would most affect the determination of losses using the method developed in this project.« less
Rigorous Training of Dogs Leads to High Accuracy in Human Scent Matching-To-Sample Performance
Marchal, Sophie; Bregeras, Olivier; Puaux, Didier; Gervais, Rémi; Ferry, Barbara
2016-01-01
Human scent identification is based on a matching-to-sample task in which trained dogs are required to compare a scent sample collected from an object found at a crime scene to that of a suspect. Based on dogs’ greater olfactory ability to detect and process odours, this method has been used in forensic investigations to identify the odour of a suspect at a crime scene. The excellent reliability and reproducibility of the method largely depend on rigor in dog training. The present study describes the various steps of training that lead to high sensitivity scores, with dogs matching samples with 90% efficiency when the complexity of the scents presented during the task in the sample is similar to that presented in the in lineups, and specificity reaching a ceiling, with no false alarms in human scent matching-to-sample tasks. This high level of accuracy ensures reliable results in judicial human scent identification tests. Also, our data should convince law enforcement authorities to use these results as official forensic evidence when dogs are trained appropriately. PMID:26863620
A square-force cohesion model and its extraction from bulk measurements
NASA Astrophysics Data System (ADS)
Liu, Peiyuan; Lamarche, Casey; Kellogg, Kevin; Hrenya, Christine
2017-11-01
Cohesive particles remain poorly understood, with order of magnitude differences exhibited for prior, physical predictions of agglomerate size. A major obstacle lies in the absence of robust models of particle-particle cohesion, thereby precluding accurate prediction of the behavior of cohesive particles. Rigorous cohesion models commonly contain parameters related to surface roughness, to which cohesion shows extreme sensitivity. However, both roughness measurement and its distillation into these model parameters are challenging. Accordingly, we propose a ``square-force'' model, where cohesive force remains constant until a cut-off separation. Via DEM simulations, we demonstrate validity of the square-force model as surrogate of more rigorous models, when its two parameters are selected to match the two key quantities governing dense and dilute granular flows, namely maximum cohesive force and critical cohesive energy, respectively. Perhaps more importantly, we establish a method to extract the parameters in the square-force model via defluidization, due to its ability to isolate the effects of the two parameters. Thus, instead of relying on complicated scans of individual grains, determination of particle-particle cohesion from simple bulk measurements becomes feasible. Dow Corning Corporation.
A rigorous comparison of different planet detection algorithms
NASA Astrophysics Data System (ADS)
Tingley, B.
2003-05-01
The idea of finding extrasolar planets (ESPs) through observations of drops in stellar brightness due to transiting objects has been around for decades. It has only been in the last ten years, however, that any serious attempts to find ESPs became practical. The discovery of a transiting planet around the star HD 209458 (Charbonneau et al. \\cite{charbonneau}) has led to a veritable explosion of research, because the photometric method is the only way to search a large number of stars for ESPs simultaneously with current technology. To this point, however, there has been limited research into the various techniques used to extract the subtle transit signals from noise, mainly brief summaries in various papers focused on publishing transit-like signatures in observations. The scheduled launches over the next few years of satellites whose primary or secondary science missions will be ESP discovery motivates a review and a comparative study of the various algorithms used to perform the transit identification, to determine rigorously and fairly which one is the most sensitive under which circumstances, to maximize the results of past, current, and future observational campaigns.
NASA Astrophysics Data System (ADS)
Dong, Yang; He, Honghui; He, Chao; Ma, Hui
2016-10-01
Polarized light is sensitive to the microstructures of biological tissues and can be used to detect physiological changes. Meanwhile, spectral features of the scattered light can also provide abundant microstructural information of tissues. In this paper, we take the backscattering polarization Mueller matrix images of bovine skeletal muscle tissues during the 24-hour experimental time, and analyze their multispectral behavior using quantitative Mueller matrix parameters. In the processes of rigor mortis and proteolysis of muscle samples, multispectral frequency distribution histograms (FDHs) of the Mueller matrix elements can reveal rich qualitative structural information. In addition, we analyze the temporal variations of the sample using the multispectral Mueller matrix transformation (MMT) parameters. The experimental results indicate that the different stages of rigor mortis and proteolysis for bovine skeletal muscle samples can be judged by these MMT parameters. The results presented in this work show that combining with the multispectral technique, the FDHs and MMT parameters can characterize the microstructural variation features of skeletal muscle tissues. The techniques have the potential to be used as tools for quantitative assessment of meat qualities in food industry.
Systematic Error Mitigation for the PIXIE Instrument
NASA Technical Reports Server (NTRS)
Kogut, Alan; Fixsen, Dale J.; Nagler, Peter; Tucker, Gregory
2016-01-01
The Primordial Ination Explorer (PIXIE) uses a nulling Fourier Transform Spectrometer to measure the absoluteintensity and linear polarization of the cosmic microwave background and diuse astrophysical foregrounds.PIXIE will search for the signature of primordial ination and will characterize distortions from a blackbodyspectrum, both to precision of a few parts per billion. Rigorous control of potential instrumental eects isrequired to take advantage of the raw sensitivity. PIXIE employs a highly symmetric design using multipledierential nulling to reduce the instrumental signature to negligible levels. We discuss the systematic errorbudget and mitigation strategies for the PIXIE mission.
Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.
Lee, Jae Hee; Jung, Koo Young
2012-07-01
Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Viney, Linda L.; Caputi, Peter
2005-01-01
Content analysis scales apply rigorous measurement to verbal communications and make possible the quantification of text in counseling research. The limitations of the Origin and Pawn Scales (M. T. Westbrook & L. L. Viney, 1980), the Positive Affect Scale (M. T. Westbrook, 1976), the Content Analysis Scales of Psychosocial Maturity (CASPM; L.…
Quantum metrology with a single spin-3/2 defect in silicon carbide
NASA Astrophysics Data System (ADS)
Soykal, Oney O.; Reinecke, Thomas L.
We show that implementations for quantum sensing with exceptional sensitivity and spatial resolution can be made using the novel features of semiconductor high half-spin multiplet defects with easy-to-implement optical detection protocols. To achieve this, we use the spin- 3 / 2 silicon monovacancy deep center in hexagonal silicon carbide based on our rigorous derivation of this defect's ground state and of its electronic and optical properties. For a single VSi- defect, we obtain magnetic field sensitivities capable of detecting individual nuclear magnetic moments. We also show that its zero-field splitting has an exceptional strain and temperature sensitivity within the technologically desirable near-infrared window of biological systems. Other point defects, i.e. 3d transition metal or rare-earth impurities in semiconductors, may also provide similar opportunities in quantum sensing due to their similar high spin (S >= 3 / 2) configurations. This work was supported in part by ONR and by the Office of Secretary of Defense, Quantum Science and Engineering Program.
pH-sensitive Eudragit nanoparticles for mucosal drug delivery.
Yoo, Jin-Wook; Giri, Namita; Lee, Chi H
2011-01-17
Drug delivery via vaginal epithelium has suffered from lack of stability due to acidic and enzymatic environments. The biocompatible pH-sensitive nanoparticles composed of Eudragit S-100 (ES) were developed to protect loaded compounds from being degraded under the rigorous vaginal conditions and achieve their therapeutically effective concentrations in the mucosal epithelium. ES nanoparticles containing a model compound (sodium fluorescein (FNa) or nile red (NR)) were prepared by the modified quasi-emulsion solvent diffusion method. Loading efficiencies were found to be 26% and 71% for a hydrophilic and a hydrophobic compound, respectively. Both hydrophilic and hydrophobic model drugs remained stable in nanoparticles at acidic pH, whereas they are quickly released from nanoparticles upon exposure at physiological pH. The confocal study revealed that ES nanoparticles were taken up by vaginal cells, followed by pH-responsive drug release, with no cytotoxic activities. The pH-sensitive nanoparticles would be a promising carrier for the vaginal-specific delivery of various therapeutic drugs including microbicides and peptides/proteins. Published by Elsevier B.V.
Birkeland, S; Akse, L
2010-01-01
Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (P<0.001) contraction (-7.9%± 0.9%) on the caudal-cranial axis. No significant differences in instrumental color (a*, b*, C*, or h*), texture (hardness), or sensory traits (aroma, color, taste, and texture) were observed between pre- or post-rigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (P<0.01) lighter, L*, (39.7 ± 1.0) than post-rigor fillets (37.8 ± 0.8) and had significantly lower (P<0.05) aerobic plate count (APC), 1.4 ± 0.4 log CFU/g against 2.6 ± 0.6 log CFU/g, and psychrotrophic count (PC), 2.1 ± 0.2 log CFU/g against 3.0 ± 0.5 log CFU/g, than post-rigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®
Assessing climate change and socio-economic uncertainties in long term management of water resources
NASA Astrophysics Data System (ADS)
Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis
2015-04-01
Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further. Adaptation measures, such as new reservoirs can manage these risks to a certain extent, but our sensitivity testing demonstrates that they are less robust to future uncertainties than measures taken to reduce water demand. Keywords: Climate change, Uncertainty, Decision making, Drought, Risk, Water resources management.
Ballarini, E; Bauer, S; Eberhardt, C; Beyer, C
2012-06-01
Transverse dispersion represents an important mixing process for transport of contaminants in groundwater and constitutes an essential prerequisite for geochemical and biodegradation reactions. Within this context, this work describes the detailed numerical simulation of highly controlled laboratory experiments using uranine, bromide and oxygen depleted water as conservative tracers for the quantification of transverse mixing in porous media. Synthetic numerical experiments reproducing an existing laboratory experimental set-up of quasi two-dimensional flow through tank were performed to assess the applicability of an analytical solution of the 2D advection-dispersion equation for the estimation of transverse dispersivity as fitting parameter. The fitted dispersivities were compared to the "true" values introduced in the numerical simulations and the associated error could be precisely estimated. A sensitivity analysis was performed on the experimental set-up in order to evaluate the sensitivities of the measurements taken at the tank experiment on the individual hydraulic and transport parameters. From the results, an improved experimental set-up as well as a numerical evaluation procedure could be developed, which allow for a precise and reliable determination of dispersivities. The improved tank set-up was used for new laboratory experiments, performed at advective velocities of 4.9 m d(-1) and 10.5 m d(-1). Numerical evaluation of these experiments yielded a unique and reliable parameter set, which closely fits the measured tracer concentration data. For the porous medium with a grain size of 0.25-0.30 mm, the fitted longitudinal and transverse dispersivities were 3.49×10(-4) m and 1.48×10(-5) m, respectively. The procedures developed in this paper for the synthetic and rigorous design and evaluation of the experiments can be generalized and transferred to comparable applications. Copyright © 2012 Elsevier B.V. All rights reserved.
Modeling, Modal Properties, and Mesh Stiffness Variation Instabilities of Planetary Gears
NASA Technical Reports Server (NTRS)
Parker, Robert G.; Lin, Jian; Krantz, Timothy L. (Technical Monitor)
2001-01-01
Planetary gear noise and vibration are primary concerns in their applications in helicopters, automobiles, aircraft engines, heavy machinery and marine vehicles. Dynamic analysis is essential to the noise and vibration reduction. This work analytically investigates some critical issues and advances the understanding of planetary gear dynamics. A lumped-parameter model is built for the dynamic analysis of general planetary gears. The unique properties of the natural frequency spectra and vibration modes are rigorously characterized. These special structures apply for general planetary gears with cyclic symmetry and, in practically important case, systems with diametrically opposed planets. The special vibration properties are useful for subsequent research. Taking advantage of the derived modal properties, the natural frequency and vibration mode sensitivities to design parameters are investigated. The key parameters include mesh stiffnesses, support/bearing stiffnesses, component masses, moments of inertia, and operating speed. The eigen-sensitivities are expressed in simple, closed-form formulae associated with modal strain and kinetic energies. As disorders (e.g., mesh stiffness variation. manufacturing and assembling errors) disturb the cyclic symmetry of planetary gears, their effects on the free vibration properties are quantitatively examined. Well-defined veering rules are derived to identify dramatic changes of natural frequencies and vibration modes under parameter variations. The knowledge of free vibration properties, eigen-sensitivities, and veering rules provide important information to effectively tune the natural frequencies and optimize structural design to minimize noise and vibration. Parametric instabilities excited by mesh stiffness variations are analytically studied for multi-mesh gear systems. The discrepancies of previous studies on parametric instability of two-stage gear chains are clarified using perturbation and numerical methods. The operating conditions causing parametric instabilities are expressed in closed-form suitable for design guidance. Using the well-defined modal properties of planetary gears, the effects of mesh parameters on parametric instability are analytically identified. Simple formulae are obtained to suppress particular instabilities by adjusting contact ratios and mesh phasing.
ERIC Educational Resources Information Center
Oldham, Mary; Kellett, Stephen; Miles, Eleanor; Sheeran, Paschal
2012-01-01
Objective: Rates of nonattendance for psychotherapy hinder the effective delivery of evidence-based treatments. Although many strategies have been developed to increase attendance, the effectiveness of these strategies has not been quantified. Our aim in the present study was to undertake a meta-analysis of rigorously controlled studies to…
A Comparative Study of Definitions on Limit and Continuity of Functions
ERIC Educational Resources Information Center
Shipman, Barbara A.
2012-01-01
Differences in definitions of limit and continuity of functions as treated in courses on calculus and in rigorous undergraduate analysis yield contradictory outcomes and unexpected language. There are results about limits in calculus that are false by the definitions of analysis, functions not continuous by one definition and continuous by…
Tutoring Adolescents in Literacy: A Meta-Analysis
ERIC Educational Resources Information Center
Jun, Seung Won; Ramirez, Gloria; Cumming, Alister
2010-01-01
What does research reveal about tutoring adolescents in literacy? We conducted a meta-analysis, identifying 152 published studies, of which 12 met rigorous inclusion criteria. We analyzed the 12 studies for the effects of tutoring according to the type, focus, and amount of tutoring; the number, age, and language background of students; and the…
Interactive visual analysis promotes exploration of long-term ecological data
T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst
2013-01-01
Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...
An International Meta-Analysis of Reading Recovery
ERIC Educational Resources Information Center
D'Agostino, Jerome V.; Harmey, Sinéad J.
2016-01-01
Reading Recovery is one of the most researched literacy programs worldwide. Although there have been at least 4 quantitative reviews of its effectiveness, none have considered all rigorous group-comparison studies from all implementing nations from the late 1970s to 2015. Using a hierarchical linear modeling (HLM) v-known analysis, we examined if…
Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei
2015-01-01
This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.
Choi, Yun-Sang
2015-01-01
This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884
ERIC Educational Resources Information Center
Green, Samuel B.; Levy, Roy; Thompson, Marilyn S.; Lu, Min; Lo, Wen-Juo
2012-01-01
A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to…
Stather, Philip W; Ferguson, James; Awopetu, Ayoola; Boyle, Jonathan R
2018-03-03
The effect of suprarenal (SR) as opposed to infrarenal (IR) fixation on renal outcomes post-endovascular aneurysm repair (EVAR) remains controversial. This meta-analysis aims to update current understanding of this issue. A prior meta-analysis was updated through a Preferred Reporting Items for Systematic reviews and Meta-Analyses search for additional studies published in the last 3 years reporting on renal dysfunction or related outcomes post-EVAR. Random effects meta-analysis was undertaken using SPSS. A total of 25 non-randomised studies comparing SR with IR fixation were included, totalling 54,832 patients. In total, 16,634 underwent SR and 38,198 IR. Baseline characteristics, including age, baseline estimated glomerular filtration rate, diabetes, cardiac disease, and smoking, were similar between the groups. There was a small but significant difference in outcomes for renal dysfunction at the study end point (SR 5.98% vs. IR 4.83%; odds ratio [OR] 1.29, 95% confidence interval [CI] 1.18-1.40 [p < .001]); however, at 30 days and 12 months there was no significant difference, and this did not hold up to sensitivity analysis. Incidence of renal infarcts (SR 6.6% vs. IR 2.3%; OR 2.78, 95% CI 1.46-5.29 [p = .002]), renal stenosis (SR 2.4% vs. IR 0.8%; OR 2.89, 95% CI 1.00-8.38 [p = .05]), and renal artery occlusion (SR 2.4% vs. IR 1.2%; OR 2.21, 95% CI 1.15-4.25 [p = 0.02]) favoured IR fixation; however, there was no difference in haemodialysis rates. This meta-analysis has identified small but significantly worse renal outcomes in patients having SR fixation devices compared with IR; however, there was no difference in dialysis rates and a small effect on renal dysfunction, which did not stand up to rigorous sensitivity analysis. This should be taken into consideration during graft selection, and further studies must assess renal outcomes in the longer term, and in those with pre-existing renal dysfunction. Copyright © 2018 European Society for Vascular Surgery. Published by Elsevier B.V. All rights reserved.
Incorporating uncertainty into medical decision making: an approach to unexpected test results.
Bianchi, Matt T; Alexander, Brian M; Cash, Sydney S
2009-01-01
The utility of diagnostic tests derives from the ability to translate the population concepts of sensitivity and specificity into information that will be useful for the individual patient: the predictive value of the result. As the array of available diagnostic testing broadens, there is a temptation to de-emphasize history and physical findings and defer to the objective rigor of technology. However, diagnostic test interpretation is not always straightforward. One significant barrier to routine use of probability-based test interpretation is the uncertainty inherent in pretest probability estimation, the critical first step of Bayesian reasoning. The context in which this uncertainty presents the greatest challenge is when test results oppose clinical judgment. It is this situation when decision support would be most helpful. The authors propose a simple graphical approach that incorporates uncertainty in pretest probability and has specific application to the interpretation of unexpected results. This method quantitatively demonstrates how uncertainty in disease probability may be amplified when test results are unexpected (opposing clinical judgment), even for tests with high sensitivity and specificity. The authors provide a simple nomogram for determining whether an unexpected test result suggests that one should "switch diagnostic sides.'' This graphical framework overcomes the limitation of pretest probability uncertainty in Bayesian analysis and guides decision making when it is most challenging: interpretation of unexpected test results.
Mirus, Benjamin B.; Nimmo, J.R.
2013-01-01
The impact of preferential flow on recharge and contaminant transport poses a considerable challenge to water-resources management. Typical hydrologic models require extensive site characterization, but can underestimate fluxes when preferential flow is significant. A recently developed source-responsive model incorporates film-flow theory with conservation of mass to estimate unsaturated-zone preferential fluxes with readily available data. The term source-responsive describes the sensitivity of preferential flow in response to water availability at the source of input. We present the first rigorous tests of a parsimonious formulation for simulating water table fluctuations using two case studies, both in arid regions with thick unsaturated zones of fractured volcanic rock. Diffuse flow theory cannot adequately capture the observed water table responses at both sites; the source-responsive model is a viable alternative. We treat the active area fraction of preferential flow paths as a scaled function of water inputs at the land surface then calibrate the macropore density to fit observed water table rises. Unlike previous applications, we allow the characteristic film-flow velocity to vary, reflecting the lag time between source and deep water table responses. Analysis of model performance and parameter sensitivity for the two case studies underscores the importance of identifying thresholds for initiation of film flow in unsaturated rocks, and suggests that this parsimonious approach is potentially of great practical value.
Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R
1983-07-01
The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.
RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.
Meltzer, S J; Auer, J
1908-01-01
Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.
RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT
Meltzer, S. J.; Auer, John
1908-01-01
Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions—nearly equimolecular to "physiological" solutions of sodium chloride—are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle. PMID:19867124
Attitude stability of spinning satellites
NASA Technical Reports Server (NTRS)
Caughey, T. K.
1980-01-01
Some problems of attitude stability of spinning satellites are treated in a rigorous manner. With certain restrictions, linearized stability analysis correctly predicts the attitude stability of spinning satellites, even in the critical cases of the Liapunov-Poincare stability theory.
A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...
McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron
2011-03-01
Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.
Climate Change Accuracy: Requirements and Economic Value
NASA Astrophysics Data System (ADS)
Wielicki, B. A.; Cooke, R.; Mlynczak, M. G.; Lukashin, C.; Thome, K. J.; Baize, R. R.
2014-12-01
Higher than normal accuracy is required to rigorously observe decadal climate change. But what level is needed? How can this be quantified? This presentation will summarize a new more rigorous and quantitative approach to determining the required accuracy for climate change observations (Wielicki et al., 2013, BAMS). Most current global satellite observations cannot meet this accuracy level. A proposed new satellite mission to resolve this challenge is CLARREO (Climate Absolute Radiance and Refractivity Observatory). CLARREO is designed to achieve advances of a factor of 10 for reflected solar spectra and a factor of 3 to 5 for thermal infrared spectra (Wielicki et al., Oct. 2013 BAMS). The CLARREO spectrometers are designed to serve as SI traceable benchmarks for the Global Satellite Intercalibration System (GSICS) and to greatly improve the utility of a wide range of LEO and GEO infrared and reflected solar passive satellite sensors for climate change observations (e.g. CERES, MODIS, VIIIRS, CrIS, IASI, Landsat, SPOT, etc). Providing more accurate decadal change trends can in turn lead to more rapid narrowing of key climate science uncertainties such as cloud feedback and climate sensitivity. A study has been carried out to quantify the economic benefits of such an advance as part of a rigorous and complete climate observing system. The study concludes that the economic value is $12 Trillion U.S. dollars in Net Present Value for a nominal discount rate of 3% (Cooke et al. 2013, J. Env. Sys. Dec.). A brief summary of these two studies and their implications for the future of climate science will be presented.
Effect of carbon monoxide on plants. [Mimosa pudica
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimmerman, P.W.; Crocker, W.; Hitchcock, A.E.
Of 108 species of plants treated with one per cent carbon monoxide, 45 showed epinastic growth of leaves. Several species showed hyponasty which caused upward curling of leaves. Other effects included: retarded stem elongation; abnormally small new leaves; abnormal yellowing of the leaves, beginning with the oldest; abscission of leaves usually associated with yellowing; and hypertrophied tissues on stems and roots. During recovery an abnormally large number of side shoots arose from latent buds of many species. Motion pictures of Mimosa pudica showed a loss of correlation, normal equilibrium position to gravity, and sensitiveness to contact or heat stimuli; however,more » the leaves moved about more rapidly than those of controls. Since carbon monoxide causes growth rigor and loss of sensitiveness to external stimuli, it is here considered as an anesthetic.« less
NLTE atomic kinetics modeling in ICF target simulations
NASA Astrophysics Data System (ADS)
Patel, Mehul V.; Mauche, Christopher W.; Scott, Howard A.; Jones, Ogden S.; Shields, Benjamin T.
2017-10-01
Radiation hydrodynamics (HYDRA) simulations using recently developed 1D spherical and 2D cylindrical hohlraum models have enabled a reassessment of the accuracy of energetics modeling across a range of NIF target configurations. Higher-resolution hohlraum calculations generally find that the X-ray drive discrepancies are greater than previously reported. We identify important physics sensitivities in the modeling of the NLTE wall plasma and highlight sensitivity variations between different hohlraum configurations (e.g. hohlraum gas fill). Additionally, 1D capsule only simulations show the importance of applying a similar level of rigor to NLTE capsule ablator modeling. Taken together, these results show how improved target performance predictions can be achieved by performing inline atomic kinetics using more complete models for the underlying atomic structure and transitions. Prepared by LLNL under Contract DE-AC52-07NA27344.
Optical-fiber-based Mueller optical coherence tomography.
Jiao, Shuliang; Yu, Wurong; Stoica, George; Wang, Lihong V
2003-07-15
An optical-fiber-based multichannel polarization-sensitive Mueller optical coherence tomography (OCT) system was built to acquire the Jones or Mueller matrix of a scattering medium, such as biological tissue. For the first time to our knowledge, fiber-based polarization-sensitive OCT was dynamically calibrated to eliminate the polarization distortion caused by the single-mode optical fiber in the sample arm, thereby overcoming a key technical impediment to the application of optical fibers in this technology. The round-trip Jones matrix of the sampling fiber was acquired from the reflecting surface of the sample for each depth scan (A scan) with our OCT system. A new rigorous algorithm was then used to retrieve the calibrated polarization properties of the sample. This algorithm was validated with experimental data. The skin of a rat was imaged with this fiber-based system.
Long persistence of rigor mortis at constant low temperature.
Varetto, Lorenzo; Curto, Ombretta
2005-01-06
We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.
Rigor Made Easy: Getting Started
ERIC Educational Resources Information Center
Blackburn, Barbara R.
2012-01-01
Bestselling author and noted rigor expert Barbara Blackburn shares the secrets to getting started, maintaining momentum, and reaching your goals. Learn what rigor looks like in the classroom, understand what it means for your students, and get the keys to successful implementation. Learn how to use rigor to raise expectations, provide appropriate…
Close Early Learning Gaps with Rigorous DAP
ERIC Educational Resources Information Center
Brown, Christopher P.; Mowry, Brian
2015-01-01
Rigorous DAP (developmentally appropriate practices) is a set of 11 principles of instruction intended to help close early childhood learning gaps. Academically rigorous learning environments create the conditions for children to learn at high levels. While academic rigor focuses on one dimension of education--academic--DAP considers the whole…
Civil Rights Project's Response to "Re-Analysis" of Charter School Study
ERIC Educational Resources Information Center
Civil Rights Project / Proyecto Derechos Civiles, 2010
2010-01-01
The Civil Rights Project (CRP) was founded, in part, to bring rigorous social science inquiry to bear on the most pressing civil rights issues. On-going trends involving public school segregation have been a primary focus of the CRP's research, and the expanding policy emphasis on school choice prompted analysis of the much smaller--but…
ERIC Educational Resources Information Center
Cowan, Richard J.; Abel, Leah; Candel, Lindsay
2017-01-01
We conducted a meta-analysis of single-subject research studies investigating the effectiveness of antecedent strategies grounded in behavioral momentum for improving compliance and on-task performance for students with autism. First, we assessed the research rigor of those studies meeting our inclusionary criteria. Next, in order to apply a…
A Review of the Application of Lifecycle Analysis to Renewable Energy Systems
ERIC Educational Resources Information Center
Lund, Chris; Biswas, Wahidul
2008-01-01
The lifecycle concept is a "cradle to grave" approach to thinking about products, processes, and services, recognizing that all stages have environmental and economic impacts. Any rigorous and meaningful comparison of energy supply options must be done using a lifecycle analysis approach. It has been applied to an increasing number of conventional…
Considerations for the Systematic Analysis and Use of Single-Case Research
ERIC Educational Resources Information Center
Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith
2012-01-01
Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…
Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics
ERIC Educational Resources Information Center
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-01-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…
Conceptualization of Light Refraction
ERIC Educational Resources Information Center
Sokolowski, Andrzej
2013-01-01
There have been a number of papers dealing quantitatively with light refraction. Yet the conceptualization of the phenomenon that sets the foundation for a more rigorous math analysis is minimized. The purpose of this paper is to fill that gap. (Contains 3 figures.)
Using Framework Analysis in nursing research: a worked example.
Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica
2013-11-01
To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.
Augmented assessment as a means to augmented reality.
Bergeron, Bryan
2006-01-01
Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.
Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation
NASA Astrophysics Data System (ADS)
Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe
2018-04-01
In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.
Krompecher, T; Bergerioux, C
1988-01-01
The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.
Rigorous Schools and Classrooms: Leading the Way
ERIC Educational Resources Information Center
Williamson, Ronald; Blackburn, Barbara R.
2010-01-01
Turn your school into a student-centered learning environment, where rigor is at the heart of instruction in every classroom. From the bestselling author of "Rigor is Not a Four-Letter Word," Barbara Blackburn, and award-winning educator Ronald Williamson, this comprehensive guide to establishing a schoolwide culture of rigor is for principals and…
Rigor Revisited: Scaffolding College Student Learning by Incorporating Their Lived Experiences
ERIC Educational Resources Information Center
Castillo-Montoya, Milagros
2018-01-01
This chapter explores how students' lived experiences contribute to the rigor of their thinking. Insights from research indicate faculty can enhance rigor by accounting for the many ways it may surface in the classroom. However, to see this type of rigor, we must revisit the way we conceptualize it for higher education.
Mungure, Tanyaradzwa E; Bekhit, Alaa El-Din A; Birch, E John; Stewart, Ian
2016-04-01
The effects of rigor temperature (5, 15, 20 and 25°C), ageing (3, 7, 14, and 21 days) and display time on meat quality and lipid oxidative stability of hot boned beef M. Semimembranosus (SM) muscle were investigated. Ultimate pH (pH(u)) was rapidly attained at higher rigor temperatures. Electrical conductivity increased with rigor temperature (p<0.001). Tenderness, purge and cooking losses were not affected by rigor temperature; however purge loss and tenderness increased with ageing (p<0.01). Lightness (L*) and redness (a*) of the SM increased as rigor temperature increased (p<0.01). Lipid oxidation was assessed using (1)H NMR where changes in aliphatic to olefinic (R(ao)) and diallylmethylene (R(ad)) proton ratios can be rapidly monitored. R(ad), R(ao), PUFA and TBARS were not affected by rigor temperature, however ageing and display increased lipid oxidation (p<0.05). This study shows that rigor temperature manipulation of hot boned beef SM muscle does not have adverse effects on lipid oxidation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Goebel, Carsten; Diepgen, Thomas L; Blömeke, Brunhilde; Gaspari, Anthony A; Schnuch, Axel; Fuchs, Anne; Schlotmann, Kordula; Krasteva, Maya; Kimber, Ian
2018-06-01
Occupational exposure of hairdressers to hair dyes has been associated with the development of allergic contact dermatitis (ACD) involving the hands. p-Phenylenediamine (PPD) and toluene-2,5-diamine (PTD) have been implicated as important occupational contact allergens. To conduct a quantitative risk assessment for the induction of contact sensitization to hair dyes in hairdressers, available data from hand rinsing studies following typical occupational exposure conditions to PPD, PTD and resorcinol were assessed. By accounting for wet work, uneven exposure and inter-individual variability for professionals, daily hand exposure concentrations were derived. Secondly, daily hand exposure was compared with the sensitization induction potency of the individual hair dye defined as the No Expected Sensitization Induction Levels (NESIL). For PPD and PTD hairdresser hand exposure levels were 2.7 and 5.9 fold below the individual NESIL. In contrast, hand exposure to resorcinol was 50 fold below the NESIL. Correspondingly, the risk assessment for PPD and PTD indicates that contact sensitization may occur, when skin protection and skin care are not rigorously applied. We conclude that awareness of health risks associated with occupational exposure to hair dyes, and of the importance of adequate protective measures, should be emphasized more fully during hairdresser education and training. Copyright © 2018 Elsevier Inc. All rights reserved.
Surface-Sensitive Microwear Texture Analysis of Attrition and Erosion.
Ranjitkar, S; Turan, A; Mann, C; Gully, G A; Marsman, M; Edwards, S; Kaidonis, J A; Hall, C; Lekkas, D; Wetselaar, P; Brook, A H; Lobbezoo, F; Townsend, G C
2017-03-01
Scale-sensitive fractal analysis of high-resolution 3-dimensional surface reconstructions of wear patterns has advanced our knowledge in evolutionary biology, and has opened up opportunities for translatory applications in clinical practice. To elucidate the microwear characteristics of attrition and erosion in worn natural teeth, we scanned 50 extracted human teeth using a confocal profiler at a high optical resolution (X-Y, 0.17 µm; Z < 3 nm). Our hypothesis was that microwear complexity would be greater in erosion and that anisotropy would be greater in attrition. The teeth were divided into 4 groups, including 2 wear types (attrition and erosion) and 2 locations (anterior and posterior teeth; n = 12 for each anterior group, n = 13 for each posterior group) for 2 tissue types (enamel and dentine). The raw 3-dimensional data cloud was subjected to a newly developed rigorous standardization technique to reduce interscanner variability as well as to filter anomalous scanning data. Linear mixed effects (regression) analyses conducted separately for the dependent variables, complexity and anisotropy, showed the following effects of the independent variables: significant interactions between wear type and tissue type ( P = 0.0157 and P = 0.0003, respectively) and significant effects of location ( P < 0.0001 and P = 0.0035, respectively). There were significant associations between complexity and anisotropy when the dependent variable was either complexity ( P = 0.0003) or anisotropy ( P = 0.0014). Our findings of greater complexity in erosion and greater anisotropy in attrition confirm our hypothesis. The greatest geometric means were noted in dentine erosion for complexity and dentine attrition for anisotropy. Dentine also exhibited microwear characteristics that were more consistent with wear types than enamel. Overall, our findings could complement macrowear assessment in dental clinical practice and research and could assist in the early detection and management of pathologic tooth wear.
Facility Concepts for Mars Returned Sample Handling
NASA Technical Reports Server (NTRS)
Cohen, Marc M.; Briggs, Geoff (Technical Monitor)
2001-01-01
Samples returned from Mars must be held in quarantine until their biological safety has been determined. A significant challenge, unique to NASA's needs, is how to contain the samples (to protect the blaspheme) while simultaneously protecting their pristine nature. This paper presents a comparative analysis of several quarantine facility concepts for handling and analyzing these samples. The considerations in this design analysis include: modes of manipulation; capability for destructive as well as non-destructive testing; avoidance of cross-contamination; linear versus recursive processing; and sample storage and retrieval within a closed system. The ability to rigorously contain biologically hazardous materials has been amply demonstrated by facilities that meet the specifications of the Center for Disease Control Biosafety Level 4. The newly defined Planetary Protection Level Alpha must provide comparable containment while assuring that the samples remain pristine; the latter requirement is based on the need to avoid compromising science analyses by instrumentation of the highest possible sensitivity (among other things this will assure that there is no false positive detection of organisms or organic molecules - a situation that would delay or prevent the release of the samples from quarantine). Protection of the samples against contamination by terrestrial organisms and organic molecules makes a considerable impact upon the sample handling facility. The use of glove boxes appears to be impractical because of their tendency to leak and to surges. As a result, a returned sample quarantine facility must consider the use of automation and remote manipulation to carry out the various functions of sample handling and transfer within the system. The problem of maintaining sensitive and bulky instrumentation under the constraints of simultaneous sample containment and contamination protection also places demands on the architectural configuration of the facility that houses it.
Sayasone, Somphou; Utzinger, Jürg; Akkhavong, Kongsap; Odermatt, Peter
2015-01-01
Intestinal parasitic infections are common in Lao People's Democratic Republic (Lao PDR). We investigated the accuracy of the Kato-Katz (KK) technique in relation to varying stool sampling efforts, and determined the effect of the concurrent use of a quantitative formalin-ethyl acetate concentration technique (FECT) for helminth diagnosis and appraisal of concomitant infections. The study was carried out between March and May 2006 in Champasack province, southern Lao PDR. Overall, 485 individuals aged ≥6 months who provided three stool samples were included in the final analysis. All stool samples were subjected to the KK technique. Additionally, one stool sample per individual was processed by FECT. Diagnosis was done under a light microscope by experienced laboratory technicians. Analysis of three stool samples with KK plus a single FECT was considered as diagnostic 'gold' standard and resulted in prevalence estimates of hookworm, Opisthorchis viverrini, Ascaris lumbricoides, Trichuris trichiura and Schistosoma mekongi infection of 77.9%, 65.0%, 33.4%, 26.2% and 24.3%, respectively. As expected, a single KK and a single FECT missed a considerable number of infections. While our diagnostic 'gold' standard produced similar results than those obtained by a mathematical model for most helminth infections, the 'true' prevalence predicted by the model for S. mekongi (28.1%) was somewhat higher than after multiple KK plus a single FECT (24.3%). In the current setting, triplicate KK plus a single FECT diagnosed helminth infections with high sensitivity. Hence, such a diagnostic approach might be utilised for generating high-quality baseline data, assessing anthelminthic drug efficacy and rigorous monitoring of community interventions. Copyright © 2014 Elsevier B.V. All rights reserved.
Critical Realism and Empirical Bioethics: A Methodological Exposition.
McKeown, Alex
2017-09-01
This paper shows how critical realism can be used to integrate empirical data and philosophical analysis within 'empirical bioethics'. The term empirical bioethics, whilst appearing oxymoronic, simply refers to an interdisciplinary approach to the resolution of practical ethical issues within the biological and life sciences, integrating social scientific, empirical data with philosophical analysis. It seeks to achieve a balanced form of ethical deliberation that is both logically rigorous and sensitive to context, to generate normative conclusions that are practically applicable to the problem, challenge, or dilemma. Since it incorporates both philosophical and social scientific components, empirical bioethics is a field that is consistent with the use of critical realism as a research methodology. The integration of philosophical and social scientific approaches to ethics has been beset with difficulties, not least because of the irreducibly normative, rather than descriptive, nature of ethical analysis and the contested relation between fact and value. However, given that facts about states of affairs inform potential courses of action and their consequences, there is a need to overcome these difficulties and successfully integrate data with theory. Previous approaches have been formulated to overcome obstacles in combining philosophical and social scientific perspectives in bioethical analysis; however each has shortcomings. As a mature interdisciplinary approach critical realism is well suited to empirical bioethics, although it has hitherto not been widely used. Here I show how it can be applied to this kind of research and explain how it represents an improvement on previous approaches.
Hypothetical Case and Scenario Description for International Transportation of Spent Nuclear Fuel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Adam David; Osborn, Douglas; Jones, Katherine A.
To support more rigorous analysis on global security issues at Sandia National Laboratories (SNL), there is a need to develop realistic data sets without using "real" data or identifying "real" vulnerabilities, hazards or geopolitically embarrassing shortcomings. In response, an interdisciplinary team led by subject matter experts in SNL's Center for Global Security and Cooperation (CGSC) developed a hypothetical case description. This hypothetical case description assigns various attributes related to international SNF transportation that are representative, illustrative and indicative of "real" characteristics of "real" countries. There is no intent to identify any particular country and any similarity with specific real-world eventsmore » is purely coincidental. To support the goal of this report to provide a case description (and set of scenarios of concern) for international SNF transportation inclusive of as much "real-world" complexity as possible -- without crossing over into politically sensitive or classified information -- this SAND report provides a subject matter expert-validated (and detailed) description of both technical and political influences on the international transportation of spent nuclear fuel. [PAGE INTENTIONALLY LEFT BLANK]« less
Antineutrino analysis for continuous monitoring of nuclear reactors: Sensitivity study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Christopher; Erickson, Anna
This paper explores the various contributors to uncertainty on predictions of the antineutrino source term which is used for reactor antineutrino experiments and is proposed as a safeguard mechanism for future reactor installations. The errors introduced during simulation of the reactor burnup cycle from variation in nuclear reaction cross sections, operating power, and other factors are combined with those from experimental and predicted antineutrino yields, resulting from fissions, evaluated, and compared. The most significant contributor to uncertainty on the reactor antineutrino source term when the reactor was modeled in 3D fidelity with assembly-level heterogeneity was found to be the uncertaintymore » on the antineutrino yields. Using the reactor simulation uncertainty data, the dedicated observation of a rigorously modeled small, fast reactor by a few-ton near-field detector was estimated to offer reduction of uncertainty on antineutrino yields in the 3.0–6.5 MeV range to a few percent for the primary power-producing fuel isotopes, even with zero prior knowledge of the yields.« less
Differentially Private Frequent Subgraph Mining
Xu, Shengzhi; Xiong, Li; Cheng, Xiang; Xiao, Ke
2016-01-01
Mining frequent subgraphs from a collection of input graphs is an important topic in data mining research. However, if the input graphs contain sensitive information, releasing frequent subgraphs may pose considerable threats to individual's privacy. In this paper, we study the problem of frequent subgraph mining (FGM) under the rigorous differential privacy model. We introduce a novel differentially private FGM algorithm, which is referred to as DFG. In this algorithm, we first privately identify frequent subgraphs from input graphs, and then compute the noisy support of each identified frequent subgraph. In particular, to privately identify frequent subgraphs, we present a frequent subgraph identification approach which can improve the utility of frequent subgraph identifications through candidates pruning. Moreover, to compute the noisy support of each identified frequent subgraph, we devise a lattice-based noisy support derivation approach, where a series of methods has been proposed to improve the accuracy of the noisy supports. Through formal privacy analysis, we prove that our DFG algorithm satisfies ε-differential privacy. Extensive experimental results on real datasets show that the DFG algorithm can privately find frequent subgraphs with high data utility. PMID:27616876
Widely tunable Fabry-Perot filter based MWIR and LWIR microspectrometers
NASA Astrophysics Data System (ADS)
Ebermann, Martin; Neumann, Norbert; Hiller, Karla; Gittler, Elvira; Meinig, Marco; Kurth, Steffen
2012-06-01
As is generally known, miniature infrared spectrometers have great potential, e. g. for process and environmental analytics or in medical applications. Many efforts are being made to shrink conventional spectrometers, such as FTIR or grating based devices. A more rigorous approach for miniaturization is the use of MEMS technologies. Based on an established design for the MWIR new MEMS Fabry-Perot filters and sensors with expanded spectral ranges in the LWIR have been developed. The range 5.5 - 8 μm is particularly suited for the analysis of liquids. A dual-band sensor, which can be simultaneously tuned from 4 - 5 μm and 8 - 11 μm for the measurement of anesthetics and carbon dioxide has also been developed. A new material system is used to reduce internal stress in the reflector layer stack. Good results in terms of finesse (<= 60) and transmittance (<= 80 %) could be demonstrated. The hybrid integration of the filter in a pyroelectric detector results in very compact, robust and cost effective microspectrometers. FP filters with two moveable reflectors instead of only one reduce significantly the acceleration sensitivity and actuation voltage.
Parametric assessment of climate change impacts of automotive material substitution.
Geyer, Roland
2008-09-15
Quantifying the net climate change impact of automotive material substitution is not a trivial task. It requires the assessment of the mass reduction potential of automotive materials, the greenhouse gas (GHG) emissions from their production and recycling, and their impact on GHG emissions from vehicle use. The model presented in this paper is based on life cycle assessment (LCA) and completely parameterized, i.e., its computational structure is separated from the required input data, which is not traditionally done in LCAs. The parameterization increases scientific rigor and transparency of the assessment methodology, facilitates sensitivity and uncertainty analysis of the results, and also makes it possible to compare different studies and explain their disparities. The state of the art of the modeling methodology is reviewed and advanced. Assessment of the GHG emission impacts of material recycling through consequential system expansion shows that our understanding of this issue is still incomplete. This is a critical knowledge gap since a case study shows thatfor materials such as aluminum, the GHG emission impacts of material production and recycling are both of the same size as the use phase savings from vehicle mass reduction.
Observations of Far-Infrared Molecular Emission Lines from the Orion Molecular Cloud. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Viscuso, P. J.
1986-01-01
The Orion Nebula was the subject of intensive study for over one hundred years. Recently, several far infrared transitions among the low-lying levels of OH were observed toward IRc2. The OH is thought to be abundant, and plays an important role in the chemical evolution of shock and post-shock regions. The OH emission serves as a sensitive probe of the temperature and density for the shock-processed gas. A rigorous treatment of the radiative transfer of these measured transitions was performed using the escape probability formalism. From this analysis, the temperature of the OH-emitting region was determined to be on the order of 40K. This suggests that the gas is part of the post-shock gas that has cooled sufficiently, most likely by way of radiative cooling by CO. Such cooling from shock temperatures of several degrees can be accomplished in 100 years. A molecular hydrogen density of 3 million/cubic cm and an OH column density of 1.0 x 10 to the 17th /sq cm is found. The beam filling factor is determined to be 36%.
Percy, Andrew J; Yang, Juncong; Hardie, Darryl B; Chambers, Andrew G; Tamura-Wells, Jessica; Borchers, Christoph H
2015-06-15
Spurred on by the growing demand for panels of validated disease biomarkers, increasing efforts have focused on advancing qualitative and quantitative tools for more highly multiplexed and sensitive analyses of a multitude of analytes in various human biofluids. In quantitative proteomics, evolving strategies involve the use of the targeted multiple reaction monitoring (MRM) mode of mass spectrometry (MS) with stable isotope-labeled standards (SIS) used for internal normalization. Using that preferred approach with non-invasive urine samples, we have systematically advanced and rigorously assessed the methodology toward the precise quantitation of the largest, multiplexed panel of candidate protein biomarkers in human urine to date. The concentrations of the 136 proteins span >5 orders of magnitude (from 8.6 μg/mL to 25 pg/mL), with average CVs of 8.6% over process triplicate. Detailed here is our quantitative method, the analysis strategy, a feasibility application to prostate cancer samples, and a discussion of the utility of this method in translational studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Passalacqua, G; Compalati, E; Schiappoli, M; Senna, G
2005-03-01
The use of Complementary/Alternative Medicines (CAM) is largely diffused and constantly increasing, especially in the field of allergic diseases and asthma. Homeopathy, acupuncture and phytotherapy are the most frequently utilised treatments, whereas complementary diagnostic techniques are mainly used in the field of food allergy-intolerance. Looking at the literature, the majority of clinical trials with CAMS are of low methodological quality, thus difficult to interpret. There are very few studies performed in a rigorously controlled fashion, and those studies provided inconclusive results. In asthma, none of the CAM have thus far been proved more effective than placebo or equally effective as standard treatments. Some herbal products, containing active principles, have displayed some clinical effect, but the herbal remedies are usually not standardised and not quantified, thus carry the risk of toxic effects or interactions. None of the alternative diagnostic techniques (electrodermal testing, kinesiology, leukocytotoxic test, iridology, hair analysis) have been proved able to distinguish between healthy and allergic subjects or to diagnose sensitizations. Therefore these tests must not be used, since they can lead to delayed or incorrect diagnosis and therapy.
Exploring Student Perceptions of Rigor Online: Toward a Definition of Rigorous Learning
ERIC Educational Resources Information Center
Duncan, Heather E.; Range, Bret; Hvidston, David
2013-01-01
Technological advances in the last decade have impacted delivery methods of university courses. More and more courses are offered in a variety of formats. While academic rigor is a term often used, its definition is less clear. This mixed-methods study explored graduate student conceptions of rigor in the online learning environment embedded…
Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results
A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...
DOT National Transportation Integrated Search
1996-04-01
THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prowell, Stacy J; Symons, Christopher T
2015-01-01
Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.
Political Animals: The Paradox of Ecofeminist Politics.
ERIC Educational Resources Information Center
Sandilands, Catriona
1994-01-01
Analyzes the paradox between the careful work of rigorous political analysis and philosophy and a desire for mystery and the experience of awe and wildness that demands putting aside careful reasoning and the sensing of nature in an altogether different way. (LZ)
Methodological rigor and citation frequency in patient compliance literature.
Bruer, J T
1982-01-01
An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334
Hedger, Nicholas; Gray, Katie L H; Garner, Matthew; Adams, Wendy J
2016-09-01
Given capacity limits, only a subset of stimuli give rise to a conscious percept. Neurocognitive models suggest that humans have evolved mechanisms that operate without awareness and prioritize threatening stimuli over neutral stimuli in subsequent perception. In this meta-analysis, we review evidence for this 'standard hypothesis' emanating from 3 widely used, but rather different experimental paradigms that have been used to manipulate awareness. We found a small pooled threat-bias effect in the masked visual probe paradigm, a medium effect in the binocular rivalry paradigm and highly inconsistent effects in the breaking continuous flash suppression paradigm. Substantial heterogeneity was explained by the stimulus type: the only threat stimuli that were robustly prioritized across all 3 paradigms were fearful faces. Meta regression revealed that anxiety may modulate threat-biases, but only under specific presentation conditions. We also found that insufficiently rigorous awareness measures, inadequate control of response biases and low level confounds may undermine claims of genuine unconscious threat processing. Considering the data together, we suggest that uncritical acceptance of the standard hypothesis is premature: current behavioral evidence for threat-sensitive visual processing that operates without awareness is weak. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
2016-01-01
Long wavelength ultraviolet radiation (UVA, 320–400 nm) interacts with chromophores present in human cells to induce reactive oxygen species (ROS) that damage both DNA and proteins. ROS levels are amplified, and the damaging effects of UVA are exacerbated if the cells are irradiated in the presence of UVA photosensitizers such as 6-thioguanine (6-TG), a strong UVA chromophore that is extensively incorporated into the DNA of dividing cells, or the fluoroquinolone antibiotic ciprofloxacin. Both DNA-embedded 6-TG and ciprofloxacin combine synergistically with UVA to generate high levels of ROS. Importantly, the extensive protein damage induced by these photosensitizer+UVA combinations inhibits DNA repair. DNA is maintained in intimate contact with the proteins that effect its replication, transcription, and repair, and DNA–protein cross-links (DPCs) are a recognized reaction product of ROS. Cross-linking of DNA metabolizing proteins would compromise these processes by introducing physical blocks and by depleting active proteins. We describe a sensitive and statistically rigorous method to analyze DPCs in cultured human cells. Application of this proteomics-based analysis to cells treated with 6-TG+UVA and ciprofloxacin+UVA identified proteins involved in DNA repair, replication, and gene expression among those most vulnerable to cross-linking under oxidative conditions. PMID:27654267
Rosenthal, Eric S; Biswal, Siddharth; Zafar, Sahar F; O'Connor, Kathryn L; Bechek, Sophia; Shenoy, Apeksha V; Boyle, Emily J; Shafi, Mouhsin M; Gilmore, Emily J; Foreman, Brandon P; Gaspard, Nicolas; Leslie-Mazwi, Thabele M; Rosand, Jonathan; Hoch, Daniel B; Ayata, Cenk; Cash, Sydney S; Cole, Andrew J; Patel, Aman B; Westover, M Brandon
2018-04-16
Delayed cerebral ischemia (DCI) is a common, disabling complication of subarachnoid hemorrhage (SAH). Preventing DCI is a key focus of neurocritical care, but interventions carry risk and cannot be applied indiscriminately. Although retrospective studies have identified continuous electroencephalographic (cEEG) measures associated with DCI, no study has characterized the accuracy of cEEG with sufficient rigor to justify using it to triage patients to interventions or clinical trials. We therefore prospectively assessed the accuracy of cEEG for predicting DCI, following the Standards for Reporting Diagnostic Accuracy Studies. We prospectively performed cEEG in nontraumatic, high-grade SAH patients at a single institution. The index test consisted of clinical neurophysiologists prospectively reporting prespecified EEG alarms: (1) decreasing relative alpha variability, (2) decreasing alpha-delta ratio, (3) worsening focal slowing, or (4) late appearing epileptiform abnormalities. The diagnostic reference standard was DCI determined by blinded, adjudicated review. Primary outcome measures were sensitivity and specificity of cEEG for subsequent DCI, determined by multistate survival analysis, adjusted for baseline risk. One hundred three of 227 consecutive patients were eligible and underwent cEEG monitoring (7.7-day mean duration). EEG alarms occurred in 96.2% of patients with and 19.6% without subsequent DCI (1.9-day median latency, interquartile range = 0.9-4.1). Among alarm subtypes, late onset epileptiform abnormalities had the highest predictive value. Prespecified EEG findings predicted DCI among patients with low (91% sensitivity, 83% specificity) and high (95% sensitivity, 77% specificity) baseline risk. cEEG accurately predicts DCI following SAH and may help target therapies to patients at highest risk of secondary brain injury. Ann Neurol 2018. © 2018 American Neurological Association.
Improved mathematical and computational tools for modeling photon propagation in tissue
NASA Astrophysics Data System (ADS)
Calabro, Katherine Weaver
Light interacts with biological tissue through two predominant mechanisms: scattering and absorption, which are sensitive to the size and density of cellular organelles, and to biochemical composition (ex. hemoglobin), respectively. During the progression of disease, tissues undergo a predictable set of changes in cell morphology and vascularization, which directly affect their scattering and absorption properties. Hence, quantification of these optical property differences can be used to identify the physiological biomarkers of disease with interest often focused on cancer. Diffuse reflectance spectroscopy is a diagnostic tool, wherein broadband visible light is transmitted through a fiber optic probe into a turbid medium, and after propagating through the sample, a fraction of the light is collected at the surface as reflectance. The measured reflectance spectrum can be analyzed with appropriate mathematical models to extract the optical properties of the tissue, and from these, a set of physiological properties. A number of models have been developed for this purpose using a variety of approaches -- from diffusion theory, to computational simulations, and empirical observations. However, these models are generally limited to narrow ranges of tissue and probe geometries. In this thesis, reflectance models were developed for a much wider range of measurement parameters, and influences such as the scattering phase function and probe design were investigated rigorously for the first time. The results provide a comprehensive understanding of the factors that influence reflectance, with novel insights that, in some cases, challenge current assumptions in the field. An improved Monte Carlo simulation program, designed to run on a graphics processing unit (GPU), was built to simulate the data used in the development of the reflectance models. Rigorous error analysis was performed to identify how inaccuracies in modeling assumptions can be expected to affect the accuracy of extracted optical property values from experimentally-acquired reflectance spectra. From this analysis, probe geometries that offer the best robustness against error in estimation of physiological properties from tissue, are presented. Finally, several in vivo studies demonstrating the use of reflectance spectroscopy for both research and clinical applications are presented.
Student-Level Analysis of Year 1 (2003-2004) Achievement Outcomes for Tennessee Charter Schools
ERIC Educational Resources Information Center
Ross, Steven M.; McDonald, Aaron J.; Gallagher, Brenda McSparrin
2005-01-01
This report presents student-level achievement results for the four charter schools that began operation in Tennessee during the 2003-04 academic year. To conduct a rigorous and valid analysis of student achievement outcomes at these schools, we employed a matched program-control design at the student level, whereby each charter school student was…
Chase: Control of Heterogeneous Autonomous Sensors for Situational Awareness
2016-08-03
remained the discovery and analysis of new foundational methodology for information collection and fusion that exercises rigorous feedback control over...simultaneously achieve quantified information and physical objectives. New foundational methodology for information collection and fusion that exercises...11.2.1. In the general area of novel stochastic systems analysis it seems appropriate to mention the pioneering work on non -Bayesian distributed learning
ERIC Educational Resources Information Center
Wall, Kate; Higgins, Steve; Remedios, Richard; Rafferty, Victoria; Tiplady, Lucy
2013-01-01
A key challenge of visual methodology is how to combine large-scale qualitative data sets with epistemologically acceptable and rigorous analysis techniques. The authors argue that a pragmatic approach drawing on ideas from mixed methods is helpful to open up the full potential of visual data. However, before one starts to "mix" the…
Intratheater Airlift Functional Needs Analysis (FNA)
2011-01-01
information on reprint and linking permissions, please see RAND Permissions. Skip all front matter: Jump to Page 16 The RAND Corporation is a nonprofit...facing the public and private sectors. All RAND mono- graphs undergo rigorous peer review to ensure high standards for research quality and...personnel. xii Intratheater Airlift Functional Needs Analysis all operating environments. The FNA assesses the ability of current assets to
ERIC Educational Resources Information Center
Barton, Erin E.; Pustejovsky, James E.; Maggin, Daniel M.; Reichow, Brian
2017-01-01
The adoption of methods and strategies validated through rigorous, experimentally oriented research is a core professional value of special education. We conducted a systematic review and meta-analysis examining the experimental literature on Technology-Aided Instruction and Intervention (TAII) using research identified as part of the National…
What Does Research Tell Us about Trends in Dissertations on PBL?
ERIC Educational Resources Information Center
Erdogan, Tolga
2017-01-01
The aim of this study is to investigate the research trends in dissertations on PBL from 2002 to 2015 in Turkey. For this purpose, the master's and doctorate dissertations in the National Thesis Database of Council of Higher Education (CoHE) were selected for rigorous content analysis. The analysis was utilized to classify the type of study, the…
ERIC Educational Resources Information Center
Gibbard, Deborah; Coglan, Louisa; MacDonald, John
2004-01-01
Background: Parents and professionals can both play a role in improving children's expressive language development and a number of alternative models of delivery exist that involve different levels of input by these two groups. However, these alternative treatments have not been subject to rigorous comparative analysis in terms of both cost and…
A simple method for plasma total vitamin C analysis suitable for routine clinical laboratory use.
Robitaille, Line; Hoffer, L John
2016-04-21
In-hospital hypovitaminosis C is highly prevalent but almost completely unrecognized. Medical awareness of this potentially important disorder is hindered by the inability of most hospital laboratories to determine plasma vitamin C concentrations. The availability of a simple, reliable method for analyzing plasma vitamin C could increase opportunities for routine plasma vitamin C analysis in clinical medicine. Plasma vitamin C can be analyzed by high performance liquid chromatography (HPLC) with electrochemical (EC) or ultraviolet (UV) light detection. We modified existing UV-HPLC methods for plasma total vitamin C analysis (the sum of ascorbic and dehydroascorbic acid) to develop a simple, constant-low-pH sample reduction procedure followed by isocratic reverse-phase HPLC separation using a purely aqueous low-pH non-buffered mobile phase. Although EC-HPLC is widely recommended over UV-HPLC for plasma total vitamin C analysis, the two methods have never been directly compared. We formally compared the simplified UV-HPLC method with EC-HPLC in 80 consecutive clinical samples. The simplified UV-HPLC method was less expensive, easier to set up, required fewer reagents and no pH adjustments, and demonstrated greater sample stability than many existing methods for plasma vitamin C analysis. When compared with the gold-standard EC-HPLC method in 80 consecutive clinical samples exhibiting a wide range of plasma vitamin C concentrations, it performed equivalently. The easy set up, simplicity and sensitivity of the plasma vitamin C analysis method described here could make it practical in a normally equipped hospital laboratory. Unlike any prior UV-HPLC method for plasma total vitamin C analysis, it was rigorously compared with the gold-standard EC-HPLC method and performed equivalently. Adoption of this method could increase the availability of plasma vitamin C analysis in clinical medicine.
Efficient Integrative Multi-SNP Association Analysis via Deterministic Approximation of Posteriors.
Wen, Xiaoquan; Lee, Yeji; Luca, Francesca; Pique-Regi, Roger
2016-06-02
With the increasing availability of functional genomic data, incorporating genomic annotations into genetic association analysis has become a standard procedure. However, the existing methods often lack rigor and/or computational efficiency and consequently do not maximize the utility of functional annotations. In this paper, we propose a rigorous inference procedure to perform integrative association analysis incorporating genomic annotations for both traditional GWASs and emerging molecular QTL mapping studies. In particular, we propose an algorithm, named deterministic approximation of posteriors (DAP), which enables highly efficient and accurate joint enrichment analysis and identification of multiple causal variants. We use a series of simulation studies to highlight the power and computational efficiency of our proposed approach and further demonstrate it by analyzing the cross-population eQTL data from the GEUVADIS project and the multi-tissue eQTL data from the GTEx project. In particular, we find that genetic variants predicted to disrupt transcription factor binding sites are enriched in cis-eQTLs across all tissues. Moreover, the enrichment estimates obtained across the tissues are correlated with the cell types for which the annotations are derived. Copyright © 2016 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System
ERIC Educational Resources Information Center
Mixon, Jason; Stuart, Jerry
2009-01-01
In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…
'Am I being over-sensitive?' Women's experience of sexual harassment during medical training.
Hinze, Susan W
2004-01-01
Despite larger numbers of women in medicine and strong statements against gender discrimination in written policies and the medical literature, sexual harassment persists in medical training. This study examines the everyday lives of women and men resident physicians to understand the context within which harassment unfolds. The narratives explored here reveal how attention is deflected from the problem of sexual harassment through a focus on women's 'sensitivity'. Women resist by refusing to name sexual harassment as problematic, and by defining sexual harassment as 'small stuff' in the context of a rigorous training program. Ultimately, both tactics of resistance fail. Closer examination of the relations shaping everyday actions is key, as is viewing the rigid hierarchy of authority and power in medical training through a gender lens. I conclude with a discussion of how reforms in medical education must tend to the gendered, everyday realities of women and men in training.
Perceived benefits of study abroad programs for nursing students: an integrative review.
Kelleher, Seán
2013-12-01
Study abroad programs that off er health care experiences in another country have become an important method in nursing education to increase students' understanding of cultural competence and intercultural sensitivity and to present them with new ideas and opportunities for personal and career development. Despite the many alleged positive attributes associated with such programs, a gap exists in the overall understanding of the benefits obtained by undergraduate nursing students who study abroad. Using Cooper's framework, 13 studies that explored the benefits of study abroad programs for undergraduate nursing students were reviewed. Findings suggest that participation in a study abroad experience is associated with many benefits for nursing students, including various forms of personal and professional growth, cultural sensitivity and competence, and cognitive development. Although research outcomes are encouraging, the nursing literature regarding this topic is limited, and more rigorous research studies are needed to support this educational practice.
Thermodynamic model effects on the design and optimization of natural gas plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diaz, S.; Zabaloy, M.; Brignole, E.A.
1999-07-01
The design and optimization of natural gas plants is carried out on the basis of process simulators. The physical property package is generally based on cubic equations of state. By rigorous thermodynamics phase equilibrium conditions, thermodynamic functions, equilibrium phase separations, work and heat are computed. The aim of this work is to analyze the NGL turboexpansion process and identify possible process computations that are more sensitive to model predictions accuracy. Three equations of state, PR, SRK and Peneloux modification, are used to study the effect of property predictions on process calculations and plant optimization. It is shown that turboexpander plantsmore » have moderate sensitivity with respect to phase equilibrium computations, but higher accuracy is required for the prediction of enthalpy and turboexpansion work. The effect of modeling CO{sub 2} solubility is also critical in mixtures with high CO{sub 2} content in the feed.« less
When Assessment Data Are Words: Validity Evidence for Qualitative Educational Assessments.
Cook, David A; Kuper, Ayelet; Hatala, Rose; Ginsburg, Shiphra
2016-10-01
Quantitative scores fail to capture all important features of learner performance. This awareness has led to increased use of qualitative data when assessing health professionals. Yet the use of qualitative assessments is hampered by incomplete understanding of their role in forming judgments, and lack of consensus in how to appraise the rigor of judgments therein derived. The authors articulate the role of qualitative assessment as part of a comprehensive program of assessment, and translate the concept of validity to apply to judgments arising from qualitative assessments. They first identify standards for rigor in qualitative research, and then use two contemporary assessment validity frameworks to reorganize these standards for application to qualitative assessment.Standards for rigor in qualitative research include responsiveness, reflexivity, purposive sampling, thick description, triangulation, transparency, and transferability. These standards can be reframed using Messick's five sources of validity evidence (content, response process, internal structure, relationships with other variables, and consequences) and Kane's four inferences in validation (scoring, generalization, extrapolation, and implications). Evidence can be collected and evaluated for each evidence source or inference. The authors illustrate this approach using published research on learning portfolios.The authors advocate a "methods-neutral" approach to assessment, in which a clearly stated purpose determines the nature of and approach to data collection and analysis. Increased use of qualitative assessments will necessitate more rigorous judgments of the defensibility (validity) of inferences and decisions. Evidence should be strategically sought to inform a coherent validity argument.
NASA Astrophysics Data System (ADS)
Šprlák, M.; Han, S.-C.; Featherstone, W. E.
2017-12-01
Rigorous modelling of the spherical gravitational potential spectra from the volumetric density and geometry of an attracting body is discussed. Firstly, we derive mathematical formulas for the spatial analysis of spherical harmonic coefficients. Secondly, we present a numerically efficient algorithm for rigorous forward modelling. We consider the finite-amplitude topographic modelling methods as special cases, with additional postulates on the volumetric density and geometry. Thirdly, we implement our algorithm in the form of computer programs and test their correctness with respect to the finite-amplitude topography routines. For this purpose, synthetic and realistic numerical experiments, applied to the gravitational field and geometry of the Moon, are performed. We also investigate the optimal choice of input parameters for the finite-amplitude modelling methods. Fourth, we exploit the rigorous forward modelling for the determination of the spherical gravitational potential spectra inferred by lunar crustal models with uniform, laterally variable, radially variable, and spatially (3D) variable bulk density. Also, we analyse these four different crustal models in terms of their spectral characteristics and band-limited radial gravitation. We demonstrate applicability of the rigorous forward modelling using currently available computational resources up to degree and order 2519 of the spherical harmonic expansion, which corresponds to a resolution of 2.2 km on the surface of the Moon. Computer codes, a user manual and scripts developed for the purposes of this study are publicly available to potential users.
DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES
Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...
Flexitime's Potential for Management.
ERIC Educational Resources Information Center
Bernard, Keith E.
1979-01-01
Firm size, employee characteristics, and structure and type of product or service generated are all factors that must be considered and analyzed in dealing with any particular employment problem. Unfortunately, this type of rigorous analysis is not evident in the material surveyed in this report. (Author)
Utility of distributed hydrologic and water quality models for watershed management and sustainability studies should be accompanied by rigorous model uncertainty analysis. However, the use of complex watershed models primarily follows the traditional {calibrate/validate/predict}...
NASA Astrophysics Data System (ADS)
McKinney, D. C.; Cuellar, A. D.
2015-12-01
Climate change has accelerated glacial retreat in high altitude glaciated regions of Nepal leading to the growth and formation of glacier lakes. Glacial lake outburst floods (GLOF) are sudden events triggered by an earthquake, moraine failure or other shock that causes a sudden outflow of water. These floods are catastrophic because of their sudden onset, the difficulty predicting them, and enormous quantity of water and debris rapidly flooding downstream areas. Imja Lake in the Himalaya of Nepal has experienced accelerated growth since it first appeared in the 1960s. Communities threatened by a flood from Imja Lake have advocated for projects to adapt to the increasing threat of a GLOF. Nonetheless, discussions surrounding projects for Imja have not included a rigorous analysis of the potential consequences of a flood, probability of an event, or costs of mitigation projects in part because this information is unknown or uncertain. This work presents a demonstration of a decision making methodology developed to rationally analyze the risks posed by Imja Lake and the various adaptation projects proposed using available information. In this work the authors use decision analysis, data envelopement analysis (DEA), and sensitivity analysis to assess proposed adaptation measures that would mitigate damage in downstream communities from a GLOF. We use an existing hydrodynamic model of the at-risk area to determine how adaptation projects will affect downstream flooding and estimate fatalities using an empirical method developed for dam failures. The DEA methodology allows us to estimate the value of a statistical life implied by each project given the cost of the project and number of lives saved to determine which project is the most efficient. In contrast the decision analysis methodology requires fatalities to be assigned a cost but allows the inclusion of uncertainty in the decision making process. We compare the output of these two methodologies and determine the sensitivity of the conclusions to changes in uncertain input parameters including project cost, value of a statistical life, and time to a GLOF event.
Han, Jingjia; Qian, Ximei; Wu, Qingling; Jha, Rajneesh; Duan, Jinshuai; Yang, Zhou; Maher, Kevin O.; Nie, Shuming; Xu, Chunhui
2017-01-01
Human pluripotent stem cells (hPSCs) are a promising cell source for regenerative medicine, but their derivatives need to be rigorously evaluated for residual stem cells to prevent teratoma formation. Here, we report the development of novel surface-enhanced Raman scattering (SERS)-based assays that can detect trace numbers of undifferentiated hPSCs in mixed cell populations in a highly specific, ultra-sensitive, and time-efficient manner. By targeting stem cell surface markers SSEA-5 and TRA-1-60 individually or simultaneously, these SERS assays were able to identify as few as 1 stem cell in 106 cells, a sensitivity (0.0001%) which was ~2,000 to 15,000-fold higher than that of flow cytometry assays. Using the SERS assay, we demonstrate that the aggregation of hPSC-based cardiomyocyte differentiation cultures into 3D spheres significantly reduced SSEA-5+ and TRA-1-60+ cells compared with parallel 2D cultures. Thus, SERS may provide a powerful new technology for quality control of hPSC-derived products for preclinical and clinical applications. PMID:27509304
Rocklin, Gabriel J.; Mobley, David L.; Dill, Ken A.
2013-01-01
Binding free energy calculations offer a thermodynamically rigorous method to compute protein-ligand binding, and they depend on empirical force fields with hundreds of parameters. We examined the sensitivity of computed binding free energies to the ligand’s electrostatic and van der Waals parameters. Dielectric screening and cancellation of effects between ligand-protein and ligand-solvent interactions reduce the parameter sensitivity of binding affinity by 65%, compared with interaction strengths computed in the gas-phase. However, multiple changes to parameters combine additively on average, which can lead to large changes in overall affinity from many small changes to parameters. Using these results, we estimate that random, uncorrelated errors in force field nonbonded parameters must be smaller than 0.02 e per charge, 0.06 Å per radius, and 0.01 kcal/mol per well depth in order to obtain 68% (one standard deviation) confidence that a computed affinity for a moderately-sized lead compound will fall within 1 kcal/mol of the true affinity, if these are the only sources of error considered. PMID:24015114
Hunt, Matthew; Tansey, Catherine M; Anderson, James; Boulanger, Renaud F; Eckenwiler, Lisa; Pringle, John; Schwartz, Lisa
2016-01-01
Research conducted following natural disasters such as earthquakes, floods or hurricanes is crucial for improving relief interventions. Such research, however, poses ethical, methodological and logistical challenges for researchers. Oversight of disaster research also poses challenges for research ethics committees (RECs), in part due to the rapid turnaround needed to initiate research after a disaster. Currently, there is limited knowledge available about how RECs respond to and appraise disaster research. To address this knowledge gap, we investigated the experiences of REC members who had reviewed disaster research conducted in low- or middle-income countries. We used interpretive description methodology and conducted in-depth interviews with 15 respondents. Respondents were chairs, members, advisors, or coordinators from 13 RECs, including RECs affiliated with universities, governments, international organizations, a for-profit REC, and an ad hoc committee established during a disaster. Interviews were analyzed inductively using constant comparative techniques. Through this process, three elements were identified as characterizing effective and high-quality review: timeliness, responsiveness and rigorousness. To ensure timeliness, many RECs rely on adaptations of review procedures for urgent protocols. Respondents emphasized that responsive review requires awareness of and sensitivity to the particularities of disaster settings and disaster research. Rigorous review was linked with providing careful assessment of ethical considerations related to the research, as well as ensuring independence of the review process. Both the frequency of disasters and the conduct of disaster research are on the rise. Ensuring effective and high quality review of disaster research is crucial, yet challenges, including time pressures for urgent protocols, exist for achieving this goal. Adapting standard REC procedures may be necessary. However, steps should be taken to ensure that ethics review of disaster research remains diligent and thorough.
Liu, Tao; Zhu, Guanghu; He, Jianfeng; Song, Tie; Zhang, Meng; Lin, Hualiang; Xiao, Jianpeng; Zeng, Weilin; Li, Xing; Li, Zhihao; Xie, Runsheng; Zhong, Haojie; Wu, Xiaocheng; Hu, Wenbiao; Zhang, Yonghui; Ma, Wenjun
2017-08-02
Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6%) in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI) data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed) model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250). The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.
Hunt, Matthew; Tansey, Catherine M.
2016-01-01
Background Research conducted following natural disasters such as earthquakes, floods or hurricanes is crucial for improving relief interventions. Such research, however, poses ethical, methodological and logistical challenges for researchers. Oversight of disaster research also poses challenges for research ethics committees (RECs), in part due to the rapid turnaround needed to initiate research after a disaster. Currently, there is limited knowledge available about how RECs respond to and appraise disaster research. To address this knowledge gap, we investigated the experiences of REC members who had reviewed disaster research conducted in low- or middle-income countries. Methods We used interpretive description methodology and conducted in-depth interviews with 15 respondents. Respondents were chairs, members, advisors, or coordinators from 13 RECs, including RECs affiliated with universities, governments, international organizations, a for-profit REC, and an ad hoc committee established during a disaster. Interviews were analyzed inductively using constant comparative techniques. Results Through this process, three elements were identified as characterizing effective and high-quality review: timeliness, responsiveness and rigorousness. To ensure timeliness, many RECs rely on adaptations of review procedures for urgent protocols. Respondents emphasized that responsive review requires awareness of and sensitivity to the particularities of disaster settings and disaster research. Rigorous review was linked with providing careful assessment of ethical considerations related to the research, as well as ensuring independence of the review process. Conclusion Both the frequency of disasters and the conduct of disaster research are on the rise. Ensuring effective and high quality review of disaster research is crucial, yet challenges, including time pressures for urgent protocols, exist for achieving this goal. Adapting standard REC procedures may be necessary. However, steps should be taken to ensure that ethics review of disaster research remains diligent and thorough. PMID:27327165
Psychometric analysis of the Brisbane Practice Environment Measure (B-PEM).
Flint, Anndrea; Farrugia, Charles; Courtney, Mary; Webster, Joan
2010-03-01
To undertake rigorous psychometric testing of the newly developed contemporary work environment measure (the Brisbane Practice Environment Measure [B-PEM]) using exploratory factor analysis and confirmatory factor analysis. Content validity of the 33-item measure was established by a panel of experts. Initial testing involved 195 nursing staff using principal component factor analysis with varimax rotation (orthogonal) and Cronbach's alpha coefficients. Confirmatory factor analysis was conducted using data from a further 983 nursing staff. Principal component factor analysis yielded a four-factor solution with eigenvalues greater than 1 that explained 52.53% of the variance. These factors were then verified using confirmatory factor analysis. Goodness-of-fit indices showed an acceptable fit overall with the full model, explaining 21% to 73% of the variance. Deletion of items took place throughout the evolution of the instrument, resulting in a 26-item, four-factor measure called the Brisbane Practice Environment Measure-Tested. The B-PEM has undergone rigorous psychometric testing, providing evidence of internal consistency and goodness-of-fit indices within acceptable ranges. The measure can be utilised as a subscale or total score reflective of a contemporary nursing work environment. An up-to-date instrument to measure practice environment may be useful for nursing leaders to monitor the workplace and to assist in identifying areas for improvement, facilitating greater job satisfaction and retention.
Chang, Chih-Cheng; Su, Jian-An; Tsai, Ching-Shu; Yen, Cheng-Fang; Liu, Jiun-Horng; Lin, Chung-Ying
2015-06-01
To examine the psychometrics of the Affiliate Stigma Scale using rigorous psychometric analysis: classical test theory (CTT) (traditional) and Rasch analysis (modern). Differential item functioning (DIF) items were also tested using Rasch analysis. Caregivers of relatives with mental illness (n = 453; mean age: 53.29 ± 13.50 years) were recruited from southern Taiwan. Each participant filled out four questionnaires: Affiliate Stigma Scale, Rosenberg Self-Esteem Scale, Beck Anxiety Inventory, and one background information sheet. CTT analyses showed that the Affiliate Stigma Scale had satisfactory internal consistency (α = 0.85-0.94) and concurrent validity (Rosenberg Self-Esteem Scale: r = -0.52 to -0.46; Beck Anxiety Inventory: r = 0.27-0.34). Rasch analyses supported the unidimensionality of three domains in the Affiliate Stigma Scale and indicated four DIF items (affect domain: 1; cognitive domain: 3) across gender. Our findings, based on rigorous statistical analysis, verified the psychometrics of the Affiliate Stigma Scale and reported its DIF items. We conclude that the three domains of the Affiliate Stigma Scale can be separately used and are suitable for measuring the affiliate stigma of caregivers of relatives with mental illness. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Egbert, Gary D.
2001-01-01
A numerical ocean tide model has been developed and tested using highly accurate TOPEX/Poseidon (T/P) tidal solutions. The hydrodynamic model is based on time stepping a finite difference approximation to the non-linear shallow water equations. Two novel features of our implementation are a rigorous treatment of self attraction and loading (SAL), and a physically based parameterization for internal tide (IT) radiation drag. The model was run for a range of grid resolutions, and with variations in model parameters and bathymetry. For a rational treatment of SAL and IT drag, the model run at high resolution (1/12 degree) fits the T/P solutions to within 5 cm RMS in the open ocean. Both the rigorous SAL treatment and the IT drag parameterization are required to obtain solutions of this quality. The sensitivity of the solution to perturbations in bathymetry suggest that the fit to T/P is probably now limited by errors in this critical input. Since the model is not constrained by any data, we can test the effect of dropping sea-level to match estimated bathymetry from the last glacial maximum (LGM). Our results suggest that the 100 m drop in sea-level in the LGM would have significantly increased tidal amplitudes in the North Atlantic, and increased overall tidal dissipation by about 40%. However, details in tidal solutions for the past 20 ka are sensitive to the assumed stratification. IT drag accounts for a significant fraction of dissipation, especially in the LGM when large areas of present day shallow sea were exposed, and this parameter is poorly constrained at present.
David crighton, 1942-2000: a commentary on his career and his influence on aeroacoustic theory
NASA Astrophysics Data System (ADS)
Ffowcs Williams, John E.
David Crighton, a greatly admired figure in fluid mechanics, Head of the Department of Applied Mathematics and Theoretical Physics at Cambridge, and Master of Jesus College, Cambridge, died at the peak of his career. He had made important contributions to the theory of waves generated by unsteady flow. Crighton's work was always characterized by the application of rigorous mathematical approximations to fluid mechanical idealizations of practically relevant problems. At the time of his death, he was certainly the most influential British applied mathematical figure, and his former collaborators and students form a strong school that continues his special style of mathematical application. Rigorous analysis of well-posed aeroacoustical problems was transformed by David Crighton.
Engineering education as a complex system
NASA Astrophysics Data System (ADS)
Gattie, David K.; Kellam, Nadia N.; Schramski, John R.; Walther, Joachim
2011-12-01
This paper presents a theoretical basis for cultivating engineering education as a complex system that will prepare students to think critically and make decisions with regard to poorly understood, ill-structured issues. Integral to this theoretical basis is a solution space construct developed and presented as a benchmark for evaluating problem-solving orientations that emerge within students' thinking as they progress through an engineering curriculum. It is proposed that the traditional engineering education model, while analytically rigorous, is characterised by properties that, although necessary, are insufficient for preparing students to address complex issues of the twenty-first century. A Synthesis and Design Studio model for engineering education is proposed, which maintains the necessary rigor of analysis within a uniquely complex yet sufficiently structured learning environment.
Image-Based Macro-Micro Finite Element Models of a Canine Femur with Implant Design Implications
NASA Astrophysics Data System (ADS)
Ghosh, Somnath; Krishnan, Ganapathi; Dyce, Jonathan
2006-06-01
In this paper, a comprehensive model of a bone-cement-implant assembly is developed for a canine cemented femoral prosthesis system. Various steps in this development entail profiling the canine femur contours by computed tomography (CT) scanning, computer aided design (CAD) reconstruction of the canine femur from CT images, CAD modeling of the implant from implant blue prints and CAD modeling of the interface cement. Finite element analysis of the macroscopic assembly is conducted for stress analysis in individual components of the system, accounting for variation in density and material properties in the porous bone material. A sensitivity analysis is conducted with the macroscopic model to investigate the effect of implant design variables on the stress distribution in the assembly. Subsequently, rigorous microstructural analysis of the bone incorporating the morphological intricacies is conducted. Various steps in this development include acquisition of the bone microstructural data from histological serial sectioning, stacking of sections to obtain 3D renderings of void distributions, microstructural characterization and determination of properties and, finally, microstructural stress analysis using a 3D Voronoi cell finite element method. Generation of the simulated microstructure and analysis by the 3D Voronoi cell finite element model provides a new way of modeling complex microstructures and correlating to morphological characteristics. An inverse calculation of the material parameters of bone by combining macroscopic experiments with microstructural characterization and analysis provides a new approach to evaluating properties without having to do experiments at this scale. Finally, the microstructural stresses in the femur are computed using the 3D VCFEM to study the stress distribution at the scale of the bone porosity. Significant difference is observed between the macroscopic stresses and the peak microscopic stresses at different locations.
Krompecher, T; Fryc, O
1978-01-01
The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.
Onset of rigor mortis is earlier in red muscle than in white muscle.
Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H
2000-01-01
Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.
Schmutz, Joel A.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.
2009-01-01
Stochastic variation in survival rates is expected to decrease long-term population growth rates. This expectation influences both life-history theory and the conservation of species. From this expectation, Pfister (1998) developed the important life-history prediction that natural selection will have minimized variability in those elements of the annual life cycle (such as adult survival rate) with high sensitivity. This prediction has not been rigorously evaluated for bird populations, in part due to statistical difficulties related to variance estimation. I here overcome these difficulties, and in an analysis of 62 populations, I confirm her prediction by showing a negative relationship between the proportional sensitivity (elasticity) of adult survival and the proportional variance (CV) of adult survival. However, several species deviated significantly from this expectation, with more process variance in survival than predicted. For instance, projecting the magnitude of process variance in annual survival for American redstarts (Setophaga ruticilla) for 25 years resulted in a 44% decline in abundance without assuming any change in mean survival rate. For most of these species with high process variance, recent changes in harvest, habitats, or changes in climate patterns are the likely sources of environmental variability causing this variability in survival. Because of climate change, environmental variability is increasing on regional and global scales, which is expected to increase stochasticity in vital rates of species. Increased stochasticity in survival will depress population growth rates, and this result will magnify the conservation challenges we face.
Ultrafast NMR diffusion measurements exploiting chirp spin echoes.
Ahola, Susanna; Mankinen, Otto; Telkki, Ville-Veikko
2017-04-01
Standard diffusion NMR measurements require the repetition of the experiment multiple times with varying gradient strength or diffusion delay. This makes the experiment time-consuming and restricts the use of hyperpolarized substances to boost sensitivity. We propose a novel single-scan diffusion experiment, which is based on spatial encoding of two-dimensional data, employing the spin-echoes created by two successive adiabatic frequency-swept chirp π pulses. The experiment is called ultrafast pulsed-field-gradient spin-echo (UF-PGSE). We present a rigorous derivation of the echo amplitude in the UF-PGSE experiment, justifying the theoretical basis of the method. The theory reveals also that the standard analysis of experimental data leads to a diffusion coefficient value overestimated by a few per cent. Although the overestimation is of the order of experimental error and thus insignificant in many practical applications, we propose that it can be compensated by a bipolar gradient version of the experiment, UF-BP-PGSE, or by corresponding stimulated-echo experiment, UF-BP-pulsed-field-gradient stimulated-echo. The latter also removes the effect of uniform background gradients. The experiments offer significant prospects for monitoring fast processes in real time as well as for increasing the sensitivity of experiments by several orders of magnitude by nuclear spin hyperpolarization. Furthermore, they can be applied as basic blocks in various ultrafast multidimensional Laplace NMR experiments. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
A Primer on Health Economic Evaluations in Thoracic Oncology.
Whittington, Melanie D; Atherly, Adam J; Bocsi, Gregary T; Camidge, D Ross
2016-08-01
There is growing interest for economic evaluation in oncology to illustrate the value of multiple new diagnostic and therapeutic interventions. As these analyses have started to move from specialist publications into mainstream medical literature, the wider medical audience consuming this information may need additional education to evaluate it appropriately. Here we review standard practices in economic evaluation, illustrating the different methods with thoracic oncology examples where possible. When interpreting and conducting health economic studies, it is important to appraise the method, perspective, time horizon, modeling technique, discount rate, and sensitivity analysis. Guidance on how to do this is provided. To provide a method to evaluate this literature, a literature search was conducted in spring 2015 to identify economic evaluations published in the Journal of Thoracic Oncology. Articles were reviewed for their study design, and areas for improvement were noted. Suggested improvements include using more rigorous sensitivity analyses, adopting a standard approach to reporting results, and conducting complete economic evaluations. Researchers should design high-quality studies to ensure the validity of the results, and consumers of this research should interpret these studies critically on the basis of a full understanding of the methodologies used before considering any of the conclusions. As advancements occur on both the research and consumer sides, this literature can be further developed to promote the best use of resources for this field. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.
International Seed Testing Association List of stabilized plant names, edition 6
USDA-ARS?s Scientific Manuscript database
Seed-testing laboratories determine the quality of seed lots in national and international seed commerce. Those services most commonly requested include purity analysis, noxious-weed seed detection, and viability tests. Rigorous procedures for performing various tests on specific crops have been est...
USEPA’s Land‐Based Materials Management Exposure and Risk Assessment Tool System
It is recognized that some kinds of 'waste' materials can in fact be reused as input materials for making safe products that benefit society. RIMM (Risk-Informed Materials Management) provides an integrated data gathering and analysis capability to enable scientifically rigorous ...
ERIC Educational Resources Information Center
Schwandt, Thomas A.; Lincoln, Yvonna S.; Guba, Egon G.
2007-01-01
Among the most knotty problems faced by investigators committed to interpretive practices in disciplines and fields such as sociocultural anthropology, jurisprudence, literary criticism, historiography, feminist studies, public administration, policy analysis, planning, educational research, and evaluation are deciding whether an interpretation is…
The Academic Performance of Catholic Schools.
ERIC Educational Resources Information Center
Morris, Andrew B.
1994-01-01
Although the (British) government's "league tables" may be an inappropriate method of comparing schools' relative effectiveness, analysis of the 1992 examination results points to Catholic schools' apparent success. A summary of the limited research evidence on Catholic school effectiveness suggests that a rigorous study of their…
Anticipatory Understanding of Adversary Intent: A Signature-Based Knowledge System
2009-06-01
concept of logical positivism has been applied more recently to all human knowledge and reflected in current data fusion research, information mining...this work has been successfully translated into useful analytical tools that can provide a rigorous and quantitative basis for predictive analysis
Holistic Competence: Putting Judgements First
ERIC Educational Resources Information Center
Beckett, David
2008-01-01
Professional practice can be conceptualised holistically, and in fact during the 1990s the "Australian model" of integrated or holistic competence emerged empirically. This piece outlines that story, and then develops a more rigorous conceptual analysis of what it is to make competent practical judgements, through inferences, in…
A new algorithm for construction of coarse-grained sites of large biomolecules.
Li, Min; Zhang, John Z H; Xia, Fei
2016-04-05
The development of coarse-grained (CG) models for large biomolecules remains a challenge in multiscale simulations, including a rigorous definition of CG representations for them. In this work, we proposed a new stepwise optimization imposed with the boundary-constraint (SOBC) algorithm to construct the CG sites of large biomolecules, based on the s cheme of essential dynamics CG. By means of SOBC, we can rigorously derive the CG representations of biomolecules with less computational cost. The SOBC is particularly efficient for the CG definition of large systems with thousands of residues. The resulted CG sites can be parameterized as a CG model using the normal mode analysis based fluctuation matching method. Through normal mode analysis, the obtained modes of CG model can accurately reflect the functionally related slow motions of biomolecules. The SOBC algorithm can be used for the construction of CG sites of large biomolecules such as F-actin and for the study of mechanical properties of biomaterials. © 2015 Wiley Periodicals, Inc.
Numerical proof of stability of roll waves in the small-amplitude limit for inclined thin film flow
NASA Astrophysics Data System (ADS)
Barker, Blake
2014-10-01
We present a rigorous numerical proof based on interval arithmetic computations categorizing the linearized and nonlinear stability of periodic viscous roll waves of the KdV-KS equation modeling weakly unstable flow of a thin fluid film on an incline in the small-amplitude KdV limit. The argument proceeds by verification of a stability condition derived by Bar-Nepomnyashchy and Johnson-Noble-Rodrigues-Zumbrun involving inner products of various elliptic functions arising through the KdV equation. One key point in the analysis is a bootstrap argument balancing the extremely poor sup norm bounds for these functions against the extremely good convergence properties for analytic interpolation in order to obtain a feasible computation time. Another is the way of handling analytic interpolation in several variables by a two-step process carving up the parameter space into manageable pieces for rigorous evaluation. These and other general aspects of the analysis should serve as blueprints for more general analyses of spectral stability.
Probability bounds analysis for nonlinear population ecology models.
Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A
2015-09-01
Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals. Copyright © 2015. Published by Elsevier Inc.
Warriss, P D; Brown, S N; Knowles, T G
2003-12-13
The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.
Accuracy and performance of 3D mask models in optical projection lithography
NASA Astrophysics Data System (ADS)
Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar
2011-04-01
Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.
ERIC Educational Resources Information Center
Harwell, Michael
2014-01-01
Commercial data analysis software has been a fixture of quantitative analyses in education for more than three decades. Despite its apparent widespread use there is no formal evidence cataloging what software is used in educational research and educational statistics classes, by whom and for what purpose, and whether some programs should be…
ERIC Educational Resources Information Center
Conn, Katharine
2014-01-01
The aim of this dissertation is to identify effective educational interventions in Sub-Saharan African with an impact on student learning. This is the first meta-analysis in the field of education conducted for Sub-Saharan Africa. This paper takes an in-depth look at twelve different types of education interventions or programs and attempts to not…
NASA Astrophysics Data System (ADS)
Ngampitipan, Tritos; Boonserm, Petarpa; Chatrabhuti, Auttakit; Visser, Matt
2016-06-01
Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ngampitipan, Tritos, E-mail: tritos.ngampitipan@gmail.com; Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330; Boonserm, Petarpa, E-mail: petarpa.boonserm@gmail.com
Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy,more » the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.« less
Learning from Science and Sport - How we, Safety, "Engage with Rigor"
NASA Astrophysics Data System (ADS)
Herd, A.
2012-01-01
As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a safety review meeting (providing a verbal critique of the presented safety case).
Reframing Rigor: A Modern Look at Challenge and Support in Higher Education
ERIC Educational Resources Information Center
Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.
2018-01-01
This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.
Theories of State Analyzing the Policy Process,
1973-11-01
values and goals - which is the heart of the rational process-- in reality cannot be separated from the actor’s empirical analysis of the situation...rigorous and objective in analysis . How different would our foreign policy actually be? Would it necessarily be better? In fact, would one even need...State, but the fact is that much of the outside research and analysis of policy process is pointed at the 6 As Robert Rothstein says in his valuable
When is Analysis Sufficient? A Study of how Professional Intelligence Analysts Judge Rigor
2007-05-01
investors, the marketing researcher assembling an analysis of a competitor’s new products for a corporate executive, and the military analyst preparing a...previously mentioned. In all instances of analysis, the risk of shallowness is fundamental—for both the middle school student and the marketing researcher...natural gas energy policy to respond to the changing consumption of a limited resource in a dynamic energy market . The next critical facet of the
Experience and Expertise Meet in New Brand of Scholarship.
ERIC Educational Resources Information Center
Heller, Scott
1992-01-01
Increasingly, academic scholars are turning to personal and autobiographical writing as a more fulfilling form of self-expression, as illustrated by the career changes of nine women and one man. One critic finds the personal tone an evasion of politics and lacking in rigorous analysis. (MSE)
Retrospective Analysis of a Classical Biological Control Programme
USDA-ARS?s Scientific Manuscript database
1. Classical biological control has been a key technology in the management of invasive arthropod pests globally for over 120 years, yet rigorous quantitative evaluations of programme success or failure are rare. Here, I used life table and matrix model analyses, and life table response experiments ...
ERIC Educational Resources Information Center
Garcia, Stephan Ramon; Ross, William T.
2017-01-01
We hope to initiate a discussion about various methods for introducing Cauchy's Theorem. Although Cauchy's Theorem is the fundamental theorem upon which complex analysis is based, there is no "standard approach." The appropriate choice depends upon the prerequisites for the course and the level of rigor intended. Common methods include…
Quantitative phosphoproteomic analysis of caprine muscle with high and low meat quality.
Liu, Manshun; Wei, Yanchao; Li, Xin; Quek, Siew Young; Zhao, Jing; Zhong, Huazhen; Zhang, Dequan; Liu, Yongfeng
2018-07-01
During the conversion of muscle to meat, protein phosphorylation can regulate various biological processes that have important effects on meat quality. To investigate the phosphorylation pattern of protein on rigor mortis, goat longissimus thoracis and external intercostals were classified into two groups (high quality and low quality), and meat quality was evaluated according to meat quality attributes (Warner-Bratzler shear force, Color, pH and drip loss). A quantitative mass spectrometry-based phosphoproteomic study was conducted to analyze the caprine muscle at 12h postmortem applying the TiO 2 -SIMAC-HILIC (TiSH) phosphopeptide enrichment strategy. A total of 2125 phosphopeptides were identified from 750 phosphoproteins. Among them, 96 proteins had differed in phosphorylation levels. The majority of these proteins are involved in glucose metabolism and muscle contraction. The differential phosphorylation level of proteins (PFK, MYL2 and HSP27) in two groups may be the crucial factors of regulating muscle rigor mortis. This study provides a comprehensive view for the phosphorylation status of caprine muscle at rigor mortis, it also gives a better understanding of the regulation of protein phosphorylation on various biological processes that affect the final meat quality attributes. Copyright © 2018. Published by Elsevier Ltd.
Rigorous analysis of an electric-field-driven liquid crystal lens for 3D displays
NASA Astrophysics Data System (ADS)
Kim, Bong-Sik; Lee, Seung-Chul; Park, Woo-Sang
2014-08-01
We numerically analyzed the optical performance of an electric field driven liquid crystal (ELC) lens adopted for 3-dimensional liquid crystal displays (3D-LCDs) through rigorous ray tracing. For the calculation, we first obtain the director distribution profile of the liquid crystals by using the Erickson-Leslie motional equation; then, we calculate the transmission of light through the ELC lens by using the extended Jones matrix method. The simulation was carried out for a 9view 3D-LCD with a diagonal of 17.1 inches, where the ELC lens was slanted to achieve natural stereoscopic images. The results show that each view exists separately according to the viewing position at an optimum viewing distance of 80 cm. In addition, our simulation results provide a quantitative explanation for the ghost or blurred images between views observed from a 3D-LCD with an ELC lens. The numerical simulations are also shown to be in good agreement with the experimental results. The present simulation method is expected to provide optimum design conditions for obtaining natural 3D images by rigorously analyzing the optical functionalities of an ELC lens.
Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames
NASA Astrophysics Data System (ADS)
Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.
2017-12-01
Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.
NASA Astrophysics Data System (ADS)
Nugraheni, Z.; Budiyono, B.; Slamet, I.
2018-03-01
To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.
Snodgrass, Melinda R; Chung, Moon Y; Meadan, Hedda; Halle, James W
2018-03-01
Single-case research (SCR) has been a valuable methodology in special education research. Montrose Wolf (1978), an early pioneer in single-case methodology, coined the term "social validity" to refer to the social importance of the goals selected, the acceptability of procedures employed, and the effectiveness of the outcomes produced in applied investigations. Since 1978, many contributors to SCR have included social validity as a feature of their articles and several authors have examined the prevalence and role of social validity in SCR. We systematically reviewed all SCR published in six highly-ranked special education journals from 2005 to 2016 to establish the prevalence of social validity assessments and to evaluate their scientific rigor. We found relatively low, but stable prevalence with only 28 publications addressing all three factors of the social validity construct (i.e., goals, procedures, outcomes). We conducted an in-depth analysis of the scientific rigor of these 28 publications. Social validity remains an understudied construct in SCR, and the scientific rigor of social validity assessments is often lacking. Implications and future directions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Randomized Study of How Physicians Interpret Research Funding Disclosures
Kesselheim, Aaron S.; Robertson, Christopher T.; Myers, Jessica A.; Rose, Susannah L.; Gillet, Victoria; Ross, Kathryn M.; Glynn, Robert J.; Joffe, Steven; Avorn, Jerry
2012-01-01
BACKGROUND The effects of clinical-trial funding on the interpretation of trial results are poorly understood. We examined how such support affects physicians’ reactions to trials with a high, medium, or low level of methodologic rigor. METHODS We presented 503 board-certified internists with abstracts that we designed describing clinical trials of three hypothetical drugs. The trials had high, medium, or low methodologic rigor, and each report included one of three support disclosures: funding from a pharmaceutical company, NIH funding, or none. For both factors studied (rigor and funding), one of the three possible variations was randomly selected for inclusion in the abstracts. Follow-up questions assessed the physicians’ impressions of the trials’ rigor, their confidence in the results, and their willingness to prescribe the drugs. RESULTS The 269 respondents (53.5% response rate) perceived the level of study rigor accurately. Physicians reported that they would be less willing to prescribe drugs tested in low-rigor trials than those tested in medium-rigor trials (odds ratio, 0.64; 95% confidence interval [CI], 0.46 to 0.89; P = 0.008) and would be more willing to prescribe drugs tested in high-rigor trials than those tested in medium-rigor trials (odds ratio, 3.07; 95% CI, 2.18 to 4.32; P<0.001). Disclosure of industry funding, as compared with no disclosure of funding, led physicians to downgrade the rigor of a trial (odds ratio, 0.63; 95% CI, 0.46 to 0.87; P = 0.006), their confidence in the results (odds ratio, 0.71; 95% CI, 0.51 to 0.98; P = 0.04), and their willingness to prescribe the hypothetical drugs (odds ratio, 0.68; 95% CI, 0.49 to 0.94; P = 0.02). Physicians were half as willing to prescribe drugs studied in industry-funded trials as they were to prescribe drugs studied in NIH-funded trials (odds ratio, 0.52; 95% CI, 0.37 to 0.71; P<0.001). These effects were consistent across all levels of methodologic rigor. CONCLUSIONS Physicians discriminate among trials of varying degrees of rigor, but industry sponsorship negatively influences their perception of methodologic quality and reduces their willingness to believe and act on trial findings, independently of the trial’s quality. These effects may influence the translation of clinical research into practice. PMID:22992075
Nonlinear optical properties of interconnected gold nanoparticles on silicon
NASA Astrophysics Data System (ADS)
Lesuffleur, Antoine; Gogol, Philippe; Beauvillain, Pierre; Guizal, B.; Van Labeke, D.; Georges, P.
2008-12-01
We report second harmonic generation (SHG) measurements in reflectivity from chains of gold nanoparticles interconnected with metallic bridges. We measured more than 30 times a SHG enhancement when a surface plasmon resonance was excited in the chains of nanoparticles, which was influenced by coupling due to the electrical connectivity of the bridges. This enhancement was confirmed by rigorous coupled wave method calculations and came from high localization of the electric field at the bridge. The introduction of 10% random defects into the chains of nanoparticles dropped the SHG by a factor of 2 and was shown to be very sensitive to the fundamental wavelength.
Effects of space flight factors on Drosophila.
Dubinin, N P; Glembotsky, Y L; Vaulina, E N; Grozdova, T Y; Kamshilova, E M; Ivaschenko, N I; Kholikova, I A; Nechitailo, G S; Mashinsky, A L; Iordanishvili, E K
1973-01-01
Drosophila melanogaster flies of strain D-32 were exposed aboard the Soyuz 10 spaceship. An insert with a nutritional medium and insects was placed in a small on-board thermostat (Biotherm II) providing a constant temperature (24 degrees C +/- 1 degree) for Drosophila development. The frequency of dominant lethals was determined in the females. Dominant, autosomal and sex-linked recessive lethals were estimated in hatching virgin males and females; the time of hatching was rigorously fixed. Sex-linked recessive lethals were related to certain stages of gametogenesis. The 1-5 oocyte stage showed an increased sensitivity to space-flight factors as regards the frequency of both dominant and recessive lethals.
NASA Technical Reports Server (NTRS)
Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.
2014-01-01
Providing web-based data of complex and sensitive astromaterials (including meteorites and lunar samples) in novel formats enhances existing preliminary examination data on these samples and supports targeted sample requests and analyses. We have developed and tested a rigorous protocol for collecting highly detailed imagery of meteorites and complex lunar samples in non-contaminating environments. These data are reduced to create interactive 3D models of the samples. We intend to provide these data as they are acquired on NASA's Astromaterials Acquisition and Curation website at http://curator.jsc.nasa.gov/.
Rigor or mortis: best practices for preclinical research in neuroscience.
Steward, Oswald; Balice-Gordon, Rita
2014-11-05
Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.
Forster, B; Ropohl, D; Raule, P
1977-07-05
The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.
ERIC Educational Resources Information Center
Diouf, Boucar; Rioux, Pierre
1999-01-01
Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…
Designing robots for care: care centered value-sensitive design.
van Wynsberghe, Aimee
2013-06-01
The prospective robots in healthcare intended to be included within the conclave of the nurse-patient relationship--what I refer to as care robots--require rigorous ethical reflection to ensure their design and introduction do not impede the promotion of values and the dignity of patients at such a vulnerable and sensitive time in their lives. The ethical evaluation of care robots requires insight into the values at stake in the healthcare tradition. What's more, given the stage of their development and lack of standards provided by the International Organization for Standardization to guide their development, ethics ought to be included into the design process of such robots. The manner in which this may be accomplished, as presented here, uses the blueprint of the Value-sensitive design approach as a means for creating a framework tailored to care contexts. Using care values as the foundational values to be integrated into a technology and using the elements in care, from the care ethics perspective, as the normative criteria, the resulting approach may be referred to as care centered value-sensitive design. The framework proposed here allows for the ethical evaluation of care robots both retrospectively and prospectively. By evaluating care robots in this way, we may ultimately ask what kind of care we, as a society, want to provide in the future.
Causality analysis in business performance measurement system using system dynamics methodology
NASA Astrophysics Data System (ADS)
Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah
2014-07-01
One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.
Rigor, vigor, and the study of health disparities
Adler, Nancy; Bush, Nicole R.; Pantell, Matthew S.
2012-01-01
Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents’ SES on their children’s health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between “rigor” and “vigor” in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities. PMID:23045672
High blood pressure and visual sensitivity
NASA Astrophysics Data System (ADS)
Eisner, Alvin; Samples, John R.
2003-09-01
The study had two main purposes: (1) to determine whether the foveal visual sensitivities of people treated for high blood pressure (vascular hypertension) differ from the sensitivities of people who have not been diagnosed with high blood pressure and (2) to understand how visual adaptation is related to standard measures of systemic cardiovascular function. Two groups of middle-aged subjects-hypertensive and normotensive-were examined with a series of test/background stimulus combinations. All subjects met rigorous inclusion criteria for excellent ocular health. Although the visual sensitivities of the two subject groups overlapped extensively, the age-related rate of sensitivity loss was, for some measures, greater for the hypertensive subjects, possibly because of adaptation differences between the two groups. Overall, the degree of steady-state sensitivity loss resulting from an increase of background illuminance (for 580-nm backgrounds) was slightly less for the hypertensive subjects. Among normotensive subjects, the ability of a bright (3.8-log-td), long-wavelength (640-nm) adapting background to selectively suppress the flicker response of long-wavelength-sensitive (LWS) cones was related inversely to the ratio of mean arterial blood pressure to heart rate. The degree of selective suppression was also related to heart rate alone, and there was evidence that short-term changes of cardiovascular response were important. The results suggest that (1) vascular hypertension, or possibly its treatment, subtly affects visual function even in the absence of eye disease and (2) changes in blood flow affect retinal light-adaptation processes involved in the selective suppression of the flicker response from LWS cones caused by bright, long-wavelength backgrounds.
Monitoring food pathogens: Novel instrumentation for cassette PCR testing
Hunt, Darin; Figley, Curtis; Lauzon, Jana; Figley, Rachel; Pilarski, Linda M.; McMullen, Lynn M.; Pilarski, Patrick M.
2018-01-01
In this manuscript, we report the design and development of a fast, reliable instrument to run gel-based cassette polymerase chain reactions (PCR). Here termed the GelCycler Mark II, our instrument is a miniaturized molecular testing system that is fast, low cost and sensitive. Cassette PCR utilizes capillary reaction units that carry all reagents needed for PCR, including primers and Taq polymerase, except the sample, which is loaded at the time of testing. Cassette PCR carries out real time quantitative PCR followed by melt curve analysis (MCA) to verify amplicon identity at the expected melt temperature (Tm). The cassette PCR technology is well developed, particularly for detecting pathogens, and has been rigorously validated for detecting pathogenic Escherichia coli in meat samples. However, the work has been hindered by the lack of a robust and stable instrument to carry out the PCR, which requires fast and accurate temperature regulation, improved light delivery and fluorescent recording, and faster PCR reactions that maintain a high sensitivity of detection. Here, we report design and testing of a new instrument to address these shortcomings and to enable standardized testing by cassette PCR and commercial manufacture of a robust and accurate instrument that can be mass produced to deliver consistent performance. As a corollary to our new instrument development, we also report the use of an improved design approach using a machined aluminum cassette to meet the new instrument standards, prevent any light bleed across different trenches in each cassette, and allow testing of a larger number of samples for more targets in a single run. The GelCycler Mark II can detect and report E. coli contamination in 41 minutes. Sample positives are defined in as having a melt curve comparable to the internal positive control, with peak height exceeding that of the internal negative control. In a fractional analysis, as little as 1 bacterium per capillary reaction unit is directly detectable, with no enrichment step, in 35 cycles of PCR/MCA, in a total time of 53 minutes, making this instrument and technology among the very best for speed and sensitivity in screening food for pathogenic contamination. PMID:29746561
Professionalization of the Senior Chinese Officer Corps Trends and Implications
1997-01-01
81The officers who retired were Ye Jianying , Nie Rongzhen, Xu Xiangqian, Wang Zhen, Song Renqiong, and Li Desheng. Of course, the political impact of...increased education level, functional spe- cialization, and adherence to retirement norms.4 Li Cheng and Lynn White, in their 1993 Asian Survey article...making rigorous comparative analysis untenable. Second, Li and White do not place their results or analysis in any theoretical context. In
Hollow-cylinder waveguide isolators for use at millimeter wavelengths
NASA Technical Reports Server (NTRS)
Kanda, M.; May, W. G.
1974-01-01
A semiconductor waveguide isolator consisting of a hollow column of a semiconductor mounted coaxially is considered in a circular waveguide in a longitudinal dc magnetic field. An elementary and physical analysis based on the excitation of plane waves in the guide and a more rigorous mode matching analysis are presented. These theoretical predictions are compared with experimental results for an InSb isolator at 94GHz and 75 K.
Analysis of small crack behavior for airframe applications
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Chan, K. S.; Hudak, S. J., Jr.; Davidson, D. L.
1994-01-01
The small fatigue crack problem is critically reviewed from the perspective of airframe applications. Different types of small cracks-microstructural, mechanical, and chemical-are carefully defined and relevant mechanisms identified. Appropriate analysis techniques, including both rigorous scientific and practical engineering treatments, are briefly described. Important materials data issues are addressed, including increased scatter in small crack data and recommended small crack test methods. Key problems requiring further study are highlighted.
Phyllis C. Adams; Glenn A. Christensen
2012-01-01
A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a Stateâs data to the national FIA...
[Experimental study of restiffening of the rigor mortis].
Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M
2001-11-01
To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.
A Phenomenological Analysis of Division III Student-Athletes' Transition out of College
ERIC Educational Resources Information Center
Covington, Sim Jonathan, Jr.
2017-01-01
Intercollegiate athletics is a major segment of numerous college and university communities across America today. Student-athletes participate in strenuous training and competition throughout their college years while managing to balance the rigorous academic curriculum of the higher education environment. This research aims to explore the…
Libya After Qaddafi: Lessons and Implications for the Future
2014-01-01
research findings and objective analysis that address the challenges facing the public and private sectors . All RAND reports undergo rigorous peer...8 The Need for Far-Reaching Security Sector Reform...World Bank , International Monetary Fund, IHS, Inter- national Foundation for Electoral Systems, Dartmouth College, the National Endowment for
Child Forensic Interviewing in Children's Advocacy Centers: Empirical Data on a Practice Model
ERIC Educational Resources Information Center
Cross, Theodore P.; Jones, Lisa M.; Walsh, Wendy A.; Simone, Monique; Kolko, David
2007-01-01
Objective: Children's Advocacy Centers (CACs) aim to improve child forensic interviewing following allegations of child abuse by coordinating multiple investigations, providing child-friendly interviewing locations, and limiting redundant interviewing. This analysis presents one of the first rigorous evaluations of CACs' implementation of these…
Single-Case Designs Technical Documentation
ERIC Educational Resources Information Center
Kratochwill, T. R.; Hitchcock, J.; Horner, R. H.; Levin, J. R.; Odom, S. L.; Rindskopf, D. M; Shadish, W. R.
2010-01-01
In an effort to expand the pool of scientific evidence available for review, the What Works Clearinghouse (WWC) assembled a panel of national experts in single-case design (SCD) and analysis to draft SCD Standards. SCDs are adaptations of interrupted time-series designs and can provide a rigorous experimental evaluation of intervention effects.…
Neoliberalism, Policy Reforms and Higher Education in Bangladesh
ERIC Educational Resources Information Center
Kabir, Ariful Haq
2013-01-01
Bangladesh has introduced neoliberal policies since the 1970s. Military regimes, since the dramatic political changes in 1975, accelerated the process. A succession of military rulers made rigorous changes in policy-making in various sectors. This article uses a critical approach to document analysis and examines the perceptions of key…
An Examination of the (Un)Intended Consequences of Performance Funding in Higher Education
ERIC Educational Resources Information Center
Umbricht, Mark R.; Fernandez, Frank; Ortagus, Justin C.
2017-01-01
Previous studies have shown that state performance funding policies do not increase baccalaureate degree production, but higher education scholarship lacks a rigorous, quantitative analysis of the unintended consequences of performance funding. In this article, we use difference-in-differences estimation with fixed effects to evaluate performance…
A Writing-Intensive, Methods-Based Laboratory Course for Undergraduates
ERIC Educational Resources Information Center
Colabroy, Keri L.
2011-01-01
Engaging undergraduate students in designing and executing original research should not only be accompanied by technique training but also intentional instruction in the critical analysis and writing of scientific literature. The course described here takes a rigorous approach to scientific reading and writing using primary literature as the model…
Legal Aspects of Evaluating Teacher Performance.
ERIC Educational Resources Information Center
Beckham, Joseph C.
Chapter 14 in a book on school law concerns the legal aspects of evaluating teacher performance. Careful analysis of recent decisions makes it clear the courts will compel uniform standards and unprecedented rigor in teacher evaluation practices. Particularly in the consideration of equitable standards, state and federal courts are relying on…
Group Practices: A New Way of Viewing CSCL
ERIC Educational Resources Information Center
Stahl, Gerry
2017-01-01
The analysis of "group practices" can make visible the work of novices learning how to inquire in science or mathematics. These ubiquitous practices are invisibly taken for granted by adults, but can be observed and rigorously studied in adequate traces of online collaborative learning. Such an approach contrasts with traditional…
A Practical Guide to Regression Discontinuity
ERIC Educational Resources Information Center
Jacob, Robin; Zhu, Pei; Somers, Marie-Andrée; Bloom, Howard
2012-01-01
Regression discontinuity (RD) analysis is a rigorous nonexperimental approach that can be used to estimate program impacts in situations in which candidates are selected for treatment based on whether their value for a numeric rating exceeds a designated threshold or cut-point. Over the last two decades, the regression discontinuity approach has…
Motivations for Sex among Low-Income African American Young Women
ERIC Educational Resources Information Center
Deardorff, Julianna; Suleiman, Ahna Ballonoff; Dal Santo, Teresa S.; Flythe, Michelle; Gurdin, J. Barry; Eyre, Stephen L.
2013-01-01
African American young women exhibit higher risk for sexually transmitted infections, including HIV/AIDS, compared with European American women, and this is particularly true for African American women living in low-income contexts. We used rigorous qualitative methods, that is, domain analysis, including free listing ("n" = 20),…
School Uniforms: A Qualitative Analysis of Aims and Accomplishments at Two Christian Schools
ERIC Educational Resources Information Center
Firmin, Michael; Smith, Suzanne; Perry, Lynsey
2006-01-01
Employing rigorous qualitative research methodology, we studied the implementation of two schools' uniform policies. Their primary intents were to eliminate competition, teach young people to dress appropriately, decrease nonacademic distractions, and lower the parental clothing costs. The young people differed with adults regarding whether or not…
Socioeconomic Status and Child Development: A Meta-Analysis
ERIC Educational Resources Information Center
Letourneau, Nicole Lyn; Duffett-Leger, Linda; Levac, Leah; Watson, Barry; Young-Morris, Catherine
2013-01-01
Lower socioeconomic status (SES) is widely accepted to have deleterious effects on the well-being and development of children and adolescents. However, rigorous meta-analytic methods have not been applied to determine the degree to which SES supports or limits children's and adolescents behavioural, cognitive and language development. While…
Critical Listening in the Ensemble Rehearsal: A Community of Learners
ERIC Educational Resources Information Center
Bell, Cindy L.
2018-01-01
This article explores a strategy for engaging ensemble members in critical listening analysis of performances and presents opportunities for improving ensemble sound through rigorous dialogue, reflection, and attentive rehearsing. Critical listening asks ensemble members to draw on individual playing experience and knowledge to describe what they…
How to Teach Hicksian Compensation and Duality Using a Spreadsheet Optimizer
ERIC Educational Resources Information Center
Ghosh, Satyajit; Ghosh, Sarah
2007-01-01
Principle of duality and numerical calculation of income and substitution effects under Hicksian Compensation are often left out of intermediate microeconomics courses because they require a rigorous calculus based analysis. But these topics are critically important for understanding consumer behavior. In this paper we use excel solver--a…
78 FR 51714 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
... instruments, please write to the Commandant of Midshipmen, Operations Office, United States Naval Academy, 101... Fourth Class Midshipmen at the United States Naval Academy. An analysis of the information collection is... with families in the community for a semblance of home away from the rigors of the academy. The...
ARCHITECTURAL PROGRAMMING--STATE OF THE ART.
ERIC Educational Resources Information Center
EVANS, BENJAMIN H.
IN RESPONSE TO A NEED FOR A MORE THOROUGH AND RIGOROUS STUDY AND ANALYSIS PROCESS IN ENVIRONMENTAL FUNCTIONS PRIOR TO THE DESIGN OF NEW BUILDINGS, A STUDY WAS UNDERTAKEN TO IDENTIFY THE EMERGING TECHNIQUES OF ARCHITECTURAL PROGRAMING PRACTICE. THE STUDY INCLUDED CORRESPONDENCE AND REVIEW OF PERIODICALS, QUESTIONNAIRES AND VISITATIONS, AND A…
Dwork, Cynthia; Pottenger, Rebecca
2013-01-01
Private data analysis—the useful analysis of confidential data—requires a rigorous and practicable definition of privacy. Differential privacy, an emerging standard, is the subject of intensive investigation in several diverse research communities. We review the definition, explain its motivation, and discuss some of the challenges to bringing this concept to practice. PMID:23243088
ERIC Educational Resources Information Center
Ruscio, John; Ruscio, Ayelet Meron; Meron, Mati
2007-01-01
Meehl's taxometric method was developed to distinguish categorical and continuous constructs. However, taxometric output can be difficult to interpret because expected results for realistic data conditions and differing procedural implementations have not been derived analytically or studied through rigorous simulations. By applying bootstrap…
ERIC Educational Resources Information Center
Berleman, William C.
Ten delinquency prevention studies are reviewed that incorporated rigorous evaluative procedures (specifically the classic experimental design) for assessing programmatic outcomes. Following an introduction, the evaluation mechanisms built into each project are described, since they were used for determination of the effectiveness of the…
Cognitive Responses of Students Who Witness Classroom Cheating
ERIC Educational Resources Information Center
Firmin, Michael W.; Burger, Amanda; Blosser, Matthew
2007-01-01
We arranged for 82 General Psychology students (51 females, 31 males) to observe peers in a course cheating situation. Individual, in-depth, qualitative interviews following the experiment we were conducting, using rigorous coding and grounded theory methodology for analysis. Results showed students to experience particular cognitive stages as…
The benefits and costs of new fuels and engines for light-duty vehicles in the United States.
Keefe, Ryan; Griffin, James P; Graham, John D
2008-10-01
Rising oil prices and concerns about energy security and climate change are spurring reconsideration of both automobile propulsion systems and the fuels that supply energy to them. In addition to the gasoline internal combustion engine, recent years have seen alternatives develop in the automotive marketplace. Currently, hybrid-electric vehicles, advanced diesels, and flex-fuel vehicles running on a high percentage mixture of ethanol and gasoline (E85) are appearing at auto shows and in driveways. We conduct a rigorous benefit-cost analysis from both the private and societal perspective of the marginal benefits and costs of each technology--using the conventional gasoline engine as a baseline. The private perspective considers only those factors that influence the decisions of individual consumers, while the societal perspective accounts for environmental, energy, and congestion externalities as well. Our analysis illustrates that both hybrids and diesels show promise for particular light-duty applications (sport utility vehicles and pickup trucks), but that vehicles running continuously on E85 consistently have greater costs than benefits. The results for diesels were particularly robust over a wide range of sensitivity analyses. The results from the societal analysis are qualitatively similar to the private analysis, demonstrating that the most relevant factors to the benefit-cost calculations are the factors that drive the individual consumer's decision. We conclude with a brief discussion of marketplace and public policy trends that will both illustrate and influence the relative adoption of these alternative technologies in the United States in the coming decade.
A user-friendly tool to evaluate the effectiveness of no-take marine reserves.
Villaseñor-Derbez, Juan Carlos; Faro, Caio; Wright, Melaina; Martínez, Jael; Fitzgerald, Sean; Fulton, Stuart; Mancha-Cisneros, Maria Del Mar; McDonald, Gavin; Micheli, Fiorenza; Suárez, Alvin; Torre, Jorge; Costello, Christopher
2018-01-01
Marine reserves are implemented to achieve a variety of objectives, but are seldom rigorously evaluated to determine whether those objectives are met. In the rare cases when evaluations do take place, they typically focus on ecological indicators and ignore other relevant objectives such as socioeconomics and governance. And regardless of the objectives, the diversity of locations, monitoring protocols, and analysis approaches hinder the ability to compare results across case studies. Moreover, analysis and evaluation of reserves is generally conducted by outside researchers, not the reserve managers or users, plausibly thereby hindering effective local management and rapid response to change. We present a framework and tool, called "MAREA", to overcome these challenges. Its purpose is to evaluate the extent to which any given reserve has achieved its stated objectives. MAREA provides specific guidance on data collection and formatting, and then conducts rigorous causal inference analysis based on data input by the user, providing real-time outputs about the effectiveness of the reserve. MAREA's ease of use, standardization of state-of-the-art inference methods, and ability to analyze marine reserve effectiveness across ecological, socioeconomic, and governance objectives could dramatically further our understanding and support of effective marine reserve management.
A user-friendly tool to evaluate the effectiveness of no-take marine reserves
Fitzgerald, Sean; Fulton, Stuart; Mancha-Cisneros, Maria del Mar; McDonald, Gavin; Micheli, Fiorenza; Suárez, Alvin; Torre, Jorge
2018-01-01
Marine reserves are implemented to achieve a variety of objectives, but are seldom rigorously evaluated to determine whether those objectives are met. In the rare cases when evaluations do take place, they typically focus on ecological indicators and ignore other relevant objectives such as socioeconomics and governance. And regardless of the objectives, the diversity of locations, monitoring protocols, and analysis approaches hinder the ability to compare results across case studies. Moreover, analysis and evaluation of reserves is generally conducted by outside researchers, not the reserve managers or users, plausibly thereby hindering effective local management and rapid response to change. We present a framework and tool, called “MAREA”, to overcome these challenges. Its purpose is to evaluate the extent to which any given reserve has achieved its stated objectives. MAREA provides specific guidance on data collection and formatting, and then conducts rigorous causal inference analysis based on data input by the user, providing real-time outputs about the effectiveness of the reserve. MAREA’s ease of use, standardization of state-of-the-art inference methods, and ability to analyze marine reserve effectiveness across ecological, socioeconomic, and governance objectives could dramatically further our understanding and support of effective marine reserve management. PMID:29381762
Thorsen, Jonathan; Brejnrod, Asker; Mortensen, Martin; Rasmussen, Morten A; Stokholm, Jakob; Al-Soud, Waleed Abu; Sørensen, Søren; Bisgaard, Hans; Waage, Johannes
2016-11-25
There is an immense scientific interest in the human microbiome and its effects on human physiology, health, and disease. A common approach for examining bacterial communities is high-throughput sequencing of 16S rRNA gene hypervariable regions, aggregating sequence-similar amplicons into operational taxonomic units (OTUs). Strategies for detecting differential relative abundance of OTUs between sample conditions include classical statistical approaches as well as a plethora of newer methods, many borrowing from the related field of RNA-seq analysis. This effort is complicated by unique data characteristics, including sparsity, sequencing depth variation, and nonconformity of read counts to theoretical distributions, which is often exacerbated by exploratory and/or unbalanced study designs. Here, we assess the robustness of available methods for (1) inference in differential relative abundance analysis and (2) beta-diversity-based sample separation, using a rigorous benchmarking framework based on large clinical 16S microbiome datasets from different sources. Running more than 380,000 full differential relative abundance tests on real datasets with permuted case/control assignments and in silico-spiked OTUs, we identify large differences in method performance on a range of parameters, including false positive rates, sensitivity to sparsity and case/control balances, and spike-in retrieval rate. In large datasets, methods with the highest false positive rates also tend to have the best detection power. For beta-diversity-based sample separation, we show that library size normalization has very little effect and that the distance metric is the most important factor in terms of separation power. Our results, generalizable to datasets from different sequencing platforms, demonstrate how the choice of method considerably affects analysis outcome. Here, we give recommendations for tools that exhibit low false positive rates, have good retrieval power across effect sizes and case/control proportions, and have low sparsity bias. Result output from some commonly used methods should be interpreted with caution. We provide an easily extensible framework for benchmarking of new methods and future microbiome datasets.
Putrefactive rigor: apparent rigor mortis due to gas distension.
Gill, James R; Landi, Kristen
2011-09-01
Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.
Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).
Suzutani, T; Ishibashi, H; Takatori, T
1978-11-01
The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.
NASA Astrophysics Data System (ADS)
Parumasur, N.; Willie, R.
2008-09-01
We consider a simple HIV/AIDs finite dimensional mathematical model on interactions of the blood cells, the HIV/AIDs virus and the immune system for consistence of the equations to the real biomedical situation that they model. A better understanding to a cure solution to the illness modeled by the finite dimensional equations is given. This is accomplished through rigorous mathematical analysis and is reinforced by numerical analysis of models developed for real life cases.
Ruiz-Meana, M; Garcia-Dorado, D; Juliá, M; Inserte, J; Siegmund, B; Ladilov, Y; Piper, M; Tritto, F P; González, M A; Soler-Soler, J
2000-01-01
The objective of this study was to investigate the effect of Na+-H+ exchange (NHE) and HCO3--Na+ symport inhibition on the development of rigor contracture. Freshly isolated adult rat cardiomyocytes were subjected to 60 min metabolic inhibition (MI) and 5 min re-energization (Rx). The effects of perfusion of HCO3- or HCO3--free buffer with or without the NHE inhibitor HOE642 (7 microM) were investigated during MI and Rx. In HCO3--free conditions, HOE642 reduced the percentage of cells developing rigor during MI from 79 +/- 1% to 40 +/- 4% (P < 0.001) without modifying the time at which rigor appeared. This resulted in a 30% reduction of hypercontracture during Rx (P < 0.01). The presence of HCO3- abolished the protective effect of HOE642 against rigor. Cells that had developed rigor underwent hypercontracture during Rx independently of treatment allocation. Ratiofluorescence measurement demonstrated that the rise in cytosolic Ca2+ (fura-2) occurred only after the onset of rigor, and was not influenced by HOE642. NHE inhibition did not modify Na+ rise (SBFI) during MI, but exaggerated the initial fall of intracellular pH (BCEFC). In conclusion, HOE642 has a protective effect against rigor during energy deprivation, but only when HCO3--dependent transporters are inhibited. This effect is independent of changes in cytosolic Na+ or Ca2+ concentrations.
To Your Health: NLM update transcript - Improving medical research rigor?
... be a well-tailored solution to enhance the quantitative rigor of medical research, suggests a viewpoint recently published in the Journal ... about 96 percent of medical and public health research articles (that report ... more quantitative rigor would attract widespread attention — if not high ...
Evaluating Rigor in Qualitative Methodology and Research Dissemination
ERIC Educational Resources Information Center
Trainor, Audrey A.; Graue, Elizabeth
2014-01-01
Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…
A Behavioral Model of Landscape Change in the Amazon Basin: The Colonist Case
NASA Technical Reports Server (NTRS)
Walker, R. A.; Drzyzga, S. A.; Li, Y. L.; Wi, J. G.; Caldas, M.; Arima, E.; Vergara, D.
2004-01-01
This paper presents the prototype of a predictive model capable of describing both magnitudes of deforestation and its spatial articulation into patterns of forest fragmentation. In a departure from other landscape models, it establishes an explicit behavioral foundation for algorithm development, predicated on notions of the peasant economy and on household production theory. It takes a 'bottom-up' approach, generating the process of land-cover change occurring at lot level together with the geography of a transportation system to describe regional landscape change. In other words, it translates the decentralized decisions of individual households into a collective, spatial impact. In so doing, the model unites the richness of survey research on farm households with the analytical rigor of spatial analysis enabled by geographic information systems (GIs). The paper describes earlier efforts at spatial modeling, provides a critique of the so-called spatially explicit model, and elaborates a behavioral foundation by considering farm practices of colonists in the Amazon basin. It then uses, insight from the behavioral statement to motivate a GIs-based model architecture. The model is implemented for a long-standing colonization frontier in the eastern sector of the basin, along the Trans-Amazon Highway in the State of Para, Brazil. Results are subjected to both sensitivity analysis and error assessment, and suggestions are made about how the model could be improved.
Orellano-Colón, Elsa M.; Mann, William C.; Rivero, Marta; Torres, Mayra; Jutai, Jeff; Santiago, Angélica; Varas, Nelson
2016-01-01
Assistive technologies (AT) are tools that enhance the independence, safety, and quality of life of older people with functional limitations. While AT may extend independence in ageing, there are racial and ethnic disparities in late-life AT use, with lower rates reported among Hispanic older populations. The aim of this study was to identify barriers experienced by Hispanic community-living older adults for using AT. Sixty Hispanic older adults (70 years and older) with functional limitations participated in this study. A descriptive qualitative research design was used guided by the principles of the Human Activity Assistive Technology Model to gain in-depth understanding of participants’ perspectives regarding barriers to using AT devices. Individual in-depth semi-structure interviews were conducted, using the Assistive Technology Devices Cards (ATDC) assessment as a prompt to facilitate participants’ qualitative responses. Data analysis included descriptive statistics and rigorous thematic content analysis. Lack of AT awareness and information, cost of AT, limited coverage of AT by heath care plans, and perceived complexity of AT were the predominant barriers experienced by the participants. A multi-level approach is required for a better understanding of the barriers for using AT devices. The personal, contextual, and activity-based barriers found in this study can be used to develop culturally sensitive AT interventions to reduce existent disparities in independent living disabilities among older Hispanics. PMID:27294762
Abad-Corpa, Eva; Meseguer-Liza, Cristobal; Martínez-Corbalán, José Tomás; Zárate-Riscal, Lourdes; Caravaca-Hernández, Amor; Paredes-Sidrach de Cardona, Antonio; Carrillo-Alcaraz, Andrés; Delgado-Hito, Pilar; Cabrero-García, Julio
2010-08-01
To generate changes in nursing practice introducing an evidence-based clinical practice (EBCP) model through a participatory process. To evaluate the effectiveness of the changes in terms of nurse-sensitive outcome (NSO). For international nursing science, it is necessary to explore the reasons for supporting EBCP and evaluate the real repercussions and effectiveness. A mixed methods study with a sequential transformative design will be conducted in the bone marrow transplant unit of a tertiary-level Spanish hospital, in two time periods >12 months (date of approval of the protocol: 2006). To evaluate the effectiveness of the intervention, we will use a prospective quasi-experimental design with two non-equivalent and non-concurrent groups. NSO and patient health data will be collected: (a) impact of psycho-social adjustment; (b) patient satisfaction; (c) symptom control; (d) adverse effects. All patients admitted during the period of time will be included, and all staff working on the unit during a participatory action research (PAR). The PAR design will be adopted from a constructivist paradigm perspective, following Checkland's "Soft Systems" theoretical model. Qualitative techniques will be used: 2-hour group meetings with nursing professionals, to be recorded and transcribed. Field diaries (participants and researchers) will be drawn up and data analysis will be carried out by content analysis. PAR is a rigorous research method for introducing changes into practice to improve NSO.
Sensitivity analysis for future space missions with segmented telescopes for high-contrast imaging
NASA Astrophysics Data System (ADS)
Leboulleux, Lucie; Pueyo, Laurent; Sauvage, Jean-François; Mazoyer, Johan; Soummer, Remi; Fusco, Thierry; Sivaramakrishnan, Anand
2018-01-01
The detection and analysis of biomarkers on earth-like planets using direct-imaging will require both high-contrast imaging and spectroscopy at very close angular separation (10^10 star to planet flux ratio at a few 0.1”). This goal can only be achieved with large telescopes in space to overcome atmospheric turbulence, often combined with a coronagraphic instrument with wavefront control. Large segmented space telescopes such as studied for the LUVOIR mission will generate segment-level instabilities and cophasing errors in addition to local mirror surface errors and other aberrations of the overall optical system. These effects contribute directly to the degradation of the final image quality and contrast. We present an analytical model that produces coronagraphic images of a segmented pupil telescope in the presence of segment phasing aberrations expressed as Zernike polynomials. This model relies on a pair-based projection of the segmented pupil and provides results that match an end-to-end simulation with an rms error on the final contrast of ~3%. This analytical model can be applied both to static and dynamic modes, and either in monochromatic or broadband light. It retires the need for end-to-end Monte-Carlo simulations that are otherwise needed to build a rigorous error budget, by enabling quasi-instantaneous analytical evaluations. The ability to invert directly the analytical model provides direct constraints and tolerances on all segments-level phasing and aberrations.
d'Alessio, M. A.; Williams, C.F.
2007-01-01
A suite of new techniques in thermochronometry allow analysis of the thermal history of a sample over a broad range of temperature sensitivities. New analysis tools must be developed that fully and formally integrate these techniques, allowing a single geologic interpretation of the rate and timing of exhumation and burial events consistent with all data. We integrate a thermal model of burial and exhumation, (U-Th)/He age modeling, and fission track age and length modeling. We then use a genetic algorithm to efficiently explore possible time-exhumation histories of a vertical sample profile (such as a borehole), simultaneously solving for exhumation and burial rates as well as changes in background heat flow. We formally combine all data in a rigorous statistical fashion. By parameterizing the model in terms of exhumation rather than time-temperature paths (as traditionally done in fission track modeling), we can ensure that exhumation histories result in a sedimentary basin whose thickness is consistent with the observed basin, a physically based constraint that eliminates otherwise acceptable thermal histories. We apply the technique to heat flow and thermochronometry data from the 2.1 -km-deep San Andreas Fault Observatory at Depth pilot hole near the San Andreas fault, California. We find that the site experienced <1 km of exhumation or burial since the onset of San Andreas fault activity ???30 Ma.
Flores, Glenn
2010-01-01
Despite an accumulating body of literature addressing racial/ethnic disparities in children’s health and health care, there have been few published studies of interventions that have been successful in eliminating these disparities. The objectives of this article, therefore, are to (1) describe 3 interventions that have been successful in eliminating racial/ethnic disparities in children’s health and health care, (2) high-light tips and pitfalls regarding devising, implementing, and evaluating pediatric disparities interventions, and (3) propose a research agenda for pediatric disparities interventions. Key characteristics of the 3 successful interventions include rigorous study designs; large sample sizes; appropriate comparison groups; community-based interventions that are culturally and linguistically sensitive and involve collaboration with participants; research staff from the same community as the participants; appropriate blinding of outcomes assessors; and statistical adjustment of outcomes for relevant covariates. On the basis of these characteristics, I propose tips, pitfalls, an approach, and a research agenda for devising, implementing, and evaluating successful pediatric disparities interventions. Examination of 3 successful interventions indicates that pediatric health care disparities can be eliminated. Achievement of this goal requires an intervention that is rigorous, evidence-based, and culturally and linguistically appropriate. The intervention must also include community collaboration, minimize attrition, adjust for potential confounders, and incorporate mechanisms for sustainability. PMID:19861473
Flores, Glenn
2009-11-01
Despite an accumulating body of literature addressing racial/ethnic disparities in children's health and health care, there have been few published studies of interventions that have been successful in eliminating these disparities. The objectives of this article, therefore, are to (1) describe 3 interventions that have been successful in eliminating racial/ethnic disparities in children's health and health care, (2) highlight tips and pitfalls regarding devising, implementing, and evaluating pediatric disparities interventions, and (3) propose a research agenda for pediatric disparities interventions. Key characteristics of the 3 successful interventions include rigorous study designs; large sample sizes; appropriate comparison groups; community-based interventions that are culturally and linguistically sensitive and involve collaboration with participants; research staff from the same community as the participants; appropriate blinding of outcomes assessors; and statistical adjustment of outcomes for relevant covariates. On the basis of these characteristics, I propose tips, pitfalls, an approach, and a research agenda for devising, implementing, and evaluating successful pediatric disparities interventions. Examination of 3 successful interventions indicates that pediatric health care disparities can be eliminated. Achievement of this goal requires an intervention that is rigorous, evidence-based, and culturally and linguistically appropriate. The intervention must also include community collaboration, minimize attrition, adjust for potential confounders, and incorporate mechanisms for sustainability.
Academic Rigor in General Education, Introductory Astronomy Courses for Nonscience Majors
ERIC Educational Resources Information Center
Brogt, Erik; Draeger, John D.
2015-01-01
We discuss a model of academic rigor and apply this to a general education introductory astronomy course. We argue that even without central tenets of professional astronomy-the use of mathematics--the course can still be considered academically rigorous when expectations, goals, assessments, and curriculum are properly aligned.
Rigor in Agricultural Education Research Reporting: Implications for the Discipline
ERIC Educational Resources Information Center
Fuhrman, Nicholas E.; Ladewig, Howard
2008-01-01
Agricultural education has been criticized for publishing research lacking many of the rigorous qualities found in publications of other disciplines. A few agricultural education researchers have suggested strategies for improving the rigor with which agricultural education studies report on methods and findings. The purpose of this study was to…
Rigor and Responsiveness in Classroom Activity
ERIC Educational Resources Information Center
Thomspon, Jessica; Hagenah, Sara; Kang, Hosun; Stroupe, David; Braaten, Melissa; Colley, Carolyn; Windschitl, Mark
2016-01-01
Background/Context: There are few examples from classrooms or the literature that provide a clear vision of teaching that simultaneously promotes rigorous disciplinary activity and is responsive to all students. Maintaining rigorous and equitable classroom discourse is a worthy goal, yet there is no clear consensus of how this actually works in a…
Classroom Talk for Rigorous Reading Comprehension Instruction
ERIC Educational Resources Information Center
Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.
2004-01-01
This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…
Nguyen, Tuan A.; Sarkar, Pabak; Veetil, Jithesh V.; Koushik, Srinagesh V.; Vogel, Steven S.
2012-01-01
Förster resonance energy transfer (FRET) microscopy is frequently used to study protein interactions and conformational changes in living cells. The utility of FRET is limited by false positive and negative signals. To overcome these limitations we have developed Fluorescence Polarization and Fluctuation Analysis (FPFA), a hybrid single-molecule based method combining time-resolved fluorescence anisotropy (homo-FRET) and fluorescence correlation spectroscopy. Using FPFA, homo-FRET (a 1–10 nm proximity gauge), brightness (a measure of the number of fluorescent subunits in a complex), and correlation time (an attribute sensitive to the mass and shape of a protein complex) can be simultaneously measured. These measurements together rigorously constrain the interpretation of FRET signals. Venus based control-constructs were used to validate FPFA. The utility of FPFA was demonstrated by measuring in living cells the number of subunits in the α-isoform of Venus-tagged calcium-calmodulin dependent protein kinase-II (CaMKIIα) holoenzyme. Brightness analysis revealed that the holoenzyme has, on average, 11.9±1.2 subunit, but values ranged from 10–14 in individual cells. Homo-FRET analysis simultaneously detected that catalytic domains were arranged as dimers in the dodecameric holoenzyme, and this paired organization was confirmed by quantitative hetero-FRET analysis. In freshly prepared cell homogenates FPFA detected only 10.2±1.3 subunits in the holoenzyme with values ranging from 9–12. Despite the reduction in subunit number, catalytic domains were still arranged as pairs in homogenates. Thus, FPFA suggests that while the absolute number of subunits in an auto-inhibited holoenzyme might vary from cell to cell, the organization of catalytic domains into pairs is preserved. PMID:22666486
Processing capacity under perceptual and cognitive load: a closer look at load theory.
Fitousi, Daniel; Wenger, Michael J
2011-06-01
Variations in perceptual and cognitive demands (load) play a major role in determining the efficiency of selective attention. According to load theory (Lavie, Hirst, Fockert, & Viding, 2004) these factors (a) improve or hamper selectivity by altering the way resources (e.g., processing capacity) are allocated, and (b) tap resources rather than data limitations (Norman & Bobrow, 1975). Here we provide an extensive and rigorous set of tests of these assumptions. Predictions regarding changes in processing capacity are tested using the hazard function of the response time (RT) distribution (Townsend & Ashby, 1978; Wenger & Gibson, 2004). The assumption that load taps resource rather than data limitations is examined using measures of sensitivity and bias drawn from signal detection theory (Swets, 1964). All analyses were performed at two levels: the individual and the aggregate. Hypotheses regarding changes in processing capacity were confirmed at the level of the aggregate. Hypotheses regarding resource and data limitations were not completely supported at either level of analysis. And in all of the analyses, we observed substantial individual differences. In sum, the results suggest a need to expand the theoretical vocabulary of load theory, rather than a need to discard it.
Narrow groove plasmonic nano-gratings for surface plasmon resonance sensing
Dhawan, Anuj; Canva, Michael; Vo-Dinh, Tuan
2011-01-01
We present a novel surface plasmon resonance (SPR) configuration based on narrow groove (sub-15 nm) plasmonic nano-gratings such that normally incident radiation can be coupled into surface plasmons without the use of prism-coupling based total internal reflection, as in the classical Kretschmann configuration. This eliminates the angular dependence requirements of SPR-based sensing and allows development of robust miniaturized SPR sensors. Simulations based on Rigorous Coupled Wave Analysis (RCWA) were carried out to numerically calculate the reflectance - from different gold and silver nano-grating structures - as a function of the localized refractive index of the media around the SPR nano-gratings as well as the incident radiation wavelength and angle of incidence. Our calculations indicate substantially higher differential reflectance signals, on localized change of refractive index in the narrow groove plasmonic gratings, as compared to those obtained from conventional SPR-based sensing systems. Furthermore, these calculations allow determination of the optimal nano-grating geometric parameters - i. e. nanoline periodicity, spacing between the nanolines, as well as the height of the nanolines in the nano-grating - for highest sensitivity to localized change of refractive index, as would occur due to binding of a biomolecule target to a functionalized nano-grating surface. PMID:21263620
Sexuality Research in Iran: A Focus on Methodological and Ethical Considerations.
Rahmani, Azam; Merghati-Khoei, Effat; Moghaddam-Banaem, Lida; Zarei, Fatemeh; Montazeri, Ali; Hajizadeh, Ebrahim
2015-07-01
Research on sensitive topics, such as sexuality, could raise technical, methodological, ethical, political, and legal challenges. The aim of this paper was to draw the methodological challenges which the authors confronted during sexuality research with young population in the Iranian culture. This study was an exploratory mixed method one conducted in 2013-14. We interviewed 63 young women aged 18-34 yr in qualitative phase and 265 young women in quantitative phase in (university and non-university) dormitories and in an Adolescent Friendly Center. Data were collected using focus group discussions and individual interviews in the qualitative phase. We employed conventional content analysis to analyze the data. To enhance the rigor of the data, multiple data collection methods, maximum variation sampling, and peer checks were applied. Five main themes emerged from the data: interaction with opposite sex, sexual risk, sexual protective, sex education, and sexual vulnerability. Challenges while conducting sex research have been discussed. These challenges included assumption of promiscuity, language of silence and privacy concerns, and sex segregation policy. We described the strategies applied in our study and the rationales for each strategy. Strategies applied in the present study can be employed in contexts with the similar methodological and moral concerns.
Fourier analysis of the SOR iteration
NASA Technical Reports Server (NTRS)
Leveque, R. J.; Trefethen, L. N.
1986-01-01
The SOR iteration for solving linear systems of equations depends upon an overrelaxation factor omega. It is shown that for the standard model problem of Poisson's equation on a rectangle, the optimal omega and corresponding convergence rate can be rigorously obtained by Fourier analysis. The trick is to tilt the space-time grid so that the SOR stencil becomes symmetrical. The tilted grid also gives insight into the relation between convergence rates of several variants.
Han, Jingjia; Qian, Ximei; Wu, Qingling; Jha, Rajneesh; Duan, Jinshuai; Yang, Zhou; Maher, Kevin O; Nie, Shuming; Xu, Chunhui
2016-10-01
Human pluripotent stem cells (hPSCs) are a promising cell source for regenerative medicine, but their derivatives need to be rigorously evaluated for residual stem cells to prevent teratoma formation. Here, we report the development of novel surface-enhanced Raman scattering (SERS)-based assays that can detect trace numbers of undifferentiated hPSCs in mixed cell populations in a highly specific, ultra-sensitive, and time-efficient manner. By targeting stem cell surface markers SSEA-5 and TRA-1-60 individually or simultaneously, these SERS assays were able to identify as few as 1 stem cell in 10(6) cells, a sensitivity (0.0001%) which was ∼2000 to 15,000-fold higher than that of flow cytometry assays. Using the SERS assay, we demonstrate that the aggregation of hPSC-based cardiomyocyte differentiation cultures into 3D spheres significantly reduced SSEA-5(+) and TRA-1-60(+) cells compared with parallel 2D cultures. Thus, SERS may provide a powerful new technology for quality control of hPSC-derived products for preclinical and clinical applications. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Laboratory and field testing of commercial rotational seismometers
Nigbor, R.L.; Evans, J.R.; Hutt, C.R.
2009-01-01
There are a small number of commercially available sensors to measure rotational motion in the frequency and amplitude ranges appropriate for earthquake motions on the ground and in structures. However, the performance of these rotational seismometers has not been rigorously and independently tested and characterized for earthquake monitoring purposes as is done for translational strong- and weak-motion seismometers. Quantities such as sensitivity, frequency response, resolution, and linearity are needed for the understanding of recorded rotational data. To address this need, we, with assistance from colleagues in the United States and Taiwan, have been developing performance test methodologies and equipment for rotational seismometers. In this article the performance testing methodologies are applied to samples of a commonly used commercial rotational seismometer, the eentec model R-1. Several examples were obtained for various test sequences in 2006, 2007, and 2008. Performance testing of these sensors consisted of measuring: (1) sensitivity and frequency response; (2) clip level; (3) self noise and resolution; and (4) cross-axis sensitivity, both rotational and translational. These sensor-specific results will assist in understanding the performance envelope of the R-1 rotational seismometer, and the test methodologies can be applied to other rotational seismometers.
Respect for cultural diversity and the empirical turn in bioethics: a plea for caution.
Mbugua, Karori
2012-01-01
In the last two decades, there have been numerous calls for a culturally sensitive bioethics. At the same time, bioethicists have become increasingly involved in empirical research, which is a sign of dissatisfaction with the analytic methods of traditional bioethics. In this article, I will argue that although these developments have broadened and enriched the field of bioethics, they can easily be construed to be an endorsement of ethical relativism, especially by those not well grounded in academic moral philosophy. I maintain that bioethicists must resist the temptation of moving too quickly from cultural relativism to ethical relativism and from empirical findings to normative conclusions. Indeed, anyone who reasons in this way is guilty of the naturalistic fallacy. I conclude by saying that properly conceptualized, empirical research and sensitivity to cultural diversity should give rise to objective rational discourse and criticism and not indiscriminate tolerance of every possible moral practice. Bioethics must remain a normative discipline that is characterized by rigorous argumentation.
Respect for cultural diversity and the empirical turn in bioethics: a plea for caution
Mbugua, Karori
2012-01-01
In the last two decades, there have been numerous calls for a culturally sensitive bioethics. At the same time, bioethicists have become increasingly involved in empirical research, which is a sign of dissatisfaction with the analytic methods of traditional bioethics. In this article, I will argue that although these developments have broadened and enriched the field of bioethics, they can easily be construed to be an endorsement of ethical relativism, especially by those not well grounded in academic moral philosophy. I maintain that bioethicists must resist the temptation of moving too quickly from cultural relativism to ethical relativism and from empirical findings to normative conclusions. Indeed, anyone who reasons in this way is guilty of the naturalistic fallacy. I conclude by saying that properly conceptualized, empirical research and sensitivity to cultural diversity should give rise to objective rational discourse and criticism and not indiscriminate tolerance of every possible moral practice. Bioethics must remain a normative discipline that is characterized by rigorous argumentation. PMID:23908754
Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem
2018-01-01
Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282
Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem
2018-04-01
This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant ( p <0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples.
Focus Group Evidence: Implications for Design and Analysis
ERIC Educational Resources Information Center
Ryan, Katherine E.; Gandha, Tysza; Culbertson, Michael J.; Carlson, Crystal
2014-01-01
In evaluation and applied social research, focus groups may be used to gather different kinds of evidence (e.g., opinion, tacit knowledge). In this article, we argue that making focus group design choices explicitly in relation to the type of evidence required would enhance the empirical value and rigor associated with focus group utilization. We…
Testing Theoretical Models of Magnetic Damping Using an Air Track
ERIC Educational Resources Information Center
Vidaurre, Ana; Riera, Jaime; Monsoriu, Juan A.; Gimenez, Marcos H.
2008-01-01
Magnetic braking is a long-established application of Lenz's law. A rigorous analysis of the laws governing this problem involves solving Maxwell's equations in a time-dependent situation. Approximate models have been developed to describe different experimental results related to this phenomenon. In this paper we present a new method for the…
Quality and Rigor of the Concept Mapping Methodology: A Pooled Study Analysis
ERIC Educational Resources Information Center
Rosas, Scott R.; Kane, Mary
2012-01-01
The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative…
Useful Material Efficiency Green Metrics Problem Set Exercises for Lecture and Laboratory
ERIC Educational Resources Information Center
Andraos, John
2015-01-01
A series of pedagogical problem set exercises are posed that illustrate the principles behind material efficiency green metrics and their application in developing a deeper understanding of reaction and synthesis plan analysis and strategies to optimize them. Rigorous, yet simple, mathematical proofs are given for some of the fundamental concepts,…
Sex Differences in the Response of Children with ADHD to Once-Daily Formulations of Methylphenidate
ERIC Educational Resources Information Center
Sonuga-Barke, J. S.; Coghill, David; Markowitz, John S.; Swanson, James M.; Vandenberghe, Mieke; Hatch, Simon J.
2007-01-01
Objectives: Studies of sex differences in methylphenidate response by children with attention-deficit/hyperactivity disorder have lacked methodological rigor and statistical power. This paper reports an examination of sex differences based on further analysis of data from a comparison of two once-daily methylphenidate formulations (the COMACS…
The Reliability of Informal Reading Inventories: What Has Changed?
ERIC Educational Resources Information Center
Nilsson, Nina L.
2013-01-01
Over time, criticisms related to the technical rigor of informal reading inventories (IRIs) have led many to question using these assessment instruments for high- or low-stakes purposes. In this article, I examine reliability evidence reported in 11 new and updated IRIs and make comparisons with Spector's earlier analysis that revealed fewer than…
Further Iterations on Using the Problem-Analysis Framework
ERIC Educational Resources Information Center
Annan, Michael; Chua, Jocelyn; Cole, Rachel; Kennedy, Emma; James, Robert; Markusdottir, Ingibjorg; Monsen, Jeremy; Robertson, Lucy; Shah, Sonia
2013-01-01
A core component of applied educational and child psychology practice is the skilfulness with which practitioners are able to rigorously structure and conceptualise complex real world human problems. This is done in such a way that when they (with others) jointly work on them, there is an increased likelihood of positive outcomes being achieved…
ERIC Educational Resources Information Center
Moss, Leah; Brown, Andy
2014-01-01
Recognition of Acquired Competencies (RAC) as it is known in Quebec, Canada, or Prior Learning Assessment (PLA), requires a learner to engage in retrospective thought about their learning path, their learning style and their experiential knowledge. This process of critical self-reflection and rigorous analysis by the learner of their prior…
ERIC Educational Resources Information Center
Nelson, Jason M.; Canivez, Gary L.; Lindstrom, Will; Hatt, Clifford V.
2007-01-01
The factor structure of the Reynolds Intellectual Assessment Scales (RIAS; [Reynolds, C.R., & Kamphaus, R.W. (2003). "Reynolds Intellectual Assessment Scales". Lutz, FL: Psychological Assessment Resources, Inc.]) was investigated with a large (N=1163) independent sample of referred students (ages 6-18). More rigorous factor extraction criteria…
Uncertainty analysis: an evaluation metric for synthesis science
Mark E. Harmon; Becky Fasth; Charles B. Halpern; James A. Lutz
2015-01-01
The methods for conducting reductionist ecological science are well known and widely used. In contrast, those used in the synthesis of ecological science (i.e., synthesis science) are still being developed, vary widely, and often lack the rigor of reductionist approaches. This is unfortunate because the synthesis of ecological parts into a greater whole is...
ERIC Educational Resources Information Center
Brunori, Maurizio
2012-01-01
Before the outbreak of World War II, Jeffries Wyman postulated that the "Bohr effect" in hemoglobin demanded the oxygen linked dissociation of the imidazole of two histidines of the polypeptide. This proposal emerged from a rigorous analysis of the acid-base titration curves of oxy- and deoxy-hemoglobin, at a time when the information on the…
Shakespeare and the Common Core: An Opportunity to Reboot
ERIC Educational Resources Information Center
Turchi, Laura; Thompson, Ayanna
2013-01-01
The Common Core generally eschews mandating texts in favor of promoting critical analysis and rigor. So it's significant that Shakespeare is the only author invoked in imperatives. His explicit inclusion offers a significant opportunity for educators to rethink how we approach Shakespearean instruction. Rather than the traditional learning of…
Drowning in Data but Thirsty for Analysis
ERIC Educational Resources Information Center
Roderick, Melissa
2012-01-01
This commentary frames the importance of the topic of this special issue by highlighting the changes that have occurred in school systems around data use, particularly in large urban districts, and the need for a more rigorous evidence base. Collectively, the articles in this volume provide a jumping-off point for such a research agenda around…
Science and Mathematics Advanced Placement Exams: Growth and Achievement over Time
ERIC Educational Resources Information Center
Judson, Eugene
2017-01-01
Rapid growth of Advanced Placement (AP) exams in the last 2 decades has been paralleled by national enthusiasm to promote availability and rigor of science, technology, engineering, and mathematics (STEM). Trends were examined in STEM AP to evaluate and compare growth and achievement. Analysis included individual STEM subjects and disaggregation…
An IRT Analysis of Preservice Teacher Self-Efficacy in Technology Integration
ERIC Educational Resources Information Center
Browne, Jeremy
2011-01-01
The need for rigorously developed measures of preservice teacher traits regarding technology integration training has been acknowledged (Kay 2006), but such instruments are still extremely rare. The Technology Integration Confidence Scale (TICS) represents one such measure, but past analyses of its functioning have been limited by sample size and…
Richard Haynes; Darius Adams; Peter Ince; John Mills; Ralph Alig
2006-01-01
The United States has a century of experience with the development of models that describe markets for forest products and trends in resource conditions. In the last four decades, increasing rigor in policy debates has stimulated the development of models to support policy analysis. Increasingly, research has evolved (often relying on computer-based models) to increase...
ERIC Educational Resources Information Center
Dombrowski, Stefan C.; Watkins, Marley W.; Brogan, Michael J.
2009-01-01
This study investigated the factor structure of the Reynolds Intellectual Assessment Scales (RIAS) using rigorous exploratory factor analytic and factor extraction procedures. The results of this study indicate that the RIAS is a single factor test. Despite these results, higher order factor analysis using the Schmid-Leiman procedure indicates…
Integrated Energy Solutions Research | Integrated Energy Solutions | NREL
that spans the height and width of the wall they are facing. Decision Science and Informatics Enabling decision makers with rigorous, technology-neutral, data-backed decision support to maximize the impact of security in energy systems through analysis, decision support, advanced energy technology development, and
A Study of Best Practices in Training Transfer and Proposed Model of Transfer
ERIC Educational Resources Information Center
Burke, Lisa A.; Hutchins, Holly M.
2008-01-01
Data were gathered from a sample of training professionals of an American Society of Training and Development (ASTD) chapter in the southern United States regarding best practices for supporting training transfer. Content analysis techniques, based on a rigorous methodology proposed by Insch, Moore, & Murphy (1997), were used to analyze the…
ERIC Educational Resources Information Center
Follette, William C.; Bonow, Jordan T.
2009-01-01
Whether explicitly acknowledged or not, behavior-analytic principles are at the heart of most, if not all, empirically supported therapies. However, the change process in psychotherapy is only now being rigorously studied. Functional analytic psychotherapy (FAP; Kohlenberg & Tsai, 1991; Tsai et al., 2009) explicitly identifies behavioral-change…
Teaching the Concept of Breakdown Point in Simple Linear Regression.
ERIC Educational Resources Information Center
Chan, Wai-Sum
2001-01-01
Most introductory textbooks on simple linear regression analysis mention the fact that extreme data points have a great influence on ordinary least-squares regression estimation; however, not many textbooks provide a rigorous mathematical explanation of this phenomenon. Suggests a way to fill this gap by teaching students the concept of breakdown…
Fiber facet gratings for high power fiber lasers
NASA Astrophysics Data System (ADS)
Vanek, Martin; Vanis, Jan; Baravets, Yauhen; Todorov, Filip; Ctyroky, Jiri; Honzatko, Pavel
2017-12-01
We numerically investigated the properties of diffraction gratings designated for fabrication on the facet of an optical fiber. The gratings are intended to be used in high-power fiber lasers as mirrors either with a low or high reflectivity. The modal reflectance of low reflectivity polarizing grating has a value close to 3% for TE mode while it is significantly suppressed for TM mode. Such a grating can be fabricated on laser output fiber facet. The polarizing grating with high modal reflectance is designed as a leaky-mode resonant diffraction grating. The grating can be etched in a thin layer of high index dielectric which is sputtered on fiber facet. We used refractive index of Ta2O5 for such a layer. We found that modal reflectance can be close to 0.95 for TE polarization and polarization extinction ratio achieves 18 dB. Rigorous coupled wave analysis was used for fast optimization of grating parameters while aperiodic rigorous coupled wave analysis, Fourier modal method and finite difference time domain method were compared and used to compute modal reflectance of designed gratings.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
Developing a Student Conception of Academic Rigor
ERIC Educational Resources Information Center
Draeger, John; del Prado Hill, Pixita; Mahler, Ronnie
2015-01-01
In this article we describe models of academic rigor from the student point of view. Drawing on a campus-wide survey, focus groups, and interviews with students, we found that students explained academic rigor in terms of workload, grading standards, level of difficulty, level of interest, and perceived relevance to future goals. These findings…
Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals
ERIC Educational Resources Information Center
Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna
2012-01-01
Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…
"Rigor for What?" Social Studies Teacher Conceptions and Enactments of Instructional Rigor
ERIC Educational Resources Information Center
Gibbs, Brian
2017-01-01
Taken from a larger qualitative study, this article argues that rather than an encompassing uniform definition, rigor, as understood and enacted by social studies teachers, exists on a complicated spectrum. Teacher placement on this spectrum was influenced by teacher life experience, teacher interpretation of student need, pedagogy employed, how…
Another View: In Defense of Vigor over Rigor in Classroom Demonstrations
ERIC Educational Resources Information Center
Dunn, Dana S.
2008-01-01
Scholarship of teaching and learning (SoTL) demands greater empirical rigor on the part of authors and the editorial process than ever before. Although admirable and important, I worry that this increasing rigor will limit opportunities and outlets for a form of pedagogical vigor--the publication of simple, experiential, but empirically…
Characteristics of School Districts That Participate in Rigorous National Educational Evaluations
ERIC Educational Resources Information Center
Stuart, Elizabeth A.; Bell, Stephen H.; Ebnesajjad, Cyrus; Olsen, Robert B.; Orr, Larry L.
2017-01-01
Given increasing interest in evidence-based policy, there is growing attention to how well the results from rigorous program evaluations may inform policy decisions. However, little attention has been paid to documenting the characteristics of schools or districts that participate in rigorous educational evaluations, and how they compare to…
Moving beyond Data Transcription: Rigor as Issue in Representation of Digital Literacies
ERIC Educational Resources Information Center
Hagood, Margaret Carmody; Skinner, Emily Neil
2015-01-01
Rigor in qualitative research has been based upon criteria of credibility, dependability, confirmability, and transferability. Drawing upon articles published during our editorship of the "Journal of Adolescent & Adult Literacy," we illustrate how the use of digital data in research study reporting may enhance these areas of rigor,…
NASA Astrophysics Data System (ADS)
Bahl, Mayank; Zhou, Gui-Rong; Heller, Evan; Cassarly, William; Jiang, Mingming; Scarmozzino, Rob; Gregory, G. Groot
2014-09-01
Over the last two decades there has been extensive research done to improve the design of Organic Light Emitting Diodes (OLEDs) so as to enhance light extraction efficiency, improve beam shaping, and allow color tuning through techniques such as the use of patterned substrates, photonic crystal (PCs) gratings, back reflectors, surface texture, and phosphor down-conversion. Computational simulation has been an important tool for examining these increasingly complex designs. It has provided insights for improving OLED performance as a result of its ability to explore limitations, predict solutions, and demonstrate theoretical results. Depending upon the focus of the design and scale of the problem, simulations are carried out using rigorous electromagnetic (EM) wave optics based techniques, such as finite-difference time-domain (FDTD) and rigorous coupled wave analysis (RCWA), or through ray optics based technique such as Monte Carlo ray-tracing. The former are typically used for modeling nanostructures on the OLED die, and the latter for modeling encapsulating structures, die placement, back-reflection, and phosphor down-conversion. This paper presents the use of a mixed-level simulation approach which unifies the use of EM wave-level and ray-level tools. This approach uses rigorous EM wave based tools to characterize the nanostructured die and generate both a Bidirectional Scattering Distribution function (BSDF) and a far-field angular intensity distribution. These characteristics are then incorporated into the ray-tracing simulator to obtain the overall performance. Such mixed-level approach allows for comprehensive modeling of the optical characteristic of OLEDs and can potentially lead to more accurate performance than that from individual modeling tools alone.
Dhar, Anjan; Close, Helen; Viswanath, Yirupaiahgari K; Rees, Colin J; Hancock, Helen C; Dwarakanath, A Deepak; Maier, Rebecca H; Wilson, Douglas; Mason, James M
2014-12-28
To undertake a randomised pilot study comparing biodegradable stents and endoscopic dilatation in patients with strictures. This British multi-site study recruited seventeen symptomatic adult patients with refractory strictures. Patients were randomised using a multicentre, blinded assessor design, comparing a biodegradable stent (BS) with endoscopic dilatation (ED). The primary endpoint was the average dysphagia score during the first 6 mo. Secondary endpoints included repeat endoscopic procedures, quality of life, and adverse events. Secondary analysis included follow-up to 12 mo. Sensitivity analyses explored alternative estimation methods for dysphagia and multiple imputation of missing values. Nonparametric tests were used. Although both groups improved, the average dysphagia scores for patients receiving stents were higher after 6 mo: BS-ED 1.17 (95%CI: 0.63-1.78) P = 0.029. The finding was robust under different estimation methods. Use of additional endoscopic procedures and quality of life (QALY) estimates were similar for BS and ED patients at 6 and 12 mo. Concomitant use of gastrointestinal prescribed medication was greater in the stent group (BS 5.1, ED 2.0 prescriptions; P < 0.001), as were related adverse events (BS 1.4, ED 0.0 events; P = 0.024). Groups were comparable at baseline and findings were statistically significant but numbers were small due to under-recruitment. The oesophageal tract has somatic sensitivity and the process of the stent dissolving, possibly unevenly, might promote discomfort or reflux. Stenting was associated with greater dysphagia, co-medication and adverse events. Rigorously conducted and adequately powered trials are needed before widespread adoption of this technology.
Yokota, Satoshi; Oshio, Shigeru
2018-04-01
Vitamin A is a vital nutritional substances that regulates biological activities including development, but is also associated with disease onset. The extent of vitamin A intake influences the retinoid content in the liver, the most important organ for the storage of vitamin A. Measurement of endogenous retinoid in biological samples is important to understand retinoid homeostasis. Here we present a reliable, highly sensitive, and robust method for the quantification of retinol and retinyl palmitate using a reverse-phase HPLC/UV isocratic method. We determined the impact of chronic dietary vitamin A on retinoid levels in livers of mice fed an AIN-93G semi-purified diet (4 IU/g) compared with an excess vitamin A diet (1000 IU/g) over a period from birth to 10 weeks of age. Coefficients of variation for intra-assays for both retinoids were less than 5%, suggesting a higher reproducibility than any other HPLC/UV gradient method. Limits of detection and quantification for retinol were 0.08 pmol, and 0.27 pmol, respectively, which are remarkably higher than previous results. Supplementation with higher doses of vitamin A over the study period significantly increased liver retinol and retinyl palmitate concentrations in adult mice. The assays described here provide a sensitive and rigorous quantification of endogenous retinol and retinyl palmitate, which can be used to help determine retinoid homeostasis in disease states, such as toxic hepatitis and liver cancer. Copyright © 2017. Published by Elsevier B.V.
Ai, Xinghao; Sun, Yingjia; Wang, Haidong; Lu, Shun
2014-07-01
Human epidermal growth factor receptor (EGFR) has become a well-established target for the treatment of patients with non-small cell lung cancer (NSCLC). However, a large number of somatic mutations in such protein have been observed to cause drug resistance or sensitivity during pathological progression, limiting the application of reversible EGFR tyrosine kinase inhibitor therapy in NSCLC. In the current work, we describe an integration of in silico analysis and in vitro assay to profile six representative EGFR inhibitors against a panel of 71 observed somatic mutations in EGFR tyrosine kinase domain. In the procedure, the changes in interaction free energy of inhibitors with EGFR upon various mutations were calculated one by one using a rigorous computational scheme, which was preoptimized based on a set of structure-solved, affinity-known samples to improve its performance in characterizing the EGFR-inhibitor system. This method was later demonstrated to be effective in inferring drug response to the classical L858R and G719S mutations that confer constitutive activation for the EGFR kinase. It is found that the Staurosporine, a natural product isolated from the bacterium Streptomyces staurosporeus, exhibits selective inhibitory activity on the T790M and T790M/L858R mutants. This finding was subsequently solidified by in vitro kinase assay experiment; the inhibitory IC50 values of Staurosporine against wild-type, T790M and T790M/L858R mutant EGFR were measured to be 937, 12 and 3 nM, respectively.
Dhar, Anjan; Close, Helen; Viswanath, Yirupaiahgari K; Rees, Colin J; Hancock, Helen C; Dwarakanath, A Deepak; Maier, Rebecca H; Wilson, Douglas; Mason, James M
2014-01-01
AIM: To undertake a randomised pilot study comparing biodegradable stents and endoscopic dilatation in patients with strictures. METHODS: This British multi-site study recruited seventeen symptomatic adult patients with refractory strictures. Patients were randomised using a multicentre, blinded assessor design, comparing a biodegradable stent (BS) with endoscopic dilatation (ED). The primary endpoint was the average dysphagia score during the first 6 mo. Secondary endpoints included repeat endoscopic procedures, quality of life, and adverse events. Secondary analysis included follow-up to 12 mo. Sensitivity analyses explored alternative estimation methods for dysphagia and multiple imputation of missing values. Nonparametric tests were used. RESULTS: Although both groups improved, the average dysphagia scores for patients receiving stents were higher after 6 mo: BS-ED 1.17 (95%CI: 0.63-1.78) P = 0.029. The finding was robust under different estimation methods. Use of additional endoscopic procedures and quality of life (QALY) estimates were similar for BS and ED patients at 6 and 12 mo. Concomitant use of gastrointestinal prescribed medication was greater in the stent group (BS 5.1, ED 2.0 prescriptions; P < 0.001), as were related adverse events (BS 1.4, ED 0.0 events; P = 0.024). Groups were comparable at baseline and findings were statistically significant but numbers were small due to under-recruitment. The oesophageal tract has somatic sensitivity and the process of the stent dissolving, possibly unevenly, might promote discomfort or reflux. CONCLUSION: Stenting was associated with greater dysphagia, co-medication and adverse events. Rigorously conducted and adequately powered trials are needed before widespread adoption of this technology. PMID:25561787
Pictogram Evaluation and Authoring Collaboration Environment
Kim, Hyeoneui; Tamayo, Dorothy; Muhkin, Michael; Kim, Jaemin; Lam, Julius; Ohno-Machado, Lucila; Aronoff-Spencer, Eliah
2012-01-01
Studies showed benefits of using pictograms in health communication such as improved recall and comprehension of health instructions. Pictograms are culturally sensitive thus need to be rigorously validated to ensure they convey the intended meaning correctly to the targeted population. The infeasibility of manually creating pictograms and the lack of robust means to store and validate pictograms are potential barriers to the wider adoption of pictograms in health communication. To address these challenges, we created an open access web-based tool, PEACE (Pictogram Evaluation and Authoring Collaboration Environment) as a part of SHINE (Sustainable Health Informatics and Networking Environment) initiatives. We report the development process and the preliminary evaluation results of PEACE in this paper. PMID:24199088
A Study in HRT Resolution: Seeking Maximum Sensitivity Among Variations in Sensing Element Material
NASA Technical Reports Server (NTRS)
Morales, Jeremy M.
2005-01-01
The EXACT (Experiments Along Coexistence near Tricriticality) project endeavors to perform the most rigorous test to date of Renormalization Group theory. In most cases, the theory gives only approximate solutions, but it offers exact predictions in the case of the He-3-He-4 tricritical point. Currently, the project is focused on maximizing the performance of the low-temperature system's HRT (high resolution thermometer) near the tricritical point. The HRT uses a PdMn sensing element, the qualities of which change based on its Mn concentration and whether or not it is annealed. All sensing element combinations will be catalogued, and through the data, the optimum configuration will be reported.
Papadimitropoulos, Adam; Rovithakis, George A; Parisini, Thomas
2007-07-01
In this paper, the problem of fault detection in mechanical systems performing linear motion, under the action of friction phenomena is addressed. The friction effects are modeled through the dynamic LuGre model. The proposed architecture is built upon an online neural network (NN) approximator, which requires only system's position and velocity. The friction internal state is not assumed to be available for measurement. The neural fault detection methodology is analyzed with respect to its robustness and sensitivity properties. Rigorous fault detectability conditions and upper bounds for the detection time are also derived. Extensive simulation results showing the effectiveness of the proposed methodology are provided, including a real case study on an industrial actuator.
Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.
Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten
2018-01-01
Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence, of sample size calculations, blinding techniques, and randomization procedures could better enable readers to evaluate potential sources of bias in animal-experimental research manuscripts. Future studies should assess whether such steps lead to improved translation of animal-experimental anesthesia research into successful clinical trials.
The conformation of myosin head domains in rigor muscle determined by X-ray interference.
Reconditi, M; Koubassova, N; Linari, M; Dobbie, I; Narayanan, T; Diat, O; Piazzesi, G; Lombardi, V; Irving, M
2003-08-01
In the absence of adenosine triphosphate, the head domains of myosin cross-bridges in muscle bind to actin filaments in a rigor conformation that is expected to mimic that following the working stroke during active contraction. We used x-ray interference between the two head arrays in opposite halves of each myosin filament to determine the rigor head conformation in single fibers from frog skeletal muscle. During isometric contraction (force T(0)), the interference effect splits the M3 x-ray reflection from the axial repeat of the heads into two peaks with relative intensity (higher angle/lower angle peak) 0.76. In demembranated fibers in rigor at low force (<0.05 T(0)), the relative intensity was 4.0, showing that the center of mass of the heads had moved 4.5 nm closer to the midpoint of the myosin filament. When rigor fibers were stretched, increasing the force to 0.55 T(0), the heads' center of mass moved back by 1.1-1.6 nm. These motions can be explained by tilting of the light chain domain of the head so that the mean angle between the Cys(707)-Lys(843) vector and the filament axis increases by approximately 36 degrees between isometric contraction and low-force rigor, and decreases by 7-10 degrees when the rigor fiber is stretched to 0.55 T(0).
Rigor in Your School: A Toolkit for Leaders
ERIC Educational Resources Information Center
Williamson, Ronald; Blackburn, Barbara R.
2011-01-01
Raise the level of rigor in your school and dramatically improve student learning with the tools in this book. Each illuminating exercise is tailored to educators looking to spread the word on rigor and beat the obstacles to achieving it schoolwide. Formatted for duplication and repeated use, these tools are perfect for those who currently hold a…
34 CFR 691.16 - Rigorous secondary school program of study.
Code of Federal Regulations, 2011 CFR
2011-07-01
...: biology, chemistry, and physics. (iv) Three years of social studies. (v) One year of a language other than... 34 Education 4 2011-07-01 2011-07-01 false Rigorous secondary school program of study. 691.16... Procedures § 691.16 Rigorous secondary school program of study. (a)(1) For each award year commencing with...
34 CFR 691.16 - Rigorous secondary school program of study.
Code of Federal Regulations, 2013 CFR
2013-07-01
...: biology, chemistry, and physics. (iv) Three years of social studies. (v) One year of a language other than... 34 Education 4 2013-07-01 2013-07-01 false Rigorous secondary school program of study. 691.16... Procedures § 691.16 Rigorous secondary school program of study. (a)(1) For each award year commencing with...
34 CFR 691.16 - Rigorous secondary school program of study.
Code of Federal Regulations, 2014 CFR
2014-07-01
...: biology, chemistry, and physics. (iv) Three years of social studies. (v) One year of a language other than... 34 Education 4 2014-07-01 2014-07-01 false Rigorous secondary school program of study. 691.16... Procedures § 691.16 Rigorous secondary school program of study. (a)(1) For each award year commencing with...
34 CFR 691.16 - Rigorous secondary school program of study.
Code of Federal Regulations, 2012 CFR
2012-07-01
...: biology, chemistry, and physics. (iv) Three years of social studies. (v) One year of a language other than... 34 Education 4 2012-07-01 2012-07-01 false Rigorous secondary school program of study. 691.16... Procedures § 691.16 Rigorous secondary school program of study. (a)(1) For each award year commencing with...
ERIC Educational Resources Information Center
Colley, Carolyn; Windschitl, Mark
2016-01-01
Teaching that is responsive to students' ideas can create opportunities for rigorous sense-making talk by young learners. Yet we have few accounts of how thoughtful attempts at responsive teaching unfold across units of instruction in elementary science classrooms and have only begun to understand how responsiveness encourages rigor in…
The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools
ERIC Educational Resources Information Center
Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia
2016-01-01
Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…
Hollow-cylinder waveguide isolators for use at millimeter wavelengths
NASA Technical Reports Server (NTRS)
Kanda, M.; May, W. G.
1974-01-01
The device considered in this study is a semiconductor waveguide isolator consisting of a hollow column of a semiconductor mounted coaxially in a circular waveguide in a longitudinal dc magnetic field. An elementary and physical analysis based on the excitation of plane waves in the guide and a more rigorous mode-matching analysis (MMA) are presented. These theoretical predictions are compared with experimental results for an InSb isolator at 94 GHz and 75 K.
The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research.
Reichlin, Thomas S; Vogt, Lucile; Würbel, Hanno
2016-01-01
Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers' self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers' self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to choose effective over ineffective measures against six different biases. Our results further indicate that knowledge of the ARRIVE guidelines had a positive effect on scientific rigor. However, the ARRIVE guidelines were known by less than half of the participants (43.7%); and among those whose latest paper was published in a journal that had endorsed the ARRIVE guidelines, more than half (51%) had never heard of these guidelines. Our results suggest that whereas reporting rates may underestimate the true use of measures against risks of bias, self-reports may overestimate it. To a large extent, this discrepancy can be explained by the researchers' ignorance and lack of knowledge of risks of bias and measures to prevent them. Our analysis thus adds significant new evidence to the assessment of research integrity in animal research. Our findings further question the confidence that the authorities have in scientific rigor, which is taken for granted in the harm-benefit analyses on which approval of animal experiments is based. Furthermore, they suggest that better education on scientific integrity and good research practice is needed. However, they also question reliance on reporting rates as indicators of scientific rigor and highlight a need for more reliable predictors.
First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation
Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...
2016-12-02
Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.
Preliminary assessment of aerial photography techniques for canvasback population analysis
Munro, R.E.; Trauger, D.L.
1976-01-01
Recent intensive research on the canvasback has focused attention on the need for more precise estimates of population parameters. During the 1972-75 period, various types of aerial photographing equipment were evaluated to determine the problems and potentials for employing these techniques in appraisals of canvasback populations. The equipment and procedures available for automated analysis of aerial photographic imagery were also investigated. Serious technical problems remain to be resolved, but some promising results were obtained. Final conclusions about the feasibility of operational implementation await a more rigorous analysis of the data collected.
Herbal medicine development: a plea for a rigorous scientific foundation.
Lietman, Paul S
2012-09-01
Science, including rigorous basic scientific research and rigorous clinical research, must underlie both the development and the clinical use of herbal medicines. Yet almost none of the hundreds or thousands of articles that are published each year on some aspect of herbal medicines, adheres to 3 simple but profound scientific principles must underlie all of herbal drug development or clinical use. Three fundamental principles that should underlie everyone's thinking about the development and/or clinical use of any herbal medicine. (1) There must be standardization and regulation (rigorously enforced) of the product being studied or being used clinically. (2) There must be scientific proof of a beneficial clinical effect for something of value to the patient and established by rigorous clinical research. (3) There must be scientific proof of safety (acceptable toxicity) for the patient and established by rigorous clinical research. These fundamental principles of science have ramifications for both the scientist and the clinician. It is critically important that both the investigator and the prescriber know exactly what is in the studied or recommended product and how effective and toxic it is. We will find new and useful drugs from natural sources. However, we will have to learn how to study herbal medicines rigorously, and we will have to try to convince the believers in herbal medicines of the wisdom and even the necessity of a rigorous scientific approach to herbal medicine development. Both biomedical science and practicing physicians must enthusiastically accept the responsibility for searching for truth in the discovery and development of new herbal medicines, in the truthful teaching about herbal medicines from a scientific perspective, and in the scientifically proven clinical use of herbal medicines.
Tenderness of pre- and post rigor lamb longissimus muscle.
Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad
2011-08-01
Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (P<0.01). Cooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.
A modal analysis of lamellar diffraction gratings in conical mountings
NASA Technical Reports Server (NTRS)
Li, Lifeng
1992-01-01
A rigorous modal analysis of lamellar grating, i.e., gratings having rectangular grooves, in conical mountings is presented. It is an extension of the analysis of Botten et al. which considered non-conical mountings. A key step in the extension is a decomposition of the electromagnetic field in the grating region into two orthogonal components. A computer program implementing this extended modal analysis is capable of dealing with plane wave diffraction by dielectric and metallic gratings with deep grooves, at arbitrary angles of incidence, and having arbitrary incident polarizations. Some numerical examples are included.
Rotella, J.J.; Link, W.A.; Chambert, T.; Stauffer, G.E.; Garrott, R.A.
2012-01-01
1.Life-history theory predicts that those vital rates that make larger contributions to population growth rate ought to be more strongly buffered against environmental variability than are those that are less important. Despite the importance of the theory for predicting demographic responses to changes in the environment, it is not yet known how pervasive demographic buffering is in animal populations because the validity of most existing studies has been called into question because of methodological deficiencies. 2.We tested for demographic buffering in the southern-most breeding mammal population in the world using data collected from 5558 known-age female Weddell seals over 30years. We first estimated all vital rates simultaneously with mark-recapture analysis and then estimated process variance and covariance in those rates using a hierarchical Bayesian approach. We next calculated the population growth rate's sensitivity to changes in each of the vital rates and tested for evidence of demographic buffering by comparing properly scaled values of sensitivity and process variance in vital rates. 3.We found evidence of positive process covariance between vital rates, which indicates that all vital rates are affected in the same direction by changes in annual environment. Despite the positive correlations, we found strong evidence that demographic buffering occurred through reductions in variation in the vital rates to which population growth rate was most sensitive. Process variation in vital rates was inversely related to sensitivity measures such that variation was greatest in breeding probabilities, intermediate for survival rates of young animals and lowest for survival rates of older animals. 4.Our work contributes to a small but growing set of studies that have used rigorous methods on long-term, detailed data to investigate demographic responses to environmental variation. The information from these studies improves our understanding of life-history evolution in stochastic environments and provides useful information for predicting population responses to future environmental change. Our results for an Antarctic apex predator also provide useful baselines from a marine ecosystem when its top- and middle-trophic levels were not substantially impacted by human activity. ?? 2011 The Authors. Journal of Animal Ecology ?? 2011 British Ecological Society.
Single toxin dose-response models revisited
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demidenko, Eugene, E-mail: eugened@dartmouth.edu
The goal of this paper is to offer a rigorous analysis of the sigmoid shape single toxin dose-response relationship. The toxin efficacy function is introduced and four special points, including maximum toxin efficacy and inflection points, on the dose-response curve are defined. The special points define three phases of the toxin effect on mortality: (1) toxin concentrations smaller than the first inflection point or (2) larger then the second inflection point imply low mortality rate, and (3) concentrations between the first and the second inflection points imply high mortality rate. Probabilistic interpretation and mathematical analysis for each of the fourmore » models, Hill, logit, probit, and Weibull is provided. Two general model extensions are introduced: (1) the multi-target hit model that accounts for the existence of several vital receptors affected by the toxin, and (2) model with a nonzero mortality at zero concentration to account for natural mortality. Special attention is given to statistical estimation in the framework of the generalized linear model with the binomial dependent variable as the mortality count in each experiment, contrary to the widespread nonlinear regression treating the mortality rate as continuous variable. The models are illustrated using standard EPA Daphnia acute (48 h) toxicity tests with mortality as a function of NiCl or CuSO{sub 4} toxin. - Highlights: • The paper offers a rigorous study of a sigmoid dose-response relationship. • The concentration with highest mortality rate is rigorously defined. • A table with four special points for five morality curves is presented. • Two new sigmoid dose-response models have been introduced. • The generalized linear model is advocated for estimation of sigmoid dose-response relationship.« less
Nazir, Jameel; Maman, Khaled; Neine, Mohamed-Elmoctar; Briquet, Benjamin; Odeyemi, Isaac A O; Hakimi, Zalmai; Garnham, Andy; Aballéa, Samuel
2015-09-01
Mirabegron, a first-in-class selective oral β3-adrenoceptor agonist, has similar efficacy to most antimuscarinic agents and a lower incidence of dry mouth in patients with overactive bladder (OAB). To evaluate the cost-effectiveness of mirabegron 50 mg compared with oral antimuscarinic agents in adults with OAB from a UK National Health Service perspective. A Markov model including health states for symptom severity, treatment status, and adverse events was developed. Cycle length was 1 month, and the time horizon was 5 years. Antimuscarinic comparators were tolterodine extended release, solifenacin, fesoterodine, oxybutynin extended release and immediate release (IR), darifenacin, and trospium chloride modified release. Transition probabilities for symptom severity levels and adverse events were estimated from a mirabegron trial and a mixed treatment comparison. Estimates for other inputs were obtained from published literature or expert opinion. Quality-adjusted life-years (QALYs) and total health care costs, including costs of drug acquisition, physician visits, incontinence pad use, and botox injections, were modeled. Deterministic and probabilistic sensitivity analyses were performed. Base-case incremental cost-effectiveness ratios ranged from £367 (vs. solifenacin 10 mg) to £15,593 (vs. oxybutynin IR 10 mg) per QALY gained. Probabilistic sensitivity analyses showed that at a willingness-to-pay threshold of £20,000/QALY gained, the probability of mirabegron 50 mg being cost-effective ranged from 70.2% versus oxybutynin IR 10 mg to 97.8% versus darifenacin 15 mg. A limitation of our analysis is the uncertainty due to the lack of direct comparisons of mirabegron with other agents; a mixed treatment comparison using rigorous methodology provided the data for the analysis, but the studies involved showed heterogeneity. Mirabegron 50 mg appears to be cost-effective compared with standard oral antimuscarinic agents for the treatment of adults with OAB from a UK National Health Service perspective. Copyright © 2015. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...
2016-02-01
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Crack propagation and arrest in CFRP materials with strain softening regions
NASA Astrophysics Data System (ADS)
Dilligan, Matthew Anthony
Understanding the growth and arrest of cracks in composite materials is critical for their effective utilization in fatigue-sensitive and damage susceptible applications such as primary aircraft structures. Local tailoring of the laminate stack to provide crack arrest capacity intermediate to major structural components has been investigated and demonstrated since some of the earliest efforts in composite aerostructural design, but to date no rigorous model of the crack arrest mechanism has been developed to allow effective sizing of these features. To address this shortcoming, the previous work in the field is reviewed, with particular attention to the analysis methodologies proposed for similar arrest features. The damage and arrest processes active in such features are investigated, and various models of these processes are discussed and evaluated. Governing equations are derived based on a proposed mechanistic model of the crack arrest process. The derived governing equations are implemented in a numerical model, and a series of simulations are performed to ascertain the general characteristics of the proposed model and allow qualitative comparison to existing experimental results. The sensitivity of the model and the arrest process to various parameters is investigated, and preliminary conclusions regarding the optimal feature configuration are developed. To address deficiencies in the available material and experimental data, a series of coupon tests are developed and conducted covering a range of arrest zone configurations. Test results are discussed and analyzed, with a particular focus on identification of the proposed failure and arrest mechanisms. Utilizing the experimentally derived material properties, the tests are reproduced with both the developed numerical tool as well as a FEA-based implementation of the arrest model. Correlation between the simulated and experimental results is analyzed, and future avenues of investigation are identified. Utilizing the developed model, a sensitivity study is conducted to assess the current proposed arrest configuration. Optimum distribution and sizing of the arrest zones is investigated, and general design guidelines are developed.
NASA Technical Reports Server (NTRS)
Sandford, Stephen P.
2010-01-01
The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is one of four Tier 1 missions recommended by the recent NRC Decadal Survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to provide accurate, broadly acknowledged climate records that are used to enable validated long-term climate projections that become the foundation for informed decisions on mitigation and adaptation policies that address the effects of climate change on society. The CLARREO mission accomplishes this critical objective through rigorous SI traceable decadal change observations that are sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. These same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. For the first time CLARREO will make highly accurate, global, SI-traceable decadal change observations sensitive to the most critical, but least understood, climate forcings, responses, and feedbacks. The CLARREO breakthrough is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. The required accuracy levels are determined so that climate trend signals can be detected against a background of naturally occurring variability. Climate system natural variability therefore determines what level of accuracy is overkill, and what level is critical to obtain. In this sense, the CLARREO mission requirements are considered optimal from a science value perspective. The accuracy for decadal change traceability to SI standards includes uncertainties associated with instrument calibration, satellite orbit sampling, and analysis methods. Unlike most space missions, the CLARREO requirements are driven not by the instantaneous accuracy of the measurements, but by accuracy in the large time/space scale averages that are key to understanding decadal changes.
NASA Astrophysics Data System (ADS)
Morris, P. J.; Verhoef, A.; Van der Tol, C.; Macdonald, D.
2011-12-01
Rationale: Floodplain meadows are highly species-rich grassland ecosystems, unique in that their vegetation and soil structures have been shaped and maintained by ~1,000 yrs of traditional, low-intensity agricultural management. Widespread development on floodplains over the last two centuries has left few remaining examples of these once commonplace ecosystems and they are afforded high conservation value by British and European agencies. Increased incidences and severity of summer drought and winter flooding in Britain in recent years have placed floodplain plant communities under stress through altered soil moisture regimes. There is a clear need for improved management strategies if the last remaining British floodplain meadows are to be conserved under changing climates. Aim: As part of the Floodplain Underground Sensors Experiment (FUSE, a 3-year project funded by the Natural Environment Research Council) we aim to understand the environmental controls over soil-vegetation-atmosphere transfers (SVAT) of water, CO2 and energy at Yarnton Mead, a floodplain meadow in southern England. An existing model, SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes; van der Tol et al., 2009), uses remotely-sensed infrared radiance spectra to predict heat and water transfers between a vegetation canopy and the atmosphere. We intend to expand SCOPE by developing a more realistic, physically-based representation of water, gas and energy transfers between soil and vegetation. This improved understanding will eventually take the form of a new submodel within SCOPE, allowing more rigorous estimation of soil-canopy-atmosphere exchanges for the site using predominantly remotely-sensed data. In this context a number of existing SVAT models will be tested and compared to ensure that only reliable and robust underground model components will be coupled to SCOPE. Approach: For this study, we parameterised an existing and widely-used SVAT model (CoupModel; Jansson, 2011) for our study site and analysed the model's sensitivity to a comprehensive set of soil/plant biophysical processes and parameter values. Findings: The sensitivity analysis indicates those processes and parameters most important to soil-vegetation-atmosphere transfers at the site. We use the outcomes of the sensitivity analysis to indicate directly the desired structure of the new SCOPE submodel. In addition, existing soil-moisture, soil matric-potential and meteorological data for the site indicate that evapotranspiration is heavily water-limited during summer months, although soil moisture and soil matric-potential data alone provide very little explanation of the ratio of potential to actual evapotranspiration. A mechanistic representation of stomatal resistance and its response to short-term changes in meteorological conditions - independent of soil moisture status - will also likely improve SCOPE's predictions of heat and water transfers. Ultimately our work will contribute to improved understanding and management of floodplain meadows in Britain and elsewhere.
A Case Study to Explore Rigorous Teaching and Testing Practices to Narrow the Achievement Gap
ERIC Educational Resources Information Center
Isler, Tesha
2012-01-01
The problem examined in this study: Does the majority of teachers use rigorous teaching and testing practices? The purpose of this qualitative exploratory case study was to explore the classroom techniques of six effective teachers who use rigorous teaching and testing practices. The hypothesis for this study is that the examination of the…
ERIC Educational Resources Information Center
Whitley, Meredith A.
2014-01-01
While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…
An Empirically-Derived Index of High School Academic Rigor. ACT Working Paper 2017-5
ERIC Educational Resources Information Center
Allen, Jeff; Ndum, Edwin; Mattern, Krista
2017-01-01
We derived an index of high school academic rigor by optimizing the prediction of first-year college GPA based on high school courses taken, grades, and indicators of advanced coursework. Using a large data set (n~108,000) and nominal parameterization of high school course outcomes, the high school academic rigor (HSAR) index capitalizes on…
ERIC Educational Resources Information Center
Guarino, Heidi; Yoder, Shaun
2015-01-01
"Seizing the Future: How Ohio's Career and Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Work," demonstrates Ohio's progress in developing strong policies for career and technical education (CTE) programs to promote rigor, including college- and career-ready graduation…
Devine, Carrick; Wells, Robyn; Lowe, Tim; Waller, John
2014-01-01
The M. longissimus from lambs electrically stimulated at 15 min post-mortem were removed after grading, wrapped in polythene film and held at 4 (n=6), 7 (n=6), 15 (n=6, n=8) and 35°C (n=6), until rigor mortis then aged at 15°C for 0, 4, 24 and 72 h post-rigor. Centrifuged free water increased exponentially, and bound water, dry matter and shear force decreased exponentially over time. Decreases in shear force and increases in free water were closely related (r(2)=0.52) and were unaffected by pre-rigor temperatures. © 2013.
Lomiwes, D; Reis, M M; Wiklund, E; Young, O A; North, M
2010-12-01
The potential of near infrared (NIR) spectroscopy as an on-line method to quantify glycogen and predict ultimate pH (pH(u)) of pre rigor beef M. longissimus dorsi (LD) was assessed. NIR spectra (538 to 1677 nm) of pre rigor LD from steers, cows and bulls were collected early post mortem and measurements were made for pre rigor glycogen concentration and pH(u). Spectral and measured data were combined to develop models to quantify glycogen and predict the pH(u) of pre rigor LD. NIR spectra and pre rigor predicted values obtained from quantitative models were shown to be poorly correlated against glycogen and pH(u) (r(2)=0.23 and 0.20, respectively). Qualitative models developed to categorize each muscle according to their pH(u) were able to correctly categorize 42% of high pH(u) samples. Optimum qualitative and quantitative models derived from NIR spectra found low correlation between predicted values and reference measurements. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd.. All rights reserved.
Nevalainen, T J; Gavin, J B; Seelye, R N; Whitehouse, S; Donnell, M
1978-07-01
The effect of normal and artificially induced rigor mortis on the vascular passage of erythrocytes and fluid through isolated dog hearts was studied. Increased rigidity of 6-mm thick transmural sections through the centre of the posterior papillary muscle was used as an indication of rigor. The perfusibility of the myocardium was tested by injecting 10 ml of 1% sodium fluorescein in Hanks solution into the circumflex branch of the left coronary artery. In prerigor hearts (20 minute incubation) fluorescein perfused the myocardium evenly whether or not it was preceded by an injection of 10 ml of heparinized dog blood. Rigor mortis developed in all hearts after 90 minutes incubation or within 20 minutes of perfusing the heart with 50 ml of 5 mM iodoacetate in Hanks solution. Fluorescein injected into hearts in rigor did not enter the posterior papillary muscle and adjacent subendocardium whether or not it was preceded by heparinized blood. Thus the vascular occlusion caused by rigor in the dog heart appears to be so effective that it prevents flow into the subendocardium of small soluble ions such as fluorescein.
Characteristics of School Districts That Participate in Rigorous National Educational Evaluations
Stuart, Elizabeth A.; Bell, Stephen H.; Ebnesajjad, Cyrus; Olsen, Robert B.; Orr, Larry L.
2017-01-01
Given increasing interest in evidence-based policy, there is growing attention to how well the results from rigorous program evaluations may inform policy decisions. However, little attention has been paid to documenting the characteristics of schools or districts that participate in rigorous educational evaluations, and how they compare to potential target populations for the interventions that were evaluated. Utilizing a list of the actual districts that participated in 11 large-scale rigorous educational evaluations, we compare those districts to several different target populations of districts that could potentially be affected by policy decisions regarding the interventions under study. We find that school districts that participated in the 11 rigorous educational evaluations differ from the interventions’ target populations in several ways, including size, student performance on state assessments, and location (urban/rural). These findings raise questions about whether, as currently implemented, the results from rigorous impact studies in education are likely to generalize to the larger set of school districts—and thus schools and students—of potential interest to policymakers, and how we can improve our study designs to retain strong internal validity while also enhancing external validity. PMID:29276552
ERIC Educational Resources Information Center
Busacca, Margherita L.; Anderson, Angelika; Moore, Dennis W.
2015-01-01
This review evaluates self-management literature targeting problem behaviors of primary school students in general education settings. Thirty-one single-case design studies met inclusion criteria, of which 16 demonstrated adequate methodological rigor, according to What Works Clearinghouse (WWC) design standards. Visual analysis and WWC…
Economic analysis in support of broad scale land management strategies.
Richard Haynes
2003-01-01
The US has a century of experience with the development of forest policies that have benefited from or been influenced by economic research activities in the forest sector. At the same time, increasing rigor in policy debates stimulated economics research. During the past four decades economic research has evolved to include increased understanding of consumer demands...
Confronting challenges to economic analysis of biological invasions in forests
Thomas P Holmes
2010-01-01
Biological invasions of forests by non-indigenous organisms present a complex, persistent, and largely irreversible threat to forest ecosystems around the globe. Rigorous assessments of the economic impacts of introduced species, at a national scale, are needed to provide credible information to policy makers. It is proposed here that microeconomic models of damage due...
Lichen elements as pollution indicators: evaluation of methods for large monitoring programmes
Susan Will-Wolf; Sarah Jovan; Michael C. Amacher
2017-01-01
Lichen element content is a reliable indicator for relative air pollution load in research and monitoring programmes requiring both efficiency and representation of many sites. We tested the value of costly rigorous field and handling protocols for sample element analysis using five lichen species. No relaxation of rigour was supported; four relaxed protocols generated...
ERIC Educational Resources Information Center
Finch, David; Deephouse, David L.; O'Reilly, Norm; Massie, Tyler; Hillenbrand, Carola
2016-01-01
The debate associated with the qualifications of business school faculty has raged since the 1959 release of the Gordon-Howell and Pierson reports, which encouraged business schools in the USA to enhance their legitimacy by increasing their faculties' doctoral qualifications and scholarly rigor. Today, the legitimacy of specific faculty…
"Edupreneurs": A Survey of For-Profit Education. Policy Analysis, No. 386.
ERIC Educational Resources Information Center
Lips, Carrie
This policy paper examines the products, services, and innovations that a fully competitive marketplace could generate if the government loosened its grip on education, noting that the failure of government-run schools to prepare students for the rigors of the modern economy is a pressing policy problem as well as an opportunity for the private…