Peer Review of EPA's Draft BMDS Document: Exponential ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.
Guilak, Farshid
2017-03-21
We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Peer Assessment with Online Tools to Improve Student Modeling
ERIC Educational Resources Information Center
Atkins, Leslie J.
2012-01-01
Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to…
Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.
Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P
2018-03-03
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bahl, Mayank; Zhou, Gui-Rong; Heller, Evan; Cassarly, William; Jiang, Mingming; Scarmozzino, Rob; Gregory, G. Groot
2014-09-01
Over the last two decades there has been extensive research done to improve the design of Organic Light Emitting Diodes (OLEDs) so as to enhance light extraction efficiency, improve beam shaping, and allow color tuning through techniques such as the use of patterned substrates, photonic crystal (PCs) gratings, back reflectors, surface texture, and phosphor down-conversion. Computational simulation has been an important tool for examining these increasingly complex designs. It has provided insights for improving OLED performance as a result of its ability to explore limitations, predict solutions, and demonstrate theoretical results. Depending upon the focus of the design and scale of the problem, simulations are carried out using rigorous electromagnetic (EM) wave optics based techniques, such as finite-difference time-domain (FDTD) and rigorous coupled wave analysis (RCWA), or through ray optics based technique such as Monte Carlo ray-tracing. The former are typically used for modeling nanostructures on the OLED die, and the latter for modeling encapsulating structures, die placement, back-reflection, and phosphor down-conversion. This paper presents the use of a mixed-level simulation approach which unifies the use of EM wave-level and ray-level tools. This approach uses rigorous EM wave based tools to characterize the nanostructured die and generate both a Bidirectional Scattering Distribution function (BSDF) and a far-field angular intensity distribution. These characteristics are then incorporated into the ray-tracing simulator to obtain the overall performance. Such mixed-level approach allows for comprehensive modeling of the optical characteristic of OLEDs and can potentially lead to more accurate performance than that from individual modeling tools alone.
Clarity versus complexity: land-use modeling as a practical tool for decision-makers
Sohl, Terry L.; Claggett, Peter
2013-01-01
The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.
ERIC Educational Resources Information Center
Cohen, Julie; Schuldt, Lorien Chambers; Brown, Lindsay; Grossman, Pamela
2016-01-01
Background/Context: Current efforts to build rigorous teacher evaluation systems has increased interest in standardized classroom observation tools as reliable measures for assessing teaching. However, many argue these instruments can also be used to effect change in classroom practice. This study investigates a model of professional development…
ERIC Educational Resources Information Center
Diouf, Boucar; Rioux, Pierre
1999-01-01
Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…
Rigor in Your School: A Toolkit for Leaders
ERIC Educational Resources Information Center
Williamson, Ronald; Blackburn, Barbara R.
2011-01-01
Raise the level of rigor in your school and dramatically improve student learning with the tools in this book. Each illuminating exercise is tailored to educators looking to spread the word on rigor and beat the obstacles to achieving it schoolwide. Formatted for duplication and repeated use, these tools are perfect for those who currently hold a…
Increasing the reliability of ecological models using modern software engineering techniques
Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff
2009-01-01
Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...
Peer Review Documents Related to the Evaluation of ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.
Facilities Stewardship: Measuring the Return on Physical Assets.
ERIC Educational Resources Information Center
Kadamus, David A.
2001-01-01
Asserts that colleges and universities should apply the same analytical rigor to physical assets as they do financial assets. Presents a management tool, the Return on Physical Assets model, to help guide physical asset allocation decisions. (EV)
Integrated Sensitivity Analysis Workflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.
2014-08-01
Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.
Modeling the Cloud to Enhance Capabilities for Crises and Catastrophe Management
2016-11-16
order for cloud computing infrastructures to be successfully deployed in real world scenarios as tools for crisis and catastrophe management, where...Statement of the Problem Studied As cloud computing becomes the dominant computational infrastructure[1] and cloud technologies make a transition to hosting...1. Formulate rigorous mathematical models representing technological capabilities and resources in cloud computing for performance modeling and
Skill Assessment for Coupled Biological/Physical Models of Marine Systems.
Stow, Craig A; Jolliff, Jason; McGillicuddy, Dennis J; Doney, Scott C; Allen, J Icarus; Friedrichs, Marjorie A M; Rose, Kenneth A; Wallhead, Philip
2009-02-20
Coupled biological/physical models of marine systems serve many purposes including the synthesis of information, hypothesis generation, and as a tool for numerical experimentation. However, marine system models are increasingly used for prediction to support high-stakes decision-making. In such applications it is imperative that a rigorous model skill assessment is conducted so that the model's capabilities are tested and understood. Herein, we review several metrics and approaches useful to evaluate model skill. The definition of skill and the determination of the skill level necessary for a given application is context specific and no single metric is likely to reveal all aspects of model skill. Thus, we recommend the use of several metrics, in concert, to provide a more thorough appraisal. The routine application and presentation of rigorous skill assessment metrics will also serve the broader interests of the modeling community, ultimately resulting in improved forecasting abilities as well as helping us recognize our limitations.
Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II
DOE Office of Scientific and Technical Information (OSTI.GOV)
George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg
2009-06-01
This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2more » storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.« less
Large eddy simulation of forest canopy flow for wildland fire modeling
Eric Mueller; William Mell; Albert Simeoni
2014-01-01
Large eddy simulation (LES) based computational fluid dynamics (CFD) simulators have obtained increasing attention in the wildland fire research community, as these tools allow the inclusion of important driving physics. However, due to the complexity of the models, individual aspects must be isolated and tested rigorously to ensure meaningful results. As wind is a...
Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties
NASA Technical Reports Server (NTRS)
Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.
2015-01-01
For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L
Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, Zachary; Neuert, Gregor; Department of Pharmacology, School of Medicine, Vanderbilt University, Nashville, Tennessee 37232
2016-08-21
Emerging techniques now allow for precise quantification of distributions of biological molecules in single cells. These rapidly advancing experimental methods have created a need for more rigorous and efficient modeling tools. Here, we derive new bounds on the likelihood that observations of single-cell, single-molecule responses come from a discrete stochastic model, posed in the form of the chemical master equation. These strict upper and lower bounds are based on a finite state projection approach, and they converge monotonically to the exact likelihood value. These bounds allow one to discriminate rigorously between models and with a minimum level of computational effort.more » In practice, these bounds can be incorporated into stochastic model identification and parameter inference routines, which improve the accuracy and efficiency of endeavors to analyze and predict single-cell behavior. We demonstrate the applicability of our approach using simulated data for three example models as well as for experimental measurements of a time-varying stochastic transcriptional response in yeast.« less
Emerging from the bottleneck: Benefits of the comparative approach to modern neuroscience
Brenowitz, Eliot A.; Zakon, Harold H.
2015-01-01
Neuroscience historically exploited a wide diversity of animal taxa. Recently, however, research focused increasingly on a few model species. This trend accelerated with the genetic revolution, as genomic sequences and genetic tools became available for a few species, which formed a bottleneck. This coalescence on a small set of model species comes with several costs often not considered, especially in the current drive to use mice explicitly as models for human diseases. Comparative studies of strategically chosen non-model species can complement model species research and yield more rigorous studies. As genetic sequences and tools become available for many more species, we are poised to emerge from the bottleneck and once again exploit the rich biological diversity offered by comparative studies. PMID:25800324
Using constraints and their value for optimization of large ODE systems
Domijan, Mirela; Rand, David A.
2015-01-01
We provide analytical tools to facilitate a rigorous assessment of the quality and value of the fit of a complex model to data. We use this to provide approaches to model fitting, parameter estimation, the design of optimization functions and experimental optimization. This is in the context where multiple constraints are used to select or optimize a large model defined by differential equations. We illustrate the approach using models of circadian clocks and the NF-κB signalling system. PMID:25673300
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
Multiscale sagebrush rangeland habitat modeling in southwest Wyoming
Homer, Collin G.; Aldridge, Cameron L.; Meyer, Debra K.; Coan, Michael J.; Bowen, Zachary H.
2009-01-01
Sagebrush-steppe ecosystems in North America have experienced dramatic elimination and degradation since European settlement. As a result, sagebrush-steppe dependent species have experienced drastic range contractions and population declines. Coordinated ecosystem-wide research, integrated with monitoring and management activities, would improve the ability to maintain existing sagebrush habitats. However, current data only identify resource availability locally, with rigorous spatial tools and models that accurately model and map sagebrush habitats over large areas still unavailable. Here we report on an effort to produce a rigorous large-area sagebrush-habitat classification and inventory with statistically validated products and estimates of precision in the State of Wyoming. This research employs a combination of significant new tools, including (1) modeling sagebrush rangeland as a series of independent continuous field components that can be combined and customized by any user at multiple spatial scales; (2) collecting ground-measured plot data on 2.4-meter imagery in the same season the satellite imagery is acquired; (3) effective modeling of ground-measured data on 2.4-meter imagery to maximize subsequent extrapolation; (4) acquiring multiple seasons (spring, summer, and fall) of an additional two spatial scales of imagery (30 meter and 56 meter) for optimal large-area modeling; (5) using regression tree classification technology that optimizes data mining of multiple image dates, ratios, and bands with ancillary data to extrapolate ground training data to coarser resolution sensors; and (6) employing rigorous accuracy assessment of model predictions to enable users to understand the inherent uncertainties. First-phase results modeled eight rangeland components (four primary targets and four secondary targets) as continuous field predictions. The primary targets included percent bare ground, percent herbaceousness, percent shrub, and percent litter. The four secondary targets included percent sagebrush (Artemisia spp.), percent big sagebrush (Artemisia tridentata), percent Wyoming sagebrush (Artemisia tridentata wyomingensis), and sagebrush height (centimeters). Results were validated by an independent accuracy assessment with root mean square error (RMSE) values ranging from 6.38 percent for bare ground to 2.99 percent for sagebrush at the QuickBird scale and RMSE values ranging from 12.07 percent for bare ground to 6.34 percent for sagebrush at the full Landsat scale. Subsequent project phases are now in progress, with plans to deliver products that improve accuracies of existing components, model new components, complete models over larger areas, track changes over time (from 1988 to 2007), and ultimately model wildlife population trends against these changes. We believe these results offer significant improvement in sagebrush rangeland quantification at multiple scales and offer users products that have been rigorously validated.
Quantifying falsifiability of scientific theories
NASA Astrophysics Data System (ADS)
Nemenman, Ilya
I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.
A Thermal Management Systems Model for the NASA GTX RBCC Concept
NASA Technical Reports Server (NTRS)
Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)
2002-01-01
The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.
Monitoring programs to assess reintroduction efforts: A critical component in recovery
Muths, E.; Dreitz, V.
2008-01-01
Reintroduction is a powerful tool in our conservation toolbox. However, the necessary follow-up, i.e. long-term monitoring, is not commonplace and if instituted may lack rigor. We contend that valid monitoring is possible, even with sparse data. We present a means to monitor based on demographic data and a projection model using the Wyoming toad (Bufo baxten) as an example. Using an iterative process, existing data is built upon gradually such that demographic estimates and subsequent inferences increase in reliability. Reintroduction and defensible monitoring may become increasingly relevant as the outlook for amphibians, especially in tropical regions, continues to deteriorate and emergency collection, captive breeding, and reintroduction become necessary. Rigorous use of appropriate modeling and an adaptive approach can validate the use of reintroduction and substantially increase its value to recovery programs. ?? 2008 Museu de Cie??ncies Naturals.
Simscape Modeling Verification in the Simulink Development Environment
NASA Technical Reports Server (NTRS)
Volle, Christopher E. E.
2014-01-01
The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.
Peer Assessment with Online Tools to Improve Student Modeling
NASA Astrophysics Data System (ADS)
Atkins, Leslie J.
2012-11-01
Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.
Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet
NASA Technical Reports Server (NTRS)
Muss, J. A.; Johnson, C. W.; Gotchy, M. B.
2000-01-01
The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.
Emerging from the bottleneck: benefits of the comparative approach to modern neuroscience.
Brenowitz, Eliot A; Zakon, Harold H
2015-05-01
Neuroscience has historically exploited a wide diversity of animal taxa. Recently, however, research has focused increasingly on a few model species. This trend has accelerated with the genetic revolution, as genomic sequences and genetic tools became available for a few species, which formed a bottleneck. This coalescence on a small set of model species comes with several costs that are often not considered, especially in the current drive to use mice explicitly as models for human diseases. Comparative studies of strategically chosen non-model species can complement model species research and yield more rigorous studies. As genetic sequences and tools become available for many more species, we are poised to emerge from the bottleneck and once again exploit the rich biological diversity offered by comparative studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Grading Rigor in Counselor Education: A Specifications Grading Framework
ERIC Educational Resources Information Center
Bonner, Matthew W.
2016-01-01
According to accreditation and professional bodies, evaluation and grading are a high priority in counselor education. Specifications grading, an evaluative tool, can be used to increase grading rigor. This article describes the components of specifications grading and applies the framework of specifications grading to a counseling theories course.
Accuracy and performance of 3D mask models in optical projection lithography
NASA Astrophysics Data System (ADS)
Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar
2011-04-01
Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.
A Framework for Daylighting Optimization in Whole Buildings with OpenStudio
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-12
We present a toolkit and workflow for leveraging the OpenStudio (Guglielmetti et al. 2010) platform to perform daylighting analysis and optimization in a whole building energy modeling (BEM) context. We have re-implemented OpenStudio's integrated Radiance and EnergyPlus functionality as an OpenStudio Measure. The OpenStudio Radiance Measure works within the OpenStudio Application and Parametric Analysis Tool, as well as the OpenStudio Server large scale analysis framework, allowing a rigorous daylighting simulation to be performed on a single building model or potentially an entire population of programmatically generated models. The Radiance simulation results can automatically inform the broader building energy model, andmore » provide dynamic daylight metrics as a basis for decision. Through introduction and example, this paper illustrates the utility of the OpenStudio building energy modeling platform to leverage existing simulation tools for integrated building energy performance simulation, daylighting analysis, and reportage.« less
NASA Astrophysics Data System (ADS)
Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew
2007-04-01
One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.
Benassi, Enrico
2017-01-15
A number of programs and tools that simulate 1 H and 13 C nuclear magnetic resonance (NMR) chemical shifts using empirical approaches are available. These tools are user-friendly, but they provide a very rough (and sometimes misleading) estimation of the NMR properties, especially for complex systems. Rigorous and reliable ways to predict and interpret NMR properties of simple and complex systems are available in many popular computational program packages. Nevertheless, experimentalists keep relying on these "unreliable" tools in their daily work because, to have a sufficiently high accuracy, these rigorous quantum mechanical methods need high levels of theory. An alternative, efficient, semi-empirical approach has been proposed by Bally, Rablen, Tantillo, and coworkers. This idea consists of creating linear calibrations models, on the basis of the application of different combinations of functionals and basis sets. Following this approach, the predictive capability of a wider range of popular functionals was systematically investigated and tested. The NMR chemical shifts were computed in solvated phase at density functional theory level, using 30 different functionals coupled with three different triple-ζ basis sets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Validation of Fatigue Modeling Predictions in Aviation Operations
NASA Technical Reports Server (NTRS)
Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin
2017-01-01
Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.
Development of a fourth generation predictive capability maturity model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard Guy; Witkowski, Walter R.; Urbina, Angel
2013-09-01
The Predictive Capability Maturity Model (PCMM) is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated for an intended application. The primary application of this tool at Sandia National Laboratories (SNL) has been for physics-based computational simulations in support of nuclear weapons applications. The two main goals of a PCMM evaluation are 1) the communication of computational simulation capability, accurately and transparently, and 2) the development of input for effective planning. As a result of the increasing importance of computational simulation to SNLs mission, themore » PCMM has evolved through multiple generations with the goal to provide more clarity, rigor, and completeness in its application. This report describes the approach used to develop the fourth generation of the PCMM.« less
Ontology-Driven Information Integration
NASA Technical Reports Server (NTRS)
Tissot, Florence; Menzel, Chris
2005-01-01
Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence
Kelly, David; Majda, Andrew J.; Tong, Xin T.
2015-01-01
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.
Kelly, David; Majda, Andrew J; Tong, Xin T
2015-08-25
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.
Zabor, Emily C; Coit, Daniel; Gershenwald, Jeffrey E; McMasters, Kelly M; Michaelson, James S; Stromberg, Arnold J; Panageas, Katherine S
2018-02-22
Prognostic models are increasingly being made available online, where they can be publicly accessed by both patients and clinicians. These online tools are an important resource for patients to better understand their prognosis and for clinicians to make informed decisions about treatment and follow-up. The goal of this analysis was to highlight the possible variability in multiple online prognostic tools in a single disease. To demonstrate the variability in survival predictions across online prognostic tools, we applied a single validation dataset to three online melanoma prognostic tools. Data on melanoma patients treated at Memorial Sloan Kettering Cancer Center between 2000 and 2014 were retrospectively collected. Calibration was assessed using calibration plots and discrimination was assessed using the C-index. In this demonstration project, we found important differences across the three models that led to variability in individual patients' predicted survival across the tools, especially in the lower range of predictions. In a validation test using a single-institution data set, calibration and discrimination varied across the three models. This study underscores the potential variability both within and across online tools, and highlights the importance of using methodological rigor when developing a prognostic model that will be made publicly available online. The results also reinforce that careful development and thoughtful interpretation, including understanding a given tool's limitations, are required in order for online prognostic tools that provide survival predictions to be a useful resource for both patients and clinicians.
Hybrid Rocket Performance Prediction with Coupling Method of CFD and Thermal Conduction Calculation
NASA Astrophysics Data System (ADS)
Funami, Yuki; Shimada, Toru
The final purpose of this study is to develop a design tool for hybrid rocket engines. This tool is a computer code which will be used in order to investigate rocket performance characteristics and unsteady phenomena lasting through the burning time, such as fuel regression or combustion oscillation. When phenomena inside a combustion chamber, namely boundary layer combustion, are described, it is difficult to use rigorous models for this target. It is because calculation cost may be too expensive. Therefore simple models are required for this calculation. In this study, quasi-one-dimensional compressible Euler equations for flowfields inside a chamber and the equation for thermal conduction inside a solid fuel are numerically solved. The energy balance equation at the solid fuel surface is solved to estimate fuel regression rate. Heat feedback model is Karabeyoglu's model dependent on total mass flux. Combustion model is global single step reaction model for 4 chemical species or chemical equilibrium model for 9 chemical species. As a first step, steady-state solutions are reported.
Raymond, Nancy C; Wyman, Jean F; Dighe, Satlaj; Harwood, Eileen M; Hang, Mikow
2018-06-01
Process evaluation is an important tool in quality improvement efforts. This article illustrates how a systematic and continuous evaluation process can be used to improve the quality of faculty career development programs by using the University of Minnesota's Building Interdisciplinary Research Careers in Women's Health (BIRCWH) K12 program as an exemplar. Data from a rigorous process evaluation incorporating quantitative and qualitative measurements were analyzed and reviewed by the BIRCWH program leadership on a regular basis. Examples are provided of how this evaluation model and processes were used to improve many aspects of the program, thereby improving scholar, mentor, and advisory committee members' satisfaction and scholar outcomes. A rigorous evaluation plan can increase the effectiveness and impact of a research career development plan.
Wilcox, Rebecca L; Adem, Patricia V; Afshinnekoo, Ebrahim; Atkinson, James B; Burke, Leah W; Cheung, Hoiwan; Dasgupta, Shoumita; DeLaGarza, Julia; Joseph, Loren; LeGallo, Robin; Lew, Madelyn; Lockwood, Christina M; Meiss, Alice; Norman, Jennifer; Markwood, Priscilla; Rizvi, Hasan; Shane-Carson, Kate P; Sobel, Mark E; Suarez, Eric; Tafe, Laura J; Wang, Jason; Haspel, Richard L
2018-05-01
Genomic medicine is transforming patient care. However, the speed of development has left a knowledge gap between discovery and effective implementation into clinical practice. Since 2010, the Training Residents in Genomics (TRIG) Working Group has found success in building a rigorous genomics curriculum with implementation tools aimed at pathology residents in postgraduate training years 1-4. Based on the TRIG model, the interprofessional Undergraduate Training in Genomics (UTRIG) Working Group was formed. Under the aegis of the Undergraduate Medical Educators Section of the Association of Pathology Chairs and representation from nine additional professional societies, UTRIG's collaborative goal is building medical student genomic literacy through development of a ready-to-use genomics curriculum. Key elements to the UTRIG curriculum are expert consensus-driven objectives, active learning methods, rigorous assessment and integration.
Implementing health promotion tools in Australian Indigenous primary health care.
Percival, Nikki A; McCalman, Janya; Armit, Christine; O'Donoghue, Lynette; Bainbridge, Roxanne; Rowley, Kevin; Doyle, Joyce; Tsey, Komla
2018-02-01
In Australia, significant resources have been invested in producing health promotion best practice guidelines, frameworks and tools (herein referred to as health promotion tools) as a strategy to improve Indigenous health promotion programmes. Yet, there has been very little rigorous implementation research about whether or how health promotion tools are implemented. This paper theorizes the complex processes of health promotion tool implementation in Indigenous comprehensive primary healthcare services. Data were derived from published and grey literature about the development and the implementation of four Indigenous health promotion tools. Tools were theoretically sampled to account for the key implementation types described in the literature. Data were analysed using the grounded-theory methods of coding and constant comparison with construct a theoretical implementation model. An Indigenous Health Promotion Tool Implementation Model was developed. Implementation is a social process, whereby researchers, practitioners and community members collectively interacted in creating culturally responsive health promotion to the common purpose of facilitating empowerment. The implementation of health promotion tools was influenced by the presence of change agents; a commitment to reciprocity and organizational governance and resourcing. The Indigenous Health Promotion Tool Implementation Model assists in explaining how health promotion tools are implemented and the conditions that influence these actions. Rather than simply developing more health promotion tools, our study suggests that continuous investment in developing conditions that support empowering implementation processes are required to maximize the beneficial impacts and effectiveness of health promotion tools. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Impact of topographic mask models on scanner matching solutions
NASA Astrophysics Data System (ADS)
Tyminski, Jacek K.; Pomplun, Jan; Renwick, Stephen P.
2014-03-01
Of keen interest to the IC industry are advanced computational lithography applications such as Optical Proximity Correction of IC layouts (OPC), scanner matching by optical proximity effect matching (OPEM), and Source Optimization (SO) and Source-Mask Optimization (SMO) used as advanced reticle enhancement techniques. The success of these tasks is strongly dependent on the integrity of the lithographic simulators used in computational lithography (CL) optimizers. Lithographic mask models used by these simulators are key drivers impacting the accuracy of the image predications, and as a consequence, determine the validity of these CL solutions. Much of the CL work involves Kirchhoff mask models, a.k.a. thin masks approximation, simplifying the treatment of the mask near-field images. On the other hand, imaging models for hyper-NA scanner require that the interactions of the illumination fields with the mask topography be rigorously accounted for, by numerically solving Maxwell's Equations. The simulators used to predict the image formation in the hyper-NA scanners must rigorously treat the masks topography and its interaction with the scanner illuminators. Such imaging models come at a high computational cost and pose challenging accuracy vs. compute time tradeoffs. Additional complication comes from the fact that the performance metrics used in computational lithography tasks show highly non-linear response to the optimization parameters. Finally, the number of patterns used for tasks such as OPC, OPEM, SO, or SMO range from tens to hundreds. These requirements determine the complexity and the workload of the lithography optimization tasks. The tools to build rigorous imaging optimizers based on first-principles governing imaging in scanners are available, but the quantifiable benefits they might provide are not very well understood. To quantify the performance of OPE matching solutions, we have compared the results of various imaging optimization trials obtained with Kirchhoff mask models to those obtained with rigorous models involving solutions of Maxwell's Equations. In both sets of trials, we used sets of large numbers of patterns, with specifications representative of CL tasks commonly encountered in hyper-NA imaging. In this report we present OPEM solutions based on various mask models and discuss the models' impact on hyper- NA scanner matching accuracy. We draw conclusions on the accuracy of results obtained with thin mask models vs. the topographic OPEM solutions. We present various examples representative of the scanner image matching for patterns representative of the current generation of IC designs.
Competency Assessment in Senior Emergency Medicine Residents for Core Ultrasound Skills.
Schmidt, Jessica N; Kendall, John; Smalley, Courtney
2015-11-01
Quality resident education in point-of-care ultrasound (POC US) is becoming increasingly important in emergency medicine (EM); however, the best methods to evaluate competency in graduating residents has not been established. We sought to design and implement a rigorous assessment of image acquisition and interpretation in POC US in a cohort of graduating residents at our institution. We evaluated nine senior residents in both image acquisition and image interpretation for five core US skills (focused assessment with sonography for trauma (FAST), aorta, echocardiogram (ECHO), pelvic, central line placement). Image acquisition, using an observed clinical skills exam (OSCE) directed assessment with a standardized patient model. Image interpretation was measured with a multiple-choice exam including normal and pathologic images. Residents performed well on image acquisition for core skills with an average score of 85.7% for core skills and 74% including advanced skills (ovaries, advanced ECHO, advanced aorta). Residents scored well but slightly lower on image interpretation with an average score of 76%. Senior residents performed well on core POC US skills as evaluated with a rigorous assessment tool. This tool may be developed further for other EM programs to use for graduating resident evaluation.
Principles to Products: Toward Realizing MOS 2.0
NASA Technical Reports Server (NTRS)
Bindschadler, Duane L.; Delp, Christopher L.
2012-01-01
This is a report on the Operations Revitalization Initiative, part of the ongoing NASA-funded Advanced Multi-Mission Operations Systems (AMMOS) program. We are implementing products that significantly improve efficiency and effectiveness of Mission Operations Systems (MOS) for deep-space missions. We take a multi-mission approach, in keeping with our organization's charter to "provide multi-mission tools and services that enable mission customers to operate at a lower total cost to NASA." Focusing first on architectural fundamentals of the MOS, we review the effort's progress. In particular, we note the use of stakeholder interactions and consideration of past lessons learned to motivate a set of Principles that guide the evolution of the AMMOS. Thus guided, we have created essential patterns and connections (detailed in companion papers) that are explicitly modeled and support elaboration at multiple levels of detail (system, sub-system, element...) throughout a MOS. This architecture is realized in design and implementation products that provide lifecycle support to a Mission at the system and subsystem level. The products include adaptable multi-mission engineering documentation that describes essentials such as operational concepts and scenarios, requirements, interfaces and agreements, information models, and mission operations processes. Because we have adopted a model-based system engineering method, these documents and their contents are meaningfully related to one another and to the system model. This means they are both more rigorous and reusable (from mission to mission) than standard system engineering products. The use of models also enables detailed, early (e.g., formulation phase) insight into the impact of changes (e.g., to interfaces or to software) that is rigorous and complete, allowing better decisions on cost or technical trades. Finally, our work provides clear and rigorous specification of operations needs to software developers, further enabling significant gains in productivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maitra, Neepa
2016-07-14
This project investigates the accuracy of currently-used functionals in time-dependent density functional theory, which is today routinely used to predict and design materials and computationally model processes in solar energy conversion. The rigorously-based electron-ion dynamics method developed here sheds light on traditional methods and overcomes challenges those methods have. The fundamental research undertaken here is important for building reliable and practical methods for materials discovery. The ultimate goal is to use these tools for the computational design of new materials for solar cell devices of high efficiency.
Models in palaeontological functional analysis
Anderson, Philip S. L.; Bright, Jen A.; Gill, Pamela G.; Palmer, Colin; Rayfield, Emily J.
2012-01-01
Models are a principal tool of modern science. By definition, and in practice, models are not literal representations of reality but provide simplifications or substitutes of the events, scenarios or behaviours that are being studied or predicted. All models make assumptions, and palaeontological models in particular require additional assumptions to study unobservable events in deep time. In the case of functional analysis, the degree of missing data associated with reconstructing musculoskeletal anatomy and neuronal control in extinct organisms has, in the eyes of some scientists, rendered detailed functional analysis of fossils intractable. Such a prognosis may indeed be realized if palaeontologists attempt to recreate elaborate biomechanical models based on missing data and loosely justified assumptions. Yet multiple enabling methodologies and techniques now exist: tools for bracketing boundaries of reality; more rigorous consideration of soft tissues and missing data and methods drawing on physical principles that all organisms must adhere to. As with many aspects of science, the utility of such biomechanical models depends on the questions they seek to address, and the accuracy and validity of the models themselves. PMID:21865242
NASA Astrophysics Data System (ADS)
Harter, T.; Davis, R.; Smart, D. R.; Brown, P. H.; Dzurella, K.; Bell, A.; Kourakos, G.
2017-12-01
Nutrient fluxes to groundwater have been subject to regulatory assessment and control only in a limited number of countries, including those in the European Union, where the Water Framework Directive requires member countries to manage groundwater basis toward achieving "good status", and California, where irrigated lands will be subject to permitting, stringent nutrient monitoring requirements, and development of practices that are protective of groundwater. However, research activities to rigorously assess agricultural practices for their impact on groundwater have been limited and instead focused on surface water protection. For groundwater-related assessment of agricultural practices, a wide range of modeling tools has been employed: vulnerability studies, nitrogen mass balance assessments, crop-soil-system models, and various statistical tools. These tools are predominantly used to identify high risk regions, practices, or crops. Here we present the development of a field site for rigorous in-situ evaluation of water and nutrient management practices in an irrigated agricultural setting. Integrating groundwater monitoring into agricultural practice assessment requires large research plots (on the order of 10s to 100s of hectares) and multi-year research time-frames - much larger than typical agricultural field research plots. Almonds are among the most common crops in California with intensive use of nitrogen fertilizer and were selected for their high water quality improvement potential. Availability of an orchard site with relatively vulnerable groundwater conditions (sandy soils, water table depth less than 10 m) was also important in site selection. Initial results show that shallow groundwater concentrations are commensurate with nitrogen leaching estimates obtained by considering historical, long-term field nitrogen mass balance and groundwater dynamics.
The Health Impact Assessment (HIA) Resource and Tool ...
Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource
NextGen Operational Improvements: Will they Improve Human Performance
NASA Technical Reports Server (NTRS)
Beard, Bettina L.; Johnston, James C.; Holbrook, Jon
2013-01-01
Modernization of the National Airspace System depends critically on the development of advanced technology, including cutting-edge automation, controller decision-support tools and integrated on-demand information. The Next Generation Air Transportation System national plan envisions air traffic control tower automation that proposes solutions for seven problems: 1) departure metering, 2) taxi routing, 3) taxi and runway scheduling, 4) departure runway assignments, 5) departure flow management, 6) integrated arrival and departure scheduling and 7) runway configuration management. Government, academia and industry are simultaneously pursuing the development of these tools. For each tool, the development process typically begins by assessing its potential benefits, and then progresses to designing preliminary versions of the tool, followed by testing the tool's strengths and weaknesses using computational modeling, human-in-the-loop simulation and/or field tests. We compiled the literature, evaluated the methodological rigor of the studies and served as referee for partisan conclusions that were sometimes overly optimistic. Here we provide the results of this review.
Using Dynamic Tools to Develop an Understanding of the Fundamental Ideas of Calculus
ERIC Educational Resources Information Center
Verzosa, Debbie; Guzon, Angela Fatima; De Las Peñas, Ma. Louise Antonette N.
2014-01-01
Although dynamic geometry software has been extensively used for teaching calculus concepts, few studies have documented how these dynamic tools may be used for teaching the rigorous foundations of the calculus. In this paper, we describe lesson sequences utilizing dynamic tools for teaching the epsilon-delta definition of the limit and the…
A Tool for Rethinking Teachers' Questioning
ERIC Educational Resources Information Center
Simpson, Amber; Mokalled, Stefani; Ellenburg, Lou Ann; Che, S. Megan
2014-01-01
In this article, the authors present a tool, the Cognitive Rigor Matrix (CRM; Hess et al. 2009), as a means to analyze and reflect on the type of questions posed by mathematics teachers. This tool is intended to promote and develop higher-order thinking and inquiry through the use of purposeful questions and mathematical tasks. The authors…
Quasi-experimental study designs series-paper 6: risk of bias assessment.
Waddington, Hugh; Aloe, Ariel M; Becker, Betsy Jane; Djimeu, Eric W; Hombrados, Jorge Garcia; Tugwell, Peter; Wells, George; Reeves, Barney
2017-09-01
Rigorous and transparent bias assessment is a core component of high-quality systematic reviews. We assess modifications to existing risk of bias approaches to incorporate rigorous quasi-experimental approaches with selection on unobservables. These are nonrandomized studies using design-based approaches to control for unobservable sources of confounding such as difference studies, instrumental variables, interrupted time series, natural experiments, and regression-discontinuity designs. We review existing risk of bias tools. Drawing on these tools, we present domains of bias and suggest directions for evaluation questions. The review suggests that existing risk of bias tools provide, to different degrees, incomplete transparent criteria to assess the validity of these designs. The paper then presents an approach to evaluating the internal validity of quasi-experiments with selection on unobservables. We conclude that tools for nonrandomized studies of interventions need to be further developed to incorporate evaluation questions for quasi-experiments with selection on unobservables. Copyright © 2017 Elsevier Inc. All rights reserved.
Modelling and interpreting spectral energy distributions of galaxies with BEAGLE
NASA Astrophysics Data System (ADS)
Chevallard, Jacopo; Charlot, Stéphane
2016-10-01
We present a new-generation tool to model and interpret spectral energy distributions (SEDs) of galaxies, which incorporates in a consistent way the production of radiation and its transfer through the interstellar and intergalactic media. This flexible tool, named BEAGLE (for BayEsian Analysis of GaLaxy sEds), allows one to build mock galaxy catalogues as well as to interpret any combination of photometric and spectroscopic galaxy observations in terms of physical parameters. The current version of the tool includes versatile modelling of the emission from stars and photoionized gas, attenuation by dust and accounting for different instrumental effects, such as spectroscopic flux calibration and line spread function. We show a first application of the BEAGLE tool to the interpretation of broad-band SEDs of a published sample of ˜ 10^4 galaxies at redshifts 0.1 ≲ z ≲ 8. We find that the constraints derived on photometric redshifts using this multipurpose tool are comparable to those obtained using public, dedicated photometric-redshift codes and quantify this result in a rigorous statistical way. We also show how the post-processing of BEAGLE output data with the PYTHON extension PYP-BEAGLE allows the characterization of systematic deviations between models and observations, in particular through posterior predictive checks. The modular design of the BEAGLE tool allows easy extensions to incorporate, for example, the absorption by neutral galactic and circumgalactic gas, and the emission from an active galactic nucleus, dust and shock-ionized gas. Information about public releases of the BEAGLE tool will be maintained on http://www.jacopochevallard.org/beagle.
NASA Astrophysics Data System (ADS)
Cenek, Martin; Dahl, Spencer K.
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Cenek, Martin; Dahl, Spencer K
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Connected Learning Communities: A Toolkit for Reinventing High School.
ERIC Educational Resources Information Center
Almeida, Cheryl, Ed.; Steinberg, Adria, Ed.
This document presents tools and guidelines to help practitioners transform their high schools into institutions facilitating community-connected learning. The approach underpinning the tools and guidelines is based on the following principles: academic rigor and relevance; personalized learning; self-passage to adulthood; and productive learning…
ERIC Educational Resources Information Center
New Teacher Project, 2011
2011-01-01
This "Rating a Teacher Observation Tool" identifies five simple questions and provides an easy-to-use scorecard to help policymakers decide whether an observation framework is likely to produce fair and accurate results. The five questions are: (1) Do the criteria and tools cover the classroom performance areas most connected to student outcomes?…
ERIC Educational Resources Information Center
Harris, Rick
1995-01-01
In a partnership between several tool companies and vocational high schools, students in construction technology classes give new products a fair and rigorous workout at a fraction of the cost of focus groups. The process allows companies to expose their products to students who, in turn, provide critical evaluation of the tools. (JOW)
Modeling and Analysis of the Reverse Water Gas Shift Process for In-Situ Propellant Production
NASA Technical Reports Server (NTRS)
Whitlow, Jonathan E.
2000-01-01
This report focuses on the development of mathematical models and simulation tools developed for the Reverse Water Gas Shift (RWGS) process. This process is a candidate technology for oxygen production on Mars under the In-Situ Propellant Production (ISPP) project. An analysis of the RWGS process was performed using a material balance for the system. The material balance is very complex due to the downstream separations and subsequent recycle inherent with the process. A numerical simulation was developed for the RWGS process to provide a tool for analysis and optimization of experimental hardware, which will be constructed later this year at Kennedy Space Center (KSC). Attempts to solve the material balance for the system, which can be defined by 27 nonlinear equations, initially failed. A convergence scheme was developed which led to successful solution of the material balance, however the simplified equations used for the gas separation membrane were found insufficient. Additional more rigorous models were successfully developed and solved for the membrane separation. Sample results from these models are included in this report, with recommendations for experimental work needed for model validation.
Diehl, Glen; Major, Solomon
2015-01-01
Measuring the effectiveness of military Global Health Engagements (GHEs) has become an area of increasing interest to the military medical field. As a result, there have been efforts to more logically and rigorously evaluate GHE projects and programs; many of these have been based on the Logic and Results Frameworks. However, while these Frameworks are apt and appropriate planning tools, they are not ideally suited to measuring programs' effectiveness. This article introduces military medicine professionals to the Measures of Effectiveness for Defense Engagement and Learning (MODEL) program, which implements a new method of assessment, one that seeks to rigorously use Measures of Effectiveness (vs. Measures of Performance) to gauge programs' and projects' success and fidelity to Theater Campaign goals. While the MODEL method draws on the Logic and Results Frameworks where appropriate, it goes beyond their planning focus by using the latest social scientific and econometric evaluation methodologies to link on-the-ground GHE "lines of effort" to the realization of national and strategic goals and end-states. It is hoped these methods will find use beyond the MODEL project itself, and will catalyze a new body of rigorous, empirically based work, which measures the effectiveness of a broad spectrum of GHE and security cooperation activities. We based our strategies on the principle that it is much more cost-effective to prevent conflicts than it is to stop one once it's started. I cannot overstate the importance of our theater security cooperation programs as the centerpiece to securing our Homeland from the irregular and catastrophic threats of the 21st Century.-GEN James L. Jones, USMC (Ret.). Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Lomiwes, D; Reis, M M; Wiklund, E; Young, O A; North, M
2010-12-01
The potential of near infrared (NIR) spectroscopy as an on-line method to quantify glycogen and predict ultimate pH (pH(u)) of pre rigor beef M. longissimus dorsi (LD) was assessed. NIR spectra (538 to 1677 nm) of pre rigor LD from steers, cows and bulls were collected early post mortem and measurements were made for pre rigor glycogen concentration and pH(u). Spectral and measured data were combined to develop models to quantify glycogen and predict the pH(u) of pre rigor LD. NIR spectra and pre rigor predicted values obtained from quantitative models were shown to be poorly correlated against glycogen and pH(u) (r(2)=0.23 and 0.20, respectively). Qualitative models developed to categorize each muscle according to their pH(u) were able to correctly categorize 42% of high pH(u) samples. Optimum qualitative and quantitative models derived from NIR spectra found low correlation between predicted values and reference measurements. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd.. All rights reserved.
Automated Design Space Exploration with Aspen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spafford, Kyle L.; Vetter, Jeffrey S.
Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less
Automated Design Space Exploration with Aspen
Spafford, Kyle L.; Vetter, Jeffrey S.
2015-01-01
Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less
Ridenour, Ty A; Wittenborn, Andrea K; Raiff, Bethany R; Benedict, Neal; Kane-Gill, Sandra
2016-03-01
A critical juncture in translation research involves the preliminary studies of intervention tools, provider training programs, policies, and other mechanisms used to leverage knowledge garnered at one translation stage into another stage. Potentially useful for such studies are rigorous techniques for conducting within-subject clinical trials, which have advanced incrementally over the last decade. However, these methods have largely not been utilized within prevention or translation contexts. The purpose of this manuscript is to demonstrate the flexibility, wide applicability, and rigor of idiographic clinical trials for preliminary testing of intervention mechanisms. Specifically demonstrated are novel uses of state-space modeling for testing intervention mechanisms of short-term outcomes, identifying heterogeneity in and moderation of within-person treatment mechanisms, a horizontal line plot to refine sampling design during the course of a clinic-based experimental study, and the need to test a treatment's efficacy as treatment is administered along with (e.g., traditional 12-month outcomes).
Consistent Chemical Mechanism from Collaborative Data Processing
Slavinskaya, Nadezda; Starcke, Jan-Hendrik; Abbasi, Mehdi; ...
2016-04-01
Numerical tool of Process Informatics Model (PrIMe) is mathematically rigorous and numerically efficient approach for analysis and optimization of chemical systems. It handles heterogeneous data and is scalable to a large number of parameters. The Boundto-Bound Data Collaboration module of the automated data-centric infrastructure of PrIMe was used for the systematic uncertainty and data consistency analyses of the H 2/CO reaction model (73/17) and 94 experimental targets (ignition delay times). The empirical rule for evaluation of the shock tube experimental data is proposed. The initial results demonstrate clear benefits of the PrIMe methods for an evaluation of the kinetic datamore » quality and data consistency and for developing predictive kinetic models.« less
Rigorous derivation of porous-media phase-field equations
NASA Astrophysics Data System (ADS)
Schmuck, Markus; Kalliadasis, Serafim
2017-11-01
The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.
... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ...
Alves, Vinicius M.; Muratov, Eugene; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander
2015-01-01
Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putative sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using random forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers were 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the ScoreCard database of possible skin or sense organ toxicants as primary candidates for experimental validation. PMID:25560674
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less
Tools for observational gait analysis in patients with stroke: a systematic review.
Ferrarello, Francesco; Bianchi, Valeria Anna Maria; Baccini, Marco; Rubbieri, Gaia; Mossello, Enrico; Cavallini, Maria Chiara; Marchionni, Niccolò; Di Bari, Mauro
2013-12-01
Stroke severely affects walking ability, and assessment of gait kinematics is important in defining diagnosis, planning treatment, and evaluating interventions in stroke rehabilitation. Although observational gait analysis is the most common approach to evaluate gait kinematics, tools useful for this purpose have received little attention in the scientific literature and have not been thoroughly reviewed. The aims of this systematic review were to identify tools proposed to conduct observational gait analysis in adults with a stroke, to summarize evidence concerning their quality, and to assess their implementation in rehabilitation research and clinical practice. An extensive search was performed of original articles reporting on visual/observational tools developed to investigate gait kinematics in adults with a stroke. Two reviewers independently selected studies, extracted data, assessed quality of the included studies, and scored the metric properties and clinical utility of each tool. Rigor in reporting metric properties and dissemination of the tools also was evaluated. Five tools were identified, not all of which had been tested adequately for their metric properties. Evaluation of content validity was partially satisfactory. Reliability was poorly investigated in all but one tool. Concurrent validity and sensitivity to change were shown for 3 and 2 tools, respectively. Overall, adequate levels of quality were rarely reached. The dissemination of the tools was poor. Based on critical appraisal, the Gait Assessment and Intervention Tool shows a good level of quality, and its use in stroke rehabilitation is recommended. Rigorous studies are needed for the other tools in order to establish their usefulness.
... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ...
Nowke, Christian; Diaz-Pier, Sandra; Weyers, Benjamin; Hentschel, Bernd; Morrison, Abigail; Kuhlen, Torsten W.; Peyser, Alexander
2018-01-01
Simulation models in many scientific fields can have non-unique solutions or unique solutions which can be difficult to find. Moreover, in evolving systems, unique final state solutions can be reached by multiple different trajectories. Neuroscience is no exception. Often, neural network models are subject to parameter fitting to obtain desirable output comparable to experimental data. Parameter fitting without sufficient constraints and a systematic exploration of the possible solution space can lead to conclusions valid only around local minima or around non-minima. To address this issue, we have developed an interactive tool for visualizing and steering parameters in neural network simulation models. In this work, we focus particularly on connectivity generation, since finding suitable connectivity configurations for neural network models constitutes a complex parameter search scenario. The development of the tool has been guided by several use cases—the tool allows researchers to steer the parameters of the connectivity generation during the simulation, thus quickly growing networks composed of multiple populations with a targeted mean activity. The flexibility of the software allows scientists to explore other connectivity and neuron variables apart from the ones presented as use cases. With this tool, we enable an interactive exploration of parameter spaces and a better understanding of neural network models and grapple with the crucial problem of non-unique network solutions and trajectories. In addition, we observe a reduction in turn around times for the assessment of these models, due to interactive visualization while the simulation is computed. PMID:29937723
Increasing rigor in NMR-based metabolomics through validated and open source tools
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2016-01-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism’s phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. PMID:27643760
Increasing rigor in NMR-based metabolomics through validated and open source tools.
Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L
2017-02-01
The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.
NASA-STD-7009 Guidance Document for Human Health and Performance Models and Simulations
NASA Technical Reports Server (NTRS)
Walton, Marlei; Mulugeta, Lealem; Nelson, Emily S.; Myers, Jerry G.
2014-01-01
Rigorous verification, validation, and credibility (VVC) processes are imperative to ensure that models and simulations (MS) are sufficiently reliable to address issues within their intended scope. The NASA standard for MS, NASA-STD-7009 (7009) [1] was a resultant outcome of the Columbia Accident Investigation Board (CAIB) to ensure MS are developed, applied, and interpreted appropriately for making decisions that may impact crew or mission safety. Because the 7009 focus is engineering systems, a NASA-STD-7009 Guidance Document is being developed to augment the 7009 and provide information, tools, and techniques applicable to the probabilistic and deterministic biological MS more prevalent in human health and performance (HHP) and space biomedical research and operations.
ERIC Educational Resources Information Center
Wendt, Oliver; Miller, Bridget
2012-01-01
Critical appraisal of the research literature is an essential step in informing and implementing evidence-based practice. Quality appraisal tools that assess the methodological quality of experimental studies provide a means to identify the most rigorous research suitable for evidence-based decision-making. In single-subject experimental research,…
ERIC Educational Resources Information Center
Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary
2018-01-01
Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…
ERIC Educational Resources Information Center
Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter
2017-01-01
Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…
Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?
ERIC Educational Resources Information Center
Brondani, Mario; He, Sarah
2013-01-01
Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…
Dynamic Modeling of Yield and Particle Size Distribution in Continuous Bayer Precipitation
NASA Astrophysics Data System (ADS)
Stephenson, Jerry L.; Kapraun, Chris
Process engineers at Alcoa's Point Comfort refinery are using a dynamic model of the Bayer precipitation area to evaluate options in operating strategies. The dynamic model, a joint development effort between Point Comfort and the Alcoa Technical Center, predicts process yields, particle size distributions and occluded soda levels for various flowsheet configurations of the precipitation and classification circuit. In addition to rigorous heat, material and particle population balances, the model includes mechanistic kinetic expressions for particle growth and agglomeration and semi-empirical kinetics for nucleation and attrition. The kinetic parameters have been tuned to Point Comfort's operating data, with excellent matches between the model results and plant data. The model is written for the ACSL dynamic simulation program with specifically developed input/output graphical user interfaces to provide a user-friendly tool. Features such as a seed charge controller enhance the model's usefulness for evaluating operating conditions and process control approaches.
2011-01-01
Background High income nations are currently exhibiting increasing ethno-cultural diversity which may present challenges for nursing practice. We performed an integrative review of literature published in North America and Europe between 1990 and 2007, to map the state of knowledge and to identify nursing assessment tools/models which are have an associated research or empirical perspective in relation to ethno-cultural dimensions of nursing care. Methods Data was retrieved from a wide variety of sources, including key electronic bibliographic databases covering research in biomedical fields, nursing and allied health, and culture, e.g. CINAHL, MEDline, PUBmed, Cochrane library, PsycINFO, Web of Science, and HAPI. We used the Critical Appraisal Skills Programme tools for quality assessment. We applied Torraco's definition and method of an integrative review that aims to create new knowledge and perspectives on a given phenomena. To add methodological rigor with respect to the search strategy and other key review components we also used the principles established by the Centre for Reviews and Dissemination. Results Thirteen thousand and thirteen articles were retrieved, from which 53 full papers were assessed for inclusion. Eight papers met the inclusion criteria, describing research on a total of eight ethno-cultural assessment tools/models. The tools/models are described and synthesized. Conclusions While many ethno-cultural assessment tools exist to guide nursing practice, few are informed by research perspectives. An increased focus on the efficiency and effectiveness of health services, patient safety, and risk management, means that provision of culturally responsive and competent health services will inevitably become paramount. PMID:21812960
MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.
Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan
2017-01-01
Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.
Bennett, Hunter; Davison, Kade; Arnold, John; Slattery, Flynn; Martin, Max; Norton, Kevin
2017-10-01
Multicomponent movement assessment tools have become commonplace to measure movement quality, proposing to indicate injury risk and performance capabilities. Despite popular use, there has been no attempt to compare the components of each tool reported in the literature, the processes in which they were developed, or the underpinning rationale for their included content. As such, the objective of this systematic review was to provide a comprehensive summary of current movement assessment tools and appraise the evidence supporting their development. A systematic literature search was performed using PRISMA guidelines to identify multicomponent movement assessment tools. Commonalities between tools and the evidence provided to support the content of each tool was identified. Each tool underwent critical appraisal to identify the rigor in which it was developed, and its applicability to professional practice. Eleven tools were identified, of which 5 provided evidence to support their content as assessments of movement quality. One assessment tool (Soccer Injury Movement Screen [SIMS]) received an overall score of above 65% on critical appraisal, with a further 2 tools (Movement Competency Screen [MCS] and modified 4 movement screen [M4-MS]) scoring above 60%. Only the MCS provided clear justification for its developmental process. The remaining 8 tools scored between 40 and 60%. On appraisal, the MCS, M4-MS, and SIMS seem to provide the most practical value for assessing movement quality as they provide the strongest reports of developmental rigor and an identifiable evidence base. In addition, considering the evidence provided, these tools may have the strongest potential for identifying performance capabilities and guiding exercise prescription in athletic and sport-specific populations.
NASA Astrophysics Data System (ADS)
Kwon, Young-Sam; Lin, Ying-Chieh; Su, Cheng-Fang
2018-04-01
In this paper, we consider the compressible models of magnetohydrodynamic flows giving rise to a variety of mathematical problems in many areas. We derive a rigorous quasi-geostrophic equation governed by magnetic field from the rotational compressible magnetohydrodynamic flows with the well-prepared initial data. It is a first derivation of quasi-geostrophic equation governed by the magnetic field, and the tool is based on the relative entropy method. This paper covers two results: the existence of the unique local strong solution of quasi-geostrophic equation with the good regularity and the derivation of a quasi-geostrophic equation.
Accuracy Analysis and Validation of the Mars Science Laboratory (MSL) Robotic Arm
NASA Technical Reports Server (NTRS)
Collins, Curtis L.; Robinson, Matthew L.
2013-01-01
The Mars Science Laboratory (MSL) Curiosity Rover is currently exploring the surface of Mars with a suite of tools and instruments mounted to the end of a five degree-of-freedom robotic arm. To verify and meet a set of end-to-end system level accuracy requirements, a detailed positioning uncertainty model of the arm was developed and exercised over the arm operational workspace. Error sources at each link in the arm kinematic chain were estimated and their effects propagated to the tool frames.A rigorous test and measurement program was developed and implemented to collect data to characterize and calibrate the kinematic and stiffness parameters of the arm. Numerous absolute and relative accuracy and repeatability requirements were validated with a combination of analysis and test data extrapolated to the Mars gravity and thermal environment. Initial results of arm accuracy and repeatability on Mars demonstrate the effectiveness of the modeling and test program as the rover continues to explore the foothills of Mount Sharp.
NASA Astrophysics Data System (ADS)
Yan, Zilin; Kim, Yongtae; Hara, Shotaro; Shikazono, Naoki
2017-04-01
The Potts Kinetic Monte Carlo (KMC) model, proven to be a robust tool to study all stages of sintering process, is an ideal tool to analyze the microstructure evolution of electrodes in solid oxide fuel cells (SOFCs). Due to the nature of this model, the input parameters of KMC simulations such as simulation temperatures and attempt frequencies are difficult to identify. We propose a rigorous and efficient approach to facilitate the input parameter calibration process using artificial neural networks (ANNs). The trained ANN reduces drastically the number of trial-and-error of KMC simulations. The KMC simulation using the calibrated input parameters predicts the microstructures of a La0.6Sr0.4Co0.2Fe0.8O3 cathode material during sintering, showing both qualitative and quantitative congruence with real 3D microstructures obtained by focused ion beam scanning electron microscopy (FIB-SEM) reconstruction.
NASA Technical Reports Server (NTRS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2008-01-01
An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
The assessment of medical competencies.
Sureda-Demeulemeester, E; Ramis-Palmer, C; Sesé-Abad, A
2017-12-01
To describe the most widely used tools in the assessment of medical competencies, analyse their prevalence of use, their advantages and disadvantages and propose an appropriate model for our context. We conducted a narrative review of articles from MEDLINE, following the PRISM protocol, and analysed a total of 62 articles. The assessment of competencies is heterogeneous, especially in the educational and professional settings. The specific and technical competencies acquired during university education are mainly assessed using the objective structured clinical assessment. In the professional setting, core competencies are assessed using the 360° technique. We need a rigorous empiric comparison of the efficiency of the tools according to the type of competency. We propose a competency management model for the «undergraduate/graduate/active professional» continuum, whose goal is to improve training and professional practice and thereby increase the quality of patient care. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.
A numerical identifiability test for state-space models--application to optimal experimental design.
Hidalgo, M E; Ayesa, E
2001-01-01
This paper describes a mathematical tool for identifiability analysis, easily applicable to high order non-linear systems modelled in state-space and implementable in simulators with a time-discrete approach. This procedure also permits a rigorous analysis of the expected estimation errors (average and maximum) in calibration experiments. The methodology is based on the recursive numerical evaluation of the information matrix during the simulation of a calibration experiment and in the setting-up of a group of information parameters based on geometric interpretations of this matrix. As an example of the utility of the proposed test, the paper presents its application to an optimal experimental design of ASM Model No. 1 calibration, in order to estimate the maximum specific growth rate microH and the concentration of heterotrophic biomass XBH.
NASA Astrophysics Data System (ADS)
Bijeljic, Branko; Icardi, Matteo; Prodanović, Maša
2018-05-01
Substantial progress has been made over last few decades on understanding the physics of multiphase flow and reactive transport phenomena in subsurface porous media. Confluence of advances in experimental techniques (including micromodels, X-ray microtomography, Nuclear Magnetic Resonance (NMR)) as well as computational power have made it possible to observe static and dynamic multi-scale flow, transport and reactive processes, thus stimulating development of new generation of modelling tools from pore to field scale. One of the key challenges is to make experiment and models as complementary as possible, with continuously improving experimental methods in order to increase predictive capabilities of theoretical models across scales. This creates need to establish rigorous benchmark studies of flow, transport and reaction in porous media which can then serve as the basis for introducing more complex phenomena in future developments.
Improved mathematical and computational tools for modeling photon propagation in tissue
NASA Astrophysics Data System (ADS)
Calabro, Katherine Weaver
Light interacts with biological tissue through two predominant mechanisms: scattering and absorption, which are sensitive to the size and density of cellular organelles, and to biochemical composition (ex. hemoglobin), respectively. During the progression of disease, tissues undergo a predictable set of changes in cell morphology and vascularization, which directly affect their scattering and absorption properties. Hence, quantification of these optical property differences can be used to identify the physiological biomarkers of disease with interest often focused on cancer. Diffuse reflectance spectroscopy is a diagnostic tool, wherein broadband visible light is transmitted through a fiber optic probe into a turbid medium, and after propagating through the sample, a fraction of the light is collected at the surface as reflectance. The measured reflectance spectrum can be analyzed with appropriate mathematical models to extract the optical properties of the tissue, and from these, a set of physiological properties. A number of models have been developed for this purpose using a variety of approaches -- from diffusion theory, to computational simulations, and empirical observations. However, these models are generally limited to narrow ranges of tissue and probe geometries. In this thesis, reflectance models were developed for a much wider range of measurement parameters, and influences such as the scattering phase function and probe design were investigated rigorously for the first time. The results provide a comprehensive understanding of the factors that influence reflectance, with novel insights that, in some cases, challenge current assumptions in the field. An improved Monte Carlo simulation program, designed to run on a graphics processing unit (GPU), was built to simulate the data used in the development of the reflectance models. Rigorous error analysis was performed to identify how inaccuracies in modeling assumptions can be expected to affect the accuracy of extracted optical property values from experimentally-acquired reflectance spectra. From this analysis, probe geometries that offer the best robustness against error in estimation of physiological properties from tissue, are presented. Finally, several in vivo studies demonstrating the use of reflectance spectroscopy for both research and clinical applications are presented.
Matter Gravitates, but Does Gravity Matter?
ERIC Educational Resources Information Center
Groetsch, C. W.
2011-01-01
The interplay of physical intuition, computational evidence, and mathematical rigor in a simple trajectory model is explored. A thought experiment based on the model is used to elicit student conjectures on the influence of a physical parameter; a mathematical model suggests a computational investigation of the conjectures, and rigorous analysis…
Model Selection in the Analysis of Photoproduction Data
NASA Astrophysics Data System (ADS)
Landay, Justin
2017-01-01
Scattering experiments provide one of the most powerful and useful tools for probing matter to better understand its fundamental properties governed by the strong interaction. As the spectroscopy of the excited states of nucleons enters a new era of precision ushered in by improved experiments at Jefferson Lab and other facilities around the world, traditional partial-wave analysis methods must be adjusted accordingly. In this poster, we present a rigorous set of statistical tools and techniques that we implemented; most notably, the LASSO method, which serves for the selection of the simplest model, allowing us to avoid over fitting. In the case of establishing the spectrum of exited baryons, it avoids overpopulation of the spectrum and thus the occurrence of false-positives. This is a prerequisite to reliably compare theories like lattice QCD or quark models to experiments. Here, we demonstrate the principle by simultaneously fitting three observables in neutral pion photo-production, such as the differential cross section, beam asymmetry and target polarization across thousands of data points. Other authors include Michael Doring, Bin Hu, and Raquel Molina.
Methodology for balancing design and process tradeoffs for deep-subwavelength technologies
NASA Astrophysics Data System (ADS)
Graur, Ioana; Wagner, Tina; Ryan, Deborah; Chidambarrao, Dureseti; Kumaraswamy, Anand; Bickford, Jeanne; Styduhar, Mark; Wang, Lee
2011-04-01
For process development of deep-subwavelength technologies, it has become accepted practice to use model-based simulation to predict systematic and parametric failures. Increasingly, these techniques are being used by designers to ensure layout manufacturability, as an alternative to, or complement to, restrictive design rules. The benefit of model-based simulation tools in the design environment is that manufacturability problems are addressed in a design-aware way by making appropriate trade-offs, e.g., between overall chip density and manufacturing cost and yield. The paper shows how library elements and the full ASIC design flow benefit from eliminating hot spots and improving design robustness early in the design cycle. It demonstrates a path to yield optimization and first time right designs implemented in leading edge technologies. The approach described herein identifies those areas in the design that could benefit from being fixed early, leading to design updates and avoiding later design churn by careful selection of design sensitivities. This paper shows how to achieve this goal by using simulation tools incorporating various models from sparse to rigorously physical, pattern detection and pattern matching, checking and validating failure thresholds.
Academic Rigor in General Education, Introductory Astronomy Courses for Nonscience Majors
ERIC Educational Resources Information Center
Brogt, Erik; Draeger, John D.
2015-01-01
We discuss a model of academic rigor and apply this to a general education introductory astronomy course. We argue that even without central tenets of professional astronomy-the use of mathematics--the course can still be considered academically rigorous when expectations, goals, assessments, and curriculum are properly aligned.
Geomorphic Unit Tool (GUT): Applications of Fluvial Mapping
NASA Astrophysics Data System (ADS)
Kramer, N.; Bangen, S. G.; Wheaton, J. M.; Bouwes, N.; Wall, E.; Saunders, C.; Bennett, S.; Fortney, S.
2017-12-01
Geomorphic units are the building blocks of rivers and represent distinct habitat patches for many fluvial organisms. We present the Geomorphic Unit Toolkit (GUT), a flexible GIS geomorphic unit mapping tool, to generate maps of fluvial landforms from topography. GUT applies attributes to landforms based on flow stage (Tier 1), topographic signatures (Tier 2), geomorphic characteristics (Tier 3) and patch characteristics (Tier 4) to derive attributed maps at the level of detail required by analysts. We hypothesize that if more rigorous and consistent geomorphic mapping is conducted, better correlations between physical habitat units and ecohydraulic model results will be obtained compared to past work. Using output from GUT for coarse bed tributary streams in the Columbia River Basin, we explore relationships between salmonid habitat and geomorphic spatial metrics. We also highlight case studies of how GUT can be used to showcase geomorphic impact from large wood restoration efforts. Provided high resolution topography exists, this tool can be used to quickly assess changes in fluvial geomorphology in watersheds impacted by human activities.
NASA Astrophysics Data System (ADS)
Changyong, Dou; Huadong, Guo; Chunming, Han; yuquan, Liu; Xijuan, Yue; Yinghui, Zhao
2014-03-01
Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc.
NASA Astrophysics Data System (ADS)
Santos, Jander P.; Sá Barreto, F. C.
2016-01-01
Spin correlation identities for the Blume-Emery-Griffiths model on Kagomé lattice are derived and combined with rigorous correlation inequalities lead to upper bounds on the critical temperature. From the spin correlation identities the mean field approximation and the effective field approximation results for the magnetization, the critical frontiers and the tricritical points are obtained. The rigorous upper bounds on the critical temperature improve over those effective-field type theories results.
NASA Astrophysics Data System (ADS)
Kuznetsov, N. V.; Leonov, G. A.; Yuldashev, M. V.; Yuldashev, R. V.
2017-10-01
During recent years it has been shown that hidden oscillations, whose basin of attraction does not overlap with small neighborhoods of equilibria, may significantly complicate simulation of dynamical models, lead to unreliable results and wrong conclusions, and cause serious damage in drilling systems, aircrafts control systems, electromechanical systems, and other applications. This article provides a survey of various phase-locked loop based circuits (used in satellite navigation systems, optical, and digital communication), where such difficulties take place in MATLAB and SPICE. Considered examples can be used for testing other phase-locked loop based circuits and simulation tools, and motivate the development and application of rigorous analytical methods for the global analysis of phase-locked loop based circuits.
Spacecraft VHF Radio Propagation Analysis in Ocean Environments Including Atmospheric Effects
NASA Technical Reports Server (NTRS)
Hwu, Shian; Moreno, Gerardo; Desilva, Kanishka; Jih, CIndy
2010-01-01
The Communication Systems Simulation Laboratory (CSSL) at the National Aeronautics and Space Administration (NASA)/Johnson Space Center (JSC) is tasked to perform spacecraft and ground network communication system simulations. The CSSL has developed simulation tools that model spacecraft communication systems and the space/ground environment in which they operate. This paper is to analyze a spacecraft's very high frequency (VHF) radio signal propagation and the impact to performance when landing in an ocean. Very little research work has been done for VHF radio systems in a maritime environment. Rigorous Radio Frequency (RF) modeling/simulation techniques were employed for various environmental effects. The simulation results illustrate the significance of the environmental effects on the VHF radio system performance.
QSAR modeling: where have you been? Where are you going to?
Cherkasov, Artem; Muratov, Eugene N; Fourches, Denis; Varnek, Alexandre; Baskin, Igor I; Cronin, Mark; Dearden, John; Gramatica, Paola; Martin, Yvonne C; Todeschini, Roberto; Consonni, Viviana; Kuz'min, Victor E; Cramer, Richard; Benigni, Romualdo; Yang, Chihae; Rathman, James; Terfloth, Lothar; Gasteiger, Johann; Richard, Ann; Tropsha, Alexander
2014-06-26
Quantitative structure-activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists toward collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR studies, as well as encourage the use of high quality, validated QSARs for regulatory decision making.
QSAR Modeling: Where have you been? Where are you going to?
Cherkasov, Artem; Muratov, Eugene N.; Fourches, Denis; Varnek, Alexandre; Baskin, Igor I.; Cronin, Mark; Dearden, John; Gramatica, Paola; Martin, Yvonne C.; Todeschini, Roberto; Consonni, Viviana; Kuz'min, Victor E.; Cramer, Richard; Benigni, Romualdo; Yang, Chihae; Rathman, James; Terfloth, Lothar; Gasteiger, Johann; Richard, Ann; Tropsha, Alexander
2014-01-01
Quantitative Structure-Activity Relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this paper, we discuss: (i) the development and evolution of QSAR; (ii) the current trends, unsolved problems, and pressing challenges; and (iii) several novel and emerging applications of QSAR modeling. Throughout this discussion, we provide guidelines for QSAR development, validation, and application, which are summarized in best practices for building rigorously validated and externally predictive QSAR models. We hope that this Perspective will help communications between computational and experimental chemists towards collaborative development and use of QSAR models. We also believe that the guidelines presented here will help journal editors and reviewers apply more stringent scientific standards to manuscripts reporting new QSAR studies, as well as encourage the use of high quality, validated QSARs for regulatory decision making. PMID:24351051
Necromechanics: Death-induced changes in the mechanical properties of human tissues.
Martins, Pedro A L S; Ferreira, Francisca; Natal Jorge, Renato; Parente, Marco; Santos, Agostinho
2015-05-01
After the death phenomenon, the rigor mortis development, characterized by body stiffening, is one of the most evident changes that occur in the body. In this work, the development of rigor mortis was assessed using a skinfold caliper in human cadavers and in live people to measure the deformation in the biceps brachii muscle in response to the force applied by the device. Additionally, to simulate the measurements with the finite element method, a two-dimensional model of an arm section was used. As a result of the experimental procedure, a decrease in deformation with increasing postmortem time was observed, which corresponds to an increase in rigidity. As expected, the deformations for the live subjects were higher. The finite element method analysis showed a correlation between the c1 parameter of the neo-Hookean model in the 4- to 8-h postmortem interval. This was accomplished by adjusting the c1 material parameter in order to simulate the measured experimental displacement. Despite being a preliminary study, the obtained results show that combining the proposed experimental procedure with a numerical technique can be very useful in the study of the postmortem mechanical modifications of human tissues. Moreover, the use of data from living subjects allows us to estimate the time of death paving the way to establish this process as an alternative to the existing techniques. This solution constitutes a portable, non-invasive method of estimating the postmortem interval with direct quantitative measurements using a skinfold caliper. The tools and methods described can be used to investigate the subject and to gain epidemiologic knowledge on rigor mortis phenomenon. © IMechE 2015.
NASA Astrophysics Data System (ADS)
Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali
2011-02-01
Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.
Sequence-based heuristics for faster annotation of non-coding RNA families.
Weinberg, Zasha; Ruzzo, Walter L
2006-01-01
Non-coding RNAs (ncRNAs) are functional RNA molecules that do not code for proteins. Covariance Models (CMs) are a useful statistical tool to find new members of an ncRNA gene family in a large genome database, using both sequence and, importantly, RNA secondary structure information. Unfortunately, CM searches are extremely slow. Previously, we created rigorous filters, which provably sacrifice none of a CM's accuracy, while making searches significantly faster for virtually all ncRNA families. However, these rigorous filters make searches slower than heuristics could be. In this paper we introduce profile HMM-based heuristic filters. We show that their accuracy is usually superior to heuristics based on BLAST. Moreover, we compared our heuristics with those used in tRNAscan-SE, whose heuristics incorporate a significant amount of work specific to tRNAs, where our heuristics are generic to any ncRNA. Performance was roughly comparable, so we expect that our heuristics provide a high-quality solution that--unlike family-specific solutions--can scale to hundreds of ncRNA families. The source code is available under GNU Public License at the supplementary web site.
Cheaib, Alissar; Badeau, Vincent; Boe, Julien; Chuine, Isabelle; Delire, Christine; Dufrêne, Eric; François, Christophe; Gritti, Emmanuel S; Legay, Myriam; Pagé, Christian; Thuiller, Wilfried; Viovy, Nicolas; Leadley, Paul
2012-06-01
Model-based projections of shifts in tree species range due to climate change are becoming an important decision support tool for forest management. However, poorly evaluated sources of uncertainty require more scrutiny before relying heavily on models for decision-making. We evaluated uncertainty arising from differences in model formulations of tree response to climate change based on a rigorous intercomparison of projections of tree distributions in France. We compared eight models ranging from niche-based to process-based models. On average, models project large range contractions of temperate tree species in lowlands due to climate change. There was substantial disagreement between models for temperate broadleaf deciduous tree species, but differences in the capacity of models to account for rising CO(2) impacts explained much of the disagreement. There was good quantitative agreement among models concerning the range contractions for Scots pine. For the dominant Mediterranean tree species, Holm oak, all models foresee substantial range expansion. © 2012 Blackwell Publishing Ltd/CNRS.
Developing a Student Conception of Academic Rigor
ERIC Educational Resources Information Center
Draeger, John; del Prado Hill, Pixita; Mahler, Ronnie
2015-01-01
In this article we describe models of academic rigor from the student point of view. Drawing on a campus-wide survey, focus groups, and interviews with students, we found that students explained academic rigor in terms of workload, grading standards, level of difficulty, level of interest, and perceived relevance to future goals. These findings…
NASA Astrophysics Data System (ADS)
Vintila, Iuliana; Gavrus, Adinel
2017-10-01
The present research paper proposes the validation of a rigorous computation model used as a numerical tool to identify rheological behavior of complex emulsions W/O. Considering a three-dimensional description of a general viscoplastic flow it is detailed the thermo-mechanical equations used to identify fluid or soft material's rheological laws starting from global experimental measurements. Analyses are conducted for complex emulsions W/O having generally a Bingham behavior using the shear stress - strain rate dependency based on a power law and using an improved analytical model. Experimental results are investigated in case of rheological behavior for crude and refined rapeseed/soybean oils and four types of corresponding W/O emulsions using different physical-chemical composition. The rheological behavior model was correlated with the thermo-mechanical analysis of a plane-plane rheometer, oil content, chemical composition, particle size and emulsifier's concentration. The parameters of rheological laws describing the industrial oils and the W/O concentrated emulsions behavior were computed from estimated shear stresses using a non-linear regression technique and from experimental torques using the inverse analysis tool designed by A. Gavrus (1992-2000).
Gordon, M. J. C.
2015-01-01
Robin Milner's paper, ‘The use of machines to assist in rigorous proof’, introduces methods for automating mathematical reasoning that are a milestone in the development of computer-assisted theorem proving. His ideas, particularly his theory of tactics, revolutionized the architecture of proof assistants. His methodology for automating rigorous proof soundly, particularly his theory of type polymorphism in programing, led to major contributions to the theory and design of programing languages. His citation for the 1991 ACM A.M. Turing award, the most prestigious award in computer science, credits him with, among other achievements, ‘probably the first theoretically based yet practical tool for machine assisted proof construction’. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society. PMID:25750147
Next Generation of Leaching Tests
A corresponding abstract has been cleared for this presentation. The four methods comprising the Leaching Environmental Assessment Framework are described along with the tools to support implementation of the more rigorous and accurate source terms that are developed using LEAF ...
Music-therapy analyzed through conceptual mapping
NASA Astrophysics Data System (ADS)
Martinez, Rodolfo; de la Fuente, Rebeca
2002-11-01
Conceptual maps have been employed lately as a learning tool, as a modern study technique, and as a new way to understand intelligence, which allows for the development of a strong theoretical reference, in order to prove the research hypothesis. This paper presents a music-therapy analysis based on this tool to produce a conceptual mapping network, which ranges from magic through the rigor of the hard sciences.
The MIXED framework: A novel approach to evaluating mixed-methods rigor.
Eckhardt, Ann L; DeVon, Holli A
2017-10-01
Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.
SNSEDextend: SuperNova Spectral Energy Distributions extrapolation toolkit
NASA Astrophysics Data System (ADS)
Pierel, Justin D. R.; Rodney, Steven A.; Avelino, Arturo; Bianco, Federica; Foley, Ryan J.; Friedman, Andrew; Hicken, Malcolm; Hounsell, Rebekah; Jha, Saurabh W.; Kessler, Richard; Kirshner, Robert; Mandel, Kaisey; Narayan, Gautham; Filippenko, Alexei V.; Scolnic, Daniel; Strolger, Louis-Gregory
2018-05-01
SNSEDextend extrapolates core-collapse and Type Ia Spectral Energy Distributions (SEDs) into the UV and IR for use in simulations and photometric classifications. The user provides a library of existing SED templates (such as those in the authors' SN SED Repository) along with new photometric constraints in the UV and/or NIR wavelength ranges. The software then extends the existing template SEDs so their colors match the input data at all phases. SNSEDextend can also extend the SALT2 spectral time-series model for Type Ia SN for a "first-order" extrapolation of the SALT2 model components, suitable for use in survey simulations and photometric classification tools; as the code does not do a rigorous re-training of the SALT2 model, the results should not be relied on for precision applications such as light curve fitting for cosmology.
Jiang, Xiaoye; Yao, Yuan; Liu, Han; Guibas, Leonidas
2014-01-01
Modern data acquisition routinely produces massive amounts of network data. Though many methods and models have been proposed to analyze such data, the research of network data is largely disconnected with the classical theory of statistical learning and signal processing. In this paper, we present a new framework for modeling network data, which connects two seemingly different areas: network data analysis and compressed sensing. From a nonparametric perspective, we model an observed network using a large dictionary. In particular, we consider the network clique detection problem and show connections between our formulation with a new algebraic tool, namely Randon basis pursuit in homogeneous spaces. Such a connection allows us to identify rigorous recovery conditions for clique detection problems. Though this paper is mainly conceptual, we also develop practical approximation algorithms for solving empirical problems and demonstrate their usefulness on real-world datasets. PMID:25620806
Waste in health information systems: a systematic review.
Awang Kalong, Nadia; Yusof, Maryati
2017-05-08
Purpose The purpose of this paper is to discuss a systematic review on waste identification related to health information systems (HIS) in Lean transformation. Design/methodology/approach A systematic review was conducted on 19 studies to evaluate Lean transformation and tools used to remove waste related to HIS in clinical settings. Findings Ten waste categories were identified, along with their relationships and applications of Lean tool types related to HIS. Different Lean tools were used at the early and final stages of Lean transformation; the tool selection depended on the waste characteristic. Nine studies reported a positive impact from Lean transformation in improving daily work processes. The selection of Lean tools should be made based on the timing, purpose and characteristics of waste to be removed. Research limitations/implications Overview of waste and its category within HIS and its analysis from socio-technical perspectives enabled the identification of its root cause in a holistic and rigorous manner. Practical implications Understanding waste types, their root cause and review of Lean tools could subsequently lead to the identification of mitigation approach to prevent future error occurrence. Originality/value Specific waste models for HIS settings are yet to be developed. Hence, the identification of the waste categories could guide future implementation of Lean transformations in HIS settings.
Investigation of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams
NASA Technical Reports Server (NTRS)
Davis, Brian A.
2005-01-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical model. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. Excellent agreement is achieved between the predicted and measured results, thereby quantitatively validating the numerical tool.
A Rigorous Treatment of Energy Extraction from a Rotating Black Hole
NASA Astrophysics Data System (ADS)
Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.
2009-05-01
The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.
Contreras, Iván; Kiefer, Stephan; Vehi, Josep
2017-01-01
Diabetes self-management is a crucial element for all people with diabetes and those at risk for developing the disease. Diabetic patients should be empowered to increase their self-management skills in order to prevent or delay the complications of diabetes. This work presents the proposal and first development stages of a smartphone application focused on the empowerment of the patients with diabetes. The concept of this interventional tool is based on the personalization of the user experience from an adaptive and dynamic perspective. The segmentation of the population and the dynamical treatment of user profiles among the different experience levels is the main challenge of the implementation. The self-management assistant and remote treatment for diabetes aims to develop a platform to integrate a series of innovative models and tools rigorously tested and supported by the research literature in diabetes together the use of a proved engine to manage workflows for healthcare.
Recent advances in applying decision science to managing national forests
Marcot, Bruce G.; Thompson, Matthew P.; Runge, Michael C.; Thompson, Frank R.; McNulty, Steven; Cleaves, David; Tomosy, Monica; Fisher, Larry A.; Andrew, Bliss
2012-01-01
Management of federal public forests to meet sustainability goals and multiple use regulations is an immense challenge. To succeed, we suggest use of formal decision science procedures and tools in the context of structured decision making (SDM). SDM entails four stages: problem structuring (framing the problem and defining objectives and evaluation criteria), problem analysis (defining alternatives, evaluating likely consequences, identifying key uncertainties, and analyzing tradeoffs), decision point (identifying the preferred alternative), and implementation and monitoring the preferred alternative with adaptive management feedbacks. We list a wide array of models, techniques, and tools available for each stage, and provide three case studies of their selected use in National Forest land management and project plans. Successful use of SDM involves participation by decision-makers, analysts, scientists, and stakeholders. We suggest specific areas for training and instituting SDM to foster transparency, rigor, clarity, and inclusiveness in formal decision processes regarding management of national forests.
Multifunctional-layered materials for creating membrane-restricted nanodomains and nanoscale imaging
NASA Astrophysics Data System (ADS)
Srinivasan, P.
2016-01-01
Experimental platform that allows precise spatial positioning of biomolecules with an exquisite control at nanometer length scales is a valuable tool to study the molecular mechanisms of membrane bound signaling. Using micromachined thin film gold (Au) in layered architecture, it is possible to add both optical and biochemical functionalities in in vitro. Towards this goal, here, I show that docking of complementary DNA tethered giant phospholiposomes on Au surface can create membrane-restricted nanodomains. These nanodomains are critical features to dissect molecular choreography of membrane signaling complexes. The excited surface plasmon resonance modes of Au allow label-free imaging at diffraction-limited resolution of stably docked DNA tethered phospholiposomes, and lipid-detergent bicelle structures. Such multifunctional building block enables realizing rigorously controlled in vitro set-up to model membrane anchored biological signaling, besides serving as an optical tool for nanoscale imaging.
Inferring subunit stoichiometry from single molecule photobleaching
2013-01-01
Single molecule photobleaching is a powerful tool for determining the stoichiometry of protein complexes. By attaching fluorophores to proteins of interest, the number of associated subunits in a complex can be deduced by imaging single molecules and counting fluorophore photobleaching steps. Because some bleaching steps might be unobserved, the ensemble of steps will be binomially distributed. In this work, it is shown that inferring the true composition of a complex from such data is nontrivial because binomially distributed observations present an ill-posed inference problem. That is, a unique and optimal estimate of the relevant parameters cannot be extracted from the observations. Because of this, a method has not been firmly established to quantify confidence when using this technique. This paper presents a general inference model for interpreting such data and provides methods for accurately estimating parameter confidence. The formalization and methods presented here provide a rigorous analytical basis for this pervasive experimental tool. PMID:23712552
Rigorous mathematical modelling for a Fast Corrector Power Supply in TPS
NASA Astrophysics Data System (ADS)
Liu, K.-B.; Liu, C.-Y.; Chien, Y.-C.; Wang, B.-S.; Wong, Y. S.
2017-04-01
To enhance the stability of beam orbit, a Fast Orbit Feedback System (FOFB) eliminating undesired disturbances was installed and tested in the 3rd generation synchrotron light source of Taiwan Photon Source (TPS) of National Synchrotron Radiation Research Center (NSRRC). The effectiveness of the FOFB greatly depends on the output performance of Fast Corrector Power Supply (FCPS); therefore, the design and implementation of an accurate FCPS is essential. A rigorous mathematical modelling is very useful to shorten design time and improve design performance of a FCPS. A rigorous mathematical modelling derived by the state-space averaging method for a FCPS in the FOFB of TPS composed of a full-bridge topology is therefore proposed in this paper. The MATLAB/SIMULINK software is used to construct the proposed mathematical modelling and to conduct the simulations of the FCPS. Simulations for the effects of the different resolutions of ADC on the output accuracy of the FCPS are investigated. A FCPS prototype is realized to demonstrate the effectiveness of the proposed rigorous mathematical modelling for the FCPS. Simulation and experimental results show that the proposed mathematical modelling is helpful for selecting the appropriate components to meet the accuracy requirements of a FCPS.
The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.
Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko
2013-11-01
Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Mobile mental health: a challenging research agenda.
Olff, Miranda
2015-01-01
The field of mobile health ("m-Health") is evolving rapidly and there is an explosive growth of psychological tools on the market. Exciting high-tech developments may identify symptoms, help individuals manage their own mental health, encourage help seeking, and provide both preventive and therapeutic interventions. This development has the potential to be an efficient cost-effective approach reducing waiting lists and serving a considerable portion of people globally ("g-Health"). However, few of the mobile applications (apps) have been rigorously evaluated. There is little information on how valid screening and assessment tools are, which of the mobile intervention apps are effective, or how well mobile apps compare to face-to-face treatments. But how feasible is rigorous scientific evaluation with the rising demands from policy makers, business partners, and users for their quick release? In this paper, developments in m-Health tools-targeting screening, assessment, prevention, and treatment-are reviewed with examples from the field of trauma and posttraumatic stress disorder. The academic challenges in developing and evaluating m-Health tools are being addressed. Evidence-based guidance is needed on appropriate research designs that may overcome some of the public and ethical challenges (e.g., equity, availability) and the market-driven wish to have mobile apps in the "App Store" yesterday rather than tomorrow.
Review and Synthesize Completed Research Through Systematic Review.
Hopp, Lisa; Rittenmeyer, Leslie
2015-10-01
The evidence-based health care movement has generated new opportunity for scholars to generate synthesized sources of evidence. Systematic reviews are rigorous forms of synthesized evidence that scholars can conduct if they have requisite skills, time, and access to excellent library resources. Systematic reviews play an important role in synthesizing what is known and unknown about a particular health issue. Thus, they have a synergistic relationship with primary research. They can both inform clinical decisions when the evidence is adequate and identify gaps in knowledge to inform research priorities. Systematic reviews can be conducted of quantitative and qualitative evidence to answer many types of questions. They all share characteristics of rigor that arise from a priori protocol development, transparency, exhaustive searching, dual independent reviewers who critically appraise studies using standardized tools, rigor in synthesis, and peer review at multiple stages in the conduct and reporting of the systematic review. © The Author(s) 2015.
Maximum entropy models as a tool for building precise neural controls.
Savin, Cristina; Tkačik, Gašper
2017-10-01
Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.
Biomedical text mining for research rigor and integrity: tasks, challenges, directions.
Kilicoglu, Halil
2017-06-13
An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.
Spatial scaling and multi-model inference in landscape genetics: Martes americana in northern Idaho
Tzeidle N. Wasserman; Samuel A. Cushman; Michael K. Schwartz; David O. Wallin
2010-01-01
Individual-based analyses relating landscape structure to genetic distances across complex landscapes enable rigorous evaluation of multiple alternative hypotheses linking landscape structure to gene flow. We utilize two extensions to increase the rigor of the individual-based causal modeling approach to inferring relationships between landscape patterns and gene flow...
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
Bayesian networks and information theory for audio-visual perception modeling.
Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis
2010-09-01
Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.
A Practical Approach to Governance and Optimization of Structured Data Elements.
Collins, Sarah A; Gesner, Emily; Morgan, Steven; Mar, Perry; Maviglia, Saverio; Colburn, Doreen; Tierney, Diana; Rocha, Roberto
2015-01-01
Definition and configuration of clinical content in an enterprise-wide electronic health record (EHR) implementation is highly complex. Sharing of data definitions across applications within an EHR implementation project may be constrained by practical limitations, including time, tools, and expertise. However, maintaining rigor in an approach to data governance is important for sustainability and consistency. With this understanding, we have defined a practical approach for governance of structured data elements to optimize data definitions given limited resources. This approach includes a 10 step process: 1) identification of clinical topics, 2) creation of draft reference models for clinical topics, 3) scoring of downstream data needs for clinical topics, 4) prioritization of clinical topics, 5) validation of reference models for clinical topics, and 6) calculation of gap analyses of EHR compared against reference model, 7) communication of validated reference models across project members, 8) requested revisions to EHR based on gap analysis, 9) evaluation of usage of reference models across project, and 10) Monitoring for new evidence requiring revisions to reference model.
A Novel Decision Support Tool to Develop Link Driving Schedules for Moves.
DOT National Transportation Integrated Search
2015-01-01
A system or user level strategy that aims to reduce emissions from transportation networks requires a rigorous assessment of emissions inventory for the system to justify its effectiveness. It is important to estimate the total emissions for a transp...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Vinicius M.; Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599; Muratov, Eugene
Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putativemore » sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using Random Forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers was 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR Toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the Scorecard database of possible skin or sense organ toxicants as primary candidates for experimental validation. - Highlights: • It was compiled the largest publicly-available skin sensitization dataset. • Predictive QSAR models were developed for skin sensitization. • Developed models have higher prediction accuracy than OECD QSAR Toolbox. • Putative chemical hazards in the Scorecard database were found using our models.« less
Electronic Design Automation (EDA) Roadmap Taskforce Report, Design of Microprocessors
1999-04-01
through on time. Hence, the study is not a crystal-ball- gazing exercise, but a rigorous, schedulable plan of action to attain the goal. NTRS97...formats so as not to impose too heavy a maintenance burden on their users Object Interfaces eliminate these problems: • A tool that binds the interface ...and User Interface - Design Tool Communication - EDA System Extension Language - EDA Standards- Based Software Development Environment - Design and
Integrated simulations for fusion research in the 2030's time frame (white paper outline)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, Alex; LoDestro, Lynda L.; Parker, Jeffrey B.
This white paper presents the rationale for developing a community-wide capability for whole-device modeling, and advocates for an effort with the expectation of persistence: a long-term programmatic commitment, and support for community efforts. Statement of 2030 goal (two suggestions): (a) Robust integrated simulation tools to aid real-time experimental discharges and reactor designs by employing a hierarchy in fidelity of physics models. (b) To produce by the early 2030s a capability for validated, predictive simulation via integration of a suite of physics models from moderate through high fidelity, to understand and plan full plasma discharges, aid in data interpretation, carry outmore » discovery science, and optimize future machine designs. We can achieve this goal via a focused effort to extend current scientific capabilities and rigorously integrate simulations of disparate physics into a comprehensive set of workflows.« less
Methods Beyond Methods: A Model for Africana Graduate Methods Training.
Best, Latrica E; Byrd, W Carson
2014-06-01
A holistic graduate education can impart not just tools and knowledge, but critical positioning to fulfill many of the original missions of Africana Studies programs set forth in the 1960s and 1970s. As an interdisciplinary field with many approaches to examining the African Diaspora, the methodological training of graduate students can vary across graduate programs. Although taking qualitative methods courses are often required of graduate students in Africana Studies programs, and these programs offer such courses, rarely if ever are graduate students in these programs required to take quantitative methods courses, let alone have these courses offered in-house. These courses can offer Africana Studies graduate students new tools for their own research, but more importantly, improve their knowledge of quantitative research of diasporic communities. These tools and knowledge can assist with identifying flawed arguments about African-descended communities and their members. This article explores the importance of requiring and offering critical quantitative methods courses in graduate programs in Africana Studies, and discusses the methods requirements of one graduate program in the field as an example of more rigorous training that other programs could offer graduate students.
Banna, Jinan C; Vera Becerra, Luz E; Kaiser, Lucia L; Townsend, Marilyn S
2010-01-01
Development of outcome measures relevant to health nutrition behaviors requires a rigorous process of testing and revision. Whereas researchers often report performance of quantitative data collection to assess questionnaire validity and reliability, qualitative testing procedures are often overlooked. This report outlines a procedure for assessing face validity of a Spanish-language dietary assessment tool. Reviewing the literature produced no rigorously validated Spanish-language food behavior assessment tools for the US Department of Agriculture's food assistance and education programs. In response to this need, this study evaluated the face validity of a Spanish-language food behavior checklist adapted from a 16-item English version of a food behavior checklist shown to be valid and reliable for limited-resource English speakers. The English version was translated using rigorous methods involving initial translation by one party and creation of five possible versions. Photos were modified based on client input and new photos were taken as necessary. A sample of low-income, Spanish-speaking women completed cognitive interviews (n=20). Spanish translation experts (n=7) fluent in both languages and familiar with both cultures made minor modifications but essentially approved client preferences. The resulting checklist generated a readability score of 93, indicating low reading difficulty. The Spanish-language checklist has adequate face validity in the target population and is ready for further validation using convergent measures. At the conclusion of testing, this instrument may be used to evaluate nutrition education interventions in California. These qualitative procedures provide a framework for designing evaluation tools for low-literate audiences participating in the US Department of Agriculture food assistance and education programs. Copyright 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.
BANNA, JINAN C.; VERA BECERRA, LUZ E.; KAISER, LUCIA L.; TOWNSEND, MARILYN S.
2015-01-01
Development of outcome measures relevant to health nutrition behaviors requires a rigorous process of testing and revision. Whereas researchers often report performance of quantitative data collection to assess questionnaire validity and reliability, qualitative testing procedures are often overlooked. This report outlines a procedure for assessing face validity of a Spanish-language dietary assessment tool. Reviewing the literature produced no rigorously validated Spanish-language food behavior assessment tools for the US Department of Agriculture’s food assistance and education programs. In response to this need, this study evaluated the face validity of a Spanish-language food behavior checklist adapted from a 16-item English version of a food behavior checklist shown to be valid and reliable for limited-resource English speakers. The English version was translated using rigorous methods involving initial translation by one party and creation of five possible versions. Photos were modified based on client input and new photos were taken as necessary. A sample of low-income, Spanish-speaking women completed cognitive interviews (n=20). Spanish translation experts (n=7) fluent in both languages and familiar with both cultures made minor modifications but essentially approved client preferences. The resulting checklist generated a readability score of 93, indicating low reading difficulty. The Spanish-language checklist has adequate face validity in the target population and is ready for further validation using convergent measures. At the conclusion of testing, this instrument may be used to evaluate nutrition education interventions in California. These qualitative procedures provide a framework for designing evaluation tools for low-literate audiences participating in the US Department of Agriculture food assistance and education programs. PMID:20102831
Scientific approaches to science policy.
Berg, Jeremy M
2013-11-01
The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.
Near Identifiability of Dynamical Systems
NASA Technical Reports Server (NTRS)
Hadaegh, F. Y.; Bekey, G. A.
1987-01-01
Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.
Improving the ideal and human observer consistency: a demonstration of principles
NASA Astrophysics Data System (ADS)
He, Xin
2017-03-01
In addition to being rigorous and realistic, the usefulness of the ideal observer computational tools may also depend on whether they serve the empirical purpose for which they are created, e.g. to identify desirable imaging systems to be used by human observers. In SPIE 10136-35, I have shown that the ideal and the human observers do not necessarily prefer the same system as the optimal or better one due to their different objectives in both hardware and software optimization. In this work, I attempt to identify a necessary but insufficient condition under which the human and the ideal observer may rank systems consistently. If corroborated, such a condition allows a numerical test on the ideal/human consistency without routine human observer studies. I reproduced data from Abbey et al. JOSA 2001 to verify the proposed condition (i.e., not a rigorous falsification study due to the lack of specificity in the proposed conjecture. A roadmap for more falsifiable conditions is proposed). Via this work, I would like to emphasize the reality of practical decision making in addition to the realism in mathematical modeling. (Disclaimer: the views expressed in this work do not necessarily represent those of the FDA.)
USEPA’s Land‐Based Materials Management Exposure and Risk Assessment Tool System
It is recognized that some kinds of 'waste' materials can in fact be reused as input materials for making safe products that benefit society. RIMM (Risk-Informed Materials Management) provides an integrated data gathering and analysis capability to enable scientifically rigorous ...
Anticipatory Understanding of Adversary Intent: A Signature-Based Knowledge System
2009-06-01
concept of logical positivism has been applied more recently to all human knowledge and reflected in current data fusion research, information mining...this work has been successfully translated into useful analytical tools that can provide a rigorous and quantitative basis for predictive analysis
A technical guide to tDCS, and related non-invasive brain stimulation tools
Woods, AJ; Antal, A; Bikson, M; Boggio, PS; Brunoni, AR; Celnik, P; Cohen, LG; Fregni, F; Herrmann, CS; Kappenman, ES; Knotkova, H; Liebetanz, D; Miniussi, C; Miranda, PC; Paulus, W; Priori, A; Reato, D; Stagg, C; Wenderoth, N; Nitsche, MA
2015-01-01
Transcranial electrical stimulation (tES), including transcranial direct and alternating current stimulation (tDCS, tACS) are non-invasive brain stimulation techniques increasingly used for modulation of central nervous system excitability in humans. Here we address methodological issues required for tES application. This review covers technical aspects of tES, as well as applications like exploration of brain physiology, modelling approaches, tES in cognitive neurosciences, and interventional approaches. It aims to help the reader to appropriately design and conduct studies involving these brain stimulation techniques, understand limitations and avoid shortcomings, which might hamper the scientific rigor and potential applications in the clinical domain. PMID:26652115
Example of a Bayes network of relations among visual features
NASA Astrophysics Data System (ADS)
Agosta, John M.
1991-10-01
Bayes probability networks, also termed `influence diagrams,' promise to be a versatile, rigorous, and expressive uncertainty reasoning tool. This paper presents an example of how a Bayes network can express constraints among visual hypotheses. An example is presented of a model composed of cylindric primitives, inferred from a line drawing of a plumbing fixture. Conflict between interpretations of candidate cylinders is expressed by two parameters, one for the presence and one for the absence of visual evidence of their intersection. It is shown how `partial exclusion' relations are so generated and how they determine the degree of competition among the set of hypotheses. Solving this network obtains the assemblies of cylinders most likely to form an object.
NASA Astrophysics Data System (ADS)
Šprlák, M.; Han, S.-C.; Featherstone, W. E.
2017-12-01
Rigorous modelling of the spherical gravitational potential spectra from the volumetric density and geometry of an attracting body is discussed. Firstly, we derive mathematical formulas for the spatial analysis of spherical harmonic coefficients. Secondly, we present a numerically efficient algorithm for rigorous forward modelling. We consider the finite-amplitude topographic modelling methods as special cases, with additional postulates on the volumetric density and geometry. Thirdly, we implement our algorithm in the form of computer programs and test their correctness with respect to the finite-amplitude topography routines. For this purpose, synthetic and realistic numerical experiments, applied to the gravitational field and geometry of the Moon, are performed. We also investigate the optimal choice of input parameters for the finite-amplitude modelling methods. Fourth, we exploit the rigorous forward modelling for the determination of the spherical gravitational potential spectra inferred by lunar crustal models with uniform, laterally variable, radially variable, and spatially (3D) variable bulk density. Also, we analyse these four different crustal models in terms of their spectral characteristics and band-limited radial gravitation. We demonstrate applicability of the rigorous forward modelling using currently available computational resources up to degree and order 2519 of the spherical harmonic expansion, which corresponds to a resolution of 2.2 km on the surface of the Moon. Computer codes, a user manual and scripts developed for the purposes of this study are publicly available to potential users.
Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells
2015-01-15
serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves
Accurate estimation of influenza epidemics using Google search data via ARGO.
Yang, Shihao; Santillana, Mauricio; Kou, S C
2015-11-24
Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions.
An empirical propellant response function for combustion stability predictions
NASA Technical Reports Server (NTRS)
Hessler, R. O.
1980-01-01
An empirical response function model was developed for ammonium perchlorate propellants to supplant T-burner testing at the preliminary design stage. The model was developed by fitting a limited T-burner data base, in terms of oxidizer size and concentration, to an analytical two parameter response function expression. Multiple peaks are predicted, but the primary effect is of a single peak for most formulations, with notable bulges for the various AP size fractions. The model was extended to velocity coupling with the assumption that dynamic response was controlled primarily by the solid phase described by the two parameter model. The magnitude of velocity coupling was then scaled using an erosive burning law. Routine use of the model for stability predictions on a number of propulsion units indicates that the model tends to overpredict propellant response. It is concluded that the model represents a generally conservative prediction tool, suited especially for the preliminary design stage when T-burner data may not be readily available. The model work included development of a rigorous summation technique for pseudopropellant properties and of a concept for modeling ordered packing of particulates.
ERIC Educational Resources Information Center
Celeste, Eric
2016-01-01
Communities of practice have become important tools for districts striving to improve teacher quality in a way that improves student outcomes, but scaling the benefits of these communities requires a more rigorous, intentional approach. That's why Learning Forward, with support from the Bill & Melinda Gates Foundation, created the Redesign PD…
Identifying Opportunities in Citizen Science for Academic Libraries
ERIC Educational Resources Information Center
Cohen, Cynthia M.; Cheney, Liz; Duong, Khue; Lea, Ben; Unno, Zoe Pettway
2015-01-01
Citizen science projects continue to grow in popularity, providing opportunities for nonexpert volunteers to contribute to and become personally invested in rigorous scientific research. Academic libraries, aiming to promote and provide tools and resources to master scientific and information literacy, can support these efforts. While few examples…
Simple Climate Model Evaluation Using Impulse Response Tests
NASA Astrophysics Data System (ADS)
Schwarber, A.; Hartin, C.; Smith, S. J.
2017-12-01
Simple climate models (SCMs) are central tools used to incorporate climate responses into human-Earth system modeling. SCMs are computationally inexpensive, making them an ideal tool for a variety of analyses, including consideration of uncertainty. Despite their wide use, many SCMs lack rigorous testing of their fundamental responses to perturbations. Here, following recommendations of a recent National Academy of Sciences report, we compare several SCMs (Hector-deoclim, MAGICC 5.3, MAGICC 6.0, and the IPCC AR5 impulse response function) to diagnose model behavior and understand the fundamental system responses within each model. We conduct stylized perturbations (emissions and forcing/concentration) of three different chemical species: CO2, CH4, and BC. We find that all 4 models respond similarly in terms of overall shape, however, there are important differences in the timing and magnitude of the responses. For example, the response to a BC pulse differs over the first 20 years after the pulse among the models, a finding that is due to differences in model structure. Such perturbation experiments are difficult to conduct in complex models due to internal model noise, making a direct comparison with simple models challenging. We can, however, compare the simplified model response from a 4xCO2 step experiment to the same stylized experiment carried out by CMIP5 models, thereby testing the ability of SCMs to emulate complex model results. This work allows an assessment of how well current understanding of Earth system responses are incorporated into multi-model frameworks by way of simple climate models.
NASA Astrophysics Data System (ADS)
Podgorney, Robert; Coleman, Justin; Wilkins, Amdrew; Huang, Hai; Veeraraghavan, Swetha; Xia, Yidong; Permann, Cody
2017-04-01
Numerical modeling has played an important role in understanding the behavior of coupled subsurface thermal-hydro-mechanical (THM) processes associated with a number of energy and environmental applications since as early as the 1970s. While the ability to rigorously describe all key tightly coupled controlling physics still remains a challenge, there have been significant advances in recent decades. These advances are related primarily to the exponential growth of computational power, the development of more accurate equations of state, improvements in the ability to represent heterogeneity and reservoir geometry, and more robust nonlinear solution schemes. The work described in this paper documents the development and linkage of several fully-coupled and fully-implicit modeling tools. These tools simulate: (1) the dynamics of fluid flow, heat transport, and quasi-static rock mechanics; (2) seismic wave propagation from the sources of energy release through heterogeneous material; and (3) the soil-structural damage resulting from ground acceleration. These tools are developed in Idaho National Laboratory's parallel Multiphysics Object Oriented Simulation Environment, and are integrated together using a global implicit approach. The governing equations are presented, the numerical approach for simultaneously solving and coupling the three coupling physics tools is discussed, and the data input and output methodology is outlined. An example is presented to demonstrate the capabilities of the coupled multiphysics approach. The example involves simulating a system conceptually similar to the geothermal development in Basel Switzerland, and the resultant induced seismicity, ground motion and structural damage is predicted.
Treetrimmer: a method for phylogenetic dataset size reduction.
Maruyama, Shinichiro; Eveleigh, Robert J M; Archibald, John M
2013-04-12
With rapid advances in genome sequencing and bioinformatics, it is now possible to generate phylogenetic trees containing thousands of operational taxonomic units (OTUs) from a wide range of organisms. However, use of rigorous tree-building methods on such large datasets is prohibitive and manual 'pruning' of sequence alignments is time consuming and raises concerns over reproducibility. There is a need for bioinformatic tools with which to objectively carry out such pruning procedures. Here we present 'TreeTrimmer', a bioinformatics procedure that removes unnecessary redundancy in large phylogenetic datasets, alleviating the size effect on more rigorous downstream analyses. The method identifies and removes user-defined 'redundant' sequences, e.g., orthologous sequences from closely related organisms and 'recently' evolved lineage-specific paralogs. Representative OTUs are retained for more rigorous re-analysis. TreeTrimmer reduces the OTU density of phylogenetic trees without sacrificing taxonomic diversity while retaining the original tree topology, thereby speeding up downstream computer-intensive analyses, e.g., Bayesian and maximum likelihood tree reconstructions, in a reproducible fashion.
Modeling driver behavior in a cognitive architecture.
Salvucci, Dario D
2006-01-01
This paper explores the development of a rigorous computational model of driver behavior in a cognitive architecture--a computational framework with underlying psychological theories that incorporate basic properties and limitations of the human system. Computational modeling has emerged as a powerful tool for studying the complex task of driving, allowing researchers to simulate driver behavior and explore the parameters and constraints of this behavior. An integrated driver model developed in the ACT-R (Adaptive Control of Thought-Rational) cognitive architecture is described that focuses on the component processes of control, monitoring, and decision making in a multilane highway environment. This model accounts for the steering profiles, lateral position profiles, and gaze distributions of human drivers during lane keeping, curve negotiation, and lane changing. The model demonstrates how cognitive architectures facilitate understanding of driver behavior in the context of general human abilities and constraints and how the driving domain benefits cognitive architectures by pushing model development toward more complex, realistic tasks. The model can also serve as a core computational engine for practical applications that predict and recognize driver behavior and distraction.
Bach, Peter M; McCarthy, David T; Urich, Christian; Sitzenfrei, Robert; Kleidorfer, Manfred; Rauch, Wolfgang; Deletic, Ana
2013-01-01
With global change bringing about greater challenges for the resilient planning and management of urban water infrastructure, research has been invested in the development of a strategic planning tool, DAnCE4Water. The tool models how urban and societal changes impact the development of centralised and decentralised (distributed) water infrastructure. An algorithm for rigorous assessment of suitable decentralised stormwater management options in the model is presented and tested on a local Melbourne catchment. Following detailed spatial representation algorithms (defined by planning rules), the model assesses numerous stormwater options to meet water quality targets at a variety of spatial scales. A multi-criteria assessment algorithm is used to find top-ranking solutions (which meet a specific treatment performance for a user-defined percentage of catchment imperviousness). A toolbox of five stormwater technologies (infiltration systems, surface wetlands, bioretention systems, ponds and swales) is featured. Parameters that set the algorithm's flexibility to develop possible management options are assessed and evaluated. Results are expressed in terms of 'utilisation', which characterises the frequency of use of different technologies across the top-ranking options (bioretention being the most versatile). Initial results highlight the importance of selecting a suitable spatial resolution and providing the model with enough flexibility for coming up with different technology combinations. The generic nature of the model enables its application to other urban areas (e.g. different catchments, local municipal regions or entire cities).
McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W
2015-03-27
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
Gas solubility in dilute solutions: A novel molecular thermodynamic perspective
NASA Astrophysics Data System (ADS)
Chialvo, Ariel A.
2018-05-01
We present an explicit molecular-based interpretation of the thermodynamic phase equilibrium underlying gas solubility in liquids, through rigorous links between the microstructure of the dilute systems and the relevant macroscopic quantities that characterize their solution thermodynamics. We apply the formal analysis to unravel and highlight the molecular-level nature of the approximations behind the widely used Krichevsky-Kasarnovsky [J. Am. Chem. Soc. 57, 2168 (1935)] and Krichevsky-Ilinskaya [Acta Physicochim. 20, 327 (1945)] equations for the modeling of gas solubility. Then, we implement a general molecular-based approach to gas solubility and illustrate it by studying Lennard-Jones binary systems whose microstructure and thermodynamic properties were consistently generated via integral equation calculations. Furthermore, guided by the molecular-based analysis, we propose a novel macroscopic modeling approach to gas solubility, emphasize some usually overlook modeling subtleties, and identify novel interdependences among relevant solubility quantities that can be used as either handy modeling constraints or tools for consistency tests.
Gas solubility in dilute solutions: A novel molecular thermodynamic perspective.
Chialvo, Ariel A
2018-05-07
We present an explicit molecular-based interpretation of the thermodynamic phase equilibrium underlying gas solubility in liquids, through rigorous links between the microstructure of the dilute systems and the relevant macroscopic quantities that characterize their solution thermodynamics. We apply the formal analysis to unravel and highlight the molecular-level nature of the approximations behind the widely used Krichevsky-Kasarnovsky [J. Am. Chem. Soc. 57, 2168 (1935)] and Krichevsky-Ilinskaya [Acta Physicochim. 20, 327 (1945)] equations for the modeling of gas solubility. Then, we implement a general molecular-based approach to gas solubility and illustrate it by studying Lennard-Jones binary systems whose microstructure and thermodynamic properties were consistently generated via integral equation calculations. Furthermore, guided by the molecular-based analysis, we propose a novel macroscopic modeling approach to gas solubility, emphasize some usually overlook modeling subtleties, and identify novel interdependences among relevant solubility quantities that can be used as either handy modeling constraints or tools for consistency tests.
Rotation and anisotropy of galaxies revisited
NASA Astrophysics Data System (ADS)
Binney, James
2005-11-01
The use of the tensor virial theorem (TVT) as a diagnostic of anisotropic velocity distributions in galaxies is revisited. The TVT provides a rigorous global link between velocity anisotropy, rotation and shape, but the quantities appearing in it are not easily estimated observationally. Traditionally, use has been made of a centrally averaged velocity dispersion and the peak rotation velocity. Although this procedure cannot be rigorously justified, tests on model galaxies show that it works surprisingly well. With the advent of integral-field spectroscopy it is now possible to establish a rigorous connection between the TVT and observations. The TVT is reformulated in terms of sky-averages, and the new formulation is tested on model galaxies.
75 FR 71131 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-22
... impacts. To complete this task with scientific rigor, it will be necessary to collect high quality survey... instruments, methodologies, procedures, and analytical techniques for this task. Moreover, they have been pilot tested in 11 States. The tools and techniques were submitted for review, and were approved, by...
Complicating Methodological Transparency
ERIC Educational Resources Information Center
Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.
2016-01-01
A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…
The Temporal Organization of Syllabic Structure
ERIC Educational Resources Information Center
Shaw, Jason A.
2010-01-01
This dissertation develops analytical tools which enable rigorous evaluation of competing syllabic parses on the basis of temporal patterns in speech production data. The data come from the articulographic tracking of fleshpoints on target speech organs, e.g., tongue, lips, jaw, in experiments with native speakers of American English and Moroccan…
The US Environmental Protection Agency (EPA) is revising its strategy to obtain the information needed to answer questions pertinent to water-quality management efficiently and rigorously at national scales. One tool of this revised strategy is use of statistically based surveys ...
The Achiever. Volume 6, Number 5
ERIC Educational Resources Information Center
Ashby, Nicole, Ed.
2007-01-01
"The Achiever" is a monthly newsletter designed expressly for parents and community leaders. Each issue contains news and information about and from public and private organizations about school improvement in the United States. Highlights of this issue include: (1) New Online Tool Simplifies Financial Aid Process; (2) Rigor in K-6:…
Language Supports for Journal Abstract Writing across Disciplines
ERIC Educational Resources Information Center
Liou, H.-C.; Yang, P.-C.; Chang, J. S.
2012-01-01
Various writing assistance tools have been developed through efforts in the areas of natural language processing with different degrees of success of curriculum integration depending on their functional rigor and pedagogical designs. In this paper, we developed a system, WriteAhead, that provides six types of suggestions when non-native graduate…
USDA-ARS?s Scientific Manuscript database
To ensure current land use strategies and management practices are economically, environmentally, and socially sustainable, tools and techniques for assessing and quantifying changes in soil quality/health (SQ) need to be developed through rigorous research and potential use by consultants, and othe...
D-peaks: a visual tool to display ChIP-seq peaks along the genome.
Brohée, Sylvain; Bontempi, Gianluca
2012-01-01
ChIP-sequencing is a method of choice to localize the positions of protein binding sites on DNA on a whole genomic scale. The deciphering of the sequencing data produced by this novel technique is challenging and it is achieved by their rigorous interpretation using dedicated tools and adapted visualization programs. Here, we present a bioinformatics tool (D-peaks) that adds several possibilities (including, user-friendliness, high-quality, relative position with respect to the genomic features) to the well-known visualization browsers or databases already existing. D-peaks is directly available through its web interface http://rsat.ulb.ac.be/dpeaks/ as well as a command line tool.
Rigor of cell fate decision by variable p53 pulses and roles of cooperative gene expression by p53
Murakami, Yohei; Takada, Shoji
2012-01-01
Upon DNA damage, the cell fate decision between survival and apoptosis is largely regulated by p53-related networks. Recent experiments found a series of discrete p53 pulses in individual cells, which led to the hypothesis that the cell fate decision upon DNA damage is controlled by counting the number of p53 pulses. Under this hypothesis, Sun et al. (2009) modeled the Bax activation switch in the apoptosis signal transduction pathway that can rigorously “count” the number of uniform p53 pulses. Based on experimental evidence, here we use variable p53 pulses with Sun et al.’s model to investigate how the variability in p53 pulses affects the rigor of the cell fate decision by the pulse number. Our calculations showed that the experimentally anticipated variability in the pulse sizes reduces the rigor of the cell fate decision. In addition, we tested the roles of the cooperativity in PUMA expression by p53, finding that lower cooperativity is plausible for more rigorous cell fate decision. This is because the variability in the p53 pulse height is more amplified in PUMA expressions with more cooperative cases. PMID:27857606
Predictive 5-Year Survivorship Model of Cystic Fibrosis
Liou, Theodore G.; Adler, Frederick R.; FitzSimmons, Stacey C.; Cahill, Barbara C.; Hibbs, Jonathan R.; Marshall, Bruce C.
2007-01-01
The objective of this study was to create a 5-year survivorship model to identify key clinical features of cystic fibrosis. Such a model could help researchers and clinicians to evaluate therapies, improve the design of prospective studies, monitor practice patterns, counsel individual patients, and determine the best candidates for lung transplantation. The authors used information from the Cystic Fibrosis Foundation Patient Registry (CFFPR), which has collected longitudinal data on approximately 90% of cystic fibrosis patients diagnosed in the United States since 1986. They developed multivariate logistic regression models by using data on 5,820 patients randomly selected from 11,630 in the CFFPR in 1993. Models were tested for goodness of fit and were validated for the remaining 5,810 patients for 1993. The validated 5-year survivorship model included age, forced expiratory volume in 1 second as a percentage of predicted normal, gender, weight-for-age z score, pancreatic sufficiency, diabetes mellitus, Staphylococcus aureus infection, Burkerholderia cepacia infection, and annual number of acute pulmonary exacerbations. The model provides insights into the complex nature of cystic fibrosis and supplies a rigorous tool for clinical practice and research. PMID:11207152
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.
Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas
2016-06-17
Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.
mHealth for HIV Treatment & Prevention: A Systematic Review of the Literature
Catalani, Caricia; Philbrick, William; Fraser, Hamish; Mechael, , Patricia; Israelski, Dennis M.
2013-01-01
This systematic review assesses the published literature to describe the landscape of mobile health technology (mHealth) for HIV/AIDS and the evidence supporting the use of these tools to address the HIV prevention, care, and treatment cascade. The speed of innovation, broad range of initiatives and tools, and heterogeneity in reporting have made it difficult to uncover and synthesize knowledge on how mHealth tools might be effective in addressing the HIV pandemic. To do address this gap, a team of reviewers collected literature on the use of mobile technology for HIV/AIDS among health, engineering, and social science literature databases and analyzed a final set of 62 articles. Articles were systematically coded, assessed for scientific rigor, and sorted for HIV programmatic relevance. The review revealed evidence that mHealth tools support HIV programmatic priorities, including: linkage to care, retention in care, and adherence to antiretroviral treatment. In terms of technical features, mHealth tools facilitate alerts and reminders, data collection, direct voice communication, educational messaging, information on demand, and more. Studies were mostly descriptive with a growing number of quasi-experimental and experimental designs. There was a lack of evidence around the use of mHealth tools to address the needs of key populations, including pregnant mothers, sex workers, users of injection drugs, and men who have sex with men. The science and practice of mHealth for HIV are evolving rapidly, but still in their early stages. Small-scale efforts, pilot projects, and preliminary descriptive studies are advancing and there is a promising trend toward implementing mHealth innovation that is feasible and acceptable within low-resource settings, positive program outcomes, operational improvements, and rigorous study design PMID:24133558
Fleisher, Linda; Wen, Kuang Yi; Miller, Suzanne M; Diefenbach, Michael; Stanton, Annette L; Ropka, Mary; Morra, Marion; Raich, Peter C
2015-11-01
Cancer patients and survivors are assuming active roles in decision-making and digital patient support tools are widely used to facilitate patient engagement. As part of Cancer Information Service Research Consortium's randomized controlled trials focused on the efficacy of eHealth interventions to promote informed treatment decision-making for newly diagnosed prostate and breast cancer patients, and post-treatment breast cancer, we conducted a rigorous process evaluation to examine the actual use of and perceived benefits of two complementary communication channels -- print and eHealth interventions. The three Virtual Cancer Information Service (V-CIS) interventions were developed through a rigorous developmental process, guided by self-regulatory theory, informed decision-making frameworks, and health communications best practices. Control arm participants received NCI print materials; experimental arm participants received the additional V-CIS patient support tool. Actual usage data from the web-based V-CIS was also obtained and reported. Print materials were highly used by all groups. About 60% of the experimental group reported using the V-CIS. Those who did use the V-CIS rated it highly on improvements in knowledge, patient-provider communication and decision-making. The findings show that how patients actually use eHealth interventions either singularly or within the context of other communication channels is complex. Integrating rigorous best practices and theoretical foundations is essential and multiple communication approaches should be considered to support patient preferences.
Towards a Credibility Assessment of Models and Simulations
NASA Technical Reports Server (NTRS)
Blattnig, Steve R.; Green, Lawrence L.; Luckring, James M.; Morrison, Joseph H.; Tripathi, Ram K.; Zang, Thomas A.
2008-01-01
A scale is presented to evaluate the rigor of modeling and simulation (M&S) practices for the purpose of supporting a credibility assessment of the M&S results. The scale distinguishes required and achieved levels of rigor for a set of M&S elements that contribute to credibility including both technical and process measures. The work has its origins in an interest within NASA to include a Credibility Assessment Scale in development of a NASA standard for models and simulations.
A Regional Seismic Travel Time Model for North America
2010-09-01
velocity at the Moho, the mantle velocity gradient, and the average crustal velocity. After tomography across Eurasia, rigorous tests find that Pn...velocity gradient, and the average crustal velocity. After tomography across Eurasia rigorous tests find that Pn travel time residuals are reduced...and S-wave velocity in the crustal layers and in the upper mantle. A good prior model is essential because the RSTT tomography inversion is invariably
NASA Astrophysics Data System (ADS)
Akilan, A.; Nagasubramanian, V.; Chaudhry, A.; Reddy, D. Rajesh; Sudheer Reddy, D.; Usha Devi, R.; Tirupati, T.; Radhadevi, P. V.; Varadan, G.
2014-11-01
Block Adjustment is a technique for large area mapping for images obtained from different remote sensingsatellites.The challenge in this process is to handle huge number of satellite imageries from different sources with different resolution and accuracies at the system level. This paper explains a system with various tools and techniques to effectively handle the end-to-end chain in large area mapping and production with good level of automation and the provisions for intuitive analysis of final results in 3D and 2D environment. In addition, the interface for using open source ortho and DEM references viz., ETM, SRTM etc. and displaying ESRI shapes for the image foot-prints are explained. Rigorous theory, mathematical modelling, workflow automation and sophisticated software engineering tools are included to ensure high photogrammetric accuracy and productivity. Major building blocks like Georeferencing, Geo-capturing and Geo-Modelling tools included in the block adjustment solution are explained in this paper. To provide optimal bundle block adjustment solution with high precision results, the system has been optimized in many stages to exploit the full utilization of hardware resources. The robustness of the system is ensured by handling failure in automatic procedure and saving the process state in every stage for subsequent restoration from the point of interruption. The results obtained from various stages of the system are presented in the paper.
What Can We Learn from Hadronic and Radiative Decays of Light Mesons?
NASA Astrophysics Data System (ADS)
Kubis, Bastian
2013-04-01
Chiral perturbation theory offers a powerful tool for the investigation of light pseudoscalar mesons. It incorporates the fundamental symmetries of QCD, interrelates various processes, and allows to link these to the light quark masses. Its shortcomings lie in a limited energy range: the radius of convergence of the chiral expansion is confined to below resonance scales. Furthermore, the strongest consequences of chiral symmetry are manifest for pseudoscalars (pions, kaons, eta) only: vector mesons, e.g., have a severe impact in particular for reactions involving photons. In this talk, I advocate dispersions relations as another model-independent tool to extend the applicability range of chiral perturbation theory. They even allow to tackle the physics of vector mesons in a rigorous way. It will be shown how dispersive methods can be used to resum large rescattering effects, and to provide model-independent links between hadronic and radiative decay modes. Examples to be discussed will include decays of the eta meson, giving access to light-quark-mass ratios or allowing to test the chiral anomaly; and meson transition form factors, which have an important impact on the hadronic light-by-light-scattering contribution to the anomalous magnetic moment of the muon.
Clinical observations from nutrition services in college athletics.
Quatromoni, Paula A
2008-04-01
College athletes are vulnerable to nutritional risks because of the rigorous demands of their sport, and because of the realities of college lifestyles. Athletes often adopt rigid training diets that predispose them to undernutrition, fatigue, and injury. Disordered eating, a common concern for college-aged women, affects a substantial number of female collegiate athletes, and is a growing concern for their male counterparts. Few resources exist to promote nutritional well-being among college athletes, particularly for individuals who suffer from eating pathology that is subclinical and often perceived as benign. This article presents evidence of the need for nutrition services for college athletes and describes nutritional risks that affect individuals across a variety of athletic teams. A multidisciplinary treatment model is depicted, featuring a nutrition practice at the core of a sports medicine wellness program in Division I college athletics. Observations from this practice document a substantial burden of subclinical eating disorders and elucidate characteristics of high-risk individuals. The Female Athlete Screening Tool is advocated as a useful tool for identifying eating pathology and triggering timely interventions. These insights from clinical practice identify opportunities and behavioral targets for intervention, and promote an effective model for health promotion in college athletics.
Moriasi, Daniel N; Gowda, Prasanna H; Arnold, Jeffrey G; Mulla, David J; Ale, Srinivasulu; Steiner, Jean L; Tomer, Mark D
2013-11-01
Subsurface tile drains in agricultural systems of the midwestern United States are a major contributor of nitrate-N (NO-N) loadings to hypoxic conditions in the Gulf of Mexico. Hydrologic and water quality models, such as the Soil and Water Assessment Tool, are widely used to simulate tile drainage systems. The Hooghoudt and Kirkham tile drain equations in the Soil and Water Assessment Tool have not been rigorously tested for predicting tile flow and the corresponding NO-N losses. In this study, long-term (1983-1996) monitoring plot data from southern Minnesota were used to evaluate the SWAT version 2009 revision 531 (hereafter referred to as SWAT) model for accurately estimating subsurface tile drain flows and associated NO-N losses. A retention parameter adjustment factor was incorporated to account for the effects of tile drainage and slope changes on the computation of surface runoff using the curve number method (hereafter referred to as Revised SWAT). The SWAT and Revised SWAT models were calibrated and validated for tile flow and associated NO-N losses. Results indicated that, on average, Revised SWAT predicted monthly tile flow and associated NO-N losses better than SWAT by 48 and 28%, respectively. For the calibration period, the Revised SWAT model simulated tile flow and NO-N losses within 4 and 1% of the observed data, respectively. For the validation period, it simulated tile flow and NO-N losses within 8 and 2%, respectively, of the observed values. Therefore, the Revised SWAT model is expected to provide more accurate simulation of the effectiveness of tile drainage and NO-N management practices. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Method for hierarchical modeling of the command of flexible manufacturing systems
NASA Astrophysics Data System (ADS)
Ausfelder, Christian; Castelain, Emmanuel; Gentina, Jean-Claude
1994-04-01
The present paper focuses on the modeling of the command and proposes a hierarchical and modular approach which is oriented on the physical structure of FMS. The requirements issuing from monitoring of FMS are discussed and integrated in the proposed model. Its modularity makes the approach open for extensions concerning as well the production resources as the products. As a modeling tool, we have chosen Object Petri nets. The first part of the paper describes desirable features of an FMS command such as safety, robustness, and adaptability. As it is shown, these features result from the flexibility of the installation. The modeling method presented in the second part of the paper begins with a structural analysis of FMS and defines a natural command hierarchy, where the coordination of the production process, the synchronization of production resources on products, and the internal coordination are treated separately. The method is rigorous and leads to a structured and modular Petri net model which can be used for FMS simulation or translated into the final command code.
Geodesic-light-cone coordinates and the Bianchi I spacetime
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fleury, Pierre; Nugier, Fabien; Fanizza, Giuseppe, E-mail: pierre.fleury@uct.ac.za, E-mail: fnugier@ntu.edu.tw, E-mail: giuseppe.fanizza@ba.infn.it
The geodesic-light-cone (GLC) coordinates are a useful tool to analyse light propagation and observations in cosmological models. In this article, we propose a detailed, pedagogical, and rigorous introduction to this coordinate system, explore its gauge degrees of freedom, and emphasize its interest when geometric optics is at stake. We then apply the GLC formalism to the homogeneous and anisotropic Bianchi I cosmology. More than a simple illustration, this application (i) allows us to show that the Weinberg conjecture according to which gravitational lensing does not affect the proper area of constant-redshift surfaces is significantly violated in a globally anisotropic universe;more » and (ii) offers a glimpse into new ways to constrain cosmic isotropy from the Hubble diagram.« less
A technical guide to tDCS, and related non-invasive brain stimulation tools.
Woods, A J; Antal, A; Bikson, M; Boggio, P S; Brunoni, A R; Celnik, P; Cohen, L G; Fregni, F; Herrmann, C S; Kappenman, E S; Knotkova, H; Liebetanz, D; Miniussi, C; Miranda, P C; Paulus, W; Priori, A; Reato, D; Stagg, C; Wenderoth, N; Nitsche, M A
2016-02-01
Transcranial electrical stimulation (tES), including transcranial direct and alternating current stimulation (tDCS, tACS) are non-invasive brain stimulation techniques increasingly used for modulation of central nervous system excitability in humans. Here we address methodological issues required for tES application. This review covers technical aspects of tES, as well as applications like exploration of brain physiology, modelling approaches, tES in cognitive neurosciences, and interventional approaches. It aims to help the reader to appropriately design and conduct studies involving these brain stimulation techniques, understand limitations and avoid shortcomings, which might hamper the scientific rigor and potential applications in the clinical domain. Copyright © 2015 International Federation of Clinical Neurophysiology. All rights reserved.
Rigorous simulations of a helical core fiber by the use of transformation optics formalism.
Napiorkowski, Maciej; Urbanczyk, Waclaw
2014-09-22
We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.
Accurate estimation of influenza epidemics using Google search data via ARGO
Yang, Shihao; Santillana, Mauricio; Kou, S. C.
2015-01-01
Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search–based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people’s online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980
NASA Astrophysics Data System (ADS)
Liang, Yunyun; Liu, Sanyang; Zhang, Shengli
2017-02-01
Apoptosis is a fundamental process controlling normal tissue homeostasis by regulating a balance between cell proliferation and death. Predicting subcellular location of apoptosis proteins is very helpful for understanding its mechanism of programmed cell death. Prediction of apoptosis protein subcellular location is still a challenging and complicated task, and existing methods mainly based on protein primary sequences. In this paper, we propose a new position-specific scoring matrix (PSSM)-based model by using Geary autocorrelation function and detrended cross-correlation coefficient (DCCA coefficient). Then a 270-dimensional (270D) feature vector is constructed on three widely used datasets: ZD98, ZW225 and CL317, and support vector machine is adopted as classifier. The overall prediction accuracies are significantly improved by rigorous jackknife test. The results show that our model offers a reliable and effective PSSM-based tool for prediction of apoptosis protein subcellular localization.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
Hollingsworth, T Déirdre; Adams, Emily R; Anderson, Roy M; Atkins, Katherine; Bartsch, Sarah; Basáñez, María-Gloria; Behrend, Matthew; Blok, David J; Chapman, Lloyd A C; Coffeng, Luc; Courtenay, Orin; Crump, Ron E; de Vlas, Sake J; Dobson, Andy; Dyson, Louise; Farkas, Hajnal; Galvani, Alison P; Gambhir, Manoj; Gurarie, David; Irvine, Michael A; Jervis, Sarah; Keeling, Matt J; Kelly-Hope, Louise; King, Charles; Lee, Bruce Y; Le Rutte, Epke A; Lietman, Thomas M; Ndeffo-Mbah, Martial; Medley, Graham F; Michael, Edwin; Pandey, Abhishek; Peterson, Jennifer K; Pinsent, Amy; Porco, Travis C; Richardus, Jan Hendrik; Reimer, Lisa; Rock, Kat S; Singh, Brajendra K; Stolk, Wilma; Swaminathan, Subramanian; Torr, Steve J; Townsend, Jeffrey; Truscott, James; Walker, Martin; Zoueva, Alexandra
2015-12-09
Quantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an overview of a collection of novel model-based analyses which aim to address key questions on the dynamics of transmission and control of nine NTDs: Chagas disease, visceral leishmaniasis, human African trypanosomiasis, leprosy, soil-transmitted helminths, schistosomiasis, lymphatic filariasis, onchocerciasis and trachoma. Several common themes resonate throughout these analyses, including: the importance of epidemiological setting on the success of interventions; targeting groups who are at highest risk of infection or re-infection; and reaching populations who are not accessing interventions and may act as a reservoir for infection,. The results also highlight the challenge of maintaining elimination 'as a public health problem' when true elimination is not reached. The models elucidate the factors that may be contributing most to persistence of disease and discuss the requirements for eventually achieving true elimination, if that is possible. Overall this collection presents new analyses to inform current control initiatives. These papers form a base from which further development of the models and more rigorous validation against a variety of datasets can help to give more detailed advice. At the moment, the models' predictions are being considered as the world prepares for a final push towards control or elimination of neglected tropical diseases by 2020.
Single molecule force spectroscopy at high data acquisition: A Bayesian nonparametric analysis
NASA Astrophysics Data System (ADS)
Sgouralis, Ioannis; Whitmore, Miles; Lapidus, Lisa; Comstock, Matthew J.; Pressé, Steve
2018-03-01
Bayesian nonparametrics (BNPs) are poised to have a deep impact in the analysis of single molecule data as they provide posterior probabilities over entire models consistent with the supplied data, not just model parameters of one preferred model. Thus they provide an elegant and rigorous solution to the difficult problem encountered when selecting an appropriate candidate model. Nevertheless, BNPs' flexibility to learn models and their associated parameters from experimental data is a double-edged sword. Most importantly, BNPs are prone to increasing the complexity of the estimated models due to artifactual features present in time traces. Thus, because of experimental challenges unique to single molecule methods, naive application of available BNP tools is not possible. Here we consider traces with time correlations and, as a specific example, we deal with force spectroscopy traces collected at high acquisition rates. While high acquisition rates are required in order to capture dwells in short-lived molecular states, in this setup, a slow response of the optical trap instrumentation (i.e., trapped beads, ambient fluid, and tethering handles) distorts the molecular signals introducing time correlations into the data that may be misinterpreted as true states by naive BNPs. Our adaptation of BNP tools explicitly takes into consideration these response dynamics, in addition to drift and noise, and makes unsupervised time series analysis of correlated single molecule force spectroscopy measurements possible, even at acquisition rates similar to or below the trap's response times.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Yu; Sengupta, Manajit; Dooraghi, Mike
Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less
Xie, Yu; Sengupta, Manajit; Dooraghi, Mike
2018-03-20
Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less
Methods Beyond Methods: A Model for Africana Graduate Methods Training
Best, Latrica E.; Byrd, W. Carson
2018-01-01
A holistic graduate education can impart not just tools and knowledge, but critical positioning to fulfill many of the original missions of Africana Studies programs set forth in the 1960s and 1970s. As an interdisciplinary field with many approaches to examining the African Diaspora, the methodological training of graduate students can vary across graduate programs. Although taking qualitative methods courses are often required of graduate students in Africana Studies programs, and these programs offer such courses, rarely if ever are graduate students in these programs required to take quantitative methods courses, let alone have these courses offered in-house. These courses can offer Africana Studies graduate students new tools for their own research, but more importantly, improve their knowledge of quantitative research of diasporic communities. These tools and knowledge can assist with identifying flawed arguments about African-descended communities and their members. This article explores the importance of requiring and offering critical quantitative methods courses in graduate programs in Africana Studies, and discusses the methods requirements of one graduate program in the field as an example of more rigorous training that other programs could offer graduate students. PMID:29710883
USDA-ARS?s Scientific Manuscript database
Background: A review of the literature produced no rigorously tested and validated Spanish-language physical activity survey or evaluation tools for use by USDA’s food assistance and education programs. The purpose of the current study was to develop and evaluate the face validity of a visually enha...
ERIC Educational Resources Information Center
Thomas, Jason E.; Hornsey, Philip E.
2014-01-01
Formative Classroom Assessment Techniques (CAT) have been well-established instructional tools in higher education since their exposition in the late 1980s (Angelo & Cross, 1993). A large body of literature exists surrounding the strengths and weaknesses of formative CATs. Simpson-Beck (2011) suggested insufficient quantitative evidence exists…
ERIC Educational Resources Information Center
Kinsler, Paul; Favaro, Alberto; McCall, Martin W.
2009-01-01
The Poynting vector is an invaluable tool for analysing electromagnetic problems. However, even a rigorous stress-energy tensor approach can still leave us with the question: is it best defined as E x H or as D x B? Typical electromagnetic treatments provide yet another perspective: they regard E x B as the appropriate definition, because E and B…
An Example-Centric Tool for Context-Driven Design of Biomedical Devices
ERIC Educational Resources Information Center
Dzombak, Rachel; Mehta, Khanjan; Butler, Peter
2015-01-01
Engineering is one of the most global professions, with design teams developing technologies for an increasingly interconnected and borderless world. In order for engineering students to be proficient in creating viable solutions to the challenges faced by diverse populations, they must receive an experiential education in rigorous engineering…
School Leader's Literacy Walkthrough: Kindergarten, First, Second, and Third Grades
ERIC Educational Resources Information Center
Kosanovich, Marcia; Smith, Kevin; Hensley, Trudy; Osborne-Lampkin, La'Tara; Foorman, Barbara
2015-01-01
The "School Leader's Literacy Walkthrough" is designed to assist school leaders in observing specific research-based practices during literacy instruction (or students' independent use or application of those practices). This tool is based on rigorous research that indicates the most effective way to teach reading (see Foorman &…
The Personal Selling Ethics Scale: Revisions and Expansions for Teaching Sales Ethics
ERIC Educational Resources Information Center
Donoho, Casey; Heinze, Timothy
2011-01-01
The field of sales draws a large number of marketing graduates. Sales curricula used within today's marketing programs should include rigorous discussions of sales ethics. The Personal Selling Ethics Scale (PSE) provides an analytical tool for assessing and discussing students' ethical sales sensitivities. However, since the scale fails to address…
Interactive visual analysis promotes exploration of long-term ecological data
T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst
2013-01-01
Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...
Layout optimization of DRAM cells using rigorous simulation model for NTD
NASA Astrophysics Data System (ADS)
Jeon, Jinhyuck; Kim, Shinyoung; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Kuechler, Bernd; Zimmermann, Rainer; Muelders, Thomas; Klostermann, Ulrich; Schmoeller, Thomas; Do, Mun-hoe; Choi, Jung-Hoe
2014-03-01
DRAM chip space is mainly determined by the size of the memory cell array patterns which consist of periodic memory cell features and edges of the periodic array. Resolution Enhancement Techniques (RET) are used to optimize the periodic pattern process performance. Computational Lithography such as source mask optimization (SMO) to find the optimal off axis illumination and optical proximity correction (OPC) combined with model based SRAF placement are applied to print patterns on target. For 20nm Memory Cell optimization we see challenges that demand additional tool competence for layout optimization. The first challenge is a memory core pattern of brick-wall type with a k1 of 0.28, so it allows only two spectral beams to interfere. We will show how to analytically derive the only valid geometrically limited source. Another consequence of two-beam interference limitation is a "super stable" core pattern, with the advantage of high depth of focus (DoF) but also low sensitivity to proximity corrections or changes of contact aspect ratio. This makes an array edge correction very difficult. The edge can be the most critical pattern since it forms the transition from the very stable regime of periodic patterns to non-periodic periphery, so it combines the most critical pitch and highest susceptibility to defocus. Above challenge makes the layout correction to a complex optimization task demanding a layout optimization that finds a solution with optimal process stability taking into account DoF, exposure dose latitude (EL), mask error enhancement factor (MEEF) and mask manufacturability constraints. This can only be achieved by simultaneously considering all criteria while placing and sizing SRAFs and main mask features. The second challenge is the use of a negative tone development (NTD) type resist, which has a strong resist effect and is difficult to characterize experimentally due to negative resist profile taper angles that perturb CD at bottom characterization by scanning electron microscope (SEM) measurements. High resist impact and difficult model data acquisition demand for a simulation model that hat is capable of extrapolating reliably beyond its calibration dataset. We use rigorous simulation models to provide that predictive performance. We have discussed the need of a rigorous mask optimization process for DRAM contact cell layout yielding mask layouts that are optimal in process performance, mask manufacturability and accuracy. In this paper, we have shown the step by step process from analytical illumination source derivation, a NTD and application tailored model calibration to layout optimization such as OPC and SRAF placement. Finally the work has been verified with simulation and experimental results on wafer.
Klein, Michael T; Hou, Gang; Quann, Richard J; Wei, Wei; Liao, Kai H; Yang, Raymond S H; Campain, Julie A; Mazurek, Monica A; Broadbelt, Linda J
2002-01-01
A chemical engineering approach for the rigorous construction, solution, and optimization of detailed kinetic models for biological processes is described. This modeling capability addresses the required technical components of detailed kinetic modeling, namely, the modeling of reactant structure and composition, the building of the reaction network, the organization of model parameters, the solution of the kinetic model, and the optimization of the model. Even though this modeling approach has enjoyed successful application in the petroleum industry, its application to biomedical research has just begun. We propose to expand the horizons on classic pharmacokinetics and physiologically based pharmacokinetics (PBPK), where human or animal bodies were often described by a few compartments, by integrating PBPK with reaction network modeling described in this article. If one draws a parallel between an oil refinery, where the application of this modeling approach has been very successful, and a human body, the individual processing units in the oil refinery may be considered equivalent to the vital organs of the human body. Even though the cell or organ may be much more complicated, the complex biochemical reaction networks in each organ may be similarly modeled and linked in much the same way as the modeling of the entire oil refinery through linkage of the individual processing units. The integrated chemical engineering software package described in this article, BioMOL, denotes the biological application of molecular-oriented lumping. BioMOL can build a detailed model in 1-1,000 CPU sec using standard desktop hardware. The models solve and optimize using standard and widely available hardware and software and can be presented in the context of a user-friendly interface. We believe this is an engineering tool with great promise in its application to complex biological reaction networks. PMID:12634134
Application of State Analysis and Goal-based Operations to a MER Mission Scenario
NASA Technical Reports Server (NTRS)
Morris, John Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.
2006-01-01
State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the system behavior in terms of state variables and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper first describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.
Application of State Analysis and Goal-Based Operations to a MER Mission Scenario
NASA Technical Reports Server (NTRS)
Morris, J. Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.
2006-01-01
State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the behavior of states and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.
Dolev, Danny; Függer, Matthias; Posch, Markus; Schmid, Ulrich; Steininger, Andreas; Lenzen, Christoph
2014-06-01
We present the first implementation of a distributed clock generation scheme for Systems-on-Chip that recovers from an unbounded number of arbitrary transient faults despite a large number of arbitrary permanent faults. We devise self-stabilizing hardware building blocks and a hybrid synchronous/asynchronous state machine enabling metastability-free transitions of the algorithm's states. We provide a comprehensive modeling approach that permits to prove, given correctness of the constructed low-level building blocks, the high-level properties of the synchronization algorithm (which have been established in a more abstract model). We believe this approach to be of interest in its own right, since this is the first technique permitting to mathematically verify, at manageable complexity, high-level properties of a fault-prone system in terms of its very basic components. We evaluate a prototype implementation, which has been designed in VHDL, using the Petrify tool in conjunction with some extensions, and synthesized for an Altera Cyclone FPGA.
Modern Management Principles Come to the Dental School.
Wataha, John C; Mouradian, Wendy E; Slayton, Rebecca L; Sorensen, John A; Berg, Joel H
2016-04-01
The University of Washington School of Dentistry may be the first dental school in the nation to apply lean process management principles as a primary tool to re-engineer its operations and curriculum to produce the dentist of the future. The efficiencies realized through re-engineering will better enable the school to remain competitive and viable as a national leader of dental education. Several task forces conducted rigorous value stream analyses in a highly collaborative environment led by the dean of the school. The four areas undergoing evaluation and re-engineering were organizational infrastructure, organizational processes, curriculum, and clinic operations. The new educational model was derived by thoroughly analyzing the current state of dental education in order to design and achieve the closest possible ideal state. As well, the school's goal was to create a lean, sustainable operational model. This model aims to ensure continued excellence in restorative dental instruction and to serve as a blueprint for other public dental schools seeking financial stability in this era of shrinking state support and rising costs.
Dolev, Danny; Függer, Matthias; Posch, Markus; Schmid, Ulrich; Steininger, Andreas; Lenzen, Christoph
2014-01-01
We present the first implementation of a distributed clock generation scheme for Systems-on-Chip that recovers from an unbounded number of arbitrary transient faults despite a large number of arbitrary permanent faults. We devise self-stabilizing hardware building blocks and a hybrid synchronous/asynchronous state machine enabling metastability-free transitions of the algorithm's states. We provide a comprehensive modeling approach that permits to prove, given correctness of the constructed low-level building blocks, the high-level properties of the synchronization algorithm (which have been established in a more abstract model). We believe this approach to be of interest in its own right, since this is the first technique permitting to mathematically verify, at manageable complexity, high-level properties of a fault-prone system in terms of its very basic components. We evaluate a prototype implementation, which has been designed in VHDL, using the Petrify tool in conjunction with some extensions, and synthesized for an Altera Cyclone FPGA. PMID:26516290
El-Houjeiri, Hassan M; Brandt, Adam R; Duffy, James E
2013-06-04
Existing transportation fuel cycle emissions models are either general and calculate nonspecific values of greenhouse gas (GHG) emissions from crude oil production, or are not available for public review and auditing. We have developed the Oil Production Greenhouse Gas Emissions Estimator (OPGEE) to provide open-source, transparent, rigorous GHG assessments for use in scientific assessment, regulatory processes, and analysis of GHG mitigation options by producers. OPGEE uses petroleum engineering fundamentals to model emissions from oil and gas production operations. We introduce OPGEE and explain the methods and assumptions used in its construction. We run OPGEE on a small set of fictional oil fields and explore model sensitivity to selected input parameters. Results show that upstream emissions from petroleum production operations can vary from 3 gCO2/MJ to over 30 gCO2/MJ using realistic ranges of input parameters. Significant drivers of emissions variation are steam injection rates, water handling requirements, and rates of flaring of associated gas.
2017-01-01
This work investigates the design of alternative monitoring tools based on state estimators for industrial crystallization systems with nucleation, growth, and agglomeration kinetics. The estimation problem is regarded as a structure design problem where the estimation model and the set of innovated states have to be chosen; the estimator is driven by the available measurements of secondary variables. On the basis of Robust Exponential estimability arguments, it is found that the concentration is distinguishable with temperature and solid fraction measurements while the crystal size distribution (CSD) is not. Accordingly, a state estimator structure is selected such that (i) the concentration (and other distinguishable states) are innovated by means of the secondary measurements processed with the geometric estimator (GE), and (ii) the CSD is estimated by means of a rigorous model in open loop mode. The proposed estimator has been tested through simulations showing good performance in the case of mismatch in the initial conditions, parametric plant-model mismatch, and noisy measurements. PMID:28890604
Porru, Marcella; Özkan, Leyla
2017-08-30
This work investigates the design of alternative monitoring tools based on state estimators for industrial crystallization systems with nucleation, growth, and agglomeration kinetics. The estimation problem is regarded as a structure design problem where the estimation model and the set of innovated states have to be chosen; the estimator is driven by the available measurements of secondary variables. On the basis of Robust Exponential estimability arguments, it is found that the concentration is distinguishable with temperature and solid fraction measurements while the crystal size distribution (CSD) is not. Accordingly, a state estimator structure is selected such that (i) the concentration (and other distinguishable states) are innovated by means of the secondary measurements processed with the geometric estimator (GE), and (ii) the CSD is estimated by means of a rigorous model in open loop mode. The proposed estimator has been tested through simulations showing good performance in the case of mismatch in the initial conditions, parametric plant-model mismatch, and noisy measurements.
Village Building Identification Based on Ensemble Convolutional Neural Networks
Guo, Zhiling; Chen, Qi; Xu, Yongwei; Shibasaki, Ryosuke; Shao, Xiaowei
2017-01-01
In this study, we present the Ensemble Convolutional Neural Network (ECNN), an elaborate CNN frame formulated based on ensembling state-of-the-art CNN models, to identify village buildings from open high-resolution remote sensing (HRRS) images. First, to optimize and mine the capability of CNN for village mapping and to ensure compatibility with our classification targets, a few state-of-the-art models were carefully optimized and enhanced based on a series of rigorous analyses and evaluations. Second, rather than directly implementing building identification by using these models, we exploited most of their advantages by ensembling their feature extractor parts into a stronger model called ECNN based on the multiscale feature learning method. Finally, the generated ECNN was applied to a pixel-level classification frame to implement object identification. The proposed method can serve as a viable tool for village building identification with high accuracy and efficiency. The experimental results obtained from the test area in Savannakhet province, Laos, prove that the proposed ECNN model significantly outperforms existing methods, improving overall accuracy from 96.64% to 99.26%, and kappa from 0.57 to 0.86. PMID:29084154
Salipur, Zdravko; Bertocci, Gina
2010-01-01
It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.
Gupta, Shikha; Basant, Nikita; Mohan, Dinesh; Singh, Kunwar P
2016-07-01
The persistence and the removal of organic chemicals from the atmosphere are largely determined by their reactions with the OH radical and O3. Experimental determinations of the kinetic rate constants of OH and O3 with a large number of chemicals are tedious and resource intensive and development of computational approaches has widely been advocated. Recently, ensemble machine learning (EML) methods have emerged as unbiased tools to establish relationship between independent and dependent variables having a nonlinear dependence. In this study, EML-based, temperature-dependent quantitative structure-reactivity relationship (QSRR) models have been developed for predicting the kinetic rate constants for OH (kOH) and O3 (kO3) reactions with diverse chemicals. Structural diversity of chemicals was evaluated using a Tanimoto similarity index. The generalization and prediction abilities of the constructed models were established through rigorous internal and external validation performed employing statistical checks. In test data, the EML QSRR models yielded correlation (R (2)) of ≥0.91 between the measured and the predicted reactivities. The applicability domains of the constructed models were determined using methods based on descriptors range, Euclidean distance, leverage, and standardization approaches. The prediction accuracies for the higher reactivity compounds were relatively better than those of the low reactivity compounds. Proposed EML QSRR models performed well and outperformed the previous reports. The proposed QSRR models can make predictions of rate constants at different temperatures. The proposed models can be useful tools in predicting the reactivities of chemicals towards OH radical and O3 in the atmosphere.
A Theoretical Framework for Lagrangian Descriptors
NASA Astrophysics Data System (ADS)
Lopesino, C.; Balibrea-Iniesta, F.; García-Garrido, V. J.; Wiggins, S.; Mancho, A. M.
This paper provides a theoretical background for Lagrangian Descriptors (LDs). The goal of achieving rigorous proofs that justify the ability of LDs to detect invariant manifolds is simplified by introducing an alternative definition for LDs. The definition is stated for n-dimensional systems with general time dependence, however we rigorously prove that this method reveals the stable and unstable manifolds of hyperbolic points in four particular 2D cases: a hyperbolic saddle point for linear autonomous systems, a hyperbolic saddle point for nonlinear autonomous systems, a hyperbolic saddle point for linear nonautonomous systems and a hyperbolic saddle point for nonlinear nonautonomous systems. We also discuss further rigorous results which show the ability of LDs to highlight additional invariants sets, such as n-tori. These results are just a simple extension of the ergodic partition theory which we illustrate by applying this methodology to well-known examples, such as the planar field of the harmonic oscillator and the 3D ABC flow. Finally, we provide a thorough discussion on the requirement of the objectivity (frame-invariance) property for tools designed to reveal phase space structures and their implications for Lagrangian descriptors.
Evaluating Emulation-based Models of Distributed Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Stephen T.; Gabert, Kasimir G.; Tarman, Thomas D.
Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses andmore » describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.« less
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
In order to better understand the dynamic processes of a real game system, we need an appropriate dynamics model, so to evaluate the validity of a model is not a trivial task. Here, we demonstrate an approach, considering the dynamical macroscope patterns of angular momentum and speed as the measurement variables, to evaluate the validity of various dynamics models. Using the data in real time Rock-Paper-Scissors (RPS) games experiments, we obtain the experimental dynamic patterns, and then derive the related theoretical dynamic patterns from a series of typical dynamics models respectively. By testing the goodness-of-fit between the experimental and theoretical patterns, the validity of the models can be evaluated. One of the results in our study case is that, among all the nonparametric models tested, the best-known Replicator dynamics model performs almost worst, while the Projection dynamics model performs best. Besides providing new empirical macroscope patterns of social dynamics, we demonstrate that the approach can be an effective and rigorous tool to test game dynamics models. Fundamental Research Funds for the Central Universities (SSEYI2014Z) and the National Natural Science Foundation of China (Grants No. 61503062).
The Planetary Data System Information Model for Geometry Metadata
NASA Astrophysics Data System (ADS)
Guinness, E. A.; Gordon, M. K.
2014-12-01
The NASA Planetary Data System (PDS) has recently developed a new set of archiving standards based on a rigorously defined information model. An important part of the new PDS information model is the model for geometry metadata, which includes, for example, attributes of the lighting and viewing angles of observations, position and velocity vectors of a spacecraft relative to Sun and observing body at the time of observation and the location and orientation of an observation on the target. The PDS geometry model is based on requirements gathered from the planetary research community, data producers, and software engineers who build search tools. A key requirement for the model is that it fully supports the breadth of PDS archives that include a wide range of data types from missions and instruments observing many types of solar system bodies such as planets, ring systems, and smaller bodies (moons, comets, and asteroids). Thus, important design aspects of the geometry model are that it standardizes the definition of the geometry attributes and provides consistency of geometry metadata across planetary science disciplines. The model specification also includes parameters so that the context of values can be unambiguously interpreted. For example, the reference frame used for specifying geographic locations on a planetary body is explicitly included with the other geometry metadata parameters. The structure and content of the new PDS geometry model is designed to enable both science analysis and efficient development of search tools. The geometry model is implemented in XML, as is the main PDS information model, and uses XML schema for validation. The initial version of the geometry model is focused on geometry for remote sensing observations conducted by flyby and orbiting spacecraft. Future releases of the PDS geometry model will be expanded to include metadata for landed and rover spacecraft.
McDonnell, J. D.; Schunck, N.; Higdon, D.; ...
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonnell, J. D.; Schunck, N.; Higdon, D.
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
Rigorous ILT optimization for advanced patterning and design-process co-optimization
NASA Astrophysics Data System (ADS)
Selinidis, Kosta; Kuechler, Bernd; Cai, Howard; Braam, Kyle; Hoppe, Wolfgang; Domnenko, Vitaly; Poonawala, Amyn; Xiao, Guangming
2018-03-01
Despite the large difficulties involved in extending 193i multiple patterning and the slow ramp of EUV lithography to full manufacturing readiness, the pace of development for new technology node variations has been accelerating. Multiple new variations of new and existing technology nodes have been introduced for a range of device applications; each variation with at least a few new process integration methods, layout constructs and/or design rules. This had led to a strong increase in the demand for predictive technology tools which can be used to quickly guide important patterning and design co-optimization decisions. In this paper, we introduce a novel hybrid predictive patterning method combining two patterning technologies which have each individually been widely used for process tuning, mask correction and process-design cooptimization. These technologies are rigorous lithography simulation and inverse lithography technology (ILT). Rigorous lithography simulation has been extensively used for process development/tuning, lithography tool user setup, photoresist hot-spot detection, photoresist-etch interaction analysis, lithography-TCAD interactions/sensitivities, source optimization and basic lithography design rule exploration. ILT has been extensively used in a range of lithographic areas including logic hot-spot fixing, memory layout correction, dense memory cell optimization, assist feature (AF) optimization, source optimization, complex patterning design rules and design-technology co-optimization (DTCO). The combined optimization capability of these two technologies will therefore have a wide range of useful applications. We investigate the benefits of the new functionality for a few of these advanced applications including correction for photoresist top loss and resist scumming hotspots.
Comparison of rigorous and simple vibrational models for the CO2 gasdynamic laser
NASA Technical Reports Server (NTRS)
Monson, D. J.
1977-01-01
The accuracy of a simple vibrational model for computing the gain in a CO2 gasdynamic laser is assessed by comparing results computed from it with results computed from a rigorous vibrational model. The simple model is that of Anderson et al. (1971), in which the vibrational kinetics are modeled by grouping the nonequilibrium vibrational degrees of freedom into two modes, to each of which there corresponds an equation describing vibrational relaxation. The two models agree fairly well in the computed gain at low temperatures, but the simple model predicts too high a gain at the higher temperatures of current interest. The sources of error contributing to the overestimation given by the simple model are determined by examining the simplified relaxation equations.
NASA Technical Reports Server (NTRS)
Glytsis, Elias N.; Brundrett, David L.; Gaylord, Thomas K.
1993-01-01
A review of the rigorous coupled-wave analysis as applied to the diffraction of electro-magnetic waves by gratings is presented. The analysis is valid for any polarization, angle of incidence, and conical diffraction. Cascaded and/or multiplexed gratings as well as material anisotropy can be incorporated under the same formalism. Small period rectangular groove gratings can also be modeled using approximately equivalent uniaxial homogeneous layers (effective media). The ordinary and extraordinary refractive indices of these layers depend on the gratings filling factor, the refractive indices of the substrate and superstrate, and the ratio of the freespace wavelength to grating period. Comparisons of the homogeneous effective medium approximations with the rigorous coupled-wave analysis are presented. Antireflection designs (single-layer or multilayer) using the effective medium models are presented and compared. These ultra-short period antireflection gratings can also be used to produce soft x-rays. Comparisons of the rigorous coupled-wave analysis with experimental results on soft x-ray generation by gratings are also included.
Binding and Scope Dependencies with "Floating Quantifiers" in Japanese
ERIC Educational Resources Information Center
Mukai, Emi
2012-01-01
The primary concern of this thesis is how we can achieve rigorous testability when we set the properties of the Computational System (hypothesized to be at the center of the language faculty) as our object of inquiry and informant judgments as a tool to construct and/or evaluate our hypotheses concerning the properties of the Computational System.…
ERIC Educational Resources Information Center
Carter, Sunshine; Traill, Stacie
2017-01-01
Electronic resource access troubleshooting is familiar work in most libraries. The added complexity introduced when a library implements a web-scale discovery service, however, creates a strong need for well-organized, rigorous training to enable troubleshooting staff to provide the best service possible. This article outlines strategies, tools,…
A Case Study of Resources Management Planning with Multiple Objectives and Projects
David L. Peterson; David G. Silsbee; Daniel L. Schmoldt
1995-01-01
Each National Park Service unit in the United States produces a resources management plan (RMP) every four years or less. The plans commit budgets and personnel to specific projects for four years, but they are prepared with little quantitative and analytical rigor and without formal decisionmaking tools. We have previously described a multiple objective planning...
Validation of the ROMI-RIP rough mill simulator
Edward R. Thomas; Urs Buehlmann
2002-01-01
The USDA Forest Service's ROMI-RIP rough mill rip-first simulation program is a popular tool for analyzing rough mill conditions, determining more efficient rough mill practices, and finding optimal lumber board cut-up patterns. However, until now, the results generated by ROMI-RIP have not been rigorously compared to those of an actual rough mill. Validating the...
ERIC Educational Resources Information Center
McKaveney, Edward W.
2017-01-01
A number of national directives and successful case studies, focus on the need for change in teaching and learning, particularly emphasizing increasingly rigorous STEM learning tied to the use of ICT and digital tools for technological literacy and future workforce development. This action research study investigated the role of instructional…
Are You Ready to Assess Social and Emotional Development? SEL Solutions Tools Index
ERIC Educational Resources Information Center
American Institutes for Research, 2015
2015-01-01
Assessing individuals' social and emotional (SE) knowledge, attitudes, and skills is a complex task. It requires careful consideration of the assessment purpose, rigor, practicality, burden, and ethics. Once you have considered these factors and have determined that you are, in fact, "Ready to Assess," you are ready to act and choose an…
ERIC Educational Resources Information Center
Meyers, Jonathan K.; LeBaron, Tyler W.; Collins, David C.
2014-01-01
Writing assignments are typically incorporated into chemistry courses in an attempt to enhance the learning of chemistry or to teach technical writing to chemistry majors. This work addresses the development of chemistry-major writing skills by focusing on the rigorous guidelines and conventions associated with the preparation of a journal…
Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics
ERIC Educational Resources Information Center
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-01-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…
What Can Graph Theory Tell Us about Word Learning and Lexical Retrieval?
ERIC Educational Resources Information Center
Vitevitch, Michael S.
2008-01-01
Purpose: Graph theory and the new science of networks provide a mathematically rigorous approach to examine the development and organization of complex systems. These tools were applied to the mental lexicon to examine the organization of words in the lexicon and to explore how that structure might influence the acquisition and retrieval of…
ERIC Educational Resources Information Center
Nehm, Ross H.; Schonfeld, Irvin Sam
2008-01-01
Growing recognition of the central importance of fostering an in-depth understanding of natural selection has, surprisingly, failed to stimulate work on the development and rigorous evaluation of instruments that measure knowledge of it. We used three different methodological tools, the Conceptual Inventory of Natural Selection (CINS), a modified…
ERIC Educational Resources Information Center
Reed, Eileen; Scull, Janie; Slicker, Gerilyn; Winkler, Amber M.
2012-01-01
Rigorous standards and aligned assessments are vital tools for boosting education outcomes but they have little traction without strong accountability systems that attach consequences to performance. In this pilot study, Eileen Reed, Janie Scull, Gerilyn Slicker, and Amber Winkler lay out the essential features of such accountability systems,…
A Framework to Manage Information Models
NASA Astrophysics Data System (ADS)
Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.
2008-05-01
The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such as OWL and RDF and written to XML Metadata Interchange (XMI) files for import into UML tools.
Complex Causal Process Diagrams for Analyzing the Health Impacts of Policy Interventions
Joffe, Michael; Mindell, Jennifer
2006-01-01
Causal diagrams are rigorous tools for controlling confounding. They also can be used to describe complex causal systems, which is done routinely in communicable disease epidemiology. The use of change diagrams has advantages over static diagrams, because change diagrams are more tractable, relate better to interventions, and have clearer interpretations. Causal diagrams are a useful basis for modeling. They make assumptions explicit, provide a framework for analysis, generate testable predictions, explore the effects of interventions, and identify data gaps. Causal diagrams can be used to integrate different types of information and to facilitate communication both among public health experts and between public health experts and experts in other fields. Causal diagrams allow the use of instrumental variables, which can help control confounding and reverse causation. PMID:16449586
Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes
NASA Technical Reports Server (NTRS)
Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.
2000-01-01
Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.
Hard Constraints in Optimization Under Uncertainty
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.
2008-01-01
This paper proposes a methodology for the analysis and design of systems subject to parametric uncertainty where design requirements are specified via hard inequality constraints. Hard constraints are those that must be satisfied for all parameter realizations within a given uncertainty model. Uncertainty models given by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles, are the focus of this paper. These models, which are also quite practical, allow for a rigorous mathematical treatment within the proposed framework. Hard constraint feasibility is determined by sizing the largest uncertainty set for which the design requirements are satisfied. Analytically verifiable assessments of robustness are attained by comparing this set with the actual uncertainty model. Strategies that enable the comparison of the robustness characteristics of competing design alternatives, the description and approximation of the robust design space, and the systematic search for designs with improved robustness are also proposed. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, this methodology is applicable to a broad range of engineering problems.
Qiao, Peng-Fei; Mou, Shin; Chuang, Shun Lien
2012-01-30
The electronic band structures and optical properties of type-II superlattice (T2SL) photodetectors in the mid-infrared (IR) range are investigated. We formulate a rigorous band structure model using the 8-band k · p method to include the conduction and valence band mixing. After solving the 8 × 8 Hamiltonian and deriving explicitly the new momentum matrix elements in terms of envelope functions, optical transition rates are obtained through the Fermi's golden rule under various doping and injection conditions. Optical measurements on T2SL photodetectors are compared with our model and show good agreement. Our modeling results of quantum structures connect directly to the device-level design and simulation. The predicted doping effect is readily applicable to the optimization of photodetectors. We further include interfacial (IF) layers to study the significance of their effect. Optical properties of T2SLs are expected to have a large tunable range by controlling the thickness and material composition of the IF layers. Our model provides an efficient tool for the designs of novel photodetectors.
Reinventing the High School Government Course: Rigor, Simulations, and Learning from Text
ERIC Educational Resources Information Center
Parker, Walter C.; Lo, Jane C.
2016-01-01
The high school government course is arguably the main site of formal civic education in the country today. This article presents the curriculum that resulted from a multiyear study aimed at improving the course. The pedagogic model, called "Knowledge in Action," centers on a rigorous form of project-based learning where the projects are…
All Rigor and No Play Is No Way to Improve Learning
ERIC Educational Resources Information Center
Wohlwend, Karen; Peppler, Kylie
2015-01-01
The authors propose and discuss their Playshop curricular model, which they developed with teachers. Their studies suggest a playful approach supports even more rigor than the Common Core State Standards require for preschool and early grade children. Children keep their attention longer when learning comes in the form of something they can play…
Scientific rigor through videogames.
Treuille, Adrien; Das, Rhiju
2014-11-01
Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
NASA Astrophysics Data System (ADS)
Di, K.; Liu, Y.; Liu, B.; Peng, M.
2012-07-01
Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF) coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1) refining EOPs by correcting the attitude angle bias, 2) refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model) and DOM (Digital Ortho Map) are automatically generated.
Nuclear physics: quantitative single-cell approaches to nuclear organization and gene expression.
Lionnet, T; Wu, B; Grünwald, D; Singer, R H; Larson, D R
2010-01-01
The internal workings of the nucleus remain a mystery. A list of component parts exists, and in many cases their functional roles are known for events such as transcription, RNA processing, or nuclear export. Some of these components exhibit structural features in the nucleus, regions of concentration or bodies that have given rise to the concept of functional compartmentalization--that there are underlying organizational principles to be described. In contrast, a picture is emerging in which transcription appears to drive the assembly of the functional components required for gene expression, drawing from pools of excess factors. Unifying this seemingly dual nature requires a more rigorous approach, one in which components are tracked in time and space and correlated with onset of specific nuclear functions. In this chapter, we anticipate tools that will address these questions and provide the missing kinetics of nuclear function. These tools are based on analyzing the fluctuations inherent in the weak signals of endogenous nuclear processes and determining values for them. In this way, it will be possible eventually to provide a computational model describing the functional relationships of essential components.
NASA Astrophysics Data System (ADS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2005-05-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
NASA Technical Reports Server (NTRS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2005-01-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
Digital morphogenesis via Schelling segregation
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2018-04-01
Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.
Learning What Works in Educational Technology with a Case Study of EDUSTAR. Policy Memo 2016-01
ERIC Educational Resources Information Center
Chatterji, Aaron K.; Jones, Benjamin F.
2016-01-01
Despite much fanfare, new technologies have yet to fundamentally advance student outcomes in K-12 schools or other educational settings. We believe that the system that supports the development and dissemination of educational technology tools is falling short. The key missing ingredient is rigorous evaluation. No one knows what works and for…
ERIC Educational Resources Information Center
McCready, John W.
2010-01-01
The purpose of this study was to examine use of decision-making tools and feedback in strategic planning in order to develop a rigorous process that would promote the efficiency of strategic planning for acquisitions in the United States Coast Guard (USCG). Strategic planning is critical to agencies such as the USCG in order to be effective…
Francisco Rodriguez y Silva; Armando Gonzalez-Caban
2010-01-01
Historically, in Spain and most European countries, forest fire budgets have never been subjected to an objective and rigorous economic analysis indicative of the returns on investments in fire management protection programs. Thus far we have witnessed expansive growth of costs without any investment planning. New economic realities and more focussed oversight by...
ERIC Educational Resources Information Center
National Center for Educational Achievement, 2010
2010-01-01
Many policymakers and education leaders have embraced the Advanced Placement (AP) Program as a tool to strengthen the high school curriculum and prepare students for college. The popularity of the AP program among these policy leaders reflects their belief that the traditional high school curriculum has often failed to provide rigorous courses…
13th Annual Systems Engineering Conference: Tues- Wed
2010-10-28
greater understanding/documentation of lessons learned – Promotes SE within the organization • Justification for continued funding of SE Infrastructure...educational process – Addresses the development of innovative learning tools, strategies, and teacher training • Research and Development – Promotes ...technology, and mathematics • More commitment to engaging young students in science, engineering, technology and mathematics • More rigor in defining
Statistical rigor in LiDAR-assisted estimation of aboveground forest biomass
Timothy G. Gregoire; Erik Næsset; Ronald E. McRoberts; Göran Ståhl; Hans Andersen; Terje Gobakken; Liviu Ene; Ross Nelson
2016-01-01
For many decades remotely sensed data have been used as a source of auxiliary information when conducting regional or national surveys of forest resources. In the past decade, airborne scanning LiDAR (Light Detection and Ranging) has emerged as a promising tool for sample surveys aimed at improving estimation of aboveground forest biomass. This technology is now...
A user-friendly tool to evaluate the effectiveness of no-take marine reserves.
Villaseñor-Derbez, Juan Carlos; Faro, Caio; Wright, Melaina; Martínez, Jael; Fitzgerald, Sean; Fulton, Stuart; Mancha-Cisneros, Maria Del Mar; McDonald, Gavin; Micheli, Fiorenza; Suárez, Alvin; Torre, Jorge; Costello, Christopher
2018-01-01
Marine reserves are implemented to achieve a variety of objectives, but are seldom rigorously evaluated to determine whether those objectives are met. In the rare cases when evaluations do take place, they typically focus on ecological indicators and ignore other relevant objectives such as socioeconomics and governance. And regardless of the objectives, the diversity of locations, monitoring protocols, and analysis approaches hinder the ability to compare results across case studies. Moreover, analysis and evaluation of reserves is generally conducted by outside researchers, not the reserve managers or users, plausibly thereby hindering effective local management and rapid response to change. We present a framework and tool, called "MAREA", to overcome these challenges. Its purpose is to evaluate the extent to which any given reserve has achieved its stated objectives. MAREA provides specific guidance on data collection and formatting, and then conducts rigorous causal inference analysis based on data input by the user, providing real-time outputs about the effectiveness of the reserve. MAREA's ease of use, standardization of state-of-the-art inference methods, and ability to analyze marine reserve effectiveness across ecological, socioeconomic, and governance objectives could dramatically further our understanding and support of effective marine reserve management.
A user-friendly tool to evaluate the effectiveness of no-take marine reserves
Fitzgerald, Sean; Fulton, Stuart; Mancha-Cisneros, Maria del Mar; McDonald, Gavin; Micheli, Fiorenza; Suárez, Alvin; Torre, Jorge
2018-01-01
Marine reserves are implemented to achieve a variety of objectives, but are seldom rigorously evaluated to determine whether those objectives are met. In the rare cases when evaluations do take place, they typically focus on ecological indicators and ignore other relevant objectives such as socioeconomics and governance. And regardless of the objectives, the diversity of locations, monitoring protocols, and analysis approaches hinder the ability to compare results across case studies. Moreover, analysis and evaluation of reserves is generally conducted by outside researchers, not the reserve managers or users, plausibly thereby hindering effective local management and rapid response to change. We present a framework and tool, called “MAREA”, to overcome these challenges. Its purpose is to evaluate the extent to which any given reserve has achieved its stated objectives. MAREA provides specific guidance on data collection and formatting, and then conducts rigorous causal inference analysis based on data input by the user, providing real-time outputs about the effectiveness of the reserve. MAREA’s ease of use, standardization of state-of-the-art inference methods, and ability to analyze marine reserve effectiveness across ecological, socioeconomic, and governance objectives could dramatically further our understanding and support of effective marine reserve management. PMID:29381762
Karkar, Ravi; Schroeder, Jessica; Epstein, Daniel A; Pina, Laura R; Scofield, Jeffrey; Fogarty, James; Kientz, Julie A; Munson, Sean A; Vilardaga, Roger; Zia, Jasmine
2017-05-02
Diagnostic self-tracking, the recording of personal information to diagnose or manage a health condition, is a common practice, especially for people with chronic conditions. Unfortunately, many who attempt diagnostic self-tracking have trouble accomplishing their goals. People often lack knowledge and skills needed to design and conduct scientifically rigorous experiments, and current tools provide little support. To address these shortcomings and explore opportunities for diagnostic self-tracking, we designed, developed, and evaluated a mobile app that applies a self-experimentation framework to support patients suffering from irritable bowel syndrome (IBS) in identifying their personal food triggers. TummyTrials aids a person in designing, executing, and analyzing self-experiments to evaluate whether a specific food triggers their symptoms. We examined the feasibility of this approach in a field study with 15 IBS patients, finding that participants could use the tool to reliably undergo a self-experiment. However, we also discovered an underlying tension between scientific validity and the lived experience of self-experimentation. We discuss challenges of applying clinical research methods in everyday life, motivating a need for the design of self-experimentation systems to balance rigor with the uncertainties of everyday life.
NASA Astrophysics Data System (ADS)
Kolski, Jeffrey
The linear lattice properties of the Proton Storage Ring (PSR) at the Los Alamos Neutron Science Center (LANSCE) in Los Alamos, NM were measured and applied to determine a better linear accelerator model. We found that the initial model was deficient in predicting the vertical focusing strength. The additional vertical focusing was located through fundamental understanding of experiment and statistically rigorous analysis. An improved model was constructed and compared against the initial model and measurement at operation set points and set points far away from nominal and was shown to indeed be an enhanced model. Independent component analysis (ICA) is a tool for data mining in many fields of science. Traditionally, ICA is applied to turn-by-turn beam position data as a means to measure the lattice functions of the real machine. Due to the diagnostic setup for the PSR, this method is not applicable. A new application method for ICA is derived, ICA applied along the length of the bunch. The ICA modes represent motions within the beam pulse. Several of the dominate ICA modes are experimentally identified.
The epistemological status of general circulation models
NASA Astrophysics Data System (ADS)
Loehle, Craig
2018-03-01
Forecasts of both likely anthropogenic effects on climate and consequent effects on nature and society are based on large, complex software tools called general circulation models (GCMs). Forecasts generated by GCMs have been used extensively in policy decisions related to climate change. However, the relation between underlying physical theories and results produced by GCMs is unclear. In the case of GCMs, many discretizations and approximations are made, and simulating Earth system processes is far from simple and currently leads to some results with unknown energy balance implications. Statistical testing of GCM forecasts for degree of agreement with data would facilitate assessment of fitness for use. If model results need to be put on an anomaly basis due to model bias, then both visual and quantitative measures of model fit depend strongly on the reference period used for normalization, making testing problematic. Epistemology is here applied to problems of statistical inference during testing, the relationship between the underlying physics and the models, the epistemic meaning of ensemble statistics, problems of spatial and temporal scale, the existence or not of an unforced null for climate fluctuations, the meaning of existing uncertainty estimates, and other issues. Rigorous reasoning entails carefully quantifying levels of uncertainty.
EarthLabs - Investigating Hurricanes: Earth's Meteorological Monsters
NASA Astrophysics Data System (ADS)
McDaris, J. R.; Dahlman, L.; Barstow, D.
2007-12-01
Earth science is one of the most important tools that the global community needs to address the pressing environmental, social, and economic issues of our time. While, at times considered a second-rate science at the high school level, it is currently undergoing a major revolution in the depth of content and pedagogical vitality. As part of this revolution, labs in Earth science courses need to shift their focus from cookbook-like activities with known outcomes to open-ended investigations that challenge students to think, explore and apply their learning. We need to establish a new model for Earth science as a rigorous lab science in policy, perception, and reality. As a concerted response to this need, five states, a coalition of scientists and educators, and an experienced curriculum team are creating a national model for a lab-based high school Earth science course named EarthLabs. This lab course will comply with the National Science Education Standards as well as the states' curriculum frameworks. The content will focus on Earth system science and environmental literacy. The lab experiences will feature a combination of field work, classroom experiments, and computer access to data and visualizations, and demonstrate the rigor and depth of a true lab course. The effort is being funded by NOAA's Environmental Literacy program. One of the prototype units of the course is Investigating Hurricanes. Hurricanes are phenomena which have tremendous impact on humanity and the resources we use. They are also the result of complex interacting Earth systems, making them perfect objects for rigorous investigation of many concepts commonly covered in Earth science courses, such as meteorology, climate, and global wind circulation. Students are able to use the same data sets, analysis tools, and research techniques that scientists employ in their research, yielding truly authentic learning opportunities. This month-long integrated unit uses hurricanes as the story line by which students investigate the different interactions involved in hurricane generation, steering, and intensification. Students analyze a variety of visualization resources looking for patterns in occurrence and to develop an understanding of hurricane structure. They download archived data about past hurricanes and produce temporal and spatial plots to discover patterns in hurricane life cycles. They investigate the relationship between hurricane wind speed and factors such as barometric pressure and sea surface temperature by conducting spreadsheet analyses on archived data. They also conduct hands-on laboratory experiments in order to understand the physical processes that underpin energy transfer in convection, condensation, and latent heat. These activities highlight Earth science as a vital, rich, invigorating course, employing state-of-the-art technologies and in-depth labs with high relevance for our daily lives and the future.
Short-term earthquake forecasting based on an epidemic clustering model
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Murru, Maura; Falcone, Giuseppe
2016-04-01
The application of rigorous statistical tools, with the aim of verifying any prediction method, requires a univocal definition of the hypothesis, or the model, characterizing the concerned anomaly or precursor, so as it can be objectively recognized in any circumstance and by any observer. This is mandatory to build up on the old-fashion approach consisting only of the retrospective anecdotic study of past cases. A rigorous definition of an earthquake forecasting hypothesis should lead to the objective identification of particular sub-volumes (usually named alarm volumes) of the total time-space volume within which the probability of occurrence of strong earthquakes is higher than the usual. The test of a similar hypothesis needs the observation of a sufficient number of past cases upon which a statistical analysis is possible. This analysis should be aimed to determine the rate at which the precursor has been followed (success rate) or not followed (false alarm rate) by the target seismic event, or the rate at which a target event has been preceded (alarm rate) or not preceded (failure rate) by the precursor. The binary table obtained from this kind of analysis leads to the definition of the parameters of the model that achieve the maximum number of successes and the minimum number of false alarms for a specific class of precursors. The mathematical tools suitable for this purpose may include the definition of Probability Gain or the R-Score, as well as the application of popular plots such as the Molchan error-diagram and the ROC diagram. Another tool for evaluating the validity of a forecasting method is the concept of the likelihood ratio (also named performance factor) of occurrence and non-occurrence of seismic events under different hypotheses. Whatever is the method chosen for building up a new hypothesis, usually based on retrospective data, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this step could be problematic for seismicity characterized by long-term recurrence. However, the separation of the data base of the data base collected in the past in two separate sections (one on which the best fit of the parameters is carried out, and the other on which the hypothesis is tested) can be a viable solution, known as retrospective-forward testing. In this study we show examples of application of the above mentioned concepts to the analysis of the Italian catalog of instrumental seismicity, making use of an epidemic algorithm developed to model short-term clustering features. This model, for which a precursory anomaly is just the occurrence of seismic activity, doesn't need the retrospective categorization of earthquakes in terms of foreshocks, mainshocks and aftershocks. It was introduced more than 15 years ago and tested so far in a number of real cases. It is now being run by several seismological centers around the world in forward real-time mode for testing purposes.
NASA Astrophysics Data System (ADS)
Dong, Yuting; Zhang, Lu; Balz, Timo; Luo, Heng; Liao, Mingsheng
2018-03-01
Radargrammetry is a powerful tool to construct digital surface models (DSMs) especially in heavily vegetated and mountainous areas where SAR interferometry (InSAR) technology suffers from decorrelation problems. In radargrammetry, the most challenging step is to produce an accurate disparity map through massive image matching, from which terrain height information can be derived using a rigorous sensor orientation model. However, precise stereoscopic SAR (StereoSAR) image matching is a very difficult task in mountainous areas due to the presence of speckle noise and dissimilar geometric/radiometric distortions. In this article, an adaptive-window least squares matching (AW-LSM) approach with an enhanced epipolar geometric constraint is proposed to robustly identify homologous points after compensation for radiometric discrepancies and geometric distortions. The matching procedure consists of two stages. In the first stage, the right image is re-projected into the left image space to generate epipolar images using rigorous imaging geometries enhanced with elevation information extracted from the prior DEM data e.g. SRTM DEM instead of the mean height of the mapped area. Consequently, the dissimilarities in geometric distortions between the left and right images are largely reduced, and the residual disparity corresponds to the height difference between true ground surface and the prior DEM. In the second stage, massive per-pixel matching between StereoSAR epipolar images identifies the residual disparity. To ensure the reliability and accuracy of the matching results, we develop an iterative matching scheme in which the classic cross correlation matching is used to obtain initial results, followed by the least squares matching (LSM) to refine the matching results. An adaptively resizing search window strategy is adopted during the dense matching step to help find right matching points. The feasibility and effectiveness of the proposed approach is demonstrated using Stripmap and Spotlight mode TerraSAR-X stereo data pairs covering Mount Song in central China. Experimental results show that the proposed method can provide a robust and effective matching tool for radargrammetry in mountainous areas.
NASA Astrophysics Data System (ADS)
Ivkin, N.; Liu, Z.; Yang, L. F.; Kumar, S. S.; Lemson, G.; Neyrinck, M.; Szalay, A. S.; Braverman, V.; Budavari, T.
2018-04-01
Cosmological N-body simulations play a vital role in studying models for the evolution of the Universe. To compare to observations and make a scientific inference, statistic analysis on large simulation datasets, e.g., finding halos, obtaining multi-point correlation functions, is crucial. However, traditional in-memory methods for these tasks do not scale to the datasets that are forbiddingly large in modern simulations. Our prior paper (Liu et al., 2015) proposes memory-efficient streaming algorithms that can find the largest halos in a simulation with up to 109 particles on a small server or desktop. However, this approach fails when directly scaling to larger datasets. This paper presents a robust streaming tool that leverages state-of-the-art techniques on GPU boosting, sampling, and parallel I/O, to significantly improve performance and scalability. Our rigorous analysis of the sketch parameters improves the previous results from finding the centers of the 103 largest halos (Liu et al., 2015) to ∼ 104 - 105, and reveals the trade-offs between memory, running time and number of halos. Our experiments show that our tool can scale to datasets with up to ∼ 1012 particles while using less than an hour of running time on a single GPU Nvidia GTX 1080.
ERIC Educational Resources Information Center
Micceri, Theodore; Brigman, Leellen; Spatig, Robert
2009-01-01
An extensive, internally cross-validated analytical study using nested (within academic disciplines) Multilevel Modeling (MLM) on 4,560 students identified functional criteria for defining high school curriculum rigor and further determined which measures could best be used to help guide decision making for marginal applicants. The key outcome…
Ward W. Carson; Stephen E. Reutebuch
1997-01-01
A procedure for performing a rigorous test of elevational accuracy of DEMs using independent ground coordinate data digitized photogrammetrically from aerial photography is presented. The accuracy of a sample set of 23 DEMs covering National Forests in Oregon and Washington was evaluated. Accuracy varied considerably between eastern and western parts of Oregon and...
Accelerating Biomedical Discoveries through Rigor and Transparency.
Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D
2017-07-01
Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Image synthesis for SAR system, calibration and processor design
NASA Technical Reports Server (NTRS)
Holtzman, J. C.; Abbott, J. L.; Kaupp, V. H.; Frost, V. S.
1978-01-01
The Point Scattering Method of simulating radar imagery rigorously models all aspects of the imaging radar phenomena. Its computational algorithms operate on a symbolic representation of the terrain test site to calculate such parameters as range, angle of incidence, resolution cell size, etc. Empirical backscatter data and elevation data are utilized to model the terrain. Additionally, the important geometrical/propagation effects such as shadow, foreshortening, layover, and local angle of incidence are rigorously treated. Applications of radar image simulation to a proposed calibrated SAR system are highlighted: soil moisture detection and vegetation discrimination.
Mathematical Rigor vs. Conceptual Change: Some Early Results
NASA Astrophysics Data System (ADS)
Alexander, W. R.
2003-05-01
Results from two different pedagogical approaches to teaching introductory astronomy at the college level will be presented. The first of these approaches is a descriptive, conceptually based approach that emphasizes conceptual change. This descriptive class is typically an elective for non-science majors. The other approach is a mathematically rigorous treatment that emphasizes problem solving and is designed to prepare students for further study in astronomy. The mathematically rigorous class is typically taken by science majors. It also fulfills an elective science requirement for these science majors. The Astronomy Diagnostic Test version 2 (ADT 2.0) was used as an assessment instrument since the validity and reliability have been investigated by previous researchers. The ADT 2.0 was administered as both a pre-test and post-test to both groups. Initial results show no significant difference between the two groups in the post-test. However, there is a slightly greater improvement for the descriptive class between the pre and post testing compared to the mathematically rigorous course. There was great care to account for variables. These variables included: selection of text, class format as well as instructor differences. Results indicate that the mathematically rigorous model, doesn't improve conceptual understanding any better than the conceptual change model. Additional results indicate that there is a similar gender bias in favor of males that has been measured by previous investigators. This research has been funded by the College of Science and Mathematics at James Madison University.
PCA as a practical indicator of OPLS-DA model reliability.
Worley, Bradley; Powers, Robert
Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.
Integrated Hydrographical Basin Management. Study Case - Crasna River Basin
NASA Astrophysics Data System (ADS)
Visescu, Mircea; Beilicci, Erika; Beilicci, Robert
2017-10-01
Hydrographical basins are important from hydrological, economic and ecological points of view. They receive and channel the runoff from rainfall and snowmelt which, when adequate managed, can provide fresh water necessary for water supply, irrigation, food industry, animal husbandry, hydrotechnical arrangements and recreation. Hydrographical basin planning and management follows the efficient use of available water resources in order to satisfy environmental, economic and social necessities and constraints. This can be facilitated by a decision support system that links hydrological, meteorological, engineering, water quality, agriculture, environmental, and other information in an integrated framework. In the last few decades different modelling tools for resolving problems regarding water quantity and quality were developed, respectively water resources management. Watershed models have been developed to the understanding of water cycle and pollution dynamics, and used to evaluate the impacts of hydrotechnical arrangements and land use management options on water quantity, quality, mitigation measures and possible global changes. Models have been used for planning monitoring network and to develop plans for intervention in case of hydrological disasters: floods, flash floods, drought and pollution. MIKE HYDRO Basin is a multi-purpose, map-centric decision support tool for integrated hydrographical basin analysis, planning and management. MIKE HYDRO Basin is designed for analyzing water sharing issues at international, national and local hydrographical basin level. MIKE HYDRO Basin uses a simplified mathematical representation of the hydrographical basin including the configuration of river and reservoir systems, catchment hydrology and existing and potential water user schemes with their various demands including a rigorous irrigation scheme module. This paper analyzes the importance and principles of integrated hydrographical basin management and develop a case study for Crasna river basin, with the use of MIKE HYDRO Basin advanced hydroinformatic tool for integrated hydrographical basin analysis, planning and management.
Experiment for validation of fluid-structure interaction models and algorithms.
Hessenthaler, A; Gaddum, N R; Holub, O; Sinkus, R; Röhrle, O; Nordsletten, D
2017-09-01
In this paper a fluid-structure interaction (FSI) experiment is presented. The aim of this experiment is to provide a challenging yet easy-to-setup FSI test case that addresses the need for rigorous testing of FSI algorithms and modeling frameworks. Steady-state and periodic steady-state test cases with constant and periodic inflow were established. Focus of the experiment is on biomedical engineering applications with flow being in the laminar regime with Reynolds numbers 1283 and 651. Flow and solid domains were defined using computer-aided design (CAD) tools. The experimental design aimed at providing a straightforward boundary condition definition. Material parameters and mechanical response of a moderately viscous Newtonian fluid and a nonlinear incompressible solid were experimentally determined. A comprehensive data set was acquired by using magnetic resonance imaging to record the interaction between the fluid and the solid, quantifying flow and solid motion. Copyright © 2016 The Authors. International Journal for Numerical Methods in Biomedical Engineering published by John Wiley & Sons Ltd.
Lyapustin, Alexei
2002-09-20
Results of an extensive validation study of the new radiative transfer code SHARM-3D are described. The code is designed for modeling of unpolarized monochromatic radiative transfer in the visible and near-IR spectra in the laterally uniform atmosphere over an arbitrarily inhomogeneous anisotropic surface. The surface boundary condition is periodic. The algorithm is based on an exact solution derived with the Green's function method. Several parameterizations were introduced into the algorithm to achieve superior performance. As a result, SHARM-3D is 2-3 orders of magnitude faster than the rigorous code SHDOM. It can model radiances over large surface scenes for a number of incidence-view geometries simultaneously. Extensive comparisons against SHDOM indicate that SHARM-3D has an average accuracy of better than 1%, which along with the high speed of calculations makes it a unique tool for remote-sensing applications in land surface and related atmospheric radiation studies.
NASA Astrophysics Data System (ADS)
Lyapustin, Alexei
2002-09-01
Results of an extensive validation study of the new radiative transfer code SHARM-3D are described. The code is designed for modeling of unpolarized monochromatic radiative transfer in the visible and near-IR spectra in the laterally uniform atmosphere over an arbitrarily inhomogeneous anisotropic surface. The surface boundary condition is periodic. The algorithm is based on an exact solution derived with the Green ’s function method. Several parameterizations were introduced into the algorithm to achieve superior performance. As a result, SHARM-3D is 2 -3 orders of magnitude faster than the rigorous code SHDOM. It can model radiances over large surface scenes for a number of incidence-view geometries simultaneously. Extensive comparisons against SHDOM indicate that SHARM-3D has an average accuracy of better than 1%, which along with the high speed of calculations makes it a unique tool for remote-sensing applications in land surface and related atmospheric radiation studies.
Schlägel, Ulrike E; Lewis, Mark A
2016-12-01
Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.
Mind-modelling with corpus stylistics in David Copperfield
Mahlberg, Michaela
2015-01-01
We suggest an innovative approach to literary discourse by using corpus linguistic methods to address research questions from cognitive poetics. In this article, we focus on the way that readers engage in mind-modelling in the process of characterisation. The article sets out our cognitive poetic model of characterisation that emphasises the continuity between literary characterisation and real-life human relationships. The model also aims to deal with the modelling of the author’s mind in line with the modelling of the minds of fictional characters. Crucially, our approach to mind-modelling is text-driven. Therefore we are able to employ corpus linguistic techniques systematically to identify textual patterns that function as cues triggering character information. In this article, we explore our understanding of mind-modelling through the characterisation of Mr. Dick from David Copperfield by Charles Dickens. Using the CLiC tool (Corpus Linguistics in Cheshire) developed for the exploration of 19th-century fiction, we investigate the textual traces in non-quotations around this character, in order to draw out the techniques of characterisation other than speech presentation. We show that Mr. Dick is a thematically and authorially significant character in the novel, and we move towards a rigorous account of the reader’s modelling of authorial intention. PMID:29708113
Mind-modelling with corpus stylistics in David Copperfield.
Stockwell, Peter; Mahlberg, Michaela
2015-05-01
We suggest an innovative approach to literary discourse by using corpus linguistic methods to address research questions from cognitive poetics. In this article, we focus on the way that readers engage in mind-modelling in the process of characterisation. The article sets out our cognitive poetic model of characterisation that emphasises the continuity between literary characterisation and real-life human relationships. The model also aims to deal with the modelling of the author's mind in line with the modelling of the minds of fictional characters. Crucially, our approach to mind-modelling is text-driven. Therefore we are able to employ corpus linguistic techniques systematically to identify textual patterns that function as cues triggering character information. In this article, we explore our understanding of mind-modelling through the characterisation of Mr. Dick from David Copperfield by Charles Dickens. Using the CLiC tool (Corpus Linguistics in Cheshire) developed for the exploration of 19th-century fiction, we investigate the textual traces in non-quotations around this character, in order to draw out the techniques of characterisation other than speech presentation. We show that Mr. Dick is a thematically and authorially significant character in the novel, and we move towards a rigorous account of the reader's modelling of authorial intention.
NASA Astrophysics Data System (ADS)
Lapin, Alexei; Klann, Michael; Reuss, Matthias
Agent-based models are rigorous tools for simulating the interactions of individual entities, such as organisms or molecules within cells and assessing their effects on the dynamic behavior of the system as a whole. In context with bioprocess and biosystems engineering there are several interesting and important applications. This contribution aims at introducing this strategy with the aid of two examples characterized by striking distinctions in the scale of the individual entities and the mode of their interactions. In the first example a structured-segregated model is applied to travel along the lifelines of single cells in the environment of a three-dimensional turbulent field of a stirred bioreactor. The modeling approach is based on an Euler-Lagrange formulation of the system. The strategy permits one to account for the heterogeneity present in real reactors in both the fluid and cellular phases, respectively. The individual response of the cells to local variations in the extracellular concentrations is pictured by a dynamically structured model of the key reactions of the central metabolism. The approach permits analysis of the lifelines of individual cells in space and time.
Methodological challenges of validating a clinical decision-making tool in the practice environment.
Brennan, Caitlin W; Daly, Barbara J
2015-04-01
Validating a measurement tool intended for use in the practice environment poses challenges that may not be present when validating a tool intended solely for research purposes. The aim of this article is to describe the methodological challenges of validating a clinical decision-making tool, the Oncology Acuity Tool, which nurses use to make nurse assignment and staffing decisions prospectively each shift. Data were derived from a larger validation study, during which several methodological challenges arose. Revisions to the tool, including conducting iterative feedback cycles with end users, were necessary before the validation study was initiated. The "true" value of patient acuity is unknown, and thus, two approaches to inter-rater reliability assessment were used. Discordant perspectives existed between experts and end users. Balancing psychometric rigor with clinical relevance may be achieved through establishing research-practice partnerships, seeking active and continuous feedback with end users, and weighing traditional statistical rules of thumb with practical considerations. © The Author(s) 2014.
Rodriguez, A Noel; DeWitt, Peter; Fisher, Jennifer; Broadfoot, Kirsten; Hurt, K Joseph
2016-06-11
To characterize the psychometric properties of a novel Obstetric Communication Assessment Tool (OCAT) in a pilot study of standardized difficult OB communication scenarios appropriate for undergraduate medical evaluation. We developed and piloted four challenging OB Standardized Patient (SP) scenarios in a sample of twenty-one third year OB/GYN clerkship students: Religious Beliefs (RB), Angry Father (AF), Maternal Smoking (MS), and Intimate Partner Violence (IPV). Five trained Standardized Patient Reviewers (SPRs) independently scored twenty-four randomized video-recorded encounters using the OCAT. Cronbach's alpha and Intraclass Correlation Coefficient-2 (ICC-2) were used to estimate internal consistency (IC) and inter-rater reliability (IRR), respectively. Systematic variation in reviewer scoring was assessed using the Stuart-Maxwell test. IC was acceptable to excellent with Cronbach's alpha values (and 95% Confidence Intervals [CI]): RB 0.91 (0.86, 0.95), AF 0.76 (0.62, 0.87), MS 0.91 (0.86, 0.95), and IPV 0.94 (0.91, 0.97). IRR was unacceptable to poor with ICC-2 values: RB 0.46 (0.40, 0.53), AF 0.48 (0.41, 0.54), MS 0.52 (0.45, 0.58), and IPV 0.67 (0.61, 0.72). Stuart-Maxwell analysis indicated systematic differences in reviewer stringency. Our initial characterization of the OCAT demonstrates important issues in communications assessment. We identify scoring inconsistencies due to differences in SPR rigor that require enhanced training to improve assessment reliability. We outline a rational process for initial communication tool validation that may be useful in undergraduate curriculum development, and acknowledge that rigorous validation of OCAT training and implementation is needed to create a valuable OB communication assessment tool.
Conflict: Operational Realism versus Analytical Rigor in Defense Modeling and Simulation
2012-06-14
Campbell, Experimental and Quasi- Eperimental Designs for Generalized Causal Inference, Boston: Houghton Mifflin Company, 2002. [7] R. T. Johnson, G...experimentation? In order for an experiment to be considered rigorous, and the results valid, the experiment should be designed using established...addition to the interview, the pilots were administered a written survey, designed to capture their reactions regarding the level of realism present
High-order computer-assisted estimates of topological entropy
NASA Astrophysics Data System (ADS)
Grote, Johannes
The concept of Taylor Models is introduced, which offers highly accurate C0-estimates for the enclosures of functional dependencies, combining high-order Taylor polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified interval arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly nonlinear dynamical systems. A method to obtain sharp rigorous enclosures of Poincare maps for certain types of flows and surfaces is developed and numerical examples are presented. Differential algebraic techniques allow the efficient and accurate computation of polynomial approximations for invariant curves of certain planar maps around hyperbolic fixed points. Subsequently we introduce a procedure to extend these polynomial curves to verified Taylor Model enclosures of local invariant manifolds with C0-errors of size 10-10--10 -14, and proceed to generate the global invariant manifold tangle up to comparable accuracy through iteration in Taylor Model arithmetic. Knowledge of the global manifold structure up to finite iterations of the local manifold pieces enables us to find all homoclinic and heteroclinic intersections in the generated manifold tangle. Combined with the mapping properties of the homoclinic points and their ordering we are able to construct a subshift of finite type as a topological factor of the original planar system to obtain rigorous lower bounds for its topological entropy. This construction is fully automatic and yields homoclinic tangles with several hundred homoclinic points. As an example rigorous lower bounds for the topological entropy of the Henon map are computed, which to the best knowledge of the authors yield the largest such estimates published so far.
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
2017-11-01
Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.
A constructivist connectionist model of transitions on false-belief tasks.
Berthiaume, Vincent G; Shultz, Thomas R; Onishi, Kristine H
2013-03-01
How do children come to understand that others have mental representations, e.g., of an object's location? Preschoolers go through two transitions on verbal false-belief tasks, in which they have to predict where an agent will search for an object that was moved in her absence. First, while three-and-a-half-year-olds usually fail at approach tasks, in which the agent wants to find the object, children just under four succeed. Second, only after four do children succeed at tasks in which the agent wants to avoid the object. We present a constructivist connectionist model that autonomously reproduces the two transitions and suggests that the transitions are due to increases in general processing abilities enabling children to (1) overcome a default true-belief attribution by distinguishing false- from true-belief situations, and to (2) predict search in avoidance situations, where there is often more than one correct, empty search location. Constructivist connectionist models are rigorous, flexible and powerful tools that can be analyzed before and after transitions to uncover novel and emergent mechanisms of cognitive development. Copyright © 2012 Elsevier B.V. All rights reserved.
Agent-based modelling of consumer energy choices
NASA Astrophysics Data System (ADS)
Rai, Varun; Henry, Adam Douglas
2016-06-01
Strategies to mitigate global climate change should be grounded in a rigorous understanding of energy systems, particularly the factors that drive energy demand. Agent-based modelling (ABM) is a powerful tool for representing the complexities of energy demand, such as social interactions and spatial constraints. Unlike other approaches for modelling energy demand, ABM is not limited to studying perfectly rational agents or to abstracting micro details into system-level equations. Instead, ABM provides the ability to represent behaviours of energy consumers -- such as individual households -- using a range of theories, and to examine how the interaction of heterogeneous agents at the micro-level produces macro outcomes of importance to the global climate, such as the adoption of low-carbon behaviours and technologies over space and time. We provide an overview of ABM work in the area of consumer energy choices, with a focus on identifying specific ways in which ABM can improve understanding of both fundamental scientific and applied aspects of the demand side of energy to aid the design of better policies and programmes. Future research needs for improving the practice of ABM to better understand energy demand are also discussed.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.
Analytical calculation on the determination of steep side wall angles from far field measurements
NASA Astrophysics Data System (ADS)
Cisotto, Luca; Pereira, Silvania F.; Urbach, H. Paul
2018-06-01
In the semiconductor industry, the performance and capabilities of the lithographic process are evaluated by measuring specific structures. These structures are often gratings of which the shape is described by a few parameters such as period, middle critical dimension, height, and side wall angle (SWA). Upon direct measurement or retrieval of these parameters, the determination of the SWA suffers from considerable inaccuracies. Although the scattering effects that steep SWAs have on the illumination can be obtained with rigorous numerical simulations, analytical models constitute a very useful tool to get insights into the problem we are treating. In this paper, we develop an approach based on analytical calculations to describe the scattering of a cliff and a ridge with steep SWAs. We also propose a detection system to determine the SWAs of the structures.
Towards a supported common NEAMS software stack
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cormac Garvey
2012-04-01
The NEAMS IPSC's are developing multidimensional, multiphysics, multiscale simulation codes based on first principles that will be capable of predicting all aspects of current and future nuclear reactor systems. These new breeds of simulation codes will include rigorous verification, validation and uncertainty quantification checks to quantify the accuracy and quality of the simulation results. The resulting NEAMS IPSC simulation codes will be an invaluable tool in designing the next generation of Nuclear Reactors and also contribute to a more speedy process in the acquisition of licenses from the NRC for new Reactor designs. Due to the high resolution of themore » models, the complexity of the physics and the added computational resources to quantify the accuracy/quality of the results, the NEAMS IPSC codes will require large HPC resources to carry out the production simulation runs.« less
Improve processes on healthcare: current issues and future trends.
Chen, Jason C H; Dolan, Matt; Lin, Binshan
2004-01-01
Information Technology (IT) is a critical resource for improving today's business competitiveness. However, many healthcare providers do not proactively manage or improve the efficiency and effectiveness of their services with IT. Survival in a competitive business environment demands continuous improvements in quality and service, while rigorously maintaining core values. Electronic commerce continues its development, gaining ground as the preferred means of business transactions. Embracing e-healthcare and treating IT as a strategic tool to improve patient safety and the quality of care enables healthcare professionals to benefit from technology formerly used only for management purposes. Numerous improvement initiatives, introduced by both the federal government and the private sector, seek to better the status quo in IT. This paper examines the current IT climate using an enhanced "Built to Last" model, and comments on future IT strategies within the healthcare industry.
Systematic and reliable multiscale modelling of lithium batteries
NASA Astrophysics Data System (ADS)
Atalay, Selcuk; Schmuck, Markus
2017-11-01
Motivated by the increasing interest in lithium batteries as energy storage devices (e.g. cars/bycicles/public transport, social robot companions, mobile phones, and tablets), we investigate three basic cells: (i) a single intercalation host; (ii) a periodic arrangement of intercalation hosts; and (iii) a rigorously upscaled formulation of (ii) as initiated in. By systematically accounting for Li transport and interfacial reactions in (i)-(iii), we compute the associated chracteristic current-voltage curves and power densities. Finally, we discuss the influence of how the intercalation particles are arranged. Our findings are expected to improve the understanding of how microscopic properties affect the battery behaviour observed on the macroscale and at the same time, the upscaled formulation (iii) serves as an efficient computational tool. This work has been supported by EPSRC, UK, through the Grant No. EP/P011713/1.
Kinetics versus thermodynamics in materials modeling: The case of the di-vacancy in iron
NASA Astrophysics Data System (ADS)
Djurabekova, F.; Malerba, L.; Pasianot, R. C.; Olsson, P.; Nordlund, K.
2010-07-01
Monte Carlo models are widely used for the study of microstructural and microchemical evolution of materials under irradiation. However, they often link explicitly the relevant activation energies to the energy difference between local equilibrium states. We provide a simple example (di-vacancy migration in iron) in which a rigorous activation energy calculation, by means of both empirical interatomic potentials and density functional theory methods, clearly shows that such a link is not granted, revealing a migration mechanism that a thermodynamics-linked activation energy model cannot predict. Such a mechanism is, however, fully consistent with thermodynamics. This example emphasizes the importance of basing Monte Carlo methods on models where the activation energies are rigorously calculated, rather than deduced from widespread heuristic equations.
Dynamic Gate Product and Artifact Generation from System Models
NASA Technical Reports Server (NTRS)
Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris
2011-01-01
Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.
A Rigorous Sharp Interface Limit of a Diffuse Interface Model Related to Tumor Growth
NASA Astrophysics Data System (ADS)
Rocca, Elisabetta; Scala, Riccardo
2017-06-01
In this paper, we study the rigorous sharp interface limit of a diffuse interface model related to the dynamics of tumor growth, when a parameter ɛ, representing the interface thickness between the tumorous and non-tumorous cells, tends to zero. More in particular, we analyze here a gradient-flow-type model arising from a modification of the recently introduced model for tumor growth dynamics in Hawkins-Daruud et al. (Int J Numer Math Biomed Eng 28:3-24, 2011) (cf. also Hilhorst et al. Math Models Methods Appl Sci 25:1011-1043, 2015). Exploiting the techniques related to both gradient flows and gamma convergence, we recover a condition on the interface Γ relating the chemical and double-well potentials, the mean curvature, and the normal velocity.
A Mathematical Evaluation of the Core Conductor Model
Clark, John; Plonsey, Robert
1966-01-01
This paper is a mathematical evaluation of the core conductor model where its three dimensionality is taken into account. The problem considered is that of a single, active, unmyelinated nerve fiber situated in an extensive, homogeneous, conducting medium. Expressions for the various core conductor parameters have been derived in a mathematically rigorous manner according to the principles of electromagnetic theory. The purpose of employing mathematical rigor in this study is to bring to light the inherent assumptions of the one dimensional core conductor model, providing a method of evaluating the accuracy of this linear model. Based on the use of synthetic squid axon data, the conclusion of this study is that the linear core conductor model is a good approximation for internal but not external parameters. PMID:5903155
ZY3-02 Laser Altimeter Footprint Geolocation Prediction
Xie, Junfeng; Tang, Xinming; Mo, Fan; Li, Guoyuan; Zhu, Guangbin; Wang, Zhenming; Fu, Xingke; Gao, Xiaoming; Dou, Xianhui
2017-01-01
Successfully launched on 30 May 2016, ZY3-02 is the first Chinese surveying and mapping satellite equipped with a lightweight laser altimeter. Calibration is necessary before the laser altimeter becomes operational. Laser footprint location prediction is the first step in calibration that is based on ground infrared detectors, and it is difficult because the sample frequency of the ZY3-02 laser altimeter is 2 Hz, and the distance between two adjacent laser footprints is about 3.5 km. In this paper, we build an on-orbit rigorous geometric prediction model referenced to the rigorous geometric model of optical remote sensing satellites. The model includes three kinds of data that must be predicted: pointing angle, orbit parameters, and attitude angles. The proposed method is verified by a ZY3-02 laser altimeter on-orbit geometric calibration test. Five laser footprint prediction experiments are conducted based on the model, and the laser footprint prediction accuracy is better than 150 m on the ground. The effectiveness and accuracy of the on-orbit rigorous geometric prediction model are confirmed by the test results. The geolocation is predicted precisely by the proposed method, and this will give a reference to the geolocation prediction of future land laser detectors in other laser altimeter calibration test. PMID:28934160
ZY3-02 Laser Altimeter Footprint Geolocation Prediction.
Xie, Junfeng; Tang, Xinming; Mo, Fan; Li, Guoyuan; Zhu, Guangbin; Wang, Zhenming; Fu, Xingke; Gao, Xiaoming; Dou, Xianhui
2017-09-21
Successfully launched on 30 May 2016, ZY3-02 is the first Chinese surveying and mapping satellite equipped with a lightweight laser altimeter. Calibration is necessary before the laser altimeter becomes operational. Laser footprint location prediction is the first step in calibration that is based on ground infrared detectors, and it is difficult because the sample frequency of the ZY3-02 laser altimeter is 2 Hz, and the distance between two adjacent laser footprints is about 3.5 km. In this paper, we build an on-orbit rigorous geometric prediction model referenced to the rigorous geometric model of optical remote sensing satellites. The model includes three kinds of data that must be predicted: pointing angle, orbit parameters, and attitude angles. The proposed method is verified by a ZY3-02 laser altimeter on-orbit geometric calibration test. Five laser footprint prediction experiments are conducted based on the model, and the laser footprint prediction accuracy is better than 150 m on the ground. The effectiveness and accuracy of the on-orbit rigorous geometric prediction model are confirmed by the test results. The geolocation is predicted precisely by the proposed method, and this will give a reference to the geolocation prediction of future land laser detectors in other laser altimeter calibration test.
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
Walach, Harald; Falkenberg, Torkel; Fønnebø, Vinjar; Lewith, George; Jonas, Wayne B
2006-01-01
Background The reasoning behind evaluating medical interventions is that a hierarchy of methods exists which successively produce improved and therefore more rigorous evidence based medicine upon which to make clinical decisions. At the foundation of this hierarchy are case studies, retrospective and prospective case series, followed by cohort studies with historical and concomitant non-randomized controls. Open-label randomized controlled studies (RCTs), and finally blinded, placebo-controlled RCTs, which offer most internal validity are considered the most reliable evidence. Rigorous RCTs remove bias. Evidence from RCTs forms the basis of meta-analyses and systematic reviews. This hierarchy, founded on a pharmacological model of therapy, is generalized to other interventions which may be complex and non-pharmacological (healing, acupuncture and surgery). Discussion The hierarchical model is valid for limited questions of efficacy, for instance for regulatory purposes and newly devised products and pharmacological preparations. It is inadequate for the evaluation of complex interventions such as physiotherapy, surgery and complementary and alternative medicine (CAM). This has to do with the essential tension between internal validity (rigor and the removal of bias) and external validity (generalizability). Summary Instead of an Evidence Hierarchy, we propose a Circular Model. This would imply a multiplicity of methods, using different designs, counterbalancing their individual strengths and weaknesses to arrive at pragmatic but equally rigorous evidence which would provide significant assistance in clinical and health systems innovation. Such evidence would better inform national health care technology assessment agencies and promote evidence based health reform. PMID:16796762
The KP Approximation Under a Weak Coriolis Forcing
NASA Astrophysics Data System (ADS)
Melinand, Benjamin
2018-02-01
In this paper, we study the asymptotic behavior of weakly transverse water-waves under a weak Coriolis forcing in the long wave regime. We derive the Boussinesq-Coriolis equations in this setting and we provide a rigorous justification of this model. Then, from these equations, we derive two other asymptotic models. When the Coriolis forcing is weak, we fully justify the rotation-modified Kadomtsev-Petviashvili equation (also called Grimshaw-Melville equation). When the Coriolis forcing is very weak, we rigorously justify the Kadomtsev-Petviashvili equation. This work provides the first mathematical justification of the KP approximation under a Coriolis forcing.
Imaging 2D optical diffuse reflectance in skeletal muscle
NASA Astrophysics Data System (ADS)
Ranasinghesagara, Janaka; Yao, Gang
2007-04-01
We discovered a unique pattern of optical reflectance from fresh prerigor skeletal muscles, which can not be described using existing theories. A numerical fitting function was developed to quantify the equiintensity contours of acquired reflectance images. Using this model, we studied the changes of reflectance profile during stretching and rigor process. We found that the prominent anisotropic features diminished after rigor completion. These results suggested that muscle sarcomere structures played important roles in modulating light propagation in whole muscle. When incorporating the sarcomere diffraction in a Monte Carlo model, we showed that the resulting reflectance profiles quantitatively resembled the experimental observation.
Archibald, Mandy M; Hartling, Lisa; Ali, Samina; Caine, Vera; Scott, Shannon D
2018-06-05
Although it is well established that family-centered education is critical to managing childhood asthma, the information needs of parents of children with asthma are not being met through current educational approaches. Patient-driven educational materials that leverage the power of the storytelling and the arts show promise in communicating health information and assisting in illness self-management. However, such arts-based knowledge translation approaches are in their infancy, and little is known about how to develop such tools for parents. This paper reports on the development of "My Asthma Diary" - an innovative knowledge translation tool based on rigorous research evidence and tailored to parents' asthma-related information needs. We used a multi-stage process to develop four eBook prototypes of "My Asthma Diary." We conducted formative research on parents' information needs and identified high quality research evidence on childhood asthma, and used these data to inform the development of the asthma eBooks. We established interdisciplinary consulting teams with health researchers, practitioners, and artists to help iteratively create the knowledge translation tools. We describe the iterative, transdisciplinary process of developing asthma eBooks which incorporates: (I) parents' preferences and information needs on childhood asthma, (II) quality evidence on childhood asthma and its management, and (III) the engaging and informative powers of storytelling and visual art as methods to communicate complex health information to parents. We identified four dominant methodological and procedural challenges encountered during this process: (I) working within an inter-disciplinary team, (II) quantity and ordering of information, (III) creating a composite narrative, and (IV) balancing actual and ideal management scenarios. We describe a replicable and rigorous multi-staged approach to developing a patient-driven, creative knowledge translation tool, which can be adapted for use with different populations and contexts. We identified specific procedural and methodological challenges that others conducting comparable work should consider, particularly as creative, patient-driven knowledge translation strategies continue to emerge across health disciplines.
Mass balance modelling of contaminants in river basins: a flexible matrix approach.
Warren, Christopher; Mackay, Don; Whelan, Mick; Fox, Kay
2005-12-01
A novel and flexible approach is described for simulating the behaviour of chemicals in river basins. A number (n) of river reaches are defined and their connectivity is described by entries in an n x n matrix. Changes in segmentation can be readily accommodated by altering the matrix entries, without the need for model revision. Two models are described. The simpler QMX-R model only considers advection and an overall loss due to the combined processes of volatilization, net transfer to sediment and degradation. The rate constant for the overall loss is derived from fugacity calculations for a single segment system. The more rigorous QMX-F model performs fugacity calculations for each segment and explicitly includes the processes of advection, evaporation, water-sediment exchange and degradation in both water and sediment. In this way chemical exposure in all compartments (including equilibrium concentrations in biota) can be estimated. Both models are designed to serve as intermediate-complexity exposure assessment tools for river basins with relatively low data requirements. By considering the spatially explicit nature of emission sources and the changes in concentration which occur with transport in the channel system, the approach offers significant advantages over simple one-segment simulations while being more readily applicable than more sophisticated, highly segmented, GIS-based models.
Space Shuttle Software Development and Certification
NASA Technical Reports Server (NTRS)
Orr, James K.; Henderson, Johnnie A
2000-01-01
Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools
Trinh, Cong T.; Wlaschin, Aaron; Srienc, Friedrich
2010-01-01
Elementary Mode Analysis is a useful Metabolic Pathway Analysis tool to identify the structure of a metabolic network that links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes that can support steady state operation of cellular metabolism represent independent cellular physiological states. Such pathway definition provides a rigorous basis to systematically characterize cellular phenotypes, metabolic network regulation, robustness, and fragility that facilitate understanding of cell physiology and implementation of metabolic engineering strategies. This mini-review aims to overview the development and application of elementary mode analysis as a metabolic pathway analysis tool in studying cell physiology and as a basis of metabolic engineering. PMID:19015845
DeGeest, David Scott; Schmidt, Frank
2015-01-01
Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.
Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.
Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia
2015-01-01
mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.
Circuit-based versus full-wave modelling of active microwave circuits
NASA Astrophysics Data System (ADS)
Bukvić, Branko; Ilić, Andjelija Ž.; Ilić, Milan M.
2018-03-01
Modern full-wave computational tools enable rigorous simulations of linear parts of complex microwave circuits within minutes, taking into account all physical electromagnetic (EM) phenomena. Non-linear components and other discrete elements of the hybrid microwave circuit are then easily added within the circuit simulator. This combined full-wave and circuit-based analysis is a must in the final stages of the circuit design, although initial designs and optimisations are still faster and more comfortably done completely in the circuit-based environment, which offers real-time solutions at the expense of accuracy. However, due to insufficient information and general lack of specific case studies, practitioners still struggle when choosing an appropriate analysis method, or a component model, because different choices lead to different solutions, often with uncertain accuracy and unexplained discrepancies arising between the simulations and measurements. We here design a reconfigurable power amplifier, as a case study, using both circuit-based solver and a full-wave EM solver. We compare numerical simulations with measurements on the manufactured prototypes, discussing the obtained differences, pointing out the importance of measured parameters de-embedding, appropriate modelling of discrete components and giving specific recipes for good modelling practices.
NASA Astrophysics Data System (ADS)
Collins, P. C.; Haden, C. V.; Ghamarian, I.; Hayes, B. J.; Ales, T.; Penso, G.; Dixit, V.; Harlow, G.
2014-07-01
Electron beam direct manufacturing, synonymously known as electron beam additive manufacturing, along with other additive "3-D printing" manufacturing processes, are receiving widespread attention as a means of producing net-shape (or near-net-shape) components, owing to potential manufacturing benefits. Yet, materials scientists know that differences in manufacturing processes often significantly influence the microstructure of even widely accepted materials and, thus, impact the properties and performance of a material in service. It is important to accelerate the understanding of the processing-structure-property relationship of materials being produced via these novel approaches in a framework that considers the performance in a statistically rigorous way. This article describes the development of a process model, the assessment of key microstructural features to be incorporated into a microstructure simulation model, a novel approach to extract a constitutive equation to predict tensile properties in Ti-6Al-4V (Ti-64), and a probabilistic approach to measure the fidelity of the property model against real data. This integrated approach will provide designers a tool to vary process parameters and understand the influence on performance, enabling design and optimization for these highly visible manufacturing approaches.
Oribe-Garcia, Iraia; Kamara-Esteban, Oihane; Martin, Cristina; Macarulla-Arenaza, Ana M; Alonso-Vicario, Ainhoa
2015-05-01
The planning of waste management strategies needs tools to support decisions at all stages of the process. Accurate quantification of the waste to be generated is essential for both the daily management (short-term) and proper design of facilities (long-term). Designing without rigorous knowledge may have serious economic and environmental consequences. The present works aims at identifying relevant socio-economic features of municipalities regarding Household Waste (HW) generation by means of factor models. Factor models face two main drawbacks, data collection and identifying relevant explanatory variables within a heterogeneous group. Grouping similar characteristics observations within a group may favour the deduction of more robust models. The methodology followed has been tested with Biscay Province because it stands out for having very different municipalities ranging from very rural to urban ones. Two main models are developed, one for the overall province and a second one after clustering the municipalities. The results prove that relating municipalities with specific characteristics, improves the results in a very heterogeneous situation. The methodology has identified urban morphology, tourism activity, level of education and economic situation as the most influencing characteristics in HW generation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Filtering Meteoroid Flights Using Multiple Unscented Kalman Filters
NASA Astrophysics Data System (ADS)
Sansom, E. K.; Bland, P. A.; Rutten, M. G.; Paxman, J.; Towner, M. C.
2016-11-01
Estimator algorithms are immensely versatile and powerful tools that can be applied to any problem where a dynamic system can be modeled by a set of equations and where observations are available. A well designed estimator enables system states to be optimally predicted and errors to be rigorously quantified. Unscented Kalman filters (UKFs) and interactive multiple models can be found in methods from satellite tracking to self-driving cars. The luminous trajectory of the Bunburra Rockhole fireball was observed by the Desert Fireball Network in mid-2007. The recorded data set is used in this paper to examine the application of these two techniques as a viable approach to characterizing fireball dynamics. The nonlinear, single-body system of equations, used to model meteoroid entry through the atmosphere, is challenged by gross fragmentation events that may occur. The incorporation of the UKF within an interactive multiple model smoother provides a likely solution for when fragmentation events may occur as well as providing a statistical analysis of the state uncertainties. In addition to these benefits, another advantage of this approach is its automatability for use within an image processing pipeline to facilitate large fireball data analyses and meteorite recoveries.
Assessments of species' vulnerability to climate change: From pseudo to science
Wade, Alisa A.; Hand, Brian K.; Kovach, Ryan; Muhlfeld, Clint C.; Waples, Robin S.; Luikart, Gordon
2017-01-01
Climate change vulnerability assessments (CCVAs) are important tools to plan for and mitigate potential impacts of climate change. However, CCVAs often lack scientific rigor, which can ultimately lead to poor conservation prioritization and associated ecological and economic costs. We discuss the need to improve comparability and consistency of CCVAs and either validate their findings or improve assessment of CCVA uncertainty and sensitivity to methodological assumptions.
ERIC Educational Resources Information Center
Scarupa, Harriet J., Ed.
2014-01-01
Mounting research evidence points to social and emotional skills as playing a central role in shaping student achievement, workplace readiness, and adult wellbeing. This report describes the rigorous, collaborative work undertaken by the Tauck Family Foundation and Child Trends, a national leader in measuring children's development and wellbeing,…
Phyllis C. Adams; Glenn A. Christensen
2012-01-01
A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a Stateâs data to the national FIA...
When is good, good enough? Methodological pragmatism for sustainable guideline development.
Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C
2015-03-06
Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...
2016-02-01
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E
2018-04-21
Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.
NASA Astrophysics Data System (ADS)
Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.
2017-12-01
Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.
Simple Tools to Facilitate Project Management of a Nursing Research Project.
Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret
2016-07-01
Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.
Comparing 2 National Organization-Level Workplace Health Promotion and Improvement Tools, 2013–2015
Lang, Jason E.; Davis, Whitney D.; Jones-Jack, Nkenge H.; Mukhtar, Qaiser; Lu, Hua; Acharya, Sushama D.; Molloy, Meg E.
2016-01-01
Creating healthy workplaces is becoming more common. Half of employers that have more than 50 employees offer some type of workplace health promotion program. Few employers implement comprehensive evidence-based interventions that reach all employees and achieve desired health and cost outcomes. A few organization-level assessment and benchmarking tools have emerged to help employers evaluate the comprehensiveness and rigor of their health promotion offerings. Even fewer tools exist that combine assessment with technical assistance and guidance to implement evidence-based practices. Our descriptive analysis compares 2 such tools, the Centers for Disease Control and Prevention’s Worksite Health ScoreCard and Prevention Partners’ WorkHealthy America, and presents data from both to describe workplace health promotion practices across the United States. These tools are reaching employers of all types (N = 1,797), and many employers are using a comprehensive approach (85% of those using WorkHealthy America and 45% of those using the ScoreCard), increasing program effectiveness and impact. PMID:27685429
Rigorous high-precision enclosures of fixed points and their invariant manifolds
NASA Astrophysics Data System (ADS)
Wittig, Alexander N.
The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by Johannes Grote is extended to compute very accurate polynomial approximations to invariant manifolds of discrete maps of arbitrary dimension around hyperbolic fixed points. The algorithm presented allows for automatic removal of resonances occurring during construction. A method for the rigorous enclosure of invariant manifolds of continuous systems is introduced. Using methods developed for discrete maps, polynomial approximations of invariant manifolds of hyperbolic fixed points of ODEs are obtained. These approximations are outfit with a sharp error bound which is verified to rigorously contain the manifolds. While we focus on the three dimensional case, verification in higher dimensions is possible using similar techniques. Integrating the resulting enclosures using the verified COSY VI integrator, the initial manifold enclosures are expanded to yield sharp enclosures of large parts of the stable and unstable manifolds. To demonstrate the effectiveness of this method, we construct enclosures of the invariant manifolds of the Lorenz system and show pictures of the resulting manifold enclosures. To the best of our knowledge, these enclosures are the largest verified enclosures of manifolds in the Lorenz system in existence.
Bellasio, Chandra; Beerling, David J; Griffiths, Howard
2016-06-01
The higher photosynthetic potential of C4 plants has led to extensive research over the past 50 years, including C4 -dominated natural biomes, crops such as maize, or for evaluating the transfer of C4 traits into C3 lineages. Photosynthetic gas exchange can be measured in air or in a 2% Oxygen mixture using readily available commercial gas exchange and modulated PSII fluorescence systems. Interpretation of these data, however, requires an understanding (or the development) of various modelling approaches, which limit the use by non-specialists. In this paper we present an accessible summary of the theory behind the analysis and derivation of C4 photosynthetic parameters, and provide a freely available Excel Fitting Tool (EFT), making rigorous C4 data analysis accessible to a broader audience. Outputs include those defining C4 photochemical and biochemical efficiency, the rate of photorespiration, bundle sheath conductance to CO2 diffusion and the in vivo biochemical constants for PEP carboxylase. The EFT compares several methodological variants proposed by different investigators, allowing users to choose the level of complexity required to interpret data. We provide a complete analysis of gas exchange data on maize (as a model C4 organism and key global crop) to illustrate the approaches, their analysis and interpretation. © 2015 John Wiley & Sons Ltd. © 2016 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chhabra, S.R.; Joachimiak, M.P.; Petzold, C.J.
Protein–protein interactions offer an insight into cellular processes beyond what may be obtained by the quantitative functional genomics tools of proteomics and transcriptomics. The aforementioned tools have been extensively applied to study E. coli and other aerobes and more recently to study the stress response behavior of Desulfovibrio 5 vulgaris Hildenborough, a model anaerobe and sulfate reducer. In this paper we present the first attempt to identify protein-protein interactions in an obligate anaerobic bacterium. We used suicide vector-assisted chromosomal modification of 12 open reading frames encoded by this sulfate reducer to append an eight amino acid affinity tag to themore » carboxy-terminus of the chosen proteins. Three biological replicates of the 10 ‘pulled-down’ proteins were separated and analyzed using liquid chromatography-mass spectrometry. Replicate agreement ranged between 35% and 69%. An interaction network among 12 bait and 90 prey proteins was reconstructed based on 134 bait-prey interactions computationally identified to be of high confidence. We discuss the biological significance of several unique metabolic features of D. vulgaris revealed by this protein-protein interaction data 15 and protein modifications that were observed. These include the distinct role of the putative carbon monoxide-induced hydrogenase, unique electron transfer routes associated with different oxidoreductases, and the possible role of methylation in regulating sulfate reduction.« less
Using Computational and Mechanical Models to Study Animal Locomotion
Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas
2012-01-01
Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026
Combining correlative and mechanistic habitat suitability models to improve ecological compensation.
Meineri, Eric; Deville, Anne-Sophie; Grémillet, David; Gauthier-Clerc, Michel; Béchet, Arnaud
2015-02-01
Only a few studies have shown positive impacts of ecological compensation on species dynamics affected by human activities. We argue that this is due to inappropriate methods used to forecast required compensation in environmental impact assessments. These assessments are mostly descriptive and only valid at limited spatial and temporal scales. However, habitat suitability models developed to predict the impacts of environmental changes on potential species' distributions should provide rigorous science-based tools for compensation planning. Here we describe the two main classes of predictive models: correlative models and individual-based mechanistic models. We show how these models can be used alone or synoptically to improve compensation planning. While correlative models are easier to implement, they tend to ignore underlying ecological processes and lack accuracy. On the contrary, individual-based mechanistic models can integrate biological interactions, dispersal ability and adaptation. Moreover, among mechanistic models, those considering animal energy balance are particularly efficient at predicting the impact of foraging habitat loss. However, mechanistic models require more field data compared to correlative models. Hence we present two approaches which combine both methods for compensation planning, especially in relation to the spatial scale considered. We show how the availability of biological databases and software enabling fast and accurate population projections could be advantageously used to assess ecological compensation requirement efficiently in environmental impact assessments. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.
NASA Astrophysics Data System (ADS)
Kaskhedikar, Apoorva Prakash
According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI pertinent to the building type. The ability to identify and rank the important variables is of great importance in practical implementation of the benchmarking tools which rely on query-based building and HVAC variable filters specified by the user.
ERIC Educational Resources Information Center
Sworder, Steven C.
2007-01-01
An experimental two-track intermediate algebra course was offered at Saddleback College, Mission Viejo, CA, between the Fall, 2002 and Fall, 2005 semesters. One track was modeled after the existing traditional California community college intermediate algebra course and the other track was a less rigorous intermediate algebra course in which the…
... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ... virus-delivered gene therapy seen in an animal model of Tay-Sachs and Sandhoff diseases for use ...
Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S
2016-10-01
Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.
Delirium diagnosis, screening and management
Lawlor, Peter G.; Bush, Shirley H.
2014-01-01
Purpose of review Our review focuses on recent developments across many settings regarding the diagnosis, screening and management of delirium, so as to inform these aspects in the context of palliative and supportive care. Recent findings Delirium diagnostic criteria have been updated in the long-awaited Diagnostic Statistical Manual of Mental Disorders, fifth edition. Studies suggest that poor recognition of delirium relates to its clinical characteristics, inadequate interprofessional communication and lack of systematic screening. Validation studies are published for cognitive and observational tools to screen for delirium. Formal guidelines for delirium screening and management have been rigorously developed for intensive care, and may serve as a model for other settings. Given that palliative sedation is often required for the management of refractory delirium at the end of life, a version of the Richmond Agitation-Sedation Scale, modified for palliative care, has undergone preliminary validation. Summary Although formal systematic delirium screening with brief but sensitive tools is strongly advocated for patients in palliative and supportive care, it requires critical evaluation in terms of clinical outcomes, including patient comfort. Randomized controlled trials are needed to inform the development of guidelines for the management of delirium in this setting. PMID:25004177
Gorini, Alessandra; Mazzocco, Ketti; Pravettoni, Gabriella
2015-01-01
Due to the lack of other treatment options, patient candidates for participation in phase I clinical trials are considered the most vulnerable, and many ethical concerns have emerged regarding the informed consent process used in the experimental design of such trials. Starting with these considerations, this nonsystematic review is aimed at analyzing the decision-making processes underlying patients' decision about whether to participate (or not) in phase I trials in order to clarify the cognitive and emotional aspects most strongly implicated in this decision. Considering that there is no uniform decision calculus and that many different variables other than the patient-physician relationship (including demographic, clinical, and personal characteristics) may influence patients' preferences for and processing of information, we conclude that patients' informed decision-making can be facilitated by creating a rigorously developed, calibrated, and validated computer tool modeled on each single patient's knowledge, values, and emotional and cognitive decisional skills. Such a tool will also help oncologists to provide tailored medical information that is useful to improve the shared decision-making process, thereby possibly increasing patient participation in clinical trials. © 2015 S. Karger AG, Basel.
Music Therapy for Posttraumatic Stress in Adults: A Theoretical Review
Landis-Shack, Nora; Heinz, Adrienne J.; Bonn-Miller, Marcel O.
2017-01-01
Music therapy has been employed as a therapeutic intervention to facilitate healing across a variety of clinical populations. There is theoretical and empirical evidence to suggest that individuals with trauma exposure and Posttraumatic Stress Disorder (PTSD), a condition characterized by enduring symptoms of distressing memory intrusions, avoidance, emotional disturbance, and hyperarousal, may derive benefits from music therapy. The current narrative review describes the practice of music therapy and presents a theoretically-informed assessment and model of music therapy as a tool for addressing symptoms of PTSD. The review also presents key empirical studies that support the theoretical assessment. Social, cognitive, and neurobiological mechanisms (e.g., community building, emotion regulation, increased pleasure, anxiety reduction) that promote music therapy’s efficacy as an adjunctive treatment for individuals with posttraumatic stress are discussed. It is concluded that music therapy may be a useful therapeutic tool to reduce symptoms and improve functioning among individuals with trauma exposure and PTSD, though more rigorous empirical study is required. In addition, music therapy may help foster resilience and engage individuals who struggle with stigma associated with seeking professional help. Practical recommendations for incorporating music therapy into clinical practice are offered along with several suggestions for future research. PMID:29290641
Application of Six Sigma/CAP methodology: controlling blood-product utilization and costs.
Neri, Robert A; Mason, Cindy E; Demko, Lisa A
2008-01-01
Blood-product components are a limited commodity whose cost is rising. Many patients benefit from their use, but patients who receive transfusions face an unnecessary increased risk for developing infections; fatal, febrile, or allergic reactions; and circulatory overload. To improve patient care, safety, and resource stewardship, transfusion practices must be evaluated for appropriateness (Wilson et al. 2002). A multihospital health system undertook a rigorous study of blood-product utilization patterns and management processes to address cost-control problems in the organization. The system leveraged two process improvement tools widely implemented outside of the healthcare industry: (1) Six Sigma methodology to identify blood-utilization drivers and to standardize transfusion practice, and (2) change acceleration process model to drive effective change. The initiative resulted in a decreased rate of inappropriate transfusions of packed red blood cell from 16 percent to less than 5 percent, improved clinician use of a blood-component order form, establishment of internal benchmarks, enhanced laboratory-to-clinician communication, and better blood-product expense control. The project further demonstrated how out-of-industry tools and methodologies can be adopted, adapted, and systematically applied to generate positive change (Black and Revere 2006).
NASA Astrophysics Data System (ADS)
Faria, J. M.; Mahomad, S.; Silva, N.
2009-05-01
The deployment of complex safety-critical applications requires rigorous techniques and powerful tools both for the development and V&V stages. Model-based technologies are increasingly being used to develop safety-critical software, and arguably, turning to them can bring significant benefits to such processes, however, along with new challenges. This paper presents the results of a research project where we tried to extend current V&V methodologies to be applied on UML/SysML models and aiming at answering the demands related to validation issues. Two quite different but complementary approaches were investigated: (i) model checking and the (ii) extraction of robustness test-cases from the same models. These two approaches don't overlap and when combined provide a wider reaching model/design validation ability than each one alone thus offering improved safety assurance. Results are very encouraging, even though they either fell short of the desired outcome as shown for model checking, or still appear as not fully matured as shown for robustness test case extraction. In the case of model checking, it was verified that the automatic model validation process can become fully operational and even expanded in scope once tool vendors help (inevitably) to improve the XMI standard interoperability situation. For the robustness test case extraction methodology, the early approach produced interesting results but need further systematisation and consolidation effort in order to produce results in a more predictable fashion and reduce reliance on expert's heuristics. Finally, further improvements and innovation research projects were immediately apparent for both investigated approaches, which point to either circumventing current limitations in XMI interoperability on one hand and bringing test case specification onto the same graphical level as the models themselves and then attempting to automate the generation of executable test cases from its standard UML notation.
Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL
NASA Technical Reports Server (NTRS)
Jenkins, J. Steven; Rouquette, Nicolas F.
2012-01-01
The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.
Satellite Re-entry Modeling and Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Horsley, M.
2012-09-01
LEO trajectory modeling is a fundamental aerospace capability and has applications in many areas of aerospace, such as maneuver planning, sensor scheduling, re-entry prediction, collision avoidance, risk analysis, and formation flying. Somewhat surprisingly, modeling the trajectory of an object in low Earth orbit is still a challenging task. This is primarily due to the large uncertainty in the upper atmospheric density, about 15-20% (1-sigma) for most thermosphere models. Other contributions come from our inability to precisely model future solar and geomagnetic activities, the potentially unknown shape, material construction and attitude history of the satellite, and intermittent, noisy tracking data. Current methods to predict a satellite's re-entry trajectory typically involve making a single prediction, with the uncertainty dealt with in an ad-hoc manner, usually based on past experience. However, due to the extreme speed of a LEO satellite, even small uncertainties in the re-entry time translate into a very large uncertainty in the location of the re-entry event. Currently, most methods simply update the re-entry estimate on a regular basis. This results in a wide range of estimates that are literally spread over the entire globe. With no understanding of the underlying distribution of potential impact points, the sequence of impact points predicted by the current methodology are largely useless until just a few hours before re-entry. This paper will discuss the development of a set of the High Performance Computing (HPC)-based capabilities to support near real-time quantification of the uncertainty inherent in uncontrolled satellite re-entries. An appropriate management of the uncertainties is essential for a rigorous treatment of the re-entry/LEO trajectory problem. The development of HPC-based tools for re-entry analysis is important as it will allow a rigorous and robust approach to risk assessment by decision makers in an operational setting. Uncertainty quantification results from the recent uncontrolled re-entry of the Phobos-Grunt satellite will be presented and discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Quality in physical therapist clinical education: a systematic review.
McCallum, Christine A; Mosher, Peter D; Jacobson, Peri J; Gallivan, Sean P; Giuffre, Suzanne M
2013-10-01
Many factors affect student learning throughout the clinical education (CE) component of professional (entry-level) physical therapist education curricula. Physical therapist education programs (PTEPs) manage CE, yet the material and human resources required to provide CE are generally overseen by community-based physical therapist practices. The purposes of this systematic review were: (1) to examine how the construct of quality is defined in CE literature and (2) to determine the methodological rigor of the available evidence on quality in physical therapist CE. This study was a systematic review of English-language journals using the American Physical Therapy Association's Open Door Portal to Evidence-Based Practice as the computer search engine. The search was categorized using terms for physical therapy and quality and for CE pedagogy and models or roles. Summary findings were characterized by 5 primary themes and 14 subthemes using a qualitative-directed content analysis. Fifty-four articles were included in the study. The primary quality themes were: CE framework, CE sites, structure of CE, assessment in CE, and CE faculty. The methodological rigor of the studies was critically appraised using a binary system based on the McMaster appraisal tools. Scores ranged from 3 to 14. Publication bias and outcome reporting bias may be inherent limitations to the results. The review found inconclusive evidence about what constitutes quality or best practice for physical therapist CE. Five key constructs of CE were identified that, when aggregated, could construe quality.
Practice guidelines for program evaluation in community-based rehabilitation.
Grandisson, Marie; Hébert, Michèle; Thibeault, Rachel
2017-06-01
This paper proposes practice guidelines to evaluate community-based rehabilitation (CBR) programs. These were developed through a rigorous three-phase research process including a literature review on good practices in CBR program evaluation, a field study during which a South Africa CBR program was evaluated, and a Delphi study to generate consensus among a highly credible panel of CBR experts from a wide range of backgrounds and geographical areas. The 10 guidelines developed are summarized into a practice model highlighting key features of sound CBR program evaluation. They strongly indicate that sound CBR evaluations are those that give a voice and as much control as possible to the most affected groups, embrace the challenge of diversity, and foster use of evaluation processes and findings through a rigorous, collaborative and empowering approach. The practice guidelines should facilitate CBR evaluation decisions in respect to facilitating an evaluation process, using frameworks and designing methods. Implications for rehabilitation Ten practice guidelines provide guidance to facilitate sound community-based rehabilitation (CBR) program evaluation decisions. Key indications of good practice include: • being as participatory and empowering as possible; • ensuring that all, including the most affected, have a real opportunity to share their thoughts; • highly considering mixed methods and participatory tools; • adapting to fit evaluation context, local culture and language(s); • defining evaluation questions and reporting findings using shared CBR language when possible, which the framework offered may facilitate.
Biological Conversion of Sugars to Hydrocarbons Technology Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Ryan; Biddy, Mary J.; Tan, Eric
2013-03-31
In support of the Bioenergy Technologies Office, the National Renewable Energy Laboratory (NREL) and the Pacific Northwest National Laboratory (PNNL) are undertaking studies of biomass conversion technologies to identify barriers and target research toward reducing conversion costs. Process designs and preliminary economic estimates for each of these pathway cases were developed using rigorous modeling tools (Aspen Plus and Chemcad). These analyses incorporated the best information available at the time of development, including data from recent pilot and bench-scale demonstrations, collaborative industrial and academic partners, and published literature and patents. This technology pathway case investigates the biological conversion of biomass derivedmore » sugars to hydrocarbon biofuels, utilizing data from recent literature references and information consistent with recent pilot scale demonstrations at NREL. Technical barriers and key research needs have been identified that should be pursued for the pathway to become competitive with petroleum-derived gasoline, diesel and jet range hydrocarbon blendstocks.« less
An improved method for large-scale preparation of negatively and positively supercoiled plasmid DNA.
Barth, Marita; Dederich, Debra; Dedon, Peter
2009-07-01
A rigorous understanding of the biological function of superhelical tension in cellular DNA requires the development of new tools and model systems for study. To this end, an ethidium bromide[#x02013]free method has been developed to prepare large quantities of either negatively or positively super-coiled plasmid DNA. The method is based upon the known effects of ionic strength on the direction of binding of DNA to an archaeal histone, rHMfB, with low and high salt concentrations leading to positive and negative DNA supercoiling, respectively. In addition to fully optimized conditions for large-scale (>500 microg) supercoiling reactions, the method is advantageous in that it avoids the use of mutagenic ethidium bromide, is applicable to chemically modified plasmid DNA substrates, and produces both positively and negatively supercoiled DNA using a single set of reagents.
Adaptive Optics Images of the Galactic Center: Using Empirical Noise-maps to Optimize Image Analysis
NASA Astrophysics Data System (ADS)
Albers, Saundra; Witzel, Gunther; Meyer, Leo; Sitarski, Breann; Boehle, Anna; Ghez, Andrea M.
2015-01-01
Adaptive Optics images are one of the most important tools in studying our Galactic Center. In-depth knowledge of the noise characteristics is crucial to optimally analyze this data. Empirical noise estimates - often represented by a constant value for the entire image - can be greatly improved by computing the local detector properties and photon noise contributions pixel by pixel. To comprehensively determine the noise, we create a noise model for each image using the three main contributors—photon noise of stellar sources, sky noise, and dark noise. We propagate the uncertainties through all reduction steps and analyze the resulting map using Starfinder. The estimation of local noise properties helps to eliminate fake detections while improving the detection limit of fainter sources. We predict that a rigorous understanding of noise allows a more robust investigation of the stellar dynamics in the center of our Galaxy.
The Jungle Universe: coupled cosmological models in a Lotka-Volterra framework
NASA Astrophysics Data System (ADS)
Perez, Jérôme; Füzfa, André; Carletti, Timoteo; Mélot, Laurence; Guedezounme, Lazare
2014-06-01
In this paper, we exploit the fact that the dynamics of homogeneous and isotropic Friedmann-Lemaître universes is a special case of generalized Lotka-Volterra system where the competitive species are the barotropic fluids filling the Universe. Without coupling between those fluids, Lotka-Volterra formulation offers a pedagogical and simple way to interpret usual Friedmann-Lemaître cosmological dynamics. A natural and physical coupling between cosmological fluids is proposed which preserves the structure of the dynamical equations. Using the standard tools of Lotka-Volterra dynamics, we obtain the general Lyapunov function of the system when one of the fluids is coupled to dark energy. This provides in a rigorous form a generic asymptotic behavior for cosmic expansion in presence of coupled species, beyond the standard de Sitter, Einstein-de Sitter and Milne cosmologies. Finally, we conjecture that chaos can appear for at least four interacting fluids.
Automated quantification of pancreatic β-cell mass
Golson, Maria L.; Bush, William S.
2014-01-01
β-Cell mass is a parameter commonly measured in studies of islet biology and diabetes. However, the rigorous quantification of pancreatic β-cell mass using conventional histological methods is a time-consuming process. Rapidly evolving virtual slide technology with high-resolution slide scanners and newly developed image analysis tools has the potential to transform β-cell mass measurement. To test the effectiveness and accuracy of this new approach, we assessed pancreata from normal C57Bl/6J mice and from mouse models of β-cell ablation (streptozotocin-treated mice) and β-cell hyperplasia (leptin-deficient mice), using a standardized systematic sampling of pancreatic specimens. Our data indicate that automated analysis of virtual pancreatic slides is highly reliable and yields results consistent with those obtained by conventional morphometric analysis. This new methodology will allow investigators to dramatically reduce the time required for β-cell mass measurement by automating high-resolution image capture and analysis of entire pancreatic sections. PMID:24760991
... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ... modulation of certain nerve cells in a rodent model of amnesia produced by by thiamine deficiency. The ...
Simic, Vladimir
2016-06-01
As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.
Derivation of phase functions from multiply scattered sunlight transmitted through a hazy atmosphere
NASA Technical Reports Server (NTRS)
Weinman, J. A.; Twitty, J. T.; Browning, S. R.; Herman, B. M.
1975-01-01
The intensity of sunlight multiply scattered in model atmospheres is derived from the equation of radiative transfer by an analytical small-angle approximation. The approximate analytical solutions are compared to rigorous numerical solutions of the same problem. Results obtained from an aerosol-laden model atmosphere are presented. Agreement between the rigorous and the approximate solutions is found to be within a few per cent. The analytical solution to the problem which considers an aerosol-laden atmosphere is then inverted to yield a phase function which describes a single scattering event at small angles. The effect of noisy data on the derived phase function is discussed.
Fast synthesis of topographic mask effects based on rigorous solutions
NASA Astrophysics Data System (ADS)
Yan, Qiliang; Deng, Zhijie; Shiely, James
2007-10-01
Topographic mask effects can no longer be ignored at technology nodes of 45 nm, 32 nm and beyond. As feature sizes become comparable to the mask topographic dimensions and the exposure wavelength, the popular thin mask model breaks down, because the mask transmission no longer follows the layout. A reliable mask transmission function has to be derived from Maxwell equations. Unfortunately, rigorous solutions of Maxwell equations are only manageable for limited field sizes, but impractical for full-chip optical proximity corrections (OPC) due to the prohibitive runtime. Approximation algorithms are in demand to achieve a balance between acceptable computation time and tolerable errors. In this paper, a fast algorithm is proposed and demonstrated to model topographic mask effects for OPC applications. The ProGen Topographic Mask (POTOMAC) model synthesizes the mask transmission functions out of small-sized Maxwell solutions from a finite-difference-in-time-domain (FDTD) engine, an industry leading rigorous simulator of topographic mask effect from SOLID-E. The integral framework presents a seamless solution to the end user. Preliminary results indicate the overhead introduced by POTOMAC is contained within the same order of magnitude in comparison to the thin mask approach.
Utility of distributed hydrologic and water quality models for watershed management and sustainability studies should be accompanied by rigorous model uncertainty analysis. However, the use of complex watershed models primarily follows the traditional {calibrate/validate/predict}...
Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.
Mulvany, M J
1975-01-01
1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023
A Rigorous Investigation on the Ground State of the Penson-Kolb Model
NASA Astrophysics Data System (ADS)
Yang, Kai-Hua; Tian, Guang-Shan; Han, Ru-Qi
2003-05-01
By using either numerical calculations or analytical methods, such as the bosonization technique, the ground state of the Penson-Kolb model has been previously studied by several groups. Some physicists argued that, as far as the existence of superconductivity in this model is concerned, it is canonically equivalent to the negative-U Hubbard model. However, others did not agree. In the present paper, we shall investigate this model by an independent and rigorous approach. We show that the ground state of the Penson-Kolb model is nondegenerate and has a nonvanishing overlap with the ground state of the negative-U Hubbard model. Furthermore, we also show that the ground states of both the models have the same good quantum numbers and may have superconducting long-range order at the same momentum q = 0. Our results support the equivalence between these models. The project partially supported by the Special Funds for Major State Basic Research Projects (G20000365) and National Natural Science Foundation of China under Grant No. 10174002
HEMODOSE: A Set of Multi-parameter Biodosimetry Tools
NASA Technical Reports Server (NTRS)
Hu, Shaowen; Blakely, William F.; Cucinotta, Francis A.
2014-01-01
After the events of September 11, 2001 and recent events at the Fukushima reactors in Japan, there is an increasing concern of the occurrence of nuclear and radiological terrorism or accidents that may result in large casualty in densely populated areas. To guide medical personnel in their clinical decisions for effective medical management and treatment of the exposed individuals, biological markers are usually applied to examine the radiation induced changes at different biological levels. Among these the peripheral blood cell counts are widely used to assess the extent of radiation induced injury. This is due to the fact that hematopoietic system is the most vulnerable part of the human body to radiation damage. Particularly, the lymphocyte, granulocyte, and platelet cells are the most radiosensitive of the blood elements, and monitoring their changes after exposure is regarded as the most practical and best laboratory test to estimate radiation dose. The HEMODOSE web tools are built upon solid physiological and pathophysiological understanding of mammalian hematopoietic systems, and rigorous coarse-grained biomathematical modeling and validation. Using single or serial granulocyte, lymphocyte, leukocyte, or platelet counts after exposure, these tools can estimate absorbed doses of adult victims very rapidly and accurately. Some patient data in historical accidents are utilized as examples to demonstrate the capabilities of these tools as a rapid point-of-care diagnostic or centralized high-throughput assay system in a large scale radiological disaster scenario. Unlike previous dose prediction algorithms, the HEMODOSE web tools establish robust correlations between the absorbed doses and victim's various types of blood cell counts not only in the early time window (1 or 2 days), but also in very late phase (up to 4 weeks) after exposure
HEMODOSE: A Set of Multi-parameter Biodosimetry Tools
NASA Technical Reports Server (NTRS)
Hu, Shaowen; Blakely, William F.; Cucinotta, Francis A.
2014-01-01
There continues to be important concerns of the possibility of the occurrence of acute radiation syndromes following nuclear and radiological terrorism or accidents that may result in mass casualties in densely populated areas. To guide medical personnel in their clinical decisions for effective medical management and treatment of the exposed individuals, biological markers are usually applied to examine radiation induced biological changes to assess the severity of radiation injury to sensitive organ systems. Among these the peripheral blood cell counts are widely used to assess the extent of radiation induced bone marrow (BM) injury. This is due to the fact that hematopoietic system is a vulnerable part of the human body to radiation damage. Particularly, the lymphocyte, granulocyte, and platelet cells are the most radiosensitive of the blood elements, and monitoring their changes after exposure is regarded as a practical and recommended laboratory test to estimate radiation dose and injury. In this work we describe the HEMODOSE web tools, which are built upon solid physiological and pathophysiological understanding of mammalian hematopoietic systems, and rigorous coarse-grained biomathematical modeling and validation. Using single or serial granulocyte, lymphocyte, leukocyte, or platelet counts after exposure, these tools can estimate absorbed doses of adult victims very rapidly and accurately to assess the severity of BM radiation injury. Some patient data from historical accidents are utilized as examples to demonstrate the capabilities of these tools as a rapid point-of-care diagnostic or centralized high-throughput assay system in a large-scale radiological disaster scenario. HEMODOSE web tools establish robust correlations between the absorbed doses and victim's various types of blood cell counts not only in the early time window (1 or 2 days), but also in very late phase (up to 4 weeks) after exposure.
Propensity Score Matching: Retrospective Randomization?
Jupiter, Daniel C
Randomized controlled trials are viewed as the optimal study design. In this commentary, we explore the strength of this design and its complexity. We also discuss some situations in which these trials are not possible, or not ethical, or not economical. In such situations, specifically, in retrospective studies, we should make every effort to recapitulate the rigor and strength of the randomized trial. However, we could be faced with an inherent indication bias in such a setting. Thus, we consider the tools available to address that bias. Specifically, we examine matching and introduce and explore a new tool: propensity score matching. This tool allows us to group subjects according to their propensity to be in a particular treatment group and, in so doing, to account for the indication bias. Copyright © 2017 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.
Modeling of profilometry with laser focus sensors
NASA Astrophysics Data System (ADS)
Bischoff, Jörg; Manske, Eberhard; Baitinger, Henner
2011-05-01
Metrology is of paramount importance in submicron patterning. Particularly, line width and overlay have to be measured very accurately. Appropriated metrology techniques are scanning electron microscopy and optical scatterometry. The latter is non-invasive, highly accurate and enables optical cross sections of layer stacks but it requires periodic patterns. Scanning laser focus sensors are a viable alternative enabling the measurement of non-periodic features. Severe limitations are imposed by the diffraction limit determining the edge location accuracy. It will be shown that the accuracy can be greatly improved by means of rigorous modeling. To this end, a fully vectorial 2.5-dimensional model has been developed based on rigorous Maxwell solvers and combined with models for the scanning and various autofocus principles. The simulations are compared with experimental results. Moreover, the simulations are directly utilized to improve the edge location accuracy.
Applying the scientific method to small catchment studies: Areview of the Panola Mountain experience
Hooper, R.P.
2001-01-01
A hallmark of the scientific method is its iterative application to a problem to increase and refine the understanding of the underlying processes controlling it. A successful iterative application of the scientific method to catchment science (including the fields of hillslope hydrology and biogeochemistry) has been hindered by two factors. First, the scale at which controlled experiments can be performed is much smaller than the scale of the phenomenon of interest. Second, computer simulation models generally have not been used as hypothesis-testing tools as rigorously as they might have been. Model evaluation often has gone only so far as evaluation of goodness of fit, rather than a full structural analysis, which is more useful when treating the model as a hypothesis. An iterative application of a simple mixing model to the Panola Mountain Research Watershed is reviewed to illustrate the increase in understanding gained by this approach and to discern general principles that may be applicable to other studies. The lessons learned include the need for an explicitly stated conceptual model of the catchment, the definition of objective measures of its applicability, and a clear linkage between the scale of observations and the scale of predictions. Published in 2001 by John Wiley & Sons. Ltd.
Naujokaitis-Lewis, Ilona; Curtis, Janelle M R
2016-01-01
Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along with demographic parameters in sensitivity routines. GRIP 2.0 is an important decision-support tool that can be used to prioritize research, identify habitat-based thresholds and management intervention points to improve probability of species persistence, and evaluate trade-offs of alternative management options.
Curtis, Janelle M.R.
2016-01-01
Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along with demographic parameters in sensitivity routines. GRIP 2.0 is an important decision-support tool that can be used to prioritize research, identify habitat-based thresholds and management intervention points to improve probability of species persistence, and evaluate trade-offs of alternative management options. PMID:27547529
Mitigating Reptile Road Mortality: Fence Failures Compromise Ecopassage Effectiveness
Baxter-Gilbert, James H.; Riley, Julia L.; Lesbarrères, David; Litzgus, Jacqueline D.
2015-01-01
Roadways pose serious threats to animal populations. The installation of roadway mitigation measures is becoming increasingly common, yet studies that rigorously evaluate the effectiveness of these conservation tools remain rare. A highway expansion project in Ontario, Canada included exclusion fencing and ecopassages as mitigation measures designed to offset detrimental effects to one of the most imperial groups of vertebrates, reptiles. Taking a multispecies approach, we used a Before-After-Control-Impact study design to compare reptile abundance on the highway before and after mitigation at an Impact site and a Control site from 1 May to 31 August in 2012 and 2013. During this time, radio telemetry, wildlife cameras, and an automated PIT-tag reading system were used to monitor reptile movements and use of ecopassages. Additionally, a willingness to utilize experiment was conducted to quantify turtle behavioral responses to ecopassages. We found no difference in abundance of turtles on the road between the un-mitigated and mitigated highways, and an increase in the percentage of both snakes and turtles detected dead on the road post-mitigation, suggesting that the fencing was not effective. Although ecopassages were used by reptiles, the number of crossings through ecopassages was lower than road-surface crossings. Furthermore, turtle willingness to use ecopassages was lower than that reported in previous arena studies, suggesting that effectiveness of ecopassages may be compromised when alternative crossing options are available (e.g., through holes in exclusion structures). Our rigorous evaluation of reptile roadway mitigation demonstrated that when exclusion structures fail, the effectiveness of population connectivity structures is compromised. Our project emphasizes the need to design mitigation measures with the biology and behavior of the target species in mind, to implement mitigation designs in a rigorous fashion, and quantitatively evaluate road mitigation to ensure allow for adaptive management and optimization of these increasingly important conservation tools. PMID:25806531
Mitigating reptile road mortality: fence failures compromise ecopassage effectiveness.
Baxter-Gilbert, James H; Riley, Julia L; Lesbarrères, David; Litzgus, Jacqueline D
2015-01-01
Roadways pose serious threats to animal populations. The installation of roadway mitigation measures is becoming increasingly common, yet studies that rigorously evaluate the effectiveness of these conservation tools remain rare. A highway expansion project in Ontario, Canada included exclusion fencing and ecopassages as mitigation measures designed to offset detrimental effects to one of the most imperial groups of vertebrates, reptiles. Taking a multispecies approach, we used a Before-After-Control-Impact study design to compare reptile abundance on the highway before and after mitigation at an Impact site and a Control site from 1 May to 31 August in 2012 and 2013. During this time, radio telemetry, wildlife cameras, and an automated PIT-tag reading system were used to monitor reptile movements and use of ecopassages. Additionally, a willingness to utilize experiment was conducted to quantify turtle behavioral responses to ecopassages. We found no difference in abundance of turtles on the road between the un-mitigated and mitigated highways, and an increase in the percentage of both snakes and turtles detected dead on the road post-mitigation, suggesting that the fencing was not effective. Although ecopassages were used by reptiles, the number of crossings through ecopassages was lower than road-surface crossings. Furthermore, turtle willingness to use ecopassages was lower than that reported in previous arena studies, suggesting that effectiveness of ecopassages may be compromised when alternative crossing options are available (e.g., through holes in exclusion structures). Our rigorous evaluation of reptile roadway mitigation demonstrated that when exclusion structures fail, the effectiveness of population connectivity structures is compromised. Our project emphasizes the need to design mitigation measures with the biology and behavior of the target species in mind, to implement mitigation designs in a rigorous fashion, and quantitatively evaluate road mitigation to ensure allow for adaptive management and optimization of these increasingly important conservation tools.
Agricultural model intercomparison and improvement project: Overview of model intercomparisons
USDA-ARS?s Scientific Manuscript database
Improvement of crop simulation models to better estimate growth and yield is one of the objectives of the Agricultural Model Intercomparison and Improvement Project (AgMIP). The overall goal of AgMIP is to provide an assessment of crop model through rigorous intercomparisons and evaluate future clim...
Emerson, Mitchell R; Gallagher, Ryan J; Marquis, Janet G; LeVine, Steven M
2009-01-01
Advancing the understanding of the mechanisms involved in the pathogenesis of multiple sclerosis (MS) likely will lead to new and better therapeutics. Although important information about the disease process has been obtained from research on pathologic specimens, peripheral blood lymphocytes and MRI studies, the elucidation of detailed mechanisms has progressed largely through investigations using animal models of MS. In addition, animal models serve as an important tool for the testing of putative interventions. The most commonly studied model of MS is experimental autoimmune encephalomyelitis (EAE). This model can be induced in a variety of species and by various means, but there has been concern that the model may not accurately reflect the disease process, and more importantly, it may give rise to erroneous findings when it is used to test possible therapeutics. Several reasons have been given to explain the shortcomings of this model as a useful testing platform, but one idea provides a framework for improving the value of this model, and thus, it deserves careful consideration. In particular, the idea asserts that EAE studies are inadequately designed to enable appropriate evaluation of putative therapeutics. Here we discuss problem areas within EAE study designs and provide suggestions for their improvement. This paper is principally directed at investigators new to the field of EAE, although experienced investigators may find useful suggestions herein. PMID:19389303
Rousseau, Marjolaine; Beauchamp, Guy; Nichols, Sylvain
The effectiveness of teaching aids in veterinary medical education is not often assessed rigorously. The objective in the present study was to evaluate the effectiveness of a commercially available jugular venipuncture alpaca model as a complementary tool to teach veterinary students how to perform venipuncture in adult alpacas. We hypothesized that practicing on the model would allow veterinary students to draw blood in alpacas more rapidly with fewer attempts than students without previous practice on the model. Thirty-six third-year veterinary students were enrolled and randomly allocated to the model (group M; n=18) or the control group (group C; n=18). The venipuncture technique was taught to all students on day 0. Students in group M practiced on the model on day 2. On day 5, an evaluator blinded to group allocation evaluated the students' venipuncture skills during a practical examination using live alpacas. Success was defined as the aspiration of a 6-ml sample of blood. Measured outcomes included number of attempts required to achieve success (success score), total procedural time, and overall qualitative score. Success scores, total procedural time, and overall scores did not differ between groups. Use of restless alpacas reduced performance. The jugular venipuncture alpaca model failed to improve jugular venipuncture skills in this student population. Lack of movement represents a significant weakness of this training model.
NASA Astrophysics Data System (ADS)
Yu, Hesheng; Thé, Jesse
2016-11-01
The prediction of the dispersion of air pollutants in urban areas is of great importance to public health, homeland security, and environmental protection. Computational Fluid Dynamics (CFD) emerges as an effective tool for pollutant dispersion modelling. This paper reports and quantitatively validates the shear stress transport (SST) k-ω turbulence closure model and its transitional variant for pollutant dispersion under complex urban environment for the first time. Sensitivity analysis is performed to establish recommendation for the proper use of turbulence models in urban settings. The current SST k-ω simulation is validated rigorously by extensive experimental data using hit rate for velocity components, and the "factor of two" of observations (FAC2) and fractional bias (FB) for concentration field. The simulation results show that current SST k-ω model can predict flow field nicely with an overall hit rate of 0.870, and concentration dispersion with FAC2 = 0.721 and FB = 0.045. The flow simulation of the current SST k-ω model is slightly inferior to that of a detached eddy simulation (DES), but better than that of standard k-ε model. However, the current study is the best among these three model approaches, when validated against measurements of pollutant dispersion in the atmosphere. This work aims to provide recommendation for proper use of CFD to predict pollutant dispersion in urban environment.
The Usability of Online Geographic Virtual Reality for Urban Planning
NASA Astrophysics Data System (ADS)
Zhang, S.; Moore, A. B.
2013-08-01
Virtual reality (VR) technology is starting to become widely and freely available (for example the online OpenSimulator tool), with potential for use in 3D urban planning and design tasks but still needing rigorous assessment to establish this. A previous study consulted with a small group of urban professionals, who concluded in a satisfaction usability test that online VR had potential value as a usable 3D communication and remote marketing tool but acknowledged that visual quality and geographic accuracy were obstacles to overcome. This research takes the investigation a significant step further to also examine the usability aspects of efficiency (how quickly tasks are completed) and effectiveness (how successfully tasks are completed), relating to OpenSimulator in an urban planning situation. The comparative study pits a three-dimensional VR model (with increased graphic fidelity and geographic content to address the feedback of the previous study) of a subdivision design (in a Dunedin suburb) against 3D models built with GIS (ArcGIS) and CAD (BricsCAD) tools, two types of software environment well established in urban professional practice. Urban professionals participated in the study by attempting to perform timed tasks correctly in each of the environments before being asked questions about the technologies involved and their perceived importance to their professional work. The results reinforce the positive feedback for VR of the previous study, with the graphical and geographic data issues being somewhat addressed (though participants stressed the need for accurate and precise object and terrain modification capabilities in VR). Ease-ofuse and associated fastest task completion speed were significant positive outcomes to emerge from the comparison with GIS and CAD, pointing to a strong future for VR in an urban planning context.
Geographic profiling applied to testing models of bumble-bee foraging.
Raine, Nigel E; Rossmo, D Kim; Le Comber, Steven C
2009-03-06
Geographic profiling (GP) was originally developed as a statistical tool to help police forces prioritize lists of suspects in investigations of serial crimes. GP uses the location of related crime sites to make inferences about where the offender is most likely to live, and has been extremely successful in criminology. Here, we show how GP is applicable to experimental studies of animal foraging, using the bumble-bee Bombus terrestris. GP techniques enable us to simplify complex patterns of spatial data down to a small number of parameters (2-3) for rigorous hypothesis testing. Combining computer model simulations and experimental observation of foraging bumble-bees, we demonstrate that GP can be used to discriminate between foraging patterns resulting from (i) different hypothetical foraging algorithms and (ii) different food item (flower) densities. We also demonstrate that combining experimental and simulated data can be used to elucidate animal foraging strategies: specifically that the foraging patterns of real bumble-bees can be reliably discriminated from three out of nine hypothetical foraging algorithms. We suggest that experimental systems, like foraging bees, could be used to test and refine GP model predictions, and that GP offers a useful technique to analyse spatial animal behaviour data in both the laboratory and field.
Three-dimensional printing in cardiology: Current applications and future challenges.
Luo, Hongxing; Meyer-Szary, Jarosław; Wang, Zhongmin; Sabiniewicz, Robert; Liu, Yuhao
2017-01-01
Three-dimensional (3D) printing has attracted a huge interest in recent years. Broadly speaking, it refers to the technology which converts a predesigned virtual model to a touchable object. In clinical medicine, it usually converts a series of two-dimensional medical images acquired through computed tomography, magnetic resonance imaging or 3D echocardiography into a physical model. Medical 3D printing consists of three main steps: image acquisition, virtual reconstruction and 3D manufacturing. It is a promising tool for preoperative evaluation, medical device design, hemodynamic simulation and medical education, it is also likely to reduce operative risk and increase operative success. However, the most relevant studies are case reports or series which are underpowered in testing its actual effect on patient outcomes. The decision of making a 3D cardiac model may seem arbitrary since it is mostly based on a cardiologist's perceived difficulty in performing an interventional procedure. A uniform consensus is urgently necessary to standardize the key steps of 3D printing from imaging acquisition to final production. In the future, more clinical trials of rigorous design are possible to further validate the effect of 3D printing on the treatment of cardiovascular diseases. (Cardiol J 2017; 24, 4: 436-444).
Line-source excitation of realistic conformal metasurface cloaks
NASA Astrophysics Data System (ADS)
Padooru, Yashwanth R.; Yakovlev, Alexander B.; Chen, Pai-Yen; Alù, Andrea
2012-11-01
Following our recently introduced analytical tools to model and design conformal mantle cloaks based on metasurfaces [Padooru et al., J. Appl. Phys. 112, 034907 (2012)], we investigate their performance and physical properties when excited by an electric line source placed in their close proximity. We consider metasurfaces formed by 2-D arrays of slotted (meshes and Jerusalem cross slots) and printed (patches and Jerusalem crosses) sub-wavelength elements. The electromagnetic scattering analysis is carried out using a rigorous analytical model, which utilizes the two-sided impedance boundary conditions at the interface of the sub-wavelength elements. It is shown that the homogenized grid-impedance expressions, originally derived for planar arrays of sub-wavelength elements and plane-wave excitation, may be successfully used to model and tailor the surface reactance of cylindrical conformal mantle cloaks illuminated by near-field sources. Our closed-form analytical results are in good agreement with full-wave numerical simulations, up to sub-wavelength distances from the metasurface, confirming that mantle cloaks may be very effective to suppress the scattering of moderately sized objects, independent of the type of excitation and point of observation. We also discuss the dual functionality of these metasurfaces to boost radiation efficiency and directivity from confined near-field sources.
Semiclassical Wigner theory of photodissociation in three dimensions: Shedding light on its basis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arbelo-González, W.; CNRS, Institut des Sciences Moléculaires, UMR 5255, 33405 Talence; Université Bordeaux, Institut des Sciences Moléculaires, UMR 5255, 33405 Talence
2015-04-07
The semiclassical Wigner theory (SCWT) of photodissociation dynamics, initially proposed by Brown and Heller [J. Chem. Phys. 75, 186 (1981)] in order to describe state distributions in the products of direct collinear photodissociations, was recently extended to realistic three-dimensional triatomic processes of the same type [Arbelo-González et al., Phys. Chem. Chem. Phys. 15, 9994 (2013)]. The resulting approach, which takes into account rotational motions in addition to vibrational and translational ones, was applied to a triatomic-like model of methyl iodide photodissociation and its predictions were found to be in nearly quantitative agreement with rigorous quantum results, but at a muchmore » lower computational cost, making thereby SCWT a potential tool for the study of polyatomic reaction dynamics. Here, we analyse the main reasons for this agreement by means of an elementary model of fragmentation explicitly dealing with the rotational motion only. We show that our formulation of SCWT makes it a semiclassical approximation to an approximate planar quantum treatment of the dynamics, both of sufficient quality for the whole treatment to be satisfying.« less
NASA Astrophysics Data System (ADS)
Liu, Chi-Ping; Zhou, Fei; Ozolins, Vidvuds
2014-03-01
Molybdenum disulfide (MoS2) is a good candidate electrode material for high capacity energy storage applications, such as lithium ion batteries and supercapacitors. In this work, we investigate lithium intercalation and diffusion kinetics in MoS2 by using first-principles density-functional theory (DFT) calculations. Two different lithium intercalation sites (1-H and 2-T) in MoS2 are found to be stable for lithium intercalation at different van der Waals' (vdW) gap distances. It is found that both thermodynamic and kinetic properties are highly related to the interlayer vdW gap distance, and that the optimal gap distance leads to effective solid-state diffusion in MoS2. Additionally, through the use of compressive sensing, we build accurate cluster expansion models to study the thermodynamic properties of MoS2 at high lithium content by truncating the higher order effective clusters with significant contributions. The results show that compressive sensing cluster expansion is a rigorous and powerful tool for model construction for advanced electrochemical applications in the future.
Chang, Li-Chun; Chen, Yu-Chi; Liao, Li-Ling; Wu, Fei Ling; Hsieh, Pei-Lin; Chen, Hsiao-Jung
2017-01-01
The study aimed to illustrate the constructs and test the psychometric properties of an instrument of health literacy competencies (IOHLC) for health professionals. A multi-phase questionnaire development method was used to develop the scale. The categorization of the knowledge and practice domains achieved consensus through a modified Delphi process. To reduce the number of items, the 92-item IOHLC was psychometrically evaluated through internal consistency, Rasch modeling, and two-stage factor analysis. In total, 736 practitioners, including nurses, nurse practitioners, health educators, case managers, and dieticians completed the 92-item IOHLC online from May 2012 to January 2013. The final version of the IOHLC covered 9 knowledge items and 40 skill items containing 9 dimensions, with good model fit, and explaining 72% of total variance. All domains had acceptable internal consistency and discriminant validity. The tool in this study is the first to verify health literacy competencies rigorously. Moreover, through psychometric testing, the 49-item IOHLC demonstrates adequate reliability and validity. The IOHLC may serve as a reference for the theoretical and in-service training of Chinese-speaking individuals' health literacy competencies.
Probability bounds analysis for nonlinear population ecology models.
Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A
2015-09-01
Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals. Copyright © 2015. Published by Elsevier Inc.
Update on simulation-based surgical training and assessment in ophthalmology: a systematic review.
Thomsen, Ann Sofia S; Subhi, Yousif; Kiilgaard, Jens Folke; la Cour, Morten; Konge, Lars
2015-06-01
This study reviews the evidence behind simulation-based surgical training of ophthalmologists to determine (1) the validity of the reported models and (2) the ability to transfer skills to the operating room. Simulation-based training is established widely within ophthalmology, although it often lacks a scientific basis for implementation. We conducted a systematic review of trials involving simulation-based training or assessment of ophthalmic surgical skills among health professionals. The search included 5 databases (PubMed, EMBASE, PsycINFO, Cochrane Library, and Web of Science) and was completed on March 1, 2014. Overall, the included trials were divided into animal, cadaver, inanimate, and virtual-reality models. Risk of bias was assessed using the Cochrane Collaboration's tool. Validity evidence was evaluated using a modern validity framework (Messick's). We screened 1368 reports for eligibility and included 118 trials. The most common surgery simulated was cataract surgery. Most validity trials investigated only 1 or 2 of 5 sources of validity (87%). Only 2 trials (48 participants) investigated transfer of skills to the operating room; 4 trials (65 participants) evaluated the effect of simulation-based training on patient-related outcomes. Because of heterogeneity of the studies, it was not possible to conduct a quantitative analysis. The methodologic rigor of trials investigating simulation-based surgical training in ophthalmology is inadequate. To ensure effective implementation of training models, evidence-based knowledge of validity and efficacy is needed. We provide a useful tool for implementation and evaluation of research in simulation-based training. Copyright © 2015 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Ultrasound scatter in heterogeneous 3D microstructures: Parameters affecting multiple scattering
NASA Astrophysics Data System (ADS)
Engle, B. J.; Roberts, R. A.; Grandin, R. J.
2018-04-01
This paper reports on a computational study of ultrasound propagation in heterogeneous metal microstructures. Random spatial fluctuations in elastic properties over a range of length scales relative to ultrasound wavelength can give rise to scatter-induced attenuation, backscatter noise, and phase front aberration. It is of interest to quantify the dependence of these phenomena on the microstructure parameters, for the purpose of quantifying deleterious consequences on flaw detectability, and for the purpose of material characterization. Valuable tools for estimation of microstructure parameters (e.g. grain size) through analysis of ultrasound backscatter have been developed based on approximate weak-scattering models. While useful, it is understood that these tools display inherent inaccuracy when multiple scattering phenomena significantly contribute to the measurement. It is the goal of this work to supplement weak scattering model predictions with corrections derived through application of an exact computational scattering model to explicitly prescribed microstructures. The scattering problem is formulated as a volume integral equation (VIE) displaying a convolutional Green-function-derived kernel. The VIE is solved iteratively employing FFT-based con-volution. Realizations of random microstructures are specified on the micron scale using statistical property descriptions (e.g. grain size and orientation distributions), which are then spatially filtered to provide rigorously equivalent scattering media on a length scale relevant to ultrasound propagation. Scattering responses from ensembles of media representations are averaged to obtain mean and variance of quantities such as attenuation and backscatter noise levels, as a function of microstructure descriptors. The computational approach will be summarized, and examples of application will be presented.
NASA Astrophysics Data System (ADS)
Morway, E. D.; Niswonger, R. G.; Triana, E.
2016-12-01
In irrigated agricultural regions supplied by both surface-water and groundwater, increased reliance on groundwater during sustained drought leads to long-term water table drawdown and subsequent surface-water losses. This, in turn, may threaten the sustainability of the irrigation project. To help offset groundwater resource losses and restore water supply reliability, an alternative management strategy commonly referred to as managed aquifer recharge (MAR) in agricultural regions helps mitigate long-term aquifer drawdown and provides additional water for subsequent withdraw. Sources of MAR in this investigation are limited to late winter runoff in years with above average precipitation (i.e., above average snowpack). However, where winter MAR results in an elevated water table, non-beneficial consumptive use may increase from evapotranspiration in adjacent and down-gradient fallow and naturally vegetated lands. To rigorously explore this trade-off, the recently published MODSIM-MODFLOW model was applied to quantify both the benefits and unintended consequences of MAR. MODSIM-MODFLOW is a generalized modeling tool capable of exploring the effects of altered river operations within an integrated groundwater and surface-water (GW-SW) model. Thus, the MODSIM-MODFLOW model provides a modeling platform capable of simulating MAR in amounts and duration consistent with other senior water rights in the river system (e.g., minimum in-stream flow requirements). Increases in non-beneficial consumptive use resulting from winter MAR are evaluated for a hypothetical model patterned after alluvial aquifers common in arid and semi-arid areas of the western United States. Study results highlight (1) the benefit of an implicitly-coupled river operations and hydrologic modeling tool, (2) the balance between winter MAR and the potential increase in non-beneficial consumptive use, and (3) conditions where MAR may or may not be an appropriate management option, such as the availability of surface-water storage.
Advanced EUV mask and imaging modeling
NASA Astrophysics Data System (ADS)
Evanschitzky, Peter; Erdmann, Andreas
2017-10-01
The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.
Machine Learning to Discover and Optimize Materials
NASA Astrophysics Data System (ADS)
Rosenbrock, Conrad Waldhar
For centuries, scientists have dreamed of creating materials by design. Rather than discovery by accident, bespoke materials could be tailored to fulfill specific technological needs. Quantum theory and computational methods are essentially equal to the task, and computational power is the new bottleneck. Machine learning has the potential to solve that problem by approximating material behavior at multiple length scales. A full end-to-end solution must allow us to approximate the quantum mechanics, microstructure and engineering tasks well enough to be predictive in the real world. In this dissertation, I present algorithms and methodology to address some of these problems at various length scales. In the realm of enumeration, systems with many degrees of freedom such as high-entropy alloys may contain prohibitively many unique possibilities so that enumerating all of them would exhaust available compute memory. One possible way to address this problem is to know in advance how many possibilities there are so that the user can reduce their search space by restricting the occupation of certain lattice sites. Although tools to calculate this number were available, none performed well for very large systems and none could easily be integrated into low-level languages for use in existing scientific codes. I present an algorithm to solve these problems. Testing the robustness of machine-learned models is an essential component in any materials discovery or optimization application. While it is customary to perform a small number of system-specific tests to validate an approach, this may be insufficient in many cases. In particular, for Cluster Expansion models, the expansion may not converge quickly enough to be useful and reliable. Although the method has been used for decades, a rigorous investigation across many systems to determine when CE "breaks" was still lacking. This dissertation includes this investigation along with heuristics that use only a small training database to predict whether a model is worth pursuing in detail. To be useful, computational materials discovery must lead to experimental validation. However, experiments are difficult due to sample purity, environmental effects and a host of other considerations. In many cases, it is difficult to connect theory to experiment because computation is deterministic. By combining advanced group theory with machine learning, we created a new tool that bridges the gap between experiment and theory so that experimental and computed phase diagrams can be harmonized. Grain boundaries in real materials control many important material properties such as corrosion, thermal conductivity, and creep. Because of their high dimensionality, learning the underlying physics to optimizing grain boundaries is extremely complex. By leveraging a mathematically rigorous representation for local atomic environments, machine learning becomes a powerful tool to approximate properties for grain boundaries. But it also goes beyond predicting properties by highlighting those atomic environments that are most important for influencing the boundary properties. This provides an immense dimensionality reduction that empowers grain boundary scientists to know where to look for deeper physical insights.
Multiagent Task Coordination Using a Distributed Optimization Approach
2015-09-01
positive- definite symmetric inertia matrix, C(q, q̇) ∈ <n×n is the centripetal and coriolis matrix, G(q) ∈ <n is the gravitation force vector, B(q) ∈ <n...artificial intelligence re- search are effectively integrated with the rigorous control systems analysis tools, and produced novel approximate dynamic...results are given to illustrate the effectiveness of the proposed designs. Section 7.0 concludes the report. 3.0 METHODS, ASSUMPTIONS, AND PROCEDURES
Prostate Cancer Genetics in African Americans
2013-09-01
tions the ~search is trying to answer, the sWdY is important to .the comn;unity .<Uld. )llC~n State’s Institutional Review Boards ~cailse. men with...for providing a balanced educational experience, the University offers a rigorous academic agenda with a broad range of disciplines, providing... Wii new hospital rehab tool • CU alum. 107. still true blue • Uke http://www.omaha.com/article/20110823/LIVEWELLOinOS239936/1165 Share 8/24
Prevention for Pediatric and Adolescent Migraine.
Hickman, Carolyn; Lewis, Kara Stuart; Little, Robert; Rastogi, Reena Gogia; Yonker, Marcy
2015-01-01
Children and adolescents can experience significant disability from frequent migraine. A number of tools have been developed to help quantify the impact of migraine in this population. Many preventative medications used in adults are routinely used to prevent migraines in children, although there has been less rigorous study. This article reviews the indications and evidence for the use of migraine preventatives, such as antidepressants, antihypertensives, anticonvulsants, antihistamines, and botulinum toxin, in this population. © 2015 American Headache Society.
Minimal intervention dentistry II: part 6. Microscope and microsurgical techniques in periodontics.
Sitbon, Y; Attathom, T
2014-05-01
Different aspects of treatment for periodontal diseases or gingival problems require rigorous diagnostics. Magnification tools and microsurgical instruments, combined with minimally invasive techniques can provide the best solutions in such cases. Relevance of treatments, duration of healing, reduction of pain and post-operative scarring have the potential to be improved for patients through such techniques. This article presents an overview of the use of microscopy in periodontics, still in the early stages of development.
Research strategies that result in optimal data collection from the patient medical record
Gregory, Katherine E.; Radovinsky, Lucy
2010-01-01
Data obtained from the patient medical record are often a component of clinical research led by nurse investigators. The rigor of the data collection methods correlates to the reliability of the data and, ultimately, the analytical outcome of the study. Research strategies for reliable data collection from the patient medical record include the development of a precise data collection tool, the use of a coding manual, and ongoing communication with research staff. PMID:20974093
Single toxin dose-response models revisited
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demidenko, Eugene, E-mail: eugened@dartmouth.edu
The goal of this paper is to offer a rigorous analysis of the sigmoid shape single toxin dose-response relationship. The toxin efficacy function is introduced and four special points, including maximum toxin efficacy and inflection points, on the dose-response curve are defined. The special points define three phases of the toxin effect on mortality: (1) toxin concentrations smaller than the first inflection point or (2) larger then the second inflection point imply low mortality rate, and (3) concentrations between the first and the second inflection points imply high mortality rate. Probabilistic interpretation and mathematical analysis for each of the fourmore » models, Hill, logit, probit, and Weibull is provided. Two general model extensions are introduced: (1) the multi-target hit model that accounts for the existence of several vital receptors affected by the toxin, and (2) model with a nonzero mortality at zero concentration to account for natural mortality. Special attention is given to statistical estimation in the framework of the generalized linear model with the binomial dependent variable as the mortality count in each experiment, contrary to the widespread nonlinear regression treating the mortality rate as continuous variable. The models are illustrated using standard EPA Daphnia acute (48 h) toxicity tests with mortality as a function of NiCl or CuSO{sub 4} toxin. - Highlights: • The paper offers a rigorous study of a sigmoid dose-response relationship. • The concentration with highest mortality rate is rigorously defined. • A table with four special points for five morality curves is presented. • Two new sigmoid dose-response models have been introduced. • The generalized linear model is advocated for estimation of sigmoid dose-response relationship.« less
Balboni, Tracy A; Fitchett, George; Handzo, George F; Johnson, Kimberly S; Koenig, Harold G; Pargament, Kenneth I; Puchalski, Christina M; Sinclair, Shane; Taylor, Elizabeth J; Steinhauser, Karen E
2017-09-01
The State of the Science in Spirituality and Palliative Care was convened to address the current landscape of research at the intersection of spirituality and palliative care and to identify critical next steps to advance this field of inquiry. Part II of the SOS-SPC report addresses the state of extant research and identifies critical research priorities pertaining to the following questions: 1) How do we assess spirituality? 2) How do we intervene on spirituality in palliative care? And 3) How do we train health professionals to address spirituality in palliative care? Findings from this report point to the need for screening and assessment tools that are rigorously developed, clinically relevant, and adapted to a diversity of clinical and cultural settings. Chaplaincy research is needed to form professional spiritual care provision in a variety of settings, and outcomes assessed to ascertain impact on key patient, family, and clinical staff outcomes. Intervention research requires rigorous conceptualization and assessments. Intervention development must be attentive to clinical feasibility, incorporate perspectives and needs of patients, families, and clinicians, and be targeted to diverse populations with spiritual needs. Finally, spiritual care competencies for various clinical care team members should be refined. Reflecting those competencies, training curricula and evaluation tools should be developed, and the impact of education on patient, family, and clinician outcomes should be systematically assessed. Published by Elsevier Inc.
Vanquelef, Enguerran; Simon, Sabrina; Marquant, Gaelle; Garcia, Elodie; Klimerak, Geoffroy; Delepine, Jean Charles; Cieplak, Piotr; Dupradeau, François-Yves
2011-07-01
R.E.D. Server is a unique, open web service, designed to derive non-polarizable RESP and ESP charges and to build force field libraries for new molecules/molecular fragments. It provides to computational biologists the means to derive rigorously molecular electrostatic potential-based charges embedded in force field libraries that are ready to be used in force field development, charge validation and molecular dynamics simulations. R.E.D. Server interfaces quantum mechanics programs, the RESP program and the latest version of the R.E.D. tools. A two step approach has been developed. The first one consists of preparing P2N file(s) to rigorously define key elements such as atom names, topology and chemical equivalencing needed when building a force field library. Then, P2N files are used to derive RESP or ESP charges embedded in force field libraries in the Tripos mol2 format. In complex cases an entire set of force field libraries or force field topology database is generated. Other features developed in R.E.D. Server include help services, a demonstration, tutorials, frequently asked questions, Jmol-based tools useful to construct PDB input files and parse R.E.D. Server outputs as well as a graphical queuing system allowing any user to check the status of R.E.D. Server jobs.
Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data
Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil
2014-01-01
Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202
Selecting a measure for assessing secondary trauma in nurses.
Watts, Jenny; Robertson, Noelle
2015-11-01
To summarise the usefulness of available psychometric tools in assessing secondary trauma in nursing staff and examine their limitations, as well as their strengths, to enable researchers to select the most suitable measures. Secondary trauma is an extreme persistent reaction that can be experienced by nursing staff following exposure to a potentially life-threatening situation. This relatively new concept is increasingly used to explore staff distress, but is complicated by various definitions. In this growing and popular field, few rigorously tested measures are used. Therefore, it is timely to examine the measures available and their robustness. In March 2014 the following databases were used: BNI, CINAHL, EMBASE, PILOTS, Medline, PsycINFO and the Cochrane Library. A systematic search of nurse and health research databases was conducted from 1980 to 2014 using the terms nurs* AND PTSD OR Posttraumatic Stress Disorder OR secondary trauma OR secondary traumatic stress OR STS OR compassion fatigue. To strengthen confidence in research findings and make the most useful contribution to practice, researchers should use the most rigorous measures available. Of the assessment tools used, the only one subject to robust peer review is the Secondary Traumatic Stress Scale (STSS). The scale most frequently used to assess secondary traumatic stress is the Professional Quality of Life Scale (ProQOL); its lack of psychometric evaluation is a potential weakness. CONCLUSION The STSS is the only validated tool reported in the peer-reviewed, published literature and the authors suggest greater application when secondary trauma is a suspected consequence of nursing work. Validated tools such as the HADS and GHQ-28 are more useful in assessing broader-based psychological morbidity. The authors suggest greater application of the STSS when secondary trauma is a suspected consequence of nursing work. Researchers interested in assessing more than trauma responses are advised to use HADS and GHQ-28.
Separating intrinsic from extrinsic fluctuations in dynamic biological systems
Paulsson, Johan
2011-01-01
From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems. PMID:21730172
Separating intrinsic from extrinsic fluctuations in dynamic biological systems.
Hilfinger, Andreas; Paulsson, Johan
2011-07-19
From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems.
A review of cultural adaptations of screening tools for autism spectrum disorders.
Soto, Sandra; Linas, Keri; Jacobstein, Diane; Biel, Matthew; Migdal, Talia; Anthony, Bruno J
2015-08-01
Screening children to determine risk for Autism Spectrum Disorders has become more common, although some question the advisability of such a strategy. The purpose of this systematic review is to identify autism screening tools that have been adapted for use in cultures different from that in which they were developed, evaluate the cultural adaptation process, report on the psychometric properties of the adapted instruments, and describe the implications for further research and clinical practice. A total of 21 articles met criteria for inclusion, reporting on the cultural adaptation of autism screening in 19 countries and in 10 languages. The cultural adaptation process was not always clearly outlined and often did not include the recommended guidelines. Cultural/linguistic modifications to the translated tools tended to increase with the rigor of the adaptation process. Differences between the psychometric properties of the original and adapted versions were common, indicating the need to obtain normative data on populations to increase the utility of the translated tool. © The Author(s) 2014.
Swanson, R Chad; Cattaneo, Adriano; Bradley, Elizabeth; Chunharas, Somsak; Atun, Rifat; Abbas, Kaja M; Katsaliaki, Korina; Mustafee, Navonil; Mason Meier, Benjamin; Best, Allan
2012-10-01
While reaching consensus on future plans to address current global health challenges is far from easy, there is broad agreement that reductionist approaches that suggest a limited set of targeted interventions to improve health around the world are inadequate. We argue that a comprehensive systems perspective should guide health practice, education, research and policy. We propose key 'systems thinking' tools and strategies that have the potential for transformational change in health systems. Three overarching themes span these tools and strategies: collaboration across disciplines, sectors and organizations; ongoing, iterative learning; and transformational leadership. The proposed tools and strategies in this paper can be applied, in varying degrees, to every organization within health systems, from families and communities to national ministries of health. While our categorization is necessarily incomplete, this initial effort will provide a valuable contribution to the health systems strengthening debate, as the need for a more systemic, rigorous perspective in health has never been greater.
Swanson, R Chad; Cattaneo, Adriano; Bradley, Elizabeth; Chunharas, Somsak; Atun, Rifat; Abbas, Kaja M; Katsaliaki, Korina; Mustafee, Navonil; Mason Meier, Benjamin; Best, Allan
2012-01-01
While reaching consensus on future plans to address current global health challenges is far from easy, there is broad agreement that reductionist approaches that suggest a limited set of targeted interventions to improve health around the world are inadequate. We argue that a comprehensive systems perspective should guide health practice, education, research and policy. We propose key ‘systems thinking’ tools and strategies that have the potential for transformational change in health systems. Three overarching themes span these tools and strategies: collaboration across disciplines, sectors and organizations; ongoing, iterative learning; and transformational leadership. The proposed tools and strategies in this paper can be applied, in varying degrees, to every organization within health systems, from families and communities to national ministries of health. While our categorization is necessarily incomplete, this initial effort will provide a valuable contribution to the health systems strengthening debate, as the need for a more systemic, rigorous perspective in health has never been greater. PMID:23014154
Hoben, Matthias; Estabrooks, Carole A.; Squires, Janet E.; Behrens, Johann
2016-01-01
We translated the Canadian residential long term care versions of the Alberta Context Tool (ACT) and the Conceptual Research Utilization (CRU) Scale into German, to study the association between organizational context factors and research utilization in German nursing homes. The rigorous translation process was based on best practice guidelines for tool translation, and we previously published methods and results of this process in two papers. Both instruments are self-report questionnaires used with care providers working in nursing homes. The aim of this study was to assess the factor structure, reliability, and measurement invariance (MI) between care provider groups responding to these instruments. In a stratified random sample of 38 nursing homes in one German region (Metropolregion Rhein-Neckar), we collected questionnaires from 273 care aides, 196 regulated nurses, 152 allied health providers, 6 quality improvement specialists, 129 clinical leaders, and 65 nursing students. The factor structure was assessed using confirmatory factor models. The first model included all 10 ACT concepts. We also decided a priori to run two separate models for the scale-based and the count-based ACT concepts as suggested by the instrument developers. The fourth model included the five CRU Scale items. Reliability scores were calculated based on the parameters of the best-fitting factor models. Multiple-group confirmatory factor models were used to assess MI between provider groups. Rather than the hypothesized ten-factor structure of the ACT, confirmatory factor models suggested 13 factors. The one-factor solution of the CRU Scale was confirmed. The reliability was acceptable (>0.7 in the entire sample and in all provider groups) for 10 of 13 ACT concepts, and high (0.90–0.96) for the CRU Scale. We could demonstrate partial strong MI for both ACT models and partial strict MI for the CRU Scale. Our results suggest that the scores of the German ACT and the CRU Scale for nursing homes are acceptably reliable and valid. However, as the ACT lacked strict MI, observed variables (or scale scores based on them) cannot be compared between provider groups. Rather, group comparisons should be based on latent variable models, which consider the different residual variances of each group. PMID:27656156
Comparing an annual and daily time-step model for predicting field-scale P loss
USDA-ARS?s Scientific Manuscript database
Several models with varying degrees of complexity are available for describing P movement through the landscape. The complexity of these models is dependent on the amount of data required by the model, the number of model parameters needed to be estimated, the theoretical rigor of the governing equa...
McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron
2011-03-01
Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.
Hanning, Brian; Predl, Nicolle
2015-09-01
Traditional overnight rehabilitation payment models in the private sector are not based on a rigorous classification system and vary greatly between contracts with no consideration of patient complexity. The payment rates are not based on relative cost and the length-of-stay (LOS) point at which a reduced rate applies (step downs) varies markedly. The rehabilitation Australian National Sub-Acute and Non-Acute Patient (AN-SNAP) model (RAM), which has been in place for over 2 years in some private hospitals, bases payment on a rigorous classification system, relative cost and industry LOS. RAM is in the process of being rolled out more widely. This paper compares and contrasts RAM with traditional overnight rehabilitation payment models. It considers the advantages of RAM for hospitals and Australian Health Service Alliance. It also considers payment model changes in the context of maintaining industry consistency with Electronic Claims Lodgement and Information Processing System Environment (ECLIPSE) and health reform generally.
Random Matrix Theory and the Anderson Model
NASA Astrophysics Data System (ADS)
Bellissard, Jean
2004-08-01
This paper is devoted to a discussion of possible strategies to prove rigorously the existence of a metal-insulator Anderson transition for the Anderson model in dimension d≥3. The possible criterions used to define such a transition are presented. It is argued that at low disorder the lowest order in perturbation theory is described by a random matrix model. Various simplified versions for which rigorous results have been obtained in the past are discussed. It includes a free probability approach, the Wegner n-orbital model and a class of models proposed by Disertori, Pinson, and Spencer, Comm. Math. Phys. 232:83-124 (2002). At last a recent work by Magnen, Rivasseau, and the author, Markov Process and Related Fields 9:261-278 (2003) is summarized: it gives a toy modeldescribing the lowest order approximation of Anderson model and it is proved that, for d=2, its density of states is given by the semicircle distribution. A short discussion of its extension to d≥3 follows.
NASA Astrophysics Data System (ADS)
Druhan, Jennifer L.; Steefel, Carl I.; Conrad, Mark E.; DePaolo, Donald J.
2014-01-01
This study demonstrates a mechanistic incorporation of the stable isotopes of sulfur within the CrunchFlow reactive transport code to model the range of microbially-mediated redox processes affecting kinetic isotope fractionation. Previous numerical models of microbially mediated sulfate reduction using Monod-type rate expressions have lacked rigorous coupling of individual sulfur isotopologue rates, with the result that they cannot accurately simulate sulfur isotope fractionation over a wide range of substrate concentrations using a constant fractionation factor. Here, we derive a modified version of the dual-Monod or Michaelis-Menten formulation (Maggi and Riley, 2009, 2010) that successfully captures the behavior of the 32S and 34S isotopes over a broad range from high sulfate and organic carbon availability to substrate limitation using a constant fractionation factor. The new model developments are used to simulate a large-scale column study designed to replicate field scale conditions of an organic carbon (acetate) amended biostimulation experiment at the Old Rifle site in western Colorado. Results demonstrate an initial period of iron reduction that transitions to sulfate reduction, in agreement with field-scale behavior observed at the Old Rifle site. At the height of sulfate reduction, effluent sulfate concentrations decreased to 0.5 mM from an influent value of 8.8 mM over the 100 cm flow path, and thus were enriched in sulfate δ34S from 6.3‰ to 39.5‰. The reactive transport model accurately reproduced the measured enrichment in δ34S of both the reactant (sulfate) and product (sulfide) species of the reduction reaction using a single fractionation factor of 0.987 obtained independently from field-scale measurements. The model also accurately simulated the accumulation and δ34S signature of solid phase elemental sulfur over the duration of the experiment, providing a new tool to predict the isotopic signatures associated with reduced mineral pools. To our knowledge, this is the first rigorous treatment of sulfur isotope fractionation subject to Monod kinetics in a mechanistic reactive transport model that considers the isotopic spatial distribution of both dissolved and solid phase sulfur species during microbially-mediated sulfate reduction. describe the design and results of the large-scale column experiment; demonstrate incorporation of the stable isotopes of sulfur in a dual-Monod kinetic expression such that fractionation is accurately modeled at both high and low substrate availability; verify accurate simulation of the chemical and isotopic gradients in reactant and product sulfur species using a kinetic fractionation factor obtained from field-scale analysis (Druhan et al., 2012); utilize the model to predict the final δ34S values of secondary sulfur minerals accumulated in the sediment over the course of the experiment. The development of rigorous isotope-specific Monod-type rate expressions are presented here in application to sulfur cycling during amended biostimulation, but are readily applicable to a variety of stable isotope systems associated with both steady state and transient biogenic redox environments. In other words, the association of this model with a uranium remediation experiment does not limit its applicability to more general redox systems. Furthermore, the ability of this model treatment to predict the isotopic composition of secondary minerals accumulated as a result of fractionating processes (item 4) offers an important means of interpreting solid phase isotopic compositions and tracking long-term stability of precipitates.
Decision support frameworks and tools for conservation
Schwartz, Mark W.; Cook, Carly N.; Pressey, Robert L.; Pullin, Andrew S.; Runge, Michael C.; Salafsky, Nick; Sutherland, William J.; Williamson, Matthew A.
2018-01-01
The practice of conservation occurs within complex socioecological systems fraught with challenges that require transparent, defensible, and often socially engaged project planning and management. Planning and decision support frameworks are designed to help conservation practitioners increase planning rigor, project accountability, stakeholder participation, transparency in decisions, and learning. We describe and contrast five common frameworks within the context of six fundamental questions (why, who, what, where, when, how) at each of three planning stages of adaptive management (project scoping, operational planning, learning). We demonstrate that decision support frameworks provide varied and extensive tools for conservation planning and management. However, using any framework in isolation risks diminishing potential benefits since no one framework covers the full spectrum of potential conservation planning and decision challenges. We describe two case studies that have effectively deployed tools from across conservation frameworks to improve conservation actions and outcomes. Attention to the critical questions for conservation project planning should allow practitioners to operate within any framework and adapt tools to suit their specific management context. We call on conservation researchers and practitioners to regularly use decision support tools as standard practice for framing both practice and research.
Evaluation of tools used to measure calcium and/or dairy consumption in children and adolescents.
Magarey, Anthea; Yaxley, Alison; Markow, Kylie; Baulderstone, Lauren; Miller, Michelle
2014-08-01
To identify and critique tools that assess Ca and/or dairy intake in children to ascertain the most accurate and reliable tools available. A systematic review of the literature was conducted using defined inclusion and exclusion criteria. Articles were included on the basis that they reported on a tool measuring Ca and/or dairy intake in children in Western countries and reported on originally developed tools or tested the validity or reliability of existing tools. Defined criteria for reporting reliability and validity properties were applied. Studies in Western countries. Children. Eighteen papers reporting on two tools that assessed dairy intake, ten that assessed Ca intake and five that assessed both dairy and Ca were identified. An examination of tool testing revealed high reliance on lower-order tests such as correlation and failure to differentiate between statistical and clinically meaningful significance. Only half of the tools were tested for reliability and results indicated that only one Ca tool and one dairy tool were reliable. Validation studies showed acceptable levels of agreement (<100 mg difference) and/or sensitivity (62-83 %) and specificity (55-77 %) in three Ca tools. With reference to the testing methodology and results, no tools were considered both valid and reliable for the assessment of dairy intake and only one tool proved valid and reliable for the assessment of Ca intake. These results clearly indicate the need for development and rigorous testing of tools to assess Ca and/or dairy intake in children and adolescents.
Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.
Forgatch, Marion S; Kjøbli, John
2016-09-01
Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.
PACE Continuous Innovation Indicators—a novel tool to measure progress in cancer treatments
Paddock, Silvia; Brum, Lauren; Sorrow, Kathleen; Thomas, Samuel; Spence, Susan; Maulbecker-Armstrong, Catharina; Goodman, Clifford; Peake, Michael; McVie, Gordon; Geipel, Gary; Li, Rose
2015-01-01
Concerns about rising health care costs and the often incremental nature of improvements in health outcomes continue to fuel intense debates about ‘progress’ and ‘value’ in cancer research. In times of tightening fiscal constraints, it is increasingly important for patients and their representatives to define what constitutes ’value’ to them. It is clear that diverse stakeholders have different priorities. Harmonisation of values may be neither possible nor desirable. Stakeholders lack tools to visualise or otherwise express these differences and to track progress in cancer treatments based on variable sets of values. The Patient Access to Cancer care Excellence (PACE) Continuous Innovation Indicators are novel, scientifically rigorous progress trackers that employ a three-step process to quantify progress in cancer treatments: 1) mine the literature to determine the strength of the evidence supporting each treatment; 2) allow users to weight the analysis according to their priorities and values; and 3) calculate Evidence Scores (E-Scores), a novel measure to track progress, based on the strength of the evidence weighted by the assigned value. We herein introduce a novel, flexible value model, show how the values from the model can be used to weight the evidence from the scientific literature to obtain E-Scores, and illustrate how assigning different values to new treatments influences the E-Scores. The Indicators allow users to learn how differing values lead to differing assessments of progress in cancer research and to check whether current incentives for innovation are aligned with their value model. By comparing E-Scores generated by this tool, users are able to visualise the relative pace of innovation across areas of cancer research and how stepwise innovation can contribute to substantial progress against cancer over time. Learning from experience and mapping current unmet needs will help to support a broad audience of stakeholders in their efforts to accelerate and maximise progress against cancer. PMID:25624879
PACE Continuous Innovation Indicators-a novel tool to measure progress in cancer treatments.
Paddock, Silvia; Brum, Lauren; Sorrow, Kathleen; Thomas, Samuel; Spence, Susan; Maulbecker-Armstrong, Catharina; Goodman, Clifford; Peake, Michael; McVie, Gordon; Geipel, Gary; Li, Rose
2015-01-01
Concerns about rising health care costs and the often incremental nature of improvements in health outcomes continue to fuel intense debates about 'progress' and 'value' in cancer research. In times of tightening fiscal constraints, it is increasingly important for patients and their representatives to define what constitutes 'value' to them. It is clear that diverse stakeholders have different priorities. Harmonisation of values may be neither possible nor desirable. Stakeholders lack tools to visualise or otherwise express these differences and to track progress in cancer treatments based on variable sets of values. The Patient Access to Cancer care Excellence (PACE) Continuous Innovation Indicators are novel, scientifically rigorous progress trackers that employ a three-step process to quantify progress in cancer treatments: 1) mine the literature to determine the strength of the evidence supporting each treatment; 2) allow users to weight the analysis according to their priorities and values; and 3) calculate Evidence Scores (E-Scores), a novel measure to track progress, based on the strength of the evidence weighted by the assigned value. We herein introduce a novel, flexible value model, show how the values from the model can be used to weight the evidence from the scientific literature to obtain E-Scores, and illustrate how assigning different values to new treatments influences the E-Scores. The Indicators allow users to learn how differing values lead to differing assessments of progress in cancer research and to check whether current incentives for innovation are aligned with their value model. By comparing E-Scores generated by this tool, users are able to visualise the relative pace of innovation across areas of cancer research and how stepwise innovation can contribute to substantial progress against cancer over time. Learning from experience and mapping current unmet needs will help to support a broad audience of stakeholders in their efforts to accelerate and maximise progress against cancer.
NASA Astrophysics Data System (ADS)
Rooney-varga, J. N.; Sterman, J.; Fracassi, E. P.; Franck, T.; Kapmeier, F.; Kurker, V.; Jones, A.; Rath, K.
2017-12-01
The strong scientific consensus about the reality and risks of anthropogenic climate change stands in stark contrast to widespread confusion and complacency among the public. Many efforts to close that gap, grounded in the information deficit model of risk communication, provide scientific information on climate change through reports and presentations. However, research shows that showing people research does not work: the gap between scientific and public understanding of climate change remains wide. Tools that are rigorously grounded in the science and motivate action on climate change are urgently needed. Here we assess the impact of one such tool, an interactive, role-play simulation, World Climate. Participants take the roles of delegates to the UN climate negotiations and are challenged to create an agreement limiting warming to no more than 2°C. The C-ROADS climate simulation model then provides participants with immediate feedback about the expected impacts of their decisions. Participants use C-ROADS to explore the climate system and use the results to refine their negotiating positions, learning about climate change while experiencing the social dynamics of negotiations and decision-making. Pre- and post-survey results from 21 sessions in eight nations showed significant gains in participants' climate change knowledge, affective engagement, intent to take action, and desire to learn. Contrary to the deficit model, gains in participants' desire to learn more and intention to act were associated with gains in affective engagement, particularly feelings of urgency and hope, but not climate knowledge. Gains were just as strong among participants who oppose government regulation, suggesting the simulation's potential to reach across political divides. Results indicate that simulations like World Climate offer a climate change communication tool that enables people to learn and feel for themselves, which together have the potential to motivate action informed by science.
González-Ferrer, Arturo; Valcárcel, María Ángel
2018-04-01
The Cohesion and Quality Act of the National Health System promotes the use of new technologies to make it possible for health professionals put the scientific evidence into practice. In order to do this, there are technological tools, known as of computer-interpretable guidelines, which can help achieve this goal from an innovation perspective. They can be adopted using an iterative process, having a great initial potential as an educational tool, of quality and safety of the patient, in the decision making and, optionally, can be integrated with the electronic medical history, once they are rigorously validated. This article presents updates on these tools, reviews international projects, and personal experiences in which they have demonstrated their value, and highlights the advantages, risks, and limitations they present from a clinical point of view. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
Kreps, Gary L
2002-01-01
The modern health care system is being irrevocably changed by the development and introduction of new health information technologies (such as health information systems, decision-support tools, specialized websites, and innovative communication devices). While many of these new technologies hold the promise of revolutionizing the modern health system and facilitating improvements in health care delivery, health education, and health promotion, it is imperative to carefully examine and assess the effectiveness of these technological tools to determine which products are most useful to apply in specific contexts, as well as to learn how to best utilize these products and processes. Without good evaluative information about new technologies, we are unlikely to reap the greatest benefits from these powerful new tools. This chapter examines the demand for evaluating health information technologies and suggests several strategies for conducting rigorous and relevant evaluation research.
Saliva as a diagnostic tool for oral and systemic diseases
Javaid, Mohammad A.; Ahmed, Ahad S.; Durand, Robert; Tran, Simon D.
2015-01-01
Early disease detection is not only vital to reduce disease severity and prevent complications, but also critical to increase success rate of therapy. Saliva has been studied extensively as a potential diagnostic tool over the last decade due to its ease and non-invasive accessibility along with its abundance of biomarkers, such as genetic material and proteins. This review will update the clinician on recent advances in salivary biomarkers to diagnose autoimmune diseases (Sjogren's syndrome, cystic fibrosis), cardiovascular diseases, diabetes, HIV, oral cancer, caries and periodontal diseases. Considering their accuracy, efficacy, ease of use and cost effectiveness, salivary diagnostic tests will be available in dental offices. It is expected that the advent of sensitive and specific salivary diagnostic tools and the establishment of defined guidelines and results following rigorous testing will allow salivary diagnostics to be used as chair-side tests for several oral and systemic diseases in the near future. PMID:26937373
Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael
2003-01-01
We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.
Investigations into phase effects from diffracted Gaussian beams for high-precision interferometry
NASA Astrophysics Data System (ADS)
Lodhia, Deepali
Gravitational wave detectors are a new class of observatories aiming to detect gravitational waves from cosmic sources. All-reflective interferometer configurations have been proposed for future detectors, replacing transmissive optics with diffractive elements, thereby reducing thermal issues associated with power absorption. However, diffraction gratings introduce additional phase noise, creating more stringent conditions for alignment stability, and further investigations are required into all-reflective interferometers. A suitable mathematical framework using Gaussian modes is required for analysing the alignment stability using diffraction gratings. Such a framework was created, whereby small beam displacements are modelled using a modal technique. It was confirmed that the original modal-based model does not contain the phase changes associated with grating displacements. Experimental tests verified that the phase of a diffracted Gaussian beam is independent of the beam shape. Phase effects were further examined using a rigorous time-domain simulation tool. These findings show that the perceived phase difference is based on an intrinsic change of coordinate system within the modal-based model, and that the extra phase can be added manually to the modal expansion. This thesis provides a well-tested and detailed mathematical framework that can be used to develop simulation codes to model more complex layouts of all-reflective interferometers.
Cooperative interactions in dense thermal Rb vapour confined in nm-scale cells
NASA Astrophysics Data System (ADS)
Keaveney, James
Gravitational wave detectors are a new class of observatories aiming to detect gravitational waves from cosmic sources. All-reflective interferometer configurations have been proposed for future detectors, replacing transmissive optics with diffractive elements, thereby reducing thermal issues associated with power absorption. However, diffraction gratings introduce additional phase noise, creating more stringent conditions for alignment stability, and further investigations are required into all-reflective interferometers. A suitable mathematical framework using Gaussian modes is required for analysing the alignment stability using diffraction gratings. Such a framework was created, whereby small beam displacements are modelled using a modal technique. It was confirmed that the original modal-based model does not contain the phase changes associated with grating displacements. Experimental tests verified that the phase of a diffracted Gaussian beam is independent of the beam shape. Phase effects were further examined using a rigorous time-domain simulation tool. These findings show that the perceived phase difference is based on an intrinsic change of coordinate system within the modal-based model, and that the extra phase can be added manually to the modal expansion. This thesis provides a well-tested and detailed mathematical framework that can be used to develop simulation codes to model more complex layouts of all-reflective interferometers.
NASA Technical Reports Server (NTRS)
Cornford, Steven L.; Feather, Martin S.
2016-01-01
This report explores the current state of the art of Safety and Mission Assurance (S&MA) in projects that have shifted towards Model Based Systems Engineering (MBSE). Its goal is to provide insight into how NASA's Office of Safety and Mission Assurance (OSMA) should respond to this shift. In MBSE, systems engineering information is organized and represented in models: rigorous computer-based representations, which collectively make many activities easier to perform, less error prone, and scalable. S&MA practices must shift accordingly. The "Objective Structure Hierarchies" recently developed by OSMA provide the framework for understanding this shift. Although the objectives themselves will remain constant, S&MA practices (activities, processes, tools) to achieve them are subject to change. This report presents insights derived from literature studies and interviews. The literature studies gleaned assurance implications from reports of space-related applications of MBSE. The interviews with knowledgeable S&MA and MBSE personnel discovered concerns and ideas for how assurance may adapt. Preliminary findings and observations are presented on the state of practice of S&MA with respect to MBSE, how it is already changing, and how it is likely to change further. Finally, recommendations are provided on how to foster the evolution of S&MA to best fit with MBSE.
NASA Astrophysics Data System (ADS)
Hidayat, D.; Nurlaelah, E.; Dahlan, J. A.
2017-09-01
The ability of mathematical creative and critical thinking are two abilities that need to be developed in the learning of mathematics. Therefore, efforts need to be made in the design of learning that is capable of developing both capabilities. The purpose of this research is to examine the mathematical creative and critical thinking ability of students who get rigorous mathematical thinking (RMT) approach and students who get expository approach. This research was quasi experiment with control group pretest-posttest design. The population were all of students grade 11th in one of the senior high school in Bandung. The result showed that: the achievement of mathematical creative and critical thinking abilities of student who obtain RMT is better than students who obtain expository approach. The use of Psychological tools and mediation with criteria of intentionality, reciprocity, and mediated of meaning on RMT helps students in developing condition in critical and creative processes. This achievement contributes to the development of integrated learning design on students’ critical and creative thinking processes.
Pediatric Issues in Sports Concussions
Giza, Christopher C.
2014-01-01
Purpose of Review: Sports-related concussions are receiving increasing attention in both the lay press and medical literature. While most media attention has been on high-profile collegiate or professional athletes, the vast majority of individuals participating in contact and collision sports are adolescents and children. This review provides a practical approach toward youth sports-related concussion with a foundation in the recent guidelines, but including specific considerations when applying these management principles to children and adolescents. Recent Findings: Objective measurement of early signs and symptoms is challenging in younger patients, and many commonly used assessment tools await rigorous validation for younger patients. Excellent evidence-based guidelines exist for CT evaluation of mild traumatic brain injury presenting to the emergency department. Evidence suggests that recovery from sports-related concussion takes longer in high school athletes compared with collegiate or professionals; however, rigorous studies below high school age are still lacking. Summary: Proper care for concussion in youth requires a delicate balance of clinical skills, age-appropriate assessment, and individualized management to achieve optimal outcomes. PMID:25470161
Multi-template polymerase chain reaction.
Kalle, Elena; Kubista, Mikael; Rensing, Christopher
2014-12-01
PCR is a formidable and potent technology that serves as an indispensable tool in a wide range of biological disciplines. However, due to the ease of use and often lack of rigorous standards many PCR applications can lead to highly variable, inaccurate, and ultimately meaningless results. Thus, rigorous method validation must precede its broad adoption to any new application. Multi-template samples possess particular features, which make their PCR analysis prone to artifacts and biases: multiple homologous templates present in copy numbers that vary within several orders of magnitude. Such conditions are a breeding ground for chimeras and heteroduplexes. Differences in template amplification efficiencies and template competition for reaction compounds undermine correct preservation of the original template ratio. In addition, the presence of inhibitors aggravates all of the above-mentioned problems. Inhibitors might also have ambivalent effects on the different templates within the same sample. Yet, no standard approaches exist for monitoring inhibitory effects in multitemplate PCR, which is crucial for establishing compatibility between samples.
Using GIS to generate spatially balanced random survey designs for natural resource applications.
Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B
2007-07-01
Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.
Considering Research Outcomes as Essential Tools for Medical Education Decision Making.
Miller, Karen Hughes; Miller, Bonnie M; Karani, Reena
2015-11-01
As medical educators face the challenge of incorporating new content, learning methods, and assessment techniques into the curriculum, the need for rigorous medical education research to guide efficient and effective instructional planning increases. When done properly, well-designed education research can provide guidance for complex education decision making. In this Commentary, the authors consider the 2015 Research in Medical Education (RIME) research and review articles in terms of the critical areas in teaching and learning that they address. The broad categories include (1) assessment (the largest collection of RIME articles, including both feedback from learners and instructors and the reliability of learner assessment), (2) the institution's impact on the learning environment, (3) what can be learned from program evaluation, and (4) emerging issues in faculty development. While the articles in this issue are broad in scope and potential impact, the RIME committee noted few studies of sufficient rigor focusing on areas of diversity and diverse learners. Although challenging to investigate, the authors encourage continuing innovation in research focused on these important areas.
Determinants of the Rigor of State Protection Policies for Persons With Dementia in Assisted Living.
Nattinger, Matthew C; Kaskie, Brian
2017-01-01
Continued growth in the number of individuals with dementia residing in assisted living (AL) facilities raises concerns about their safety and protection. However, unlike federally regulated nursing facilities, AL facilities are state-regulated and there is a high degree of variation among policies designed to protect persons with dementia. Despite the important role these protection policies have in shaping the quality of life of persons with dementia residing in AL facilities, little is known about their formation. In this research, we examined the adoption of AL protection policies pertaining to staffing, the physical environment, and the use of chemical restraints. For each protection policy type, we modeled policy rigor using an innovative point-in-time approach, incorporating variables associated with state contextual, institutional, political, and external factors. We found that the rate of state AL protection policy adoptions remained steady over the study period, with staffing policies becoming less rigorous over time. Variables reflecting institutional policy making, including legislative professionalism and bureaucratic oversight, were associated with the rigor of state AL dementia protection policies. As we continue to evaluate the mechanisms contributing to the rigor of AL protection policies, it seems that organized advocacy efforts might expand their role in educating state policy makers about the importance of protecting persons with dementia residing in AL facilities and moving to advance appropriate policies.
Qian, Ma; Ma, Jie
2009-06-07
Fletcher's spherical substrate model [J. Chem. Phys. 29, 572 (1958)] is a basic model for understanding the heterogeneous nucleation phenomena in nature. However, a rigorous thermodynamic formulation of the model has been missing due to the significant complexities involved. This has not only left the classical model deficient but also likely obscured its other important features, which would otherwise have helped to better understand and control heterogeneous nucleation on spherical substrates. This work presents a rigorous thermodynamic formulation of Fletcher's model using a novel analytical approach and discusses the new perspectives derived. In particular, it is shown that the use of an intermediate variable, a selected geometrical angle or pseudocontact angle between the embryo and spherical substrate, revealed extraordinary similarities between the first derivatives of the free energy change with respect to embryo radius for nucleation on spherical and flat substrates. Enlightened by the discovery, it was found that there exists a local maximum in the difference between the equivalent contact angles for nucleation on spherical and flat substrates due to the existence of a local maximum in the difference between the shape factors for nucleation on spherical and flat substrate surfaces. This helps to understand the complexity of the heterogeneous nucleation phenomena in a practical system. Also, it was found that the unfavorable size effect occurs primarily when R<5r( *) (R: radius of substrate and r( *): critical embryo radius) and diminishes rapidly with increasing value of R/r( *) beyond R/r( *)=5. This finding provides a baseline for controlling the size effects in heterogeneous nucleation.
A Systematic Review of Physician Leadership and Emotional Intelligence
Mintz, Laura Janine; Stoller, James K.
2014-01-01
Objective This review evaluates the current understanding of emotional intelligence (EI) and physician leadership, exploring key themes and areas for future research. Literature Search We searched the literature using PubMed, Google Scholar, and Business Source Complete for articles published between 1990 and 2012. Search terms included physician and leadership, emotional intelligence, organizational behavior, and organizational development. All abstracts were reviewed. Full articles were evaluated if they addressed the connection between EI and physician leadership. Articles were included if they focused on physicians or physicians-in-training and discussed interventions or recommendations. Appraisal and Synthesis We assessed articles for conceptual rigor, study design, and measurement quality. A thematic analysis categorized the main themes and findings of the articles. Results The search produced 3713 abstracts, of which 437 full articles were read and 144 were included in this review. Three themes were identified: (1) EI is broadly endorsed as a leadership development strategy across providers and settings; (2) models of EI and leadership development practices vary widely; and (3) EI is considered relevant throughout medical education and practice. Limitations of the literature were that most reports were expert opinion or observational and studies used several different tools for measuring EI. Conclusions EI is widely endorsed as a component of curricula for developing physician leaders. Research comparing practice models and measurement tools will critically advance understanding about how to develop and nurture EI to enhance leadership skills in physicians throughout their careers. PMID:24701306
Chapman, Benjamin P.; Weiss, Alexander; Duberstein, Paul
2016-01-01
Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in “big data” problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how three common SLT algorithms–Supervised Principal Components, Regularization, and Boosting—can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach—or perhaps because of them–SLT methods may hold value as a statistically rigorous approach to exploratory regression. PMID:27454257
Magalhaes, Sandra; Banwell, Brenda; Bar-Or, Amit; Fortier, Isabel; Hanwell, Heather E; Lim, Ming; Matt, Georg E; Neuteboom, Rinze F; O'Riordan, David L; Schneider, Paul K; Pugliatti, Maura; Shatenstein, Bryna; Tansey, Catherine M; Wassmer, Evangeline; Wolfson, Christina
2018-06-01
While studying the etiology of multiple sclerosis (MS) in children has several methodological advantages over studying etiology in adults, studies are limited by small sample sizes. Using a rigorous methodological process, we developed the Pediatric MS Tool-Kit, a measurement framework that includes a minimal set of core variables to assess etiological risk factors. We solicited input from the International Pediatric MS Study Group to select three risk factors: environmental tobacco smoke (ETS) exposure, sun exposure, and vitamin D intake. To develop the Tool-Kit, we used a Delphi study involving a working group of epidemiologists, neurologists, and content experts from North America and Europe. The Tool-Kit includes six core variables to measure ETS, six to measure sun exposure, and six to measure vitamin D intake. The Tool-Kit can be accessed online ( www.maelstrom-research.org/mica/network/tool-kit ). The goals of the Tool-Kit are to enhance exposure measurement in newly designed pediatric MS studies and comparability of results across studies, and in the longer term to facilitate harmonization of studies, a methodological approach that can be used to circumvent issues of small sample sizes. We believe the Tool-Kit will prove to be a valuable resource to guide pediatric MS researchers in developing study-specific questionnaire.
Evaluation of tools used to measure calcium and/or dairy consumption in adults.
Magarey, Anthea; Baulderstone, Lauren; Yaxley, Alison; Markow, Kylie; Miller, Michelle
2015-05-01
To identify and critique tools for the assessment of Ca and/or dairy intake in adults, in order to ascertain the most accurate and reliable tools available. A systematic review of the literature was conducted using defined inclusion and exclusion criteria. Articles reporting on originally developed tools or testing the reliability or validity of existing tools that measure Ca and/or dairy intake in adults were included. Author-defined criteria for reporting reliability and validity properties were applied. Studies conducted in Western countries. Adults. Thirty papers, utilising thirty-six tools assessing intake of dairy, Ca or both, were identified. Reliability testing was conducted on only two dairy and five Ca tools, with results indicating that only one dairy and two Ca tools were reliable. Validity testing was conducted for all but four Ca-only tools. There was high reliance in validity testing on lower-order tests such as correlation and failure to differentiate between statistical and clinically meaningful differences. Results of the validity testing suggest one dairy and five Ca tools are valid. Thus one tool was considered both reliable and valid for the assessment of dairy intake and only two tools proved reliable and valid for the assessment of Ca intake. While several tools are reliable and valid, their application across adult populations is limited by the populations in which they were tested. These results indicate a need for tools that assess Ca and/or dairy intake in adults to be rigorously tested for reliability and validity.
Operational skill assessment of the IBI-MFC Ocean Forecasting System within the frame of the CMEMS.
NASA Astrophysics Data System (ADS)
Lorente Jimenez, Pablo; Garcia-Sotillo, Marcos; Amo-Balandron, Arancha; Aznar Lecocq, Roland; Perez Gomez, Begoña; Levier, Bruno; Alvarez-Fanjul, Enrique
2016-04-01
Since operational ocean forecasting systems (OOFSs) are increasingly used as tools to support high-stakes decision-making for coastal management, a rigorous skill assessment of model performance becomes essential. In this context, the IBI-MFC (Iberia-Biscay-Ireland Monitoring & Forecasting Centre) has been providing daily ocean model estimates and forecasts for the IBI regional seas since 2011, first in the frame of MyOcean projects and later as part of the Copernicus Marine Environment Monitoring Service (CMEMS). A comprehensive web validation tool named NARVAL (North Atlantic Regional VALidation) has been developed to routinely monitor IBI performance and to evaluate model's veracity and prognostic capabilities. Three-dimensional comparisons are carried out on a different time basis ('online mode' - daily verifications - and 'delayed mode' - for longer time periods -) using a broad variety of in-situ (buoys, tide-gauges, ARGO-floats, drifters and gliders) and remote-sensing (satellite and HF radars) observational sources as reference fields to validate against the NEMO model solution. Product quality indicators and meaningful skill metrics are automatically computed not only averaged over the entire IBI domain but also over specific sub-regions of particular interest from a user perspective (i.e. coastal or shelf areas) in order to determine IBI spatial and temporal uncertainty levels. A complementary aspect of NARVAL web tool is the intercomparison of different CMEMS forecast model solutions in overlapping areas. Noticeable efforts are in progress in order to quantitatively assess the quality and consistency of nested system outputs by setting up specific intercomparison exercises on different temporal and spatial scales, encompassing global configurations (CMEMS Global system), regional applications (NWS and MED ones) and local high-resolution coastal models (i.e. the PdE SAMPA system in the Gibraltar Strait). NARVAL constitutes a powerful approach to increase our knowledge on the IBI-MFC forecast system and aids us to inform CMEMS end users about the provided ocean forecasting products' confidence level by routinely delivering QUality Information Documents (QUIDs). It allows the detection of strengths and weaknesses in the modeling of several key physical processes and the understanding of potential sources of discrepancies in IBI predictions. Once the numerical model shortcomings are identified, potential improvements can be achieved thanks to reliable upgrades, making evolve IBI OOFS towards more refined and advanced versions.
Stability analysis of an implicitly defined labor market model
NASA Astrophysics Data System (ADS)
Mendes, Diana A.; Mendes, Vivaldo M.
2008-06-01
Until very recently, the pervasive existence of models exhibiting well-defined backward dynamics but ill-defined forward dynamics in economics and finance has apparently posed no serious obstacles to the analysis of their dynamics and stability, despite the problems that may arise from possible erroneous conclusions regarding theoretical considerations and policy prescriptions from such models. A large number of papers have dealt with this problem in the past by assuming the existence of symmetry between forward and backward dynamics, even in the case when the map cannot be invertible either forward or backwards. However, this procedure has been seriously questioned over the last few years in a series of papers dealing with implicit difference equations and inverse limit spaces. This paper explores the search and matching labor market model developed by Bhattacharya and Bunzel [J. Bhattacharya, H. Bunzel, Chaotic Planning Solution in the Textbook Model of Equilibrium Labor Market Search and Matching, Mimeo, Iowa State University, 2002; J. Bhattacharya, H. Bunzel, Economics Bulletin 5 (19) (2003) 1-10], with the following objectives in mind: (i) to show that chaotic dynamics may still be present in the model for acceptable parameter values, (ii) to clarify some open questions related with the admissible dynamics in the forward looking setting, by providing a rigorous proof of the existence of cyclic and chaotic dynamics through the application of tools from symbolic dynamics and inverse limit theory.
Relativistic space-charge-limited current for massive Dirac fermions
NASA Astrophysics Data System (ADS)
Ang, Y. S.; Zubair, M.; Ang, L. K.
2017-04-01
A theory of relativistic space-charge-limited current (SCLC) is formulated to determine the SCLC scaling, J ∝Vα/Lβ , for a finite band-gap Dirac material of length L biased under a voltage V . In one-dimensional (1D) bulk geometry, our model allows (α ,β ) to vary from (2,3) for the nonrelativistic model in traditional solids to (3/2,2) for the ultrarelativistic model of massless Dirac fermions. For 2D thin-film geometry we obtain α =β , which varies between 2 and 3/2, respectively, at the nonrelativistic and ultrarelativistic limits. We further provide rigorous proof based on a Green's-function approach that for a uniform SCLC model described by carrier-density-dependent mobility, the scaling relations of the 1D bulk model can be directly mapped into the case of 2D thin film for any contact geometries. Our simplified approach provides a convenient tool to obtain the 2D thin-film SCLC scaling relations without the need of explicitly solving the complicated 2D problems. Finally, this work clarifies the inconsistency in using the traditional SCLC models to explain the experimental measurement of a 2D Dirac semiconductor. We conclude that the voltage scaling 3 /2 <α <2 is a distinct signature of massive Dirac fermions in a Dirac semiconductor and is in agreement with experimental SCLC measurements in MoS2.
Extreme Response Style: Which Model Is Best?
ERIC Educational Resources Information Center
Leventhal, Brian
2017-01-01
More robust and rigorous psychometric models, such as multidimensional Item Response Theory models, have been advocated for survey applications. However, item responses may be influenced by construct-irrelevant variance factors such as preferences for extreme response options. Through empirical and simulation methods, this study evaluates the use…
MixGF: spectral probabilities for mixture spectra from more than one peptide.
Wang, Jian; Bourne, Philip E; Bandeira, Nuno
2014-12-01
In large-scale proteomic experiments, multiple peptide precursors are often cofragmented simultaneously in the same mixture tandem mass (MS/MS) spectrum. These spectra tend to elude current computational tools because of the ubiquitous assumption that each spectrum is generated from only one peptide. Therefore, tools that consider multiple peptide matches to each MS/MS spectrum can potentially improve the relatively low spectrum identification rate often observed in proteomics experiments. More importantly, data independent acquisition protocols promoting the cofragmentation of multiple precursors are emerging as alternative methods that can greatly improve the throughput of peptide identifications but their success also depends on the availability of algorithms to identify multiple peptides from each MS/MS spectrum. Here we address a fundamental question in the identification of mixture MS/MS spectra: determining the statistical significance of multiple peptides matched to a given MS/MS spectrum. We propose the MixGF generating function model to rigorously compute the statistical significance of peptide identifications for mixture spectra and show that this approach improves the sensitivity of current mixture spectra database search tools by a ≈30-390%. Analysis of multiple data sets with MixGF reveals that in complex biological samples the number of identified mixture spectra can be as high as 20% of all the identified spectra and the number of unique peptides identified only in mixture spectra can be up to 35.4% of those identified in single-peptide spectra. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
MixGF: Spectral Probabilities for Mixture Spectra from more than One Peptide*
Wang, Jian; Bourne, Philip E.; Bandeira, Nuno
2014-01-01
In large-scale proteomic experiments, multiple peptide precursors are often cofragmented simultaneously in the same mixture tandem mass (MS/MS) spectrum. These spectra tend to elude current computational tools because of the ubiquitous assumption that each spectrum is generated from only one peptide. Therefore, tools that consider multiple peptide matches to each MS/MS spectrum can potentially improve the relatively low spectrum identification rate often observed in proteomics experiments. More importantly, data independent acquisition protocols promoting the cofragmentation of multiple precursors are emerging as alternative methods that can greatly improve the throughput of peptide identifications but their success also depends on the availability of algorithms to identify multiple peptides from each MS/MS spectrum. Here we address a fundamental question in the identification of mixture MS/MS spectra: determining the statistical significance of multiple peptides matched to a given MS/MS spectrum. We propose the MixGF generating function model to rigorously compute the statistical significance of peptide identifications for mixture spectra and show that this approach improves the sensitivity of current mixture spectra database search tools by a ≈30–390%. Analysis of multiple data sets with MixGF reveals that in complex biological samples the number of identified mixture spectra can be as high as 20% of all the identified spectra and the number of unique peptides identified only in mixture spectra can be up to 35.4% of those identified in single-peptide spectra. PMID:25225354
Mamlin, Burke W; Tierney, William M
2016-01-01
Healthcare is an information business with expanding use of information and communication technologies (ICTs). Current ICT tools are immature, but a brighter future looms. We examine 7 areas of ICT in healthcare: electronic health records (EHRs), health information exchange (HIE), patient portals, telemedicine, social media, mobile devices and wearable sensors and monitors, and privacy and security. In each of these areas, we examine the current status and future promise, highlighting how each might reach its promise. Steps to better EHRs include a universal programming interface, universal patient identifiers, improved documentation and improved data analysis. HIEs require federal subsidies for sustainability and support from EHR vendors, targeting seamless sharing of EHR data. Patient portals must bring patients into the EHR with better design and training, greater provider engagement and leveraging HIEs. Telemedicine needs sustainable payment models, clear rules of engagement, quality measures and monitoring. Social media needs consensus on rules of engagement for providers, better data mining tools and approaches to counter disinformation. Mobile and wearable devices benefit from a universal programming interface, improved infrastructure, more rigorous research and integration with EHRs and HIEs. Laws for privacy and security need updating to match current technologies, and data stewards should share information on breaches and standardize best practices. ICT tools are evolving quickly in healthcare and require a rational and well-funded national agenda for development, use and assessment. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Beneš, Michal; Pažanin, Igor
2018-03-01
This paper reports an analytical investigation of non-isothermal fluid flow in a thin (or long) vertical pipe filled with porous medium via asymptotic analysis. We assume that the fluid inside the pipe is cooled (or heated) by the surrounding medium and that the flow is governed by the prescribed pressure drop between pipe's ends. Starting from the dimensionless Darcy-Brinkman-Boussinesq system, we formally derive a macroscopic model describing the effective flow at small Brinkman-Darcy number. The asymptotic approximation is given by the explicit formulae for the velocity, pressure and temperature clearly acknowledging the effects of the cooling (heating) and porous structure. The theoretical error analysis is carried out to indicate the order of accuracy and to provide a rigorous justification of the effective model.
Development of rigor mortis is not affected by muscle volume.
Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H
2001-04-01
There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Debono, Josephine C, E-mail: josephine.debono@bci.org.au; Poulos, Ann E; Westmead Breast Cancer Institute, Westmead, New South Wales
The aim of this study was to first evaluate the quality of studies investigating the diagnostic accuracy of radiographers as mammogram screen-readers and then to develop an adapted tool for determining the quality of screen-reading studies. A literature search was used to identify relevant studies and a quality evaluation tool constructed by combining the criteria for quality of Whiting, Rutjes, Dinnes et al. and Brealey and Westwood. This constructed tool was then applied to the studies and subsequently adapted specifically for use in evaluating quality in studies investigating diagnostic accuracy of screen-readers. Eleven studies were identified and the constructed toolmore » applied to evaluate quality. This evaluation resulted in the identification of quality issues with the studies such as potential for bias, applicability of results, study conduct, reporting of the study and observer characteristics. An assessment of the applicability and relevance of the tool for this area of research resulted in adaptations to the criteria and the development of a tool specifically for evaluating diagnostic accuracy in screen-reading. This tool, with further refinement and rigorous validation can make a significant contribution to promoting well-designed studies in this important area of research and practice.« less
Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I
2015-01-01
High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®
Richard Haynes; Darius Adams; Peter Ince; John Mills; Ralph Alig
2006-01-01
The United States has a century of experience with the development of models that describe markets for forest products and trends in resource conditions. In the last four decades, increasing rigor in policy debates has stimulated the development of models to support policy analysis. Increasingly, research has evolved (often relying on computer-based models) to increase...
A square-force cohesion model and its extraction from bulk measurements
NASA Astrophysics Data System (ADS)
Liu, Peiyuan; Lamarche, Casey; Kellogg, Kevin; Hrenya, Christine
2017-11-01
Cohesive particles remain poorly understood, with order of magnitude differences exhibited for prior, physical predictions of agglomerate size. A major obstacle lies in the absence of robust models of particle-particle cohesion, thereby precluding accurate prediction of the behavior of cohesive particles. Rigorous cohesion models commonly contain parameters related to surface roughness, to which cohesion shows extreme sensitivity. However, both roughness measurement and its distillation into these model parameters are challenging. Accordingly, we propose a ``square-force'' model, where cohesive force remains constant until a cut-off separation. Via DEM simulations, we demonstrate validity of the square-force model as surrogate of more rigorous models, when its two parameters are selected to match the two key quantities governing dense and dilute granular flows, namely maximum cohesive force and critical cohesive energy, respectively. Perhaps more importantly, we establish a method to extract the parameters in the square-force model via defluidization, due to its ability to isolate the effects of the two parameters. Thus, instead of relying on complicated scans of individual grains, determination of particle-particle cohesion from simple bulk measurements becomes feasible. Dow Corning Corporation.
Stabilization and Reconstruction Operations: A New Paradigm, Analysis Tool, and US Air Force Role
2007-03-08
Maslow in his 1943 treatise, A Theory of Human Motivation, from which his Hierarchy of Needs Pyramid is derived.15 An interesting observation can be...or program budgets, but out of a necessity to retain relevancy in today’s conflict environment. It has taken over a decade to reach this point, but...than rigorous scientific explanation. The New Paradigm The new paradigm is not really new at all but is derived from noted psychologist Abraham
The environment and human health; USGS science for solutions
,
2001-01-01
Emerging infectious diseases, ground-water contamination, trace-metal poisoning...environmental threats to public health the world over require new solutions. Because of an increased awareness of the issues, greater cooperation among scientific and policy agencies, and powerful new tools and techniques to conduct research, there is new hope that complex ecological health problems can be solved. U.S. Geological Survey scientists are forming partnerships with experts in the public health and biomedical research communities to conduct rigorous scientific inquiries into the health effects of ecological processes.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
Fractional Stochastic Differential Equations Satisfying Fluctuation-Dissipation Theorem
NASA Astrophysics Data System (ADS)
Li, Lei; Liu, Jian-Guo; Lu, Jianfeng
2017-10-01
We propose in this work a fractional stochastic differential equation (FSDE) model consistent with the over-damped limit of the generalized Langevin equation model. As a result of the `fluctuation-dissipation theorem', the differential equations driven by fractional Brownian noise to model memory effects should be paired with Caputo derivatives, and this FSDE model should be understood in an integral form. We establish the existence of strong solutions for such equations and discuss the ergodicity and convergence to Gibbs measure. In the linear forcing regime, we show rigorously the algebraic convergence to Gibbs measure when the `fluctuation-dissipation theorem' is satisfied, and this verifies that satisfying `fluctuation-dissipation theorem' indeed leads to the correct physical behavior. We further discuss possible approaches to analyze the ergodicity and convergence to Gibbs measure in the nonlinear forcing regime, while leave the rigorous analysis for future works. The FSDE model proposed is suitable for systems in contact with heat bath with power-law kernel and subdiffusion behaviors.
Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane
2018-05-01
This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
Rigorous description of holograms of particles illuminated by an astigmatic elliptical Gaussian beam
NASA Astrophysics Data System (ADS)
Yuan, Y. J.; Ren, K. F.; Coëtmellec, S.; Lebrun, D.
2009-02-01
The digital holography is a non-intrusive optical metrology and well adapted for the measurement of the size and velocity field of particles in the spray of a fluid. The simplified model of an opaque disk is often used in the treatment of the diagrams and therefore the refraction and the third dimension diffraction of the particle are not taken into account. We present in this paper a rigorous description of the holographic diagrams and evaluate the effects of the refraction and the third dimension diffraction by comparison to the opaque disk model. It is found that the effects are important when the real part of the refractive index is near unity or the imaginary part is non zero but small.
Engineering education as a complex system
NASA Astrophysics Data System (ADS)
Gattie, David K.; Kellam, Nadia N.; Schramski, John R.; Walther, Joachim
2011-12-01
This paper presents a theoretical basis for cultivating engineering education as a complex system that will prepare students to think critically and make decisions with regard to poorly understood, ill-structured issues. Integral to this theoretical basis is a solution space construct developed and presented as a benchmark for evaluating problem-solving orientations that emerge within students' thinking as they progress through an engineering curriculum. It is proposed that the traditional engineering education model, while analytically rigorous, is characterised by properties that, although necessary, are insufficient for preparing students to address complex issues of the twenty-first century. A Synthesis and Design Studio model for engineering education is proposed, which maintains the necessary rigor of analysis within a uniquely complex yet sufficiently structured learning environment.
A Prospective Test of Cognitive Vulnerability Models of Depression with Adolescent Girls
ERIC Educational Resources Information Center
Bohon, Cara; Stice, Eric; Burton, Emily; Fudell, Molly; Nolen-Hoeksema, Susan
2008-01-01
This study sought to provide a more rigorous prospective test of two cognitive vulnerability models of depression with longitudinal data from 496 adolescent girls. Results supported the cognitive vulnerability model in that stressors predicted future increases in depressive symptoms and onset of clinically significant major depression for…
Learning, Judgment, and the Rooted Particular
ERIC Educational Resources Information Center
McCabe, David
2012-01-01
This article begins by acknowledging the general worry that scholarship in the humanities lacks the rigor and objectivity of other scholarly fields. In considering the validity of that criticism, I distinguish two models of learning: the covering law model exemplified by the natural sciences, and the model of rooted particularity that…
Quantitative comparison between crowd models for evacuation planning and evaluation
NASA Astrophysics Data System (ADS)
Viswanathan, Vaisagh; Lee, Chong Eu; Lees, Michael Harold; Cheong, Siew Ann; Sloot, Peter M. A.
2014-02-01
Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we describe a procedure to quantitatively compare different crowd models or between models and real-world data. We simulated three models: (1) the lattice gas model, (2) the social force model, and (3) the RVO2 model, and obtained the distributions of six observables: (1) evacuation time, (2) zoned evacuation time, (3) passage density, (4) total distance traveled, (5) inconvenience, and (6) flow rate. We then used the DISTATIS procedure to compute the compromise matrix of statistical distances between the three models. Projecting the three models onto the first two principal components of the compromise matrix, we find the lattice gas and RVO2 models are similar in terms of the evacuation time, passage density, and flow rates, whereas the social force and RVO2 models are similar in terms of the total distance traveled. Most importantly, we find that the zoned evacuation times of the three models to be very different from each other. Thus we propose to use this variable, if it can be measured, as the key test between different models, and also between models and the real world. Finally, we compared the model flow rates against the flow rate of an emergency evacuation during the May 2008 Sichuan earthquake, and found the social force model agrees best with this real data.
The immune epitope database: a historical retrospective of the first decade.
Salimi, Nima; Fleri, Ward; Peters, Bjoern; Sette, Alessandro
2012-10-01
As the amount of biomedical information available in the literature continues to increase, databases that aggregate this information continue to grow in importance and scope. The population of databases can occur either through fully automated text mining approaches or through manual curation by human subject experts. We here report our experiences in populating the National Institute of Allergy and Infectious Diseases sponsored Immune Epitope Database and Analysis Resource (IEDB, http://iedb.org), which was created in 2003, and as of 2012 captures the epitope information from approximately 99% of all papers published to date that describe immune epitopes (with the exception of cancer and HIV data). This was achieved using a hybrid model based on automated document categorization and extensive human expert involvement. This task required automated scanning of over 22 million PubMed abstracts followed by classification and curation of over 13 000 references, including over 7000 infectious disease-related manuscripts, over 1000 allergy-related manuscripts, roughly 4000 related to autoimmunity, and 1000 transplant/alloantigen-related manuscripts. The IEDB curation involves an unprecedented level of detail, capturing for each paper the actual experiments performed for each different epitope structure. Key to enabling this process was the extensive use of ontologies to ensure rigorous and consistent data representation as well as interoperability with other bioinformatics resources, including the Protein Data Bank, Chemical Entities of Biological Interest, and the NIAID Bioinformatics Resource Centers. A growing fraction of the IEDB data derives from direct submissions by research groups engaged in epitope discovery, and is being facilitated by the implementation of novel data submission tools. The present explosion of information contained in biological databases demands effective query and display capabilities to optimize the user experience. Accordingly, the development of original ways to query the database, on the basis of ontologically driven hierarchical trees, and display of epitope data in aggregate in a biologically intuitive yet rigorous fashion is now at the forefront of the IEDB efforts. We also highlight advances made in the realm of epitope analysis and predictive tools available in the IEDB. © 2012 The Authors. Immunology © 2012 Blackwell Publishing Ltd.
Preserving pre-rigor meat functionality for beef patty production.
Claus, J R; Sørheim, O
2006-06-01
Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.
Estimation of the time since death--reconsidering the re-establishment of rigor mortis.
Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter
2013-01-01
In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.
Hitzig, Sander L.; Balioussis, Christina; Nussbaum, Ethne; McGillivray, Colleen F.; Catharine Craven, B.; Noreau, Luc
2013-01-01
Context Although pressure ulcers may negatively influence quality of life (QoL) post-spinal cord injury (SCI), our understanding of how to assess their impact is confounded by conceptual and measurement issues. To ensure that descriptions of pressure ulcer impact are appropriately characterized, measures should be selected according to the domains that they evaluate and the population and pathologies for which they are designed. Objective To conduct a systematic literature review to identify and classify outcome measures used to assess the impact of pressure ulcers on QoL after SCI. Methods Electronic databases (Medline/PubMed, CINAHL, and PsycInfo) were searched for studies published between 1975 and 2011. Identified outcome measures were classified as being either subjective or objective using a QoL model. Results Fourteen studies were identified. The majority of tools identified in these studies did not have psychometric evidence supporting their use in the SCI population with the exception of two objective measures, the Short-Form 36 and the Craig Handicap Assessment and Reporting Technique, and two subjective measures, the Life Situation Questionnaire-Revised and the Ferrans and Powers Quality of Life Index SCI-Version. Conclusion Many QoL outcome tools showed promise in being sensitive to the presence of pressure ulcers, but few of them have been validated for use with SCI. Prospective studies should employ more rigorous methods for collecting data on pressure ulcer severity and location to improve the quality of findings with regard to their impact on QoL. The Cardiff Wound Impact Schedule is a potential tool for assessing impact of pressure ulcers-post SCI. PMID:24090238
Silicon nanoporous membranes as a rigorous platform for validation of biomolecular transport models
Feinberg, Benjamin J.; Hsiao, Jeff C.; Park, Jaehyun; Zydney, Andrew L.; Fissell, William H.; Roy, Shuvo
2017-01-01
Microelectromechanical systems (MEMS), a technology that resulted from significant innovation in semiconductor fabrication, have recently been applied to the development of silicon nanopore membranes (SNM). In contrast to membranes fabricated from polymeric materials, SNM exhibit slit-shaped pores, monodisperse pore size, constant surface porosity, zero pore overlap, and sub-micron thickness. This development in membrane fabrication is applied herein for the validation of the XDLVO (extended Derjaguin, Landau, Verwey, and Overbeek) theory of membrane transport within the context of hemofiltration. In this work, the XDLVO model has been derived for the unique slit pore structure of SNM. Beta-2-microglobulin (B2M), a clinically relevant “middle molecular weight” solute in kidney disease, is highlighted in this study as the solute of interest. In order to determine interaction parameters within the XDLVO model for B2M and SNM, goniometric measurements were conducted, yielding a Hamaker constant of 4.61× 10−21 J and an acid-base Gibbs free energy at contact of 41 mJ/m2. The XDLVO model was combined with existing models for membrane sieving, with predictions of the refined model in good agreement with experimental data. Furthermore, the results show a significant difference between the XDLVO model and the simpler steric predictions typically applied in membrane transport. The refined model can be used as a tool to tailor membrane chemistry and maximize sieving or rejection of different biomolecules. PMID:28936029
Modeling near-road air quality using a computational fluid dynamics model, CFD-VIT-RIT.
Wang, Y Jason; Zhang, K Max
2009-10-15
It is well recognized that dilution is an important mechanism governing the near-road air pollutant concentrations. In this paper, we aim to advance our understanding of turbulent mixing mechanisms on and near roadways using computation fluid dynamics. Turbulent mixing mechanisms can be classified into three categories according to their origins: vehicle-induced turbulence (VIT), road-induced turbulence (RIT), and atmospheric boundary layer turbulence. RIT includes the turbulence generated by road embankment, road surface thermal effects, and roadside structures. Both VIT and RIT are affected by the roadway designs. We incorporate the detailed treatment of VIT and RIT into the CFD (namely CFD-VIT-RIT) and apply the model in simulating the spatial gradients of carbon monoxide near two major highways with different traffic mix and roadway configurations. The modeling results are compared to the field measurements and those from CALINE4 and CFD without considering VIT and RIT. We demonstrate that the incorporation of VIT and RIT considerably improves the modeling predictions, especially on vertical gradients and seasonal variations of carbon monoxide. Our study implies that roadway design can significantly influence the near-road air pollution. Thus we recommend that mitigating near-road air pollution through roadway designs be considered in the air quality and transportation management In addition, thanks to the rigorous representation of turbulent mixing mechanisms, CFD-VIT-RIT can become valuable tools in the roadway designs process.
d'Alessio, M. A.; Williams, C.F.
2007-01-01
A suite of new techniques in thermochronometry allow analysis of the thermal history of a sample over a broad range of temperature sensitivities. New analysis tools must be developed that fully and formally integrate these techniques, allowing a single geologic interpretation of the rate and timing of exhumation and burial events consistent with all data. We integrate a thermal model of burial and exhumation, (U-Th)/He age modeling, and fission track age and length modeling. We then use a genetic algorithm to efficiently explore possible time-exhumation histories of a vertical sample profile (such as a borehole), simultaneously solving for exhumation and burial rates as well as changes in background heat flow. We formally combine all data in a rigorous statistical fashion. By parameterizing the model in terms of exhumation rather than time-temperature paths (as traditionally done in fission track modeling), we can ensure that exhumation histories result in a sedimentary basin whose thickness is consistent with the observed basin, a physically based constraint that eliminates otherwise acceptable thermal histories. We apply the technique to heat flow and thermochronometry data from the 2.1 -km-deep San Andreas Fault Observatory at Depth pilot hole near the San Andreas fault, California. We find that the site experienced <1 km of exhumation or burial since the onset of San Andreas fault activity ???30 Ma.
Inferring the nature of anthropogenic threats from long-term abundance records.
Shoemaker, Kevin T; Akçakaya, H Resit
2015-02-01
Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.
2014-10-27
a phase-averaged spectral wind-wave generation and transformation model and its interface in the Surface-water Modeling System (SMS). Ambrose...applications of the Boussinesq (BOUSS-2D) wave model that provides more rigorous calculations for design and performance optimization of integrated...navigation systems . Together these wave models provide reliable predictions on regional and local spatial domains and cost-effective engineering solutions
NASA Astrophysics Data System (ADS)
Toman, Blaza; Nelson, Michael A.; Bedner, Mary
2017-06-01
Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).
Chiu, Grace S; Wu, Margaret A; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.
Fullen, Daniel J.; Murray, Bryan; Mori, Julie; Catchpole, Andrew; Borley, Daryl W.; Murray, Edward J.; Balaratnam, Ganesh; Gilbert, Anthony; Mann, Alex; Hughes, Fiona; Lambkin-Williams, Rob
2016-01-01
Background Human Rhinovirus infection is an important precursor to asthma and chronic obstructive pulmonary disease exacerbations and the Human Viral Challenge model may provide a powerful tool in studying these and other chronic respiratory diseases. In this study we have reported the production and human characterisation of a new Wild-Type HRV-16 challenge virus produced specifically for this purpose. Methods and Stock Development A HRV-16 isolate from an 18 year old experimentally infected healthy female volunteer (University of Virginia Children’s Hospital, USA) was obtained with appropriate medical history and consent. We manufactured a new HRV-16 stock by minimal passage in a WI-38 cell line under Good Manufacturing Practice conditions. Having first subjected the stock to rigorous adventitious agent testing and determining the virus suitability for human use, we conducted an initial safety and pathogenicity clinical study in adult volunteers in our dedicated clinical quarantine facility in London. Human Challenge and Conclusions In this study we have demonstrated the new Wild-Type HRV-16 Challenge Virus to be both safe and pathogenic, causing an appropriate level of disease in experimentally inoculated healthy adult volunteers. Furthermore, by inoculating volunteers with a range of different inoculum titres, we have established the minimum inoculum titre required to achieve reproducible disease. We have demonstrated that although inoculation titres as low as 1 TCID50 can produce relatively high infection rates, the optimal titre for progression with future HRV challenge model development with this virus stock was 10 TCID50. Studies currently underway are evaluating the use of this virus as a challenge agent in asthmatics. Trial Registration ClinicalTrials.gov NCT02522832 PMID:27936016
Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities
Bardhan, Jaydeep P.
2014-01-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics. PMID:25505358
Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities.
Bardhan, Jaydeep P
2013-12-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.
Gradient models in molecular biophysics: progress, challenges, opportunities
NASA Astrophysics Data System (ADS)
Bardhan, Jaydeep P.
2013-12-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g., molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding nonlocal dielectric response. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain, and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost 40 years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The review concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.
Sociomateriality: a theoretical framework for studying distributed medical education.
MacLeod, Anna; Kits, Olga; Whelan, Emma; Fournier, Cathy; Wilson, Keith; Power, Gregory; Mann, Karen; Tummons, Jonathan; Brown, Peggy Alexiadis
2015-11-01
Distributed medical education (DME) is a type of distance learning in which students participate in medical education from diverse geographic locations using Web conferencing, videoconferencing, e-learning, and similar tools. DME is becoming increasingly widespread in North America and around the world.Although relatively new to medical education, distance learning has a long history in the broader field of education and a related body of literature that speaks to the importance of engaging in rigorous and theoretically informed studies of distance learning. The existing DME literature is helpful, but it has been largely descriptive and lacks a critical "lens"-that is, a theoretical perspective from which to rigorously conceptualize and interrogate DME's social (relationships, people) and material (technologies, tools) aspects.The authors describe DME and theories about distance learning and show that such theories focus on social, pedagogical, and cognitive considerations without adequately taking into account material factors. They address this gap by proposing sociomateriality as a theoretical framework allowing researchers and educators to study DME and (1) understand and consider previously obscured actors, infrastructure, and other factors that, on the surface, seem unrelated and even unimportant; (2) see clearly how the social and material components of learning are intertwined in fluid, messy, and often uncertain ways; and (3) perhaps think differently, even in ways that disrupt traditional approaches, as they explore DME. The authors conclude that DME brings with it substantial investments of social and material resources, and therefore needs careful study, using approaches that embrace its complexity.
NASA Astrophysics Data System (ADS)
Bilki, Burak
2018-03-01
The Particle Flow Algorithms attempt to measure each particle in a hadronic jet individually, using the detector providing the best energy/momentum resolution. Therefore, the spatial segmentation of the calorimeter plays a crucial role. In this context, the CALICE Collaboration developed the Digital Hadron Calorimeter. The Digital Hadron Calorimeter uses Resistive Plate Chambers as active media and has a 1-bit resolution (digital) readout of 1 × 1 cm2 pads. The calorimeter was tested with steel and tungsten absorber structures, as well as with no absorber structure, at the Fermilab and CERN test beam facilities over several years. In addition to conventional calorimetric measurements, the Digital Hadron Calorimeter offers detailed measurements of event shapes, rigorous tests of simulation models and various tools for improved performance due to its very high spatial granularity. Here we report on the results from the analysis of pion and positron events. Results of comparisons with the Monte Carlo simulations are also discussed. The analysis demonstrates the unique utilization of detailed event topologies.
Benson, Neil; van der Graaf, Piet H; Peletier, Lambertus A
2017-11-15
A key element of the drug discovery process is target selection. Although the topic is subject to much discussion and experimental effort, there are no defined quantitative rules around optimal selection. Often 'rules of thumb', that have not been subject to rigorous exploration, are used. In this paper we explore the 'rule of thumb' notion that the molecule that initiates a pathway signal is the optimal target. Given the multi-factorial and complex nature of this question, we have simplified an example pathway to its logical minimum of two steps and used a mathematical model of this to explore the different options in the context of typical small and large molecule drugs. In this paper, we report the conclusions of our analysis and describe the analysis tool and methods used. These provide a platform to enable a more extensive enquiry into this important topic. Copyright © 2017 Elsevier B.V. All rights reserved.
Simulation of Watts Bar Unit 1 Initial Startup Tests with Continuous Energy Monte Carlo Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Godfrey, Andrew T; Gehin, Jess C; Bekar, Kursat B
2014-01-01
The Consortium for Advanced Simulation of Light Water Reactors* is developing a collection of methods and software products known as VERA, the Virtual Environment for Reactor Applications. One component of the testing and validation plan for VERA is comparison of neutronics results to a set of continuous energy Monte Carlo solutions for a range of pressurized water reactor geometries using the SCALE component KENO-VI developed by Oak Ridge National Laboratory. Recent improvements in data, methods, and parallelism have enabled KENO, previously utilized predominately as a criticality safety code, to demonstrate excellent capability and performance for reactor physics applications. The highlymore » detailed and rigorous KENO solutions provide a reliable nu-meric reference for VERAneutronics and also demonstrate the most accurate predictions achievable by modeling and simulations tools for comparison to operating plant data. This paper demonstrates the performance of KENO-VI for the Watts Bar Unit 1 Cycle 1 zero power physics tests, including reactor criticality, control rod worths, and isothermal temperature coefficients.« less
Hindrances to bistable front propagation: application to Wolbachia invasion.
Nadin, Grégoire; Strugarek, Martin; Vauchelet, Nicolas
2018-05-01
We study the biological situation when an invading population propagates and replaces an existing population with different characteristics. For instance, this may occur in the presence of a vertically transmitted infection causing a cytoplasmic effect similar to the Allee effect (e.g. Wolbachia in Aedes mosquitoes): the invading dynamics we model is bistable. We aim at quantifying the propagules (what does it take for an invasion to start?) and the invasive power (how far can an invading front go, and what can stop it?). We rigorously show that a heterogeneous environment inducing a strong enough population gradient can stop an invading front, which will converge in this case to a stable front. We characterize the critical population jump, and also prove the existence of unstable fronts above the stable (blocking) fronts. Being above the maximal unstable front enables an invading front to clear the obstacle and propagate further. We are particularly interested in the case of artificial Wolbachia infection, used as a tool to fight arboviruses.
Modeling biochemical pathways in the gene ontology
Hill, David P.; D’Eustachio, Peter; Berardini, Tanya Z.; ...
2016-09-01
The concept of a biological pathway, an ordered sequence of molecular transformations, is used to collect and represent molecular knowledge for a broad span of organismal biology. Representations of biomedical pathways typically are rich but idiosyncratic presentations of organized knowledge about individual pathways. Meanwhile, biomedical ontologies and associated annotation files are powerful tools that organize molecular information in a logically rigorous form to support computational analysis. The Gene Ontology (GO), representing Molecular Functions, Biological Processes and Cellular Components, incorporates many aspects of biological pathways within its ontological representations. Here we present a methodology for extending and refining the classes inmore » the GO for more comprehensive, consistent and integrated representation of pathways, leveraging knowledge embedded in current pathway representations such as those in the Reactome Knowledgebase and MetaCyc. With carbohydrate metabolic pathways as a use case, we discuss how our representation supports the integration of variant pathway classes into a unified ontological structure that can be used for data comparison and analysis.« less
A Test of the DSP Sexing Method on CT Images from a Modern French Sample.
Mestekova, Sarka; Bruzek, Jaroslav; Veleminska, Jana; Chaumoitre, Kathia
2015-09-01
The hip bone is considered to be one of the most reliable indicators in sex determination. The aim of this study was to test the reliability of the DSP method for the hip bone proposed by Murail et al. (Bull Mem Soc Anthropol Paris, 17, 2005, 167) on a sample from a present-day population in France (52 males and 54 females). Ten linear measurements were collected from three-dimensional models derived from computed tomography images (CTI). To quantify the proportions of correct sex determinations, a more rigorous posterior probability threshold of 0.95 was applied. Using all 10 measurements, 92.3% of males and 97.2% of females were sexed correctly. The percentage of undetermined specimens varied depending on the used combination of measurements; however, all sexes were assigned with a 100% accuracy. This study proves that DSP is an appropriate and reliable tool for sex determination, based on dimensions obtained from CTI. © 2015 American Academy of Forensic Sciences.
Bioengineering Solutions for Manufacturing Challenges in CAR T Cells
Piscopo, Nicole J.; Mueller, Katherine P.; Das, Amritava; Hematti, Peiman; Murphy, William L.; Palecek, Sean P.; Capitini, Christian M.
2017-01-01
The next generation of therapeutic products to be approved for the clinic is anticipated to be cell therapies, termed “living drugs” for their capacity to dynamically and temporally respond to changes during their production ex vivo and after their administration in vivo. Genetically engineered chimeric antigen receptor (CAR) T cells have rapidly developed into powerful tools to harness the power of immune system manipulation against cancer. Regulatory agencies are beginning to approve CAR T cell therapies due to their striking efficacy in treating some hematological malignancies. However, the engineering and manufacturing of such cells remains a challenge for widespread adoption of this technology. Bioengineering approaches including biomaterials, synthetic biology, metabolic engineering, process control and automation, and in vitro disease modeling could offer promising methods to overcome some of these challenges. Here, we describe the manufacturing process of CAR T cells, highlighting potential roles for bioengineers to partner with biologists and clinicians to advance the manufacture of these complex cellular products under rigorous regulatory and quality control. PMID:28840981
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
ERIC Educational Resources Information Center
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
Evaluating habitat suitability models for nesting white-headed woodpeckers in unburned forest
Quresh S. Latif; Victoria A. Saab; Kim Mellen-Mclean; Jonathan G. Dudley
2015-01-01
Habitat suitability models can provide guidelines for species conservation by predicting where species of interest are likely to occur. Presence-only models are widely used but typically provide only relative indices of habitat suitability (HSIs), necessitating rigorous evaluation often using independently collected presence-absence data. We refined and evaluated...
Conservatoire Students' Experiences and Perceptions of Instrument-Specific Master Classes
ERIC Educational Resources Information Center
Long, Marion; Creech, Andrea; Gaunt, Helena; Hallam, Susan
2014-01-01
Historically, in the professional training of musicians, the master-apprentice model has played a central role in instilling the methods and values of the discipline, contributing to the rigorous formation of talent. Expert professional musicians advocate that certain thinking skills can be modelled through the master-apprentice model, yet its…
NASA Astrophysics Data System (ADS)
Mc Namara, Hugh A.; Pokrovskii, Alexei V.
2006-02-01
The Kaldor model-one of the first nonlinear models of macroeconomics-is modified to incorporate a Preisach nonlinearity. The new dynamical system thus created shows highly complicated behaviour. This paper presents a rigorous (computer aided) proof of chaos in this new model, and of the existence of unstable periodic orbits of all minimal periods p>57.
Designing an Educational Game with Ten Steps to Complex Learning
ERIC Educational Resources Information Center
Enfield, Jacob
2012-01-01
Few instructional design (ID) models exist which are specific for developing educational games. Moreover, those extant ID models have not been rigorously evaluated. No ID models were found which focus on educational games with complex learning objectives. "Ten Steps to Complex Learning" (TSCL) is based on the four component instructional…
Vaporization and Zonal Mixing in Performance Modeling of Advanced LOX-Methane Rockets
NASA Technical Reports Server (NTRS)
Williams, George J., Jr.; Stiegemeier, Benjamin R.
2013-01-01
Initial modeling of LOX-Methane reaction control (RCE) 100 lbf thrusters and larger, 5500 lbf thrusters with the TDK/VIPER code has shown good agreement with sea-level and altitude test data. However, the vaporization and zonal mixing upstream of the compressible flow stage of the models leveraged empirical trends to match the sea-level data. This was necessary in part because the codes are designed primarily to handle the compressible part of the flow (i.e. contraction through expansion) and in part because there was limited data on the thrusters themselves on which to base a rigorous model. A more rigorous model has been developed which includes detailed vaporization trends based on element type and geometry, radial variations in mixture ratio within each of the "zones" associated with elements and not just between zones of different element types, and, to the extent possible, updated kinetic rates. The Spray Combustion Analysis Program (SCAP) was leveraged to support assumptions in the vaporization trends. Data of both thrusters is revisited and the model maintains a good predictive capability while addressing some of the major limitations of the previous version.
NASA Astrophysics Data System (ADS)
Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean
2016-04-01
A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.
Predictive QSAR modeling workflow, model applicability domains, and virtual screening.
Tropsha, Alexander; Golbraikh, Alexander
2007-01-01
Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.
Nielsen, Joseph; Tokuhiro, Akira; Hiromoto, Robert; ...
2015-11-13
Evaluation of the impacts of uncertainty and sensitivity in modeling presents a significant set of challenges in particular to high fidelity modeling. Computational costs and validation of models creates a need for cost effective decision making with regards to experiment design. Experiments designed to validate computation models can be used to reduce uncertainty in the physical model. In some cases, large uncertainty in a particular aspect of the model may or may not have a large impact on the final results. For example, modeling of a relief valve may result in large uncertainty, however, the actual effects on final peakmore » clad temperature in a reactor transient may be small and the large uncertainty with respect to valve modeling may be considered acceptable. Additionally, the ability to determine the adequacy of a model and the validation supporting it should be considered within a risk informed framework. Low fidelity modeling with large uncertainty may be considered adequate if the uncertainty is considered acceptable with respect to risk. In other words, models that are used to evaluate the probability of failure should be evaluated more rigorously with the intent of increasing safety margin. Probabilistic risk assessment (PRA) techniques have traditionally been used to identify accident conditions and transients. Traditional classical event tree methods utilize analysts’ knowledge and experience to identify the important timing of events in coordination with thermal-hydraulic modeling. These methods lack the capability to evaluate complex dynamic systems. In these systems, time and energy scales associated with transient events may vary as a function of transition times and energies to arrive at a different physical state. Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. Unfortunately DPRA methods introduce issues associated with combinatorial explosion of states. This study presents a methodology to address combinatorial explosion using a Branch-and-Bound algorithm applied to Dynamic Event Trees (DET), which utilize LENDIT (L – Length, E – Energy, N – Number, D – Distribution, I – Information, and T – Time) as well as a set theory to describe system, state, resource, and response (S2R2) sets to create bounding functions for the DET. The optimization of the DET in identifying high probability failure branches is extended to create a Phenomenological Identification and Ranking Table (PIRT) methodology to evaluate modeling parameters important to safety of those failure branches that have a high probability of failure. The PIRT can then be used as a tool to identify and evaluate the need for experimental validation of models that have the potential to reduce risk. Finally, in order to demonstrate this methodology, a Boiling Water Reactor (BWR) Station Blackout (SBO) case study is presented.« less
NASA Technical Reports Server (NTRS)
Tanveer, S.; Foster, M. R.
2002-01-01
We report progress in three areas of investigation related to dendritic crystal growth. Those items include: 1. Selection of tip features dendritic crystal growth; 2) Investigation of nonlinear evolution for two-sided model; and 3) Rigorous mathematical justification.
Accurate Biomass Estimation via Bayesian Adaptive Sampling
NASA Technical Reports Server (NTRS)
Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay
2005-01-01
The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.
NASA Astrophysics Data System (ADS)
Wolff, Schuyler G.; Perrin, Marshall D.; Stapelfeldt, Karl; Duchêne, Gaspard; Ménard, Francois; Padgett, Deborah; Pinte, Christophe; Pueyo, Laurent; Fischer, William J.
2017-12-01
We present new Hubble Space Telescope (HST) Advanced Camera for Surveys observations and detailed models for a recently discovered edge-on protoplanetary disk around ESO-Hα 569 (a low-mass T Tauri star in the Cha I star-forming region). Using radiative transfer models, we probe the distribution of the grains and overall shape of the disk (inclination, scale height, dust mass, flaring exponent, and surface/volume density exponent) by model fitting to multiwavelength (F606W and F814W) HST observations together with a literature-compiled spectral energy distribution. A new tool set was developed for finding optimal fits of MCFOST radiative transfer models using the MCMC code emcee to efficiently explore the high-dimensional parameter space. It is able to self-consistently and simultaneously fit a wide variety of observables in order to place constraints on the physical properties of a given disk, while also rigorously assessing the uncertainties in those derived properties. We confirm that ESO-Hα 569 is an optically thick nearly edge-on protoplanetary disk. The shape of the disk is well-described by a flared disk model with an exponentially tapered outer edge, consistent with models previously advocated on theoretical grounds and supported by millimeter interferometry. The scattered-light images and spectral energy distribution are best fit by an unusually high total disk mass (gas+dust assuming a ratio of 100:1) with a disk-to-star mass ratio of 0.16.
NASA Technical Reports Server (NTRS)
Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.
2013-01-01
Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.
Use of Temperature to Improve West Nile Virus Forecasts
NASA Astrophysics Data System (ADS)
Shaman, J. L.; DeFelice, N.; Schneider, Z.; Little, E.; Barker, C.; Caillouet, K.; Campbell, S.; Damian, D.; Irwin, P.; Jones, H.; Townsend, J.
2017-12-01
Ecological and laboratory studies have demonstrated that temperature modulates West Nile virus (WNV) transmission dynamics and spillover infection to humans. Here we explore whether the inclusion of temperature forcing in a model depicting WNV transmission improves WNV forecast accuracy relative to a baseline model depicting WNV transmission without temperature forcing. Both models are optimized using a data assimilation method and two observed data streams: mosquito infection rates and reported human WNV cases. Each coupled model-inference framework is then used to generate retrospective ensemble forecasts of WNV for 110 outbreak years from among 12 geographically diverse United States counties. The temperature-forced model improves forecast accuracy for much of the outbreak season. From the end of July until the beginning of October, a timespan during which 70% of human cases are reported, the temperature-forced model generated forecasts of the total number of human cases over the next 3 weeks, total number of human cases over the season, the week with the highest percentage of infectious mosquitoes, and the peak percentage of infectious mosquitoes that were on average 5%, 10%, 12%, and 6% more accurate, respectively, than the baseline model. These results indicate that use of temperature forcing improves WNV forecast accuracy and provide further evidence that temperatures influence rates of WNV transmission. The findings help build a foundation for implementation of a statistically rigorous system for real-time forecast of seasonal WNV outbreaks and their use as a quantitative decision support tool for public health officials and mosquito control programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhardwaj, Shubhendu; Sensale-Rodriguez, Berardi; Xing, Huili Grace
A rigorous theoretical and computational model is developed for the plasma-wave propagation in high electron mobility transistor structures with electron injection from a resonant tunneling diode at the gate. We discuss the conditions in which low-loss and sustainable plasmon modes can be supported in such structures. The developed analytical model is used to derive the dispersion relation for these plasmon-modes. A non-linear full-wave-hydrodynamic numerical solver is also developed using a finite difference time domain algorithm. The developed analytical solutions are validated via the numerical solution. We also verify previous observations that were based on a simplified transmission line model. Itmore » is shown that at high levels of negative differential conductance, plasmon amplification is indeed possible. The proposed rigorous models can enable accurate design and optimization of practical resonant tunnel diode-based plasma-wave devices for terahertz sources, mixers, and detectors, by allowing a precise representation of their coupling when integrated with other electromagnetic structures.« less
Including Magnetostriction in Micromagnetic Models
NASA Astrophysics Data System (ADS)
Conbhuí, Pádraig Ó.; Williams, Wyn; Fabian, Karl; Nagy, Lesleis
2016-04-01
The magnetic anomalies that identify crustal spreading are predominantly recorded by basalts formed at the mid-ocean ridges, whose magnetic signals are dominated by iron-titanium-oxides (Fe3-xTixO4), so called "titanomagnetites", of which the Fe2.4Ti0.6O4 (TM60) phase is the most common. With sufficient quantities of titanium present, these minerals exhibit strong magnetostriction. To date, models of these grains in the pseudo-single domain (PSD) range have failed to accurately account for this effect. In particular, a popular analytic treatment provided by Kittel (1949) for describing the magnetostrictive energy as an effective increase of the anisotropy constant can produce unphysical strains for non-uniform magnetizations. I will present a rigorous approach based on work by Brown (1966) and by Kroner (1958) for including magnetostriction in micromagnetic codes which is suitable for modelling hysteresis loops and finding remanent states in the PSD regime. Preliminary results suggest the more rigorously defined micromagnetic models exhibit higher coercivities and extended single domain ranges when compared to more simplistic approaches.
Elf, Johan
2016-04-27
A new, game-changing approach makes it possible to rigorously disprove models without making assumptions about the unknown parts of the biological system. Copyright © 2016 Elsevier Inc. All rights reserved.
Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale
Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv
2015-01-01
X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner. PMID:26169570
The NGEE Arctic Data Archive -- Portal for Archiving and Distributing Data and Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boden, Thomas A; Palanisamy, Giri; Devarakonda, Ranjeet
2014-01-01
The Next-Generation Ecosystem Experiments (NGEE Arctic) project is committed to implementing a rigorous and high-quality data management program. The goal is to implement innovative and cost-effective guidelines and tools for collecting, archiving, and sharing data within the project, the larger scientific community, and the public. The NGEE Arctic web site is the framework for implementing these data management and data sharing tools. The open sharing of NGEE Arctic data among project researchers, the broader scientific community, and the public is critical to meeting the scientific goals and objectives of the NGEE Arctic project and critical to advancing the mission ofmore » the Department of Energy (DOE), Office of Science, Biological and Environmental (BER) Terrestrial Ecosystem Science (TES) program.« less
Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale
NASA Astrophysics Data System (ADS)
Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv
2015-07-01
X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner.
Constructing Optimal Coarse-Grained Sites of Huge Biomolecules by Fluctuation Maximization.
Li, Min; Zhang, John Zenghui; Xia, Fei
2016-04-12
Coarse-grained (CG) models are valuable tools for the study of functions of large biomolecules on large length and time scales. The definition of CG representations for huge biomolecules is always a formidable challenge. In this work, we propose a new method called fluctuation maximization coarse-graining (FM-CG) to construct the CG sites of biomolecules. The defined residual in FM-CG converges to a maximal value as the number of CG sites increases, allowing an optimal CG model to be rigorously defined on the basis of the maximum. More importantly, we developed a robust algorithm called stepwise local iterative optimization (SLIO) to accelerate the process of coarse-graining large biomolecules. By means of the efficient SLIO algorithm, the computational cost of coarse-graining large biomolecules is reduced to within the time scale of seconds, which is far lower than that of conventional simulated annealing. The coarse-graining of two huge systems, chaperonin GroEL and lengsin, indicates that our new methods can coarse-grain huge biomolecular systems with up to 10,000 residues within the time scale of minutes. The further parametrization of CG sites derived from FM-CG allows us to construct the corresponding CG models for studies of the functions of huge biomolecular systems.
Safety Verification of the Small Aircraft Transportation System Concept of Operations
NASA Technical Reports Server (NTRS)
Carreno, Victor; Munoz, Cesar
2005-01-01
A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.
NASA Technical Reports Server (NTRS)
Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry
2015-01-01
Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.
Na, Hyuntae; Lee, Seung-Yub; Üstündag, Ersan; ...
2013-01-01
This paper introduces a recent development and application of a noncommercial artificial neural network (ANN) simulator with graphical user interface (GUI) to assist in rapid data modeling and analysis in the engineering diffraction field. The real-time network training/simulation monitoring tool has been customized for the study of constitutive behavior of engineering materials, and it has improved data mining and forecasting capabilities of neural networks. This software has been used to train and simulate the finite element modeling (FEM) data for a fiber composite system, both forward and inverse. The forward neural network simulation precisely reduplicates FEM results several orders ofmore » magnitude faster than the slow original FEM. The inverse simulation is more challenging; yet, material parameters can be meaningfully determined with the aid of parameter sensitivity information. The simulator GUI also reveals that output node size for materials parameter and input normalization method for strain data are critical train conditions in inverse network. The successful use of ANN modeling and simulator GUI has been validated through engineering neutron diffraction experimental data by determining constitutive laws of the real fiber composite materials via a mathematically rigorous and physically meaningful parameter search process, once the networks are successfully trained from the FEM database.« less
Global surgery: current evidence for improving surgical care.
Fuller, Jennifer C; Shaye, David A
2017-08-01
The field of global surgery is undergoing rapid transformation, owing to several recent prominent reports positioning it as a cost-effective means of relieving global disease burden. The purpose of this article is to review the recent advances in the field of global surgery. Efforts to grow the global surgical workforce and procedural capacity have focused on innovative methods to increase surgeon training, enhance international collaboration, leverage technology, optimize existing health systems, and safely implement task-sharing. Computer modeling offers a novel means of informing policy to optimize timely access to care, equitably promote health and financial protection, and efficiently grow infrastructure. Tools and checklists have recently been developed to enhance data collection and ensure methodologically rigorous publications to inform planning, benchmark surgical systems, promote accurate modeling, track key health indicators, and promote safety. Creation of institutional partnerships and trainee exchanges can enrich training, stimulate commitment to humanitarian work, and promote the equal exchange of ideas and expertise. The recent body of work creates a strong foundation upon which work toward the goal of universal access to safe, affordable surgical care can be built; however, further collection and analysis of country-specific data is necessary for accurate modeling and outcomes research into the efficacy of policies such as task-sharing is greatly needed.
New statistical potential for quality assessment of protein models and a survey of energy functions
2010-01-01
Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048
A quantitative model for oxygen uptake and release in a family of hemeproteins.
Bustamante, Juan P; Szretter, María E; Sued, Mariela; Martí, Marcelo A; Estrin, Darío A; Boechi, Leonardo
2016-06-15
Hemeproteins have many diverse functions that largely depend on the rate at which they uptake or release small ligands, like oxygen. These proteins have been extensively studied using either simulations or experiments, albeit only qualitatively and one or two proteins at a time. We present a physical-chemical model, which uses data obtained exclusively from computer simulations, to describe the uptake and release of oxygen in a family of hemeproteins, called truncated hemoglobins (trHbs). Through a rigorous statistical analysis we demonstrate that our model successfully recaptures all the reported experimental oxygen association and dissociation kinetic rate constants, thus allowing us to establish the key factors that determine the rates at which these hemeproteins uptake and release oxygen. We found that internal tunnels as well as the distal site water molecules control ligand uptake, whereas oxygen stabilization by distal site residues controls ligand release. Because these rates largely determine the functions of these hemeproteins, these approaches will also be important tools in characterizing the trHbs members with unknown functions. lboechi@ic.fcen.uba.ar Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Integrated Medical Model Verification, Validation, and Credibility
NASA Technical Reports Server (NTRS)
Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry
2014-01-01
The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including Operations, Science and Technology Planning, and Exploration Planning. CONCLUSIONS: The VVC approach established by the IMM project of combining the IMM VV Plan with 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.
DIMM-SC: a Dirichlet mixture model for clustering droplet-based single cell transcriptomic data.
Sun, Zhe; Wang, Ting; Deng, Ke; Wang, Xiao-Feng; Lafyatis, Robert; Ding, Ying; Hu, Ming; Chen, Wei
2018-01-01
Single cell transcriptome sequencing (scRNA-Seq) has become a revolutionary tool to study cellular and molecular processes at single cell resolution. Among existing technologies, the recently developed droplet-based platform enables efficient parallel processing of thousands of single cells with direct counting of transcript copies using Unique Molecular Identifier (UMI). Despite the technology advances, statistical methods and computational tools are still lacking for analyzing droplet-based scRNA-Seq data. Particularly, model-based approaches for clustering large-scale single cell transcriptomic data are still under-explored. We developed DIMM-SC, a Dirichlet Mixture Model for clustering droplet-based Single Cell transcriptomic data. This approach explicitly models UMI count data from scRNA-Seq experiments and characterizes variations across different cell clusters via a Dirichlet mixture prior. We performed comprehensive simulations to evaluate DIMM-SC and compared it with existing clustering methods such as K-means, CellTree and Seurat. In addition, we analyzed public scRNA-Seq datasets with known cluster labels and in-house scRNA-Seq datasets from a study of systemic sclerosis with prior biological knowledge to benchmark and validate DIMM-SC. Both simulation studies and real data applications demonstrated that overall, DIMM-SC achieves substantially improved clustering accuracy and much lower clustering variability compared to other existing clustering methods. More importantly, as a model-based approach, DIMM-SC is able to quantify the clustering uncertainty for each single cell, facilitating rigorous statistical inference and biological interpretations, which are typically unavailable from existing clustering methods. DIMM-SC has been implemented in a user-friendly R package with a detailed tutorial available on www.pitt.edu/∼wec47/singlecell.html. wei.chen@chp.edu or hum@ccf.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Sadr, S M K; Saroj, D P; Kouchaki, S; Ilemobade, A A; Ouki, S K
2015-06-01
A global challenge of increasing concern is diminishing fresh water resources. A growing practice in many communities to supplement diminishing fresh water availability has been the reuse of water. Novel methods of treating polluted waters, such as membrane assisted technologies, have recently been developed and successfully implemented in many places. Given the diversity of membrane assisted technologies available, the current challenge is how to select a reliable alternative among numerous technologies for appropriate water reuse. In this research, a fuzzy logic based multi-criteria, group decision making tool has been developed. This tool has been employed in the selection of appropriate membrane treatment technologies for several non-potable and potable reuse scenarios. Robust criteria, covering technical, environmental, economic and socio-cultural aspects, were selected, while 10 different membrane assisted technologies were assessed in the tool. The results show this approach capable of facilitating systematic and rigorous analysis in the comparison and selection of membrane assisted technologies for advanced wastewater treatment and reuse. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Cole, Bjorn; Chung, Seung
2012-01-01
One of the challenges of systems engineering is in working multidisciplinary problems in a cohesive manner. When planning analysis of these problems, system engineers must trade between time and cost for analysis quality and quantity. The quality often correlates with greater run time in multidisciplinary models and the quantity is associated with the number of alternatives that can be analyzed. The trade-off is due to the resource intensive process of creating a cohesive multidisciplinary systems model and analysis. Furthermore, reuse or extension of the models used in one stage of a product life cycle for another is a major challenge. Recent developments have enabled a much less resource-intensive and more rigorous approach than hand-written translation scripts between multi-disciplinary models and their analyses. The key is to work from a core systems model defined in a MOF-based language such as SysML and in leveraging the emerging tool ecosystem, such as Query/View/Transformation (QVT), from the OMG community. SysML was designed to model multidisciplinary systems. The QVT standard was designed to transform SysML models into other models, including those leveraged by engineering analyses. The Europa Habitability Mission (EHM) team has begun to exploit these capabilities. In one case, a Matlab/Simulink model is generated on the fly from a system description for power analysis written in SysML. In a more general case, symbolic analysis (supported by Wolfram Mathematica) is coordinated by data objects transformed from the systems model, enabling extremely flexible and powerful design exploration and analytical investigations of expected system performance.