Risk Assessment: Evidence Base
NASA Technical Reports Server (NTRS)
Johnson-Throop, Kathy A.
2007-01-01
Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.
A Comparison of Risk Sensitive Path Planning Methods for Aircraft Emergency Landing
NASA Technical Reports Server (NTRS)
Meuleau, Nicolas; Plaunt, Christian; Smith, David E.; Smith, Tristan
2009-01-01
Determining the best site to land a damaged aircraft presents some interesting challenges for standard path planning techniques. There are multiple possible locations to consider, the space is 3-dimensional with dynamics, the criteria for a good path is determined by overall risk rather than distance or time, and optimization really matters, since an improved path corresponds to greater expected survival rate. We have investigated a number of different path planning methods for solving this problem, including cell decomposition, visibility graphs, probabilistic road maps (PRMs), and local search techniques. In their pure form, none of these techniques have proven to be entirely satisfactory - some are too slow or unpredictable, some produce highly non-optimal paths or do not find certain types of paths, and some do not cope well with the dynamic constraints when controllability is limited. In the end, we are converging towards a hybrid technique that involves seeding a roadmap with a layered visibility graph, using PRM to extend that roadmap, and using local search to further optimize the resulting paths. We describe the techniques we have investigated, report on our experiments with these techniques, and discuss when and why various techniques were unsatisfactory.
Structure-guided Protein Transition Modeling with a Probabilistic Roadmap Algorithm.
Maximova, Tatiana; Plaku, Erion; Shehu, Amarda
2016-07-07
Proteins are macromolecules in perpetual motion, switching between structural states to modulate their function. A detailed characterization of the precise yet complex relationship between protein structure, dynamics, and function requires elucidating transitions between functionally-relevant states. Doing so challenges both wet and dry laboratories, as protein dynamics involves disparate temporal scales. In this paper we present a novel, sampling-based algorithm to compute transition paths. The algorithm exploits two main ideas. First, it leverages known structures to initialize its search and define a reduced conformation space for rapid sampling. This is key to address the insufficient sampling issue suffered by sampling-based algorithms. Second, the algorithm embeds samples in a nearest-neighbor graph where transition paths can be efficiently computed via queries. The algorithm adapts the probabilistic roadmap framework that is popular in robot motion planning. In addition to efficiently computing lowest-cost paths between any given structures, the algorithm allows investigating hypotheses regarding the order of experimentally-known structures in a transition event. This novel contribution is likely to open up new venues of research. Detailed analysis is presented on multiple-basin proteins of relevance to human disease. Multiscaling and the AMBER ff14SB force field are used to obtain energetically-credible paths at atomistic detail.
Peressutti, Devis; Penney, Graeme P; Housden, R James; Kolbitsch, Christoph; Gomez, Alberto; Rijkhorst, Erik-Jan; Barratt, Dean C; Rhode, Kawal S; King, Andrew P
2013-05-01
In image-guided cardiac interventions, respiratory motion causes misalignments between the pre-procedure roadmap of the heart used for guidance and the intra-procedure position of the heart, reducing the accuracy of the guidance information and leading to potentially dangerous consequences. We propose a novel technique for motion-correcting the pre-procedural information that combines a probabilistic MRI-derived affine motion model with intra-procedure real-time 3D echocardiography (echo) images in a Bayesian framework. The probabilistic model incorporates a measure of confidence in its motion estimates which enables resolution of the potentially conflicting information supplied by the model and the echo data. Unlike models proposed so far, our method allows the final motion estimate to deviate from the model-produced estimate according to the information provided by the echo images, so adapting to the complex variability of respiratory motion. The proposed method is evaluated using gold-standard MRI-derived motion fields and simulated 3D echo data for nine volunteers and real 3D live echo images for four volunteers. The Bayesian method is compared to 5 other motion estimation techniques and results show mean/max improvements in estimation accuracy of 10.6%/18.9% for simulated echo images and 20.8%/41.5% for real 3D live echo data, over the best comparative estimation method. Copyright © 2013 Elsevier B.V. All rights reserved.
Pharmaceuticals Exposed to the Space Environment: Problems and Prospects
NASA Technical Reports Server (NTRS)
Jaworske, Donald A.; Myers, Jerry G.
2016-01-01
The NASA Human Research Program (HRP) Health Countermeasures Element maintains ongoing efforts to inform detailed risks, gaps, and further questions associated with the use of pharmaceuticals in space. Most recently, the Pharmacology Risk Report, released in 2010, illustrates the problems associated with maintaining pharmaceutical efficacy. Since the report, one key publication includes evaluation of pharmaceutical products stored on the International Space Station (ISS). This study shows that selected pharmaceuticals on ISS have a shorter shelf-life in space than corresponding terrestrial controls. The HRP Human Research Roadmap for planetary exploration identifies the risk of ineffective or toxic medications due to long-term storage during missions to Mars. The roadmap also identifies the need to understand and predict how pharmaceuticals will behave when exposed to radiation for long durations. Terrestrial studies of returned samples offer a start for predictive modeling. This paper shows that pharmaceuticals returned to Earth for post-flight analyses are amenable to a Weibull distribution analysis in order to support probabilistic risk assessment modeling. The paper also considers the prospect of passive payloads of key pharmaceuticals on sample return missions outside of Earth's magnetic field to gather additional statistics. Ongoing work in radiation chemistry suggests possible mitigation strategies where future work could be done at cryogenic temperatures to explore methods for preserving the strength of pharmaceuticals in the space radiation environment, perhaps one day leading to an architecture where pharmaceuticals are cached on the Martian surface and preserved cryogenically.
Consensus report on the future of animal-free systemic toxicity testing.
Leist, Marcel; Hasiwa, Nina; Rovida, Costanza; Daneshian, Mardas; Basketter, David; Kimber, Ian; Clewell, Harvey; Gocht, Tilman; Goldberg, Alan; Busquet, Francois; Rossi, Anna-Maria; Schwarz, Michael; Stephens, Martin; Taalman, Rob; Knudsen, Thomas B; McKim, James; Harris, Georgina; Pamies, David; Hartung, Thomas
2014-01-01
Since March 2013, animal use for cosmetics testing for the European market has been banned. This requires a renewed view on risk assessment in this field. However, in other fields as well, traditional animal experimentation does not always satisfy requirements in safety testing, as the need for human-relevant information is ever increasing. A general strategy for animal-free test approaches was outlined by the US National Research Council`s vision document for Toxicity Testing in the 21st Century in 2007. It is now possible to provide a more defined roadmap on how to implement this vision for the four principal areas of systemic toxicity evaluation: repeat dose organ toxicity, carcinogenicity, reproductive toxicity and allergy induction (skin sensitization), as well as for the evaluation of toxicant metabolism (toxicokinetics) (Fig. 1). CAAT-Europe assembled experts from Europe, America and Asia to design a scientific roadmap for future risk assessment approaches and the outcome was then further discussed and refined in two consensus meetings with over 200 stakeholders. The key recommendations include: focusing on improving existing methods rather than favoring de novo design; combining hazard testing with toxicokinetics predictions; developing integrated test strategies; incorporating new high content endpoints to classical assays; evolving test validation procedures; promoting collaboration and data-sharing of different industrial sectors; integrating new disciplines, such as systems biology and high throughput screening; and involving regulators early on in the test development process. A focus on data quality, combined with increased attention to the scientific background of a test method, will be important drivers. Information from each test system should be mapped along adverse outcome pathways. Finally, quantitative information on all factors and key events will be fed into systems biology models that allow a probabilistic risk assessment with flexible adaptation to exposure scenarios and individual risk factors.
Lee, K-E; Lee, E-J; Park, H-S
2016-08-30
Recent advances in computational epigenetics have provided new opportunities to evaluate n-gram probabilistic language models. In this paper, we describe a systematic genome-wide approach for predicting functional roles in inactive chromatin regions by using a sequence-based Markovian chromatin map of the human genome. We demonstrate that Markov chains of sequences can be used as a precursor to predict functional roles in heterochromatin regions and provide an example comparing two publicly available chromatin annotations of large-scale epigenomics projects: ENCODE project consortium and Roadmap Epigenomics consortium.
Multiple external hazards compound level 3 PSA methods research of nuclear power plant
NASA Astrophysics Data System (ADS)
Wang, Handing; Liang, Xiaoyu; Zhang, Xiaoming; Yang, Jianfeng; Liu, Weidong; Lei, Dina
2017-01-01
2011 Fukushima nuclear power plant severe accident was caused by both earthquake and tsunami, which results in large amount of radioactive nuclides release. That accident has caused the radioactive contamination on the surrounding environment. Although this accident probability is extremely small, once such an accident happens that is likely to release a lot of radioactive materials into the environment, and cause radiation contamination. Therefore, studying accidents consequences is important and essential to improve nuclear power plant design and management. Level 3 PSA methods of nuclear power plant can be used to analyze radiological consequences, and quantify risk to the public health effects around nuclear power plants. Based on multiple external hazards compound level 3 PSA methods studies of nuclear power plant, and the description of the multiple external hazards compound level 3 PSA technology roadmap and important technical elements, as well as taking a coastal nuclear power plant as the reference site, we analyzed the impact of off-site consequences of nuclear power plant severe accidents caused by multiple external hazards. At last we discussed the impact of off-site consequences probabilistic risk studies and its applications under multiple external hazards compound conditions, and explained feasibility and reasonableness of emergency plans implementation.
Hasse, J U; Weingaertner, D E
2016-01-01
As the central product of the BMBF-KLIMZUG-funded Joint Network and Research Project (JNRP) 'dynaklim - Dynamic adaptation of regional planning and development processes to the effects of climate change in the Emscher-Lippe region (North Rhine Westphalia, Germany)', the Roadmap 2020 'Regional Climate Adaptation' has been developed by the various regional stakeholders and institutions containing specific regional scenarios, strategies and adaptation measures applicable throughout the region. This paper presents the method, elements and main results of this regional roadmap process by using the example of the thematic sub-roadmap 'Water Sensitive Urban Design 2020'. With a focus on the process support tool 'KlimaFLEX', one of the main adaptation measures of the WSUD 2020 roadmap, typical challenges for integrated climate change adaptation like scattered knowledge, knowledge gaps and divided responsibilities but also potential solutions and promising chances for urban development and urban water management are discussed. With the roadmap and the related tool, the relevant stakeholders of the Emscher-Lippe region have jointly developed important prerequisites to integrate their knowledge, to clarify vulnerabilities, adaptation goals, responsibilities and interests, and to foresightedly coordinate measures, resources, priorities and schedules for an efficient joint urban planning, well-grounded decision-making in times of continued uncertainties and step-by-step implementation of adaptation measures from now on.
Roadmap to Long-Term Monitoring Optimization
This roadmap focuses on optimization of established long-term monitoring programs for groundwater. Tools and techniques discussed concentrate on methods for optimizing the monitoring frequency and spatial (three-dimensional) distribution of wells ...
The Soils and Groundwater – EM-20 S&T Roadmap Quality Assurance Project Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fix, N. J.
The Soils and Groundwater – EM-20 Science and Technology Roadmap Project is a U.S. Department of Energy, Office of Environmental Management-funded initiative designed to develop new methods, strategies and technology for characterizing, modeling, remediating, and monitoring soils and groundwater contaminated with metals, radionuclides, and chlorinated organics. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by EM-20 Roadmap Project staff.
A roadmap for acute care training of frontline Healthcare workers in LMICs.
Shah, Nirupa; Bhagwanjee, Satish; Diaz, Janet; Gopalan, P D; Appiah, John Adabie
2017-10-01
This 10-step roadmap outlines explicit procedures for developing, implementing and evaluating short focused training programs for acute care in low and middle income countries (LMICs). A roadmap is necessary to develop resilient training programs that achieve equivalent outcomes despite regional variability in human capacity and infrastructure. Programs based on the roadmap should address shortfalls in human capacity and access to care in the short term and establish the ground work for health systems strengthening in the long term. The primary targets for acute care training are frontline healthcare workers at the clinic level. The programs will differ from others currently available with respect to the timelines, triage method, therapeutic interventions and potential for secondary prevention. The roadmap encompasses multiple iterative cycles of the Plan-Do-Study-Act framework. Core features are integration of frontline trainees with the referral system while promoting research, quality improvement and evaluation from the bottom-up. Training programs must be evidence based, developed along action timelines and use adaptive training methods. A systems approach is essential because training programs that take cognizance of all factors that influence health care delivery have the potential to produce health systems strengthening (HSS). Copyright © 2017 Elsevier Inc. All rights reserved.
Technology Roadmaps for Compound Semiconductors
Bennett, Herbert S.
2000-01-01
The roles cited for compound semiconductors in public versions of existing technology roadmaps from the National Electronics Manufacturing Initiative, Inc., Optoelectronics Industry Development Association, Microelectronics Advanced Research Initiative on Optoelectronic Interconnects, and Optoelectronics Industry and Technology Development Association (OITDA) are discussed and compared within the context of trends in the Si CMOS industry. In particular, the extent to which these technology roadmaps treat compound semiconductors at the materials processing and device levels will be presented for specific applications. For example, OITDA’s Optical Communications Technology Roadmap directly connects the information demand of delivering 100 Mbit/s to the home to the requirement of producing 200 GHz heterojunction bipolar transistors with 30 nm bases and InP high electron mobility transistors with 100 nm gates. Some general actions for progress towards the proposed International Technology Roadmap for Compound Semiconductors (ITRCS) and methods for determining the value of an ITRCS will be suggested. But, in the final analysis, the value added by an ITRCS will depend on how industry leaders respond. The technical challenges and economic opportunities of delivering high quality digital video to consumers provide concrete examples of where the above actions and methods could be applied. PMID:27551615
System Synthesis in Preliminary Aircraft Design using Statistical Methods
NASA Technical Reports Server (NTRS)
DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.
1996-01-01
This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).
Accurate and reproducible functional maps in 127 human cell types via 2D genome segmentation
Hardison, Ross C.
2017-01-01
Abstract The Roadmap Epigenomics Consortium has published whole-genome functional annotation maps in 127 human cell types by integrating data from studies of multiple epigenetic marks. These maps have been widely used for studying gene regulation in cell type-specific contexts and predicting the functional impact of DNA mutations on disease. Here, we present a new map of functional elements produced by applying a method called IDEAS on the same data. The method has several unique advantages and outperforms existing methods, including that used by the Roadmap Epigenomics Consortium. Using five categories of independent experimental datasets, we compared the IDEAS and Roadmap Epigenomics maps. While the overall concordance between the two maps is high, the maps differ substantially in the prediction details and in their consistency of annotation of a given genomic position across cell types. The annotation from IDEAS is uniformly more accurate than the Roadmap Epigenomics annotation and the improvement is substantial based on several criteria. We further introduce a pipeline that improves the reproducibility of functional annotation maps. Thus, we provide a high-quality map of candidate functional regions across 127 human cell types and compare the quality of different annotation methods in order to facilitate biomedical research in epigenomics. PMID:28973456
Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering
NASA Astrophysics Data System (ADS)
Hynes-Griffin, M. E.; Buege, L. L.
1983-09-01
Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.
Comparative Effectiveness Research: A Roadmap for Physical Activity and Lifestyle
Jakicic, John M.; Sox, Harold; Blair, Steven N.; Bensink, Mark; Johnson, William G.; King, Abby C.; Lee, I-Min; Nahum-Shani, Inbal; Sallis, James F.; Sallis, Robert E.; Craft, Lynette; Whitehead, James R.; Ainsworth, Barbara E.
2017-01-01
Purpose Comparative Effectiveness Research (CER) is designed to support informed decision making at both the individual, population, and policy levels. The American College of Sports Medicine and partners convened a conference with the focus of building an agenda for CER within the context of physical activity and non-pharmacological lifestyle approaches in the prevention and treatment of chronic disease. This report summarizes the conference content and consensus recommendations that culminated in a CER Roadmap for Physical Activity and Lifestyle approaches to reducing the risk of chronic disease. Methods This conference focused on presentations and discussion around the following topic areas: 1) defining CER, 2) identifying the current funding climate to support CER, 3) summarizing methods for conducting CER, and 4) identifying CER opportunities for physical activity. Results This conference resulted in consensus recommendations to adopt a CER Roadmap for Physical Activity and Lifestyle approaches to reducing the risk of chronic disease. In general, this roadmap provides a systematic framework by which CER for physical activity can move from a planning phase, to a phase of engagement in CER related to lifestyle factors with particular emphasis on physical activity, to a societal change phase that results in changes in policy, practice, and health. Conclusions It is recommended that physical activity researchers and healthcare providers use the roadmap developed from this conference as a method to systematically engage in and apply CER to the promotion of physical activity as a key lifestyle behavior that can be effective at impacting a variety of health-related outcomes. PMID:25426735
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.
Capabilities Roadmap Briefings to the National Research Council
NASA Technical Reports Server (NTRS)
2005-01-01
High energy power and propulsion capability roadmap - general background and introduction. Advanced telescopes and observatories and scientific instruments and sensors capability roadmaps - general background and introduction. Space communications capability roadmap interim review. Robotic access to planetary surface capability roadmap. Human health and support systems capability roadmap progress review.
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sailer, Anna M., E-mail: anni.sailer@mumc.nl; Haan, Michiel W. de, E-mail: m.de.haan@mumc.nl; Graaf, Rick de, E-mail: r.de.graaf@mumc.nl
PurposeThis study was designed to evaluate the feasibility of endovascular guidance by means of live fluoroscopy fusion with magnetic resonance angiography (MRA) and computed tomography angiography (CTA).MethodsFusion guidance was evaluated in 20 endovascular peripheral artery interventions in 17 patients. Fifteen patients had received preinterventional diagnostic MRA and two patients had undergone CTA. Time for fluoroscopy with MRA/CTA coregistration was recorded. Feasibility of fusion guidance was evaluated according to the following criteria: for every procedure the executing interventional radiologists recorded whether 3D road-mapping provided added value (yes vs. no) and whether PTA and/or stenting could be performed relying on the fusionmore » road-map without need for diagnostic contrast-enhanced angiogram series (CEAS) (yes vs. no). Precision of the fusion road-map was evaluated by recording maximum differences between the position of the vasculature on the virtual CTA/MRA images and conventional angiography.ResultsAverage time needed for image coregistration was 5 ± 2 min. Three-dimensional road-map added value was experienced in 15 procedures in 12 patients. In half of the patients (8/17), intervention was performed relying on the fusion road-map only, without diagnostic CEAS. In two patients, MRA roadmap showed a false-positive lesion. Excluding three patients with inordinate movements, mean difference in position of vasculature on angiography and MRA/CTA road-map was 1.86 ± 0.95 mm, implying that approximately 95 % of differences were between 0 and 3.72 mm (2 ± 1.96 standard deviation).ConclusionsFluoroscopy with MRA/CTA fusion guidance for peripheral artery interventions is feasible. By reducing the number of CEAS, this technology may contribute to enhance procedural safety.« less
NASA Technical Reports Server (NTRS)
Inman, Thomas
2005-01-01
General Background and Introduction of Capability Roadmaps: Agency Objective. Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Technology and Capability Readiness Levels. Relationships Between Roadmaps. Purpose of NRC Review. Capability Roadmap Development (Team Progress to Date).
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
Probabilistic structural analysis methods for space propulsion system components
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.
2007 Precision Strike PEO Summer Forum - Joint Perspectives on Precision Engagement
2007-07-11
Status,” Colonel Richard Justice, USAF—Commander of the Miniature Munitions Systems Group (MMSG), Eglin Air Force Base “Unmanned Systems (UAS) Roadmap...Role in the Roadmap Implementation Methods & Processes Working Group Issues delineated in Implementation Plan form basis for JTEM methodology...Test and Evaluation JMETC – Joint Mission Environment Test Capability WG – Working Group DOT&E AT&L DOT&E Unclassified 5 Background: JTEM Problem
NASA Technical Reports Server (NTRS)
Mueller, Rob
2005-01-01
General Background and Introduction of Capability Roadmaps Agency Objective. Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date)
Probabilistic Structural Analysis Theory Development
NASA Technical Reports Server (NTRS)
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Weiss, Brian A.; Vogl, Gregory; Helu, Moneer; Qiao, Guixiu; Pellegrino, Joan; Justiniano, Mauricio; Raghunathan, Anand
2017-01-01
The National Institute of Standards and Technology (NIST) hosted the Roadmapping Workshop – Measurement Science for Prognostics and Health Management for Smart Manufacturing Systems (PHM4SMS) in Fall 2014 to discuss the needs and priorities of stakeholders in the PHM4SMS technology area. The workshop brought together over 70 members of the PHM community. The attendees included representatives from small, medium, and large manufacturers; technology developers and integrators; academic researchers; government organizations; trade associations; and standards bodies. The attendees discussed the current and anticipated measurement science challenges to advance PHM methods and techniques for smart manufacturing systems; the associated research and development needed to implement condition monitoring, diagnostic, and prognostic technologies within manufacturing environments; and the priorities to meet the needs of PHM in manufacturing. This paper will summarize the key findings of this workshop, and present some of the critical measurement science challenges and corresponding roadmaps, i.e., suggested courses of action, to advance PHM for manufacturing. Milestones and targeted capabilities will be presented for each roadmap across three areas: PHM Manufacturing Process Techniques; PHM Performance Assessment; and PHM Infrastructure – Hardware, Software, and Integration. An analysis of these roadmaps and crosscutting themes seen across the breakout sessions is also discussed. PMID:28664163
Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components
NASA Technical Reports Server (NTRS)
1999-01-01
Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.
Power Systems for Future Missions: Appendices A-L
NASA Technical Reports Server (NTRS)
Gill, S. P.; Frye, P. E.; Littman, Franklin D.; Meisl, C. J.
1994-01-01
Selection of power system technology for space applications is typically based on mass, readiness of a particular technology to meet specific mission requirements, and life cycle costs (LCC). The LCC is typically used as a discriminator between competing technologies for a single mission application. All other future applications for a given technology are usually ignored. As a result, development cost of a technology becomes a dominant factor in the LCC comparison. Therefore, it is common for technologies such as DIPS and LMR-CBC to be potentially applicable to a wide range of missions and still lose out in the initial LCC comparison due to high development costs. This collection of appendices (A through L) contains the following power systems technology plans: CBC DIPS Technology Roadmap; PEM PFC Technology Roadmap; NAS Battery Technology Roadmap; PV/RFC Power System Technology Roadmap; PV/NAS Battery Technology Roadmap; Thermionic Reactor Power System Technology Roadmap; SP-100 Power System Technology Roadmap; Dynamic SP-100 Power System Technology Roadmap; Near-Term Solar Dynamic Power System Technology Roadmap; Advanced Solar Dynamic Power System Technology Roadmap; Advanced Stirling Cycle Dynamic Isotope Power System Technology Roadmap; and the ESPPRS (Evolutionary Space Power and Propulsion Requirements System) User's Guide.
NASA Technical Reports Server (NTRS)
Poniatowski, Karen
2005-01-01
Contents include the following: Overview/Introduction. Roadmap Approach/Considerations. Roadmap Timeline/Spirals. Requirements Development. Spaceport/Range Capabilities. Mixed Range Architecture. User Requirements/Customer Considerations. Manifest Considerations. Emerging Launch User Requirements. Capability Breakdown Structure/Assessment. Roadmap Team Observations. Transformational Range Test Concept. Roadmap Team Conclusions. Next Steps.
NASA Technical Reports Server (NTRS)
Crooke, Julie A.
2005-01-01
Contents include the following: General Background and Introduction of Capability Roadmaps "Title." Agency Objective. Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).
NASA Technical Reports Server (NTRS)
Aikins, Jan
2005-01-01
Contents include the following: General Background and Introduction of Capability Roadmaps. Agency Objective. Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).
NASA Technical Reports Server (NTRS)
Aikins, Jan
2005-01-01
Contents include the following: General Background and Introduction of Capability Roadmaps. Agency Objective. Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting during October 6-8 at the Best Western Sterling Inn, Sterling Heights (Detroit), Michigan is co-sponsored by US Army Tank-automotive & Armaments Command (TACOM). The meeting will provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11's Probabilistic Methods Committee is to "enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development."
NASA's New Thermal Management Systems Roadmap; Whats in it, What it Means
NASA Technical Reports Server (NTRS)
Swanson, Ted
2016-01-01
In July of 2015 NASA publically released a new set of Technology Area Roadmaps that will be used to help guide future NASA-funded technology development efforts. One of these was the Thermal Management Systems Roadmap, often identified as TA14. This Roadmap identifies the time sequencing and interdependencies of high priority, advanced thermal control technology for the next 5 to 20 years. Available funding limits the development of new technology. The Roadmaps are the first step in the process of prioritizing HQ-supported technology funding. The 2015 Roadmaps are focused on planned mission architectures and needs, as identified in the NRC-led science Decadals and HEOMD's Design Reference Missions. Additionally, the 2015 Roadmaps focus on "applied " R&D as opposed to more basic research. The NASA Mission Directorates were all closely involved in development of 2015 Roadmaps, and an extensive external review was also conducted. This talk will discuss the Technology Roadmaps in general, and then focus on the specific technologies identified for TA 14, Thermal Management Systems.
NASA Astrophysics Data System (ADS)
Yilmaz, Zeynep
Typically, the vertical component of the ground motion is not considered explicitly in seismic design of bridges, but in some cases the vertical component can have a significant effect on the structural response. The key question of when the vertical component should be incorporated in design is answered by the probabilistic seismic hazard assessment study incorporating the probabilistic seismic demand models and ground motion models. Nonlinear simulation models with varying configurations of an existing bridge in California were considered in the analytical study. The simulation models were subjected to the set of selected ground motions in two stages: at first, only horizontal components of the motion were applied; while in the second stage the structures were subjected to both horizontal and vertical components applied simultaneously and the ground motions that produced the largest adverse effects on the bridge system were identified. Moment demand in the mid-span and at the support of the longitudinal girder and the axial force demand in the column are found to be significantly affected by the vertical excitations. These response parameters can be modeled using simple ground motion parameters such as horizontal spectral acceleration and vertical spectral acceleration within 5% to 30% error margin depending on the type of the parameter and the period of the structure. For a complete hazard assessment, both of these ground motion parameters explaining the structural behavior should also be modeled. For the horizontal spectral acceleration, Abrahamson and Silva (2008) model was used within many available standard model. A new NGA vertical ground motion model consistent with the horizontal model was constructed. These models are combined in a vector probabilistic seismic hazard analyses. Series of hazard curves developed and presented for different locations in Bay Area for soil site conditions to provide a roadmap for the prediction of these features for future earthquakes. Findings from this study will contribute to the development of revised guidelines to address vertical ground motion effects, particularly in the near fault regions, in the seismic design of highway bridges.
High Energy Power and Propulsion Capability Roadmap: General Background and Introduction
NASA Technical Reports Server (NTRS)
Bankston, Perry
2005-01-01
Agency objective are: Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).
NASA Technical Reports Server (NTRS)
Skelly, Darin M.
2005-01-01
Viewgraphs on the National Research Council's diaglog to assess progress on NASA's transformational spaceport and range technologies capability roadmap development is presented. The topics include: 1) Agency Goals and Objectives; 2) Strategic Planning Transformation; 3) Advanced Planning Organizational Roles; 4) Public Involvement in Strategic Planning; 5) Strategic Roadmaps; 6) Strategic Roadmaps Schedule; 7) Capability Roadmaps; 8) Capability Charter; 9) Process for Team Selection; 10) Capability Roadmap Development Schedule Overview; 11) Purpose of NRC Review; 12) Technology Readiness Levels; 13) Capability Readiness Levels; 14) Crosswalk Matrix Trans Spaceport & Range; 15) Example linkage to other roadmaps; 16) Capability Readiness Levels Defined; and 17) Crosswalk Matrix Ratings Work In-progress.
Probabilistic finite elements for fracture mechanics
NASA Technical Reports Server (NTRS)
Besterfield, Glen
1988-01-01
The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components
NASA Technical Reports Server (NTRS)
1991-01-01
Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.
NASA Technical Reports Server (NTRS)
Coulter, Dan; Bankston, Perry
2005-01-01
Agency objective are: Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).
A Suggested Approach for Producing VAMS Air Transportation System Technology Roadmaps
NASA Technical Reports Server (NTRS)
Weathers, Del
2002-01-01
This viewgraph presentation provides an overview on the use of technology 'roadmaps' in order to facilitate the research development of VAMS (Virtual Airspace Modeling and Simulation). These roadmaps are to be produced by each concept team, updated annually, discussed at the technical interchange meetings (TIMs), shared among all VAMS participants, and made available electronically. These concept-specific technology roadmaps will be subsequently blended into an integrated catalog of roadmaps, technical discussions, and research recommendations. A historical example of ATM (Air Traffic Management) research and technology from 1940 to 1999 as shown in a series of 'roadmaps' is also included.
Research and Development Roadmaps for Liquid Metal Cooled Fast Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, T. K.; Grandy, C.; Natesan, K.
The United States Department of Energy (DOE) commissioned the development of technology roadmaps for advanced (non-light water reactor) reactor concepts to help focus research and development funding over the next five years. The roadmaps show the research and development needed to support demonstration of an advanced (non-LWR) concept by the early 2030s, consistent with DOE’s Vision and Strategy for the Development and Deployment of Advanced Reactors. The intent is only to convey the technical steps that would be required to achieve such a goal; the means by which DOE will determine whether to invest in specific tasks will be treatedmore » separately. The starting point for the roadmaps is the Technical Readiness Assessment performed as part of an Advanced Test and Demonstration Reactor study released in 2016. The roadmaps were developed based upon a review of technical reports and vendor literature summarizing the technical maturity of each concept and the outstanding research and development needs. Critical path tasks for specific systems were highlighted on the basis of time and resources needed to complete the tasks and the importance of the system to the performance of the reactor concept. The roadmaps are generic, i.e. not specific to a particular vendor’s design but vendor design information may have been used as representative of the concept family. In the event that both near-term and more advanced versions of a concept are being developed, either a single roadmap with multiple branches or separate roadmaps for each version were developed. In each case, roadmaps point to a demonstration reactor (engineering or commercial) and show the activities that must be completed in parallel to support that demonstration in the 2030-2035 window. This report provides the roadmaps for two fast reactor concepts, the Sodium-cooled Fast Reactor (SFR) and the Lead-cooled Fast Reactor (LFR). The SFR technology is mature enough for commercial demonstration by the early 2030s, and the remaining critical paths and R&D needs are generally related to the completion of qualification of fuel and structural materials, validation of reactor design codes and methods, and support of the licensing frameworks. The LFR’s technology is instead less-mature compared to the SFR’s, and will be at the engineering demonstration stage by the early 2030s. Key LFR technology development activities will focus on resolving remaining design challenges and demonstrating the viability of systems and components in the integral system, which will be done in parallel with addressing the gaps shared with SFR technology. The approach and timeline presented here assume that, for the first module demonstration, vendors would pursue a two-step licensing process based on 10CFR Part 50.« less
NASA Astrophysics Data System (ADS)
Fei, Cheng-Wei; Bai, Guang-Chen
2014-12-01
To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Patterning roadmap: 2017 prospects
NASA Astrophysics Data System (ADS)
Neisser, Mark
2017-06-01
Road mapping of semiconductor chips has been underway for over 20 years, first with the International Technology Roadmap for Semiconductors (ITRS) roadmap and now with the International Roadmap for Devices and Systems (IRDS) roadmap. The original roadmap was mostly driven bottom up and was developed to ensure that the large numbers of semiconductor producers and suppliers had good information to base their research and development on. The current roadmap is generated more top-down, where the customers of semiconductor chips anticipate what will be needed in the future and the roadmap projects what will be needed to fulfill that demand. The More Moore section of the roadmap projects that advanced logic will drive higher-resolution patterning, rather than memory chips. Potential solutions for patterning future logic nodes can be derived as extensions of `next-generation' patterning technologies currently under development. Advanced patterning has made great progress, and two `next-generation' patterning technologies, EUV and nanoimprint lithography, have potential to be in production as early as 2018. The potential adoption of two different next-generation patterning technologies suggests that patterning technology is becoming more specialized. This is good for the industry in that it lowers overall costs, but may lead to slower progress in extending any one patterning technology in the future.
NASA Technical Reports Server (NTRS)
Regenie, Victoria
2005-01-01
Contents include the following: General Background and Introduction of Capability. Roadmaps for Systems Engineering Cost/Risk Analysis. Agency Objectives. Strategic Planning Transformation. Review Capability Roadmaps and Schedule. Review Purpose of NRC Review. Capability Roadmap Development (Progress to Date).
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
Probabilistic classifiers with high-dimensional data
Kim, Kyung In; Simon, Richard
2011-01-01
For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Human Planetary Landing System (HPLS) Capability Roadmap NRC Progress Review
NASA Technical Reports Server (NTRS)
Manning, Rob; Schmitt, Harrison H.; Graves, Claude
2005-01-01
Capability Roadmap Team. Capability Description, Scope and Capability Breakdown Structure. Benefits of the HPLS. Roadmap Process and Approach. Current State-of-the-Art, Assumptions and Key Requirements. Top Level HPLS Roadmap. Capability Presentations by Leads. Mission Drivers Requirements. "AEDL" System Engineering. Communication & Navigation Systems. Hypersonic Systems. Super to Subsonic Decelerator Systems. Terminal Descent and Landing Systems. A Priori In-Situ Mars Observations. AEDL Analysis, Test and Validation Infrastructure. Capability Technical Challenges. Capability Connection Points to other Roadmaps/Crosswalks. Summary of Top Level Capability. Forward Work.
Dominating Scale-Free Networks Using Generalized Probabilistic Methods
Molnár,, F.; Derzsy, N.; Czabarka, É.; Székely, L.; Szymanski, B. K.; Korniss, G.
2014-01-01
We study ensemble-based graph-theoretical methods aiming to approximate the size of the minimum dominating set (MDS) in scale-free networks. We analyze both analytical upper bounds of dominating sets and numerical realizations for applications. We propose two novel probabilistic dominating set selection strategies that are applicable to heterogeneous networks. One of them obtains the smallest probabilistic dominating set and also outperforms the deterministic degree-ranked method. We show that a degree-dependent probabilistic selection method becomes optimal in its deterministic limit. In addition, we also find the precise limit where selecting high-degree nodes exclusively becomes inefficient for network domination. We validate our results on several real-world networks, and provide highly accurate analytical estimates for our methods. PMID:25200937
Is probabilistic bias analysis approximately Bayesian?
MacLehose, Richard F.; Gustafson, Paul
2011-01-01
Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311
Global/local methods for probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.
1993-01-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Global/local methods for probabilistic structural analysis
NASA Astrophysics Data System (ADS)
Millwater, H. R.; Wu, Y.-T.
1993-04-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Book of Knowledge (BOK) for NASA Electronic Packaging Roadmap
NASA Technical Reports Server (NTRS)
Ghaffarian, Reza
2015-01-01
The objective of this document is to update the NASA roadmap on packaging technologies (initially released in 2007) and to present the current trends toward further reducing size and increasing functionality. Due to the breadth of work being performed in the area of microelectronics packaging, this report presents only a number of key packaging technologies detailed in three industry roadmaps for conventional microelectronics and a more recently introduced roadmap for organic and printed electronics applications. The topics for each category were down-selected by reviewing the 2012 reports of the International Technology Roadmap for Semiconductor (ITRS), the 2013 roadmap reports of the International Electronics Manufacturing Initiative (iNEMI), the 2013 roadmap of association connecting electronics industry (IPC), the Organic Printed Electronics Association (OE-A). The report also summarizes the results of numerous articles and websites specifically discussing the trends in microelectronics packaging technologies.
Probabilistic drug connectivity mapping
2014-01-01
Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351
Comprehensive Smart Grid Planning in a Regulated Utility Environment
NASA Astrophysics Data System (ADS)
Turner, Matthew; Liao, Yuan; Du, Yan
2015-06-01
This paper presents the tools and exercises used during the Kentucky Smart Grid Roadmap Initiative in a collaborative electric grid planning process involving state regulators, public utilities, academic institutions, and private interest groups. The mandate of the initiative was to assess the existing condition of smart grid deployments in Kentucky, to enhance understanding of smart grid concepts by stakeholders, and to develop a roadmap for the deployment of smart grid technologies by the jurisdictional utilities of Kentucky. Through involvement of many important stakeholder groups, the resultant Smart Grid Deployment Roadmap proposes an aggressive yet achievable strategy and timetable designed to promote enhanced availability, security, efficiency, reliability, affordability, sustainability and safety of the electricity supply throughout the state while maintaining Kentucky's nationally competitive electricity rates. The models and methods developed for this exercise can be utilized as a systematic process for the planning of coordinated smart grid deployments.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...
Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels
NASA Astrophysics Data System (ADS)
Kurniasih, R.; Sujadi, I.
2017-09-01
This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.
Global industry status report and roadmap for high performance displays
NASA Astrophysics Data System (ADS)
Bardsley, J. Norman; Pinnel, M. Robert
2003-09-01
A summary is provided of a comprehensive industry status report and roadmap available from www.usdc.org. Continued improvements in LCD technology are being driven by home entertainment applications, leading to better color and video response. Competing technologies, such as PDP and OLED and electronic paper must either exploit inherent advantages for such applications or focus on other market niches that are not being addressed well by mainline LCD technology. Flexible displays provide an opportunity for innovative technologies and manufacturing methods, but appear to bring no killer applications.
Comparison of probabilistic and deterministic fiber tracking of cranial nerves.
Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H
2017-09-01
OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p < 0.001; Wilcoxon signed-rank test). The false-positive error of the last obtained depiction was also significantly lower in probabilistic than in deterministic tracking (p < 0.001). The HCP data yielded significantly better results in terms of the Dice coefficient in probabilistic tracking (p < 0.001, Mann-Whitney U-test) and in deterministic tracking (p = 0.02). The false-positive errors were smaller in HCP data in deterministic tracking (p < 0.001) and showed a strong trend toward significance in probabilistic tracking (p = 0.06). In the clinical cases, the probabilistic method visualized 7 of 10 attempted CNs accurately, compared with 3 correct depictions with deterministic tracking. CONCLUSIONS High angular resolution DTI scans are preferable for the DTI-based depiction of the cranial nerves. Probabilistic tracking with a gradual PICo threshold increase is more effective for this task than the previously described deterministic tracking with a gradual FA threshold increase and might represent a method that is useful for depicting cranial nerves with DTI since it eliminates the erroneous fibers without manual intervention.
Scientific Assessment of NASA's Solar System Exploration Roadmap
NASA Technical Reports Server (NTRS)
1996-01-01
At its June 24-28, 1996, meeting, the Space Studies Board's Committee on Planetary and Lunar Exploration (COMPLEX), chaired by Ronald Greeley of Arizona State University, conducted an assessment of NASA's Mission to the Solar System Roadmap report. This assessment was made at the specific request of Dr. Jurgen Rahe, NASA's science program director for solar system exploration. The assessment includes consideration of the process by which the Roadmap was developed, comparison of the goals and objectives of the Roadmap with published National Research Council (NRC) recommendations, and suggestions for improving the Roadmap.
Flight Avionics Hardware Roadmap
NASA Technical Reports Server (NTRS)
Hodson, Robert; McCabe, Mary; Paulick, Paul; Ruffner, Tim; Some, Rafi; Chen, Yuan; Vitalpur, Sharada; Hughes, Mark; Ling, Kuok; Redifer, Matt;
2013-01-01
As part of NASA's Avionics Steering Committee's stated goal to advance the avionics discipline ahead of program and project needs, the committee initiated a multi-Center technology roadmapping activity to create a comprehensive avionics roadmap. The roadmap is intended to strategically guide avionics technology development to effectively meet future NASA missions needs. The scope of the roadmap aligns with the twelve avionics elements defined in the ASC charter, but is subdivided into the following five areas: Foundational Technology (including devices and components), Command and Data Handling, Spaceflight Instrumentation, Communication and Tracking, and Human Interfaces.
Probabilistic Geoacoustic Inversion in Complex Environments
2015-09-30
Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, Jeff; Ayala, Samuel
2000-01-01
NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.
Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
Application of the Probabilistic Dynamic Synthesis Method to Realistic Structures
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. In previous work, the feasibility of the PDS method applied to a simple seven degree-of-freedom spring-mass system was verified. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Use of adjoint methods in the probabilistic finite element approach to fracture mechanics
NASA Technical Reports Server (NTRS)
Liu, Wing Kam; Besterfield, Glen; Lawrence, Mark; Belytschko, Ted
1988-01-01
The adjoint method approach to probabilistic finite element methods (PFEM) is presented. When the number of objective functions is small compared to the number of random variables, the adjoint method is far superior to the direct method in evaluating the objective function derivatives with respect to the random variables. The PFEM is extended to probabilistic fracture mechanics (PFM) using an element which has the near crack-tip singular strain field embedded. Since only two objective functions (i.e., mode I and II stress intensity factors) are needed for PFM, the adjoint method is well suited.
Web-based Academic Roadmaps for Careers in the Geosciences
NASA Astrophysics Data System (ADS)
Murray, D. P.; Veeger, A. I.; Grossman-Garber, D.
2007-12-01
To a greater extent than most science programs, geology is underrepresented in K-12 curricula and the media. Thus potential majors have scant knowledge of academic requirements and career trajectories, and their idea of what geologists do--if they have one at all--is outdated. We have addressed these concerns by developing a dynamic, web-based academic roadmap for current and prospective students, their families, and others who are contemplating careers in the geosciences. The goals of this visually attractive "educational pathway" are to not only improve student recruitment and retention, but to empower student learning by creating better communication and advising tools that can render our undergraduate program transparent for learners and their families. Although we have developed academic roadmaps for four environmental and life science programs at the University of Rhode Island, we focus here on the roadmap for the geosciences, which illustrates educational pathways along the academic and early-career continuum for current and potential (i.e., high school) students who are considering the earth sciences. In essence, the Geosciences Academic Roadmap is a "one-stop'" portal to the discipline. It includes user- friendly information about our curriculum, outcomes (which at URI are tightly linked to performance in courses and the major), extracurricular activities (e.g., field camp, internships), careers, graduate programs, and training. In the presentation of this material extensive use is made of streaming video, interviews with students and earth scientists, and links to other relevant sites. Moreover, through the use of "Hot Topics", particular attention is made to insure that examples of geoscience activities are not only of relevance to today's students, but show geologists using the modern methods of the discipline in exciting ways. Although this is a "work-in-progress", evaluation of the sites, by high school through graduate students, has been strongly positive. Our presentation will include a demonstration of the Academic Roadmap, and a template that can be used by other geoscience departments to easily design websites.
Fleischhacker, Sheila E; Ballard, Rachel M; Starke-Reed, Pamela E; Galuska, Deborah A; Neuhouser, Marian L
2017-10-01
The Interagency Committee on Human Nutrition Research (ICHNR) is charged with improving the planning, coordination, and communication among federal agencies engaged in nutrition research and with facilitating the development and updating of plans for federal research programs to meet current and future domestic and international needs for nutrition. The ICHNR is co-chaired by the USDA Under Secretary for Research, Education, and Economics and Chief Scientist and the US Department of Health and Human Services Assistant Secretary for Health and is made up of >10 departments and agencies. Once the ICHNR was reassembled after a 10-y hiatus, the ICHNR recognized a need for a written roadmap to identify critical human nutrition research gaps and opportunities. This commentary provides an overview of the process the ICHNR undertook to develop a first-of-its-kind National Nutrition Research Roadmap, which was publicly released on 4 March 2016. The primary audience for the Roadmap is federal science agency leaders, along with relevant program and policy staff who rely on federally supported human nutrition research, in addition to the broader scientific community. The Roadmap is framed around the following 3 questions: 1 ) How can we better understand and define eating patterns to improve and sustain health? 2 ) What can be done to help people choose healthy eating patterns? 3 ) How can we develop and engage innovative methods and systems to accelerate discoveries in human nutrition? Within these 3 questions, 11 topical areas were identified on the basis of the following criteria: population impact, feasibility given current technological capacities, and emerging scientific opportunities. This commentary highlights initial federal and some professional research society efforts to address the Roadmap's research and resource priorities. We conclude by noting examples of early collaborations and partnerships to move human nutrition research forward in the 21st century. © 2017 American Society for Nutrition.
2017-01-01
Background The Right Size Roadmap was developed by the Association of Public Health Laboratories and the Centers for Disease Control and Prevention to improve influenza virologic surveillance efficiency. Guidelines were provided to state health departments regarding representativeness and statistical estimates of specimen numbers needed for seasonal influenza situational awareness, rare or novel influenza virus detection, and rare or novel influenza virus investigation. Objective The aim of this study was to compare Roadmap sampling recommendations with Idaho’s influenza virologic surveillance to determine implementation feasibility. Methods We calculated the proportion of medically attended influenza-like illness (MA-ILI) from Idaho’s influenza-like illness surveillance among outpatients during October 2008 to May 2014, applied data to Roadmap-provided sample size calculators, and compared calculations with actual numbers of specimens tested for influenza by the Idaho Bureau of Laboratories (IBL). We assessed representativeness among patients’ tested specimens to census estimates by age, sex, and health district residence. Results Among outpatients surveilled, Idaho’s mean annual proportion of MA-ILI was 2.30% (20,834/905,818) during a 5-year period. Thus, according to Roadmap recommendations, Idaho needs to collect 128 specimens from MA-ILI patients/week for situational awareness, 1496 influenza-positive specimens/week for detection of a rare or novel influenza virus at 0.2% prevalence, and after detection, 478 specimens/week to confirm true prevalence is ≤2% of influenza-positive samples. The mean number of respiratory specimens Idaho tested for influenza/week, excluding the 2009-2010 influenza season, ranged from 6 to 24. Various influenza virus types and subtypes were collected and specimen submission sources were representative in terms of geographic distribution, patient age range and sex, and disease severity. Conclusions Insufficient numbers of respiratory specimens are submitted to IBL for influenza laboratory testing. Increased specimen submission would facilitate meeting Roadmap sample size recommendations. PMID:28838883
A probabilistic Hu-Washizu variational principle
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Besterfield, G. H.
1987-01-01
A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.
Dynamic Probabilistic Instability of Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2009-01-01
A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.
Probabilistic boundary element method
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Raveendra, S. T.
1989-01-01
The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.
NASA Technical Reports Server (NTRS)
Price J. M.; Ortega, R.
1998-01-01
Probabilistic method is not a universally accepted approach for the design and analysis of aerospace structures. The validity of this approach must be demonstrated to encourage its acceptance as it viable design and analysis tool to estimate structural reliability. The objective of this Study is to develop a well characterized finite population of similar aerospace structures that can be used to (1) validate probabilistic codes, (2) demonstrate the basic principles behind probabilistic methods, (3) formulate general guidelines for characterization of material drivers (such as elastic modulus) when limited data is available, and (4) investigate how the drivers affect the results of sensitivity analysis at the component/failure mode level.
NASA Technical Reports Server (NTRS)
Ryan, Robert S.; Townsend, John S.
1993-01-01
The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.
The USET Tribal-FERST Roadmap was developed by the United South and Eastern Tribes (USET), in collaboration with the EPA, as a general roadmap for other tribes to follow and modify as needed fortheir unique applications.
Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona
2016-01-01
Abstract Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
A probabilistic and continuous model of protein conformational space for template-free modeling.
Zhao, Feng; Peng, Jian; Debartolo, Joe; Freed, Karl F; Sosnick, Tobin R; Xu, Jinbo
2010-06-01
One of the major challenges with protein template-free modeling is an efficient sampling algorithm that can explore a huge conformation space quickly. The popular fragment assembly method constructs a conformation by stringing together short fragments extracted from the Protein Data Base (PDB). The discrete nature of this method may limit generated conformations to a subspace in which the native fold does not belong. Another worry is that a protein with really new fold may contain some fragments not in the PDB. This article presents a probabilistic model of protein conformational space to overcome the above two limitations. This probabilistic model employs directional statistics to model the distribution of backbone angles and 2(nd)-order Conditional Random Fields (CRFs) to describe sequence-angle relationship. Using this probabilistic model, we can sample protein conformations in a continuous space, as opposed to the widely used fragment assembly and lattice model methods that work in a discrete space. We show that when coupled with a simple energy function, this probabilistic method compares favorably with the fragment assembly method in the blind CASP8 evaluation, especially on alpha or small beta proteins. To our knowledge, this is the first probabilistic method that can search conformations in a continuous space and achieves favorable performance. Our method also generated three-dimensional (3D) models better than template-based methods for a couple of CASP8 hard targets. The method described in this article can also be applied to protein loop modeling, model refinement, and even RNA tertiary structure prediction.
River Protection Project Technology and Innovation Roadmap.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reid, D. S.; Wooley, T. A.; Kelly, S. E.
The Technology and Innovation Roadmap is a planning tool for WRPS management, DOE ORP, DOE EM, and others to understand the risks and technology gaps associated with the RPP mission. The roadmap identifies and prioritizes technical areas that require technology solutions and underscores where timely and appropriate technology development can have the greatest impact to reduce those risks and uncertainties. The roadmap also serves as a tool for determining allocation of resources.
Research & Development Roadmap for Next-Generation Appliances
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goetzler, William; Sutherland, Timothy; Foley, Kevin
2012-03-01
Appliances present an attractive opportunity for near-term energy savings in existing building, because they are less expensive and replaced more regularly than heating, ventilation, and air-conditioning (HVAC) systems or building envelope components. This roadmap targets high-priority research and development (R&D), demonstration and commercialization activities that could significantly reduce residential appliance energy consumption. The main objective of the roadmap is to seek activities that accelerate the commercialization of high-efficiency appliance technologies while maintaining the competitiveness of American industry. The roadmap identified and evaluated potential technical innovations, defined research needs, created preliminary research and development roadmaps, and obtained stakeholder feedback on themore » proposed initiatives.« less
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.
Contains basic information on the role and origins of the Selected Analytical Methods including the formation of the Homeland Security Laboratory Capacity Work Group and the Environmental Evaluation Analytical Process Roadmap for Homeland Security Events
Challenges for Product Roadmapping in Inter-company Collaboration
NASA Astrophysics Data System (ADS)
Suomalainen, Tanja; Tihinen, Maarit; Parviainen, Päivi
Product roadmapping is a critical activity in product development, as it provides a link between business aspects and requirements engineering and thus helps to manage a high-level view of the company’s products. Nowadays, inter-company collaboration, such as outsourcing, is a common way of developing software products, as through collaboration, organisations gain advantages, such as flexibility with in-house resources, savings in product development costs and gain a physical presence in important markets. The role of product roadmapping becomes even more critical in collaborative settings, since different companies need to align strategies and work together to create products. In order to support companies in improving their own product roadmapping processes, this paper first gives an overview of product roadmapping and then discusses in detail an empirical study of the current practices in industry. The presented results particularly focus on the most challenging and important activities of product roadmapping in collaboration.
Probabilistic simulation of stress concentration in composite laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, L.
1993-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.
Reliability and risk assessment of structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1991-01-01
Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.
A technology roadmap of smart biosensors from conventional glucose monitoring systems.
Shende, Pravin; Sahu, Pratiksha; Gaud, Ram
2017-06-01
The objective of this review article is to focus on technology roadmap of smart biosensors from a conventional glucose monitoring system. The estimation of glucose with commercially available devices involves analysis of blood samples that are obtained by pricking finger or extracting blood from the forearm. Since pain and discomfort are associated with invasive methods, the non-invasive measurement techniques have been investigated. The non-invasive methods show advantages like non-exposure to sharp objects such as needles and syringes, due to which there is an increase in testing frequency, improved control of glucose concentration and absence of pain and biohazard materials. This review study is aimed to describe recent invasive techniques and major noninvasive techniques, viz. biosensors, optical techniques and sensor-embedded contact lenses for glucose estimation.
Rapid Cost Assessment of Space Mission Concepts Through Application of Complexity-Based Cost Indices
NASA Technical Reports Server (NTRS)
Peterson, Craig E.; Cutts, James; Balint, Tibor; Hall, James B.
2008-01-01
This slide presentation reviews the development of a rapid cost assessment models for evaluation of exploration missions through the application of complexity based cost indices. In Fall of 2004, NASA began developing 13 documents, known as "strategic roadmaps," intended to outline a strategy for space exploration over the next 30 years. The Third Strategic Roadmap, The Strategic Roadmap for Solar System Exploration, focused on strategy for robotic exploration of the Solar System. Development of the Strategic Roadmap for Solar System Exploration led to the investigation of a large variety of missions. However, the necessity of planning around scientific inquiry and budgetary constraints made it necessary for the roadmap development team to evaluate potential missions not only for scientific return but also cost. Performing detailed cost studies for each of the large number of missions was impractical given the time constraints involved and lack of detailed mission studies; so a method of rapid cost assessment was developed by us to allow preliminary analysis. It has been noted that there is a strong correlation between complexity and cost and schedule of planetary missions. While these correlations were made after missions had been built and flown (successfully or otherwise), it seemed likely that a similar approach could provide at least some relative cost ranking. Cost estimation relationships (CERs) have been developed based on subsystem design choices. These CERs required more detailed information than available, forcing the team to adopt a more high level approach. Costing by analogy has been developed for small satellites, however, planetary exploration missions provide such varying spacecraft requirements that there is a lack of adequately comparable missions that can be used for analogy.
Roadmap to achieve 25% hypertension control in Africa by 2025
Dzudie, Anastase; Kingue, Samuel; Dzudie, Anastase; Sliwa, Karen; Mayosi, Bongani; Dzudie, Anastase; Sliwa, Karen; Rayner, Brian; Ojji, Dike; Schutte, Aletta E; Twagirumukiza, Marc; Damasceno, Albertino; Ba, Seringe Abdou; Kane, Abdoul; Kramoh, Euloge; Kacou, Jean Baptiste Anzouan; Onwubere, Basden; Cornick, Ruth; Anisiuba, Benedict; Mocumbi, Ana Olga; Ogola, Elijah; Awad, Mohamed; Nel, George; Otieno, Harun; Toure, Ali Ibrahim; Kengne, Andre Pascal; Perel, Pablo; Adler, Alm; Poulter, Neil
2017-01-01
Summary Background and aim: The Pan-African Society of Cardiology (PASCAR) has identified hypertension as the highest area of priority for action to reduce heart disease and stroke on the continent. The aim of this PASCAR roadmap on hypertension was to develop practical guidance on how to implement strategies that translate existing knowledge into effective action and improve detection, treatment and control of hypertension and cardiovascular health in sub-Saharan Africa (SSA) by the year 2025. Methods: Development of this roadmap started with the creation of a consortium of experts with leadership skills in hypertension. In 2014, experts in different fields, including physicians and non-physicians, were invited to join. Via faceto- face meetings and teleconferences, the consortium made a situation analysis, set a goal, identified roadblocks and solutions to the management of hypertension and customised the World Heart Federation roadmap to Africa. Results: Hypertension is a major crisis on the continent but very few randomised, controlled trials have been conducted on its management. Also, only 25.8% of the countries have developed or adopted guidelines for the management of hypertension. Other major roadblocks are either government and health-system related or healthcare professional or patient related. The PASCAR hypertension task force identified a 10-point action plan to be implemented by African ministries of health to achieve 25% control of hypertension in Africa by 2025. Conclusions: Hypertension affects millions of people in SSA and if left untreated, is a major cause of heart disease and stroke. Very few SSA countries have a clear hypertension policy. This PASCAR roadmap identifies practical and effective solutions that would improve detection, treatment and control of hypertension on the continent and could be implemented as is or adapted to specific national settings. PMID:28906541
An advanced probabilistic structural analysis method for implicit performance functions
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.
1989-01-01
In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.
Probabilistic safety assessment of the design of a tall buildings under the extreme load
DOE Office of Scientific and Technical Information (OSTI.GOV)
Králik, Juraj, E-mail: juraj.kralik@stuba.sk
2016-06-08
The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.
Probabilistic safety assessment of the design of a tall buildings under the extreme load
NASA Astrophysics Data System (ADS)
Králik, Juraj
2016-06-01
The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.
Runaas, Lyndsey; Hanauer, David; Maher, Molly; Bischoff, Evan; Fauer, Alex; Hoang, Tiffany; Munaco, Anna; Sankaran, Roshun; Gupta, Rahael; Seyedsalehi, Sajjad; Cohn, Amy; An, Larry; Tewari, Muneesh; Choi, Sung Won
2017-05-01
Health information technology (HIT) has great potential for increasing patient engagement. Pediatric hematopoietic cell transplantation (HCT) is a setting ripe for using HIT but in which little research exists. "BMT Roadmap" is a web-based application that integrates patient-specific information and includes several domains: laboratory results, medications, clinical trial details, photos of the healthcare team, trajectory of transplant process, and discharge checklist. BMT Roadmap was provided to 10 caregivers of patients undergoing first-time HCT. Research assistants performed weekly qualitative interviews throughout the patient's hospitalization and at discharge and day 100 to assess the impact of BMT Roadmap. Rigorous thematic analysis revealed 5 recurrent themes: emotional impact of the HCT process itself; critical importance of communication among patients, caregivers, and healthcare providers; ways in which BMT Roadmap was helpful during inpatient setting; suggestions for improving BMT Roadmap; and other strategies for organization and management of complex healthcare needs that could be incorporated into BMT Roadmap. Caregivers found the tool useful and easy to use, leading them to want even greater access to information. BMT Roadmap was feasible, with no disruption to inpatient care. Although this initial study is limited by the small sample size and single-institution experience, these initial findings are encouraging and support further investigation. Copyright © 2017 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
U.S. Army unmanned aircraft systems roadmap 2010-2035
DOT National Transportation Integrated Search
2010-01-01
The Unmanned Aircraft System (UAS) Roadmap outlines how the U.S. Army will develop, organize, and employ UAS from 2010 to 2035 across full spectrum operations. The Army UAS Roadmap is nested with the Unmanned Systems (UMS) Initial Capabilities Docume...
Probabilistic dual heuristic programming-based adaptive critic
NASA Astrophysics Data System (ADS)
Herzallah, Randa
2010-02-01
Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
Over a full two day period, February 2–3, 2016, the Office of High Energy Physics convened a workshop in Gaithersburg, MD to seek community input on development of an Advanced Accelerator Concepts (AAC) research roadmap. The workshop was in response to a recommendation by the HEPAP Accelerator R&D Subpanel [1] [2] to “convene the university and laboratory proponents of advanced acceleration concepts to develop R&D roadmaps with a series of milestones and common down selection criteria towards the goal for constructing a multi-TeV e+e– collider” (the charge to the workshop can be found in Appendix A). During the workshop, proponentsmore » of laser-driven plasma wakefield acceleration (LWFA), particle-beam-driven plasma wakefield acceleration (PWFA), and dielectric wakefield acceleration (DWFA), along with a limited number of invited university and laboratory experts, presented and critically discussed individual concept roadmaps. The roadmap workshop was preceded by several preparatory workshops. The first day of the workshop featured presentation of three initial individual roadmaps with ample time for discussion. The individual roadmaps covered a time period extending until roughly 2040, with the end date assumed to be roughly appropriate for initial operation of a multi-TeV e+e– collider. The second day of the workshop comprised talks on synergies between the roadmaps and with global efforts, potential early applications, diagnostics needs, simulation needs, and beam issues and challenges related to a collider. During the last half of the day the roadmaps were revisited but with emphasis on the next five to ten years (as specifically requested in the charge) and on common challenges. The workshop concluded with critical and unanimous endorsement of the individual roadmaps and an extended discussion on the characteristics of the common challenges. (For the agenda and list of participants see Appendix B.)« less
Flight Avionics Hardware Roadmap
NASA Technical Reports Server (NTRS)
Some, Raphael; Goforth, Monte; Chen, Yuan; Powell, Wes; Paulick, Paul; Vitalpur, Sharada; Buscher, Deborah; Wade, Ray; West, John; Redifer, Matt;
2014-01-01
The Avionics Technology Roadmap takes an 80% approach to technology investment in spacecraft avionics. It delineates a suite of technologies covering foundational, component, and subsystem-levels, which directly support 80% of future NASA space mission needs. The roadmap eschews high cost, limited utility technologies in favor of lower cost, and broadly applicable technologies with high return on investment. The roadmap is also phased to support future NASA mission needs and desires, with a view towards creating an optimized investment portfolio that matures specific, high impact technologies on a schedule that matches optimum insertion points of these technologies into NASA missions. The roadmap looks out over 15+ years and covers some 114 technologies, 58 of which are targeted for TRL6 within 5 years, with 23 additional technologies to be at TRL6 by 2020. Of that number, only a few are recommended for near term investment: 1. Rad Hard High Performance Computing 2. Extreme temperature capable electronics and packaging 3. RFID/SAW-based spacecraft sensors and instruments 4. Lightweight, low power 2D displays suitable for crewed missions 5. Radiation tolerant Graphics Processing Unit to drive crew displays 6. Distributed/reconfigurable, extreme temperature and radiation tolerant, spacecraft sensor controller and sensor modules 7. Spacecraft to spacecraft, long link data communication protocols 8. High performance and extreme temperature capable C&DH subsystem In addition, the roadmap team recommends several other activities that it believes are necessary to advance avionics technology across NASA: center dot Engage the OCT roadmap teams to coordinate avionics technology advances and infusion into these roadmaps and their mission set center dot Charter a team to develop a set of use cases for future avionics capabilities in order to decouple this roadmap from specific missions center dot Partner with the Software Steering Committee to coordinate computing hardware and software technology roadmaps and investment recommendations center dot Continue monitoring foundational technologies upon which future avionics technologies will be dependent, e.g., RHBD and COTS semiconductor technologies
Probabilistic Aeroelastic Analysis of Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.
2004-01-01
A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.
Probabilistic numerics and uncertainty in computations
Hennig, Philipp; Osborne, Michael A.; Girolami, Mark
2015-01-01
We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321
Probabilistic numerics and uncertainty in computations.
Hennig, Philipp; Osborne, Michael A; Girolami, Mark
2015-07-08
We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.
NASA Astrophysics Data System (ADS)
Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang
2017-05-01
Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.
NESSUS/EXPERT - An expert system for probabilistic structural analysis methods
NASA Technical Reports Server (NTRS)
Millwater, H.; Palmer, K.; Fink, P.
1988-01-01
An expert system (NESSUS/EXPERT) is presented which provides assistance in using probabilistic structural analysis methods. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator. NESSUS/EXPERT was developed with a combination of FORTRAN and CLIPS, a C language expert system tool, to exploit the strengths of each language.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacvarov, D.C.
1981-01-01
A new method for probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes is presented. The utilized approach of applying the finite element method for probabilistic risk assessment is demonstrated to be very powerful. The reasons for this are two. First, the finite element method is inherently suitable for analysis of three dimensional spaces where the parameters, such as three variate probability densities of the lightning currents, are non-uniformly distributed. Second, the finite element method permits non-uniform discretization of the three dimensional probability spaces thus yielding high accuracy in critical regions, such as the area of themore » low probability events, while at the same time maintaining coarse discretization in the non-critical areas to keep the number of grid points and the size of the problem to a manageable low level. The finite element probabilistic risk assessment method presented here is based on a new multidimensional search algorithm. It utilizes an efficient iterative technique for finite element interpolation of the transmission line insulation flashover criteria computed with an electro-magnetic transients program. Compared to other available methods the new finite element probabilistic risk assessment method is significantly more accurate and approximately two orders of magnitude computationally more efficient. The method is especially suited for accurate assessment of rare, very low probability events.« less
DOT National Transportation Integrated Search
2009-10-13
This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...
Leveraging Federal Funding for Longitudinal Data Systems: A Roadmap for States. Fiscal Year 2011
ERIC Educational Resources Information Center
Data Quality Campaign, 2011
2011-01-01
States should use this roadmap to identify and leverage federal funding sources for data-related activities. This roadmap presents such opportunities for FY 2011, and provides guidance on some of the ways the funds may be used.
Idaho National Engineering Laboratory Waste Management Operations Roadmap Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bullock, M.
1992-04-01
At the direction of the Department of Energy-Headquarters (DOE-HQ), the DOE Idaho Field Office (DOE-ID) is developing roadmaps for Environmental Restoration and Waste Management (ER&WM) activities at Idaho National Engineering Laboratory (INEL). DOE-ID has convened a select group of contractor personnel from EG&G Idaho, Inc. to assist DOE-ID personnel with the roadmapping project. This document is a report on the initial stages of the first phase of the INEL`s roadmapping efforts.
Structural reliability methods: Code development status
NASA Astrophysics Data System (ADS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-05-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Structural reliability methods: Code development status
NASA Technical Reports Server (NTRS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-01-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Probabilistic structural analysis methods for space transportation propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.
1991-01-01
Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olson, Jarrod; Barr, Jonathan L.; Burtner, Edwin R.
A key challenge for research roadmapping in the crisis response and management domain is articulation of a shared vision that describes what the future can and should include. Visioning allows for far-reaching stakeholder engagement that can properly align research with stakeholders needs. Engagement includes feedback from researchers, policy makers, general public, and end-users on technical and non-technical factors. This work articulates a process and framework for the construction and maintenance of a stakeholder-centric research vision and roadmap in the emergency management domain. This novel roadmapping process integrates three pieces: analysis of the research and technology landscape, visioning, and stakeholder engagement.more » Our structured engagement process elicits research foci for the roadmap based on relevance to stakeholder mission, identifies collaborators, and builds consensus around the roadmap priorities. We find that the vision process and vision storyboard helps SMEs conceptualize and discuss a technology's strengths, weaknesses, and alignment with needs« less
Mission to the Solar System: Exploration and Discovery. A Mission and Technology Roadmap
NASA Technical Reports Server (NTRS)
Gulkis, S. (Editor); Stetson, D. S. (Editor); Stofan, E. R. (Editor)
1998-01-01
Solar System exploration addresses some of humanity's most fundamental questions: How and when did life form on Earth? Does life exist elsewhere in the Solar System or in the Universe? - How did the Solar System form and evolve in time? - What can the other planets teach us about the Earth? This document describes a Mission and Technology Roadmap for addressing these and other fundamental Solar System Questions. A Roadmap Development Team of scientists, engineers, educators, and technologists worked to define the next evolutionary steps in in situ exploration, sample return, and completion of the overall Solar System survey. Guidelines were to "develop aa visionary, but affordable, mission and technology development Roadmap for the exploration of the Solar System in the 2000 to 2012 timeframe." The Roadmap provides a catalog of potential flight missions. (Supporting research and technology, ground-based observations, and laboratory research, which are no less important than flight missions, are not included in this Roadmap.)
The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering
NASA Technical Reports Server (NTRS)
Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen
2006-01-01
This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.
Integrity and security in an Ada runtime environment
NASA Technical Reports Server (NTRS)
Bown, Rodney L.
1991-01-01
A review is provided of the Formal Methods group discussions. It was stated that integrity is not a pure mathematical dual of security. The input data is part of the integrity domain. The group provided a roadmap for research. One item of the roadmap and the final position statement are closely related to the space shuttle and space station. The group's position is to use a safe subset of Ada. Examples of safe sets include the Army Secure Operating System and the Penelope Ada verification tool. It is recommended that a conservative attitude is required when writing Ada code for life and property critical systems.
In-Situ Resource Utilization (ISRU) Capability Roadmap Progress Review
NASA Technical Reports Server (NTRS)
Sanders, Gerald B.; Duke, Michael
2005-01-01
A progress review on In-Situ Resource Utilization (ISRU) capability is presented. The topics include: 1) In-Situ Resource Utilization (ISRU) Capability Roadmap: Level 1; 2) ISRU Emphasized Architecture Overview; 3) ISRU Capability Elements: Level 2 and below; and 4) ISRU Capability Roadmap Wrap-up.
Probabilistic structural analysis methods for improving Space Shuttle engine reliability
NASA Technical Reports Server (NTRS)
Boyce, L.
1989-01-01
Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.
Cost-Reduction Roadmap for Residential Solar Photovoltaics (PV),
2017-2030 | Solar Research | NREL Cost-Reduction Roadmap for Residential Solar Photovoltaics (PV), 2017-2030 Cost-Reduction Roadmap for Residential Solar Photovoltaics (PV), 2017-2030 This report Office (SETO) residential 2030 photovoltaics (PV) cost target of $0.05 per kilowatt-hour by identifying
The 2017 Plasma Roadmap: Low temperature plasma science and technology
USDA-ARS?s Scientific Manuscript database
Journal of Physics D: Applied Physics published the first Plasma Roadmap in 2012 consisting of the individual perspectives of 16 leading experts in the various sub-fields of low temperature plasma science and technology. The 2017 Plasma Roadmap is the first update of a planned series of periodic upd...
NASA Strategic Roadmap Committees Final Roadmaps. Volumes 1 and 2
NASA Technical Reports Server (NTRS)
2005-01-01
Volume 1 contains NASA strategic roadmaps for the following Advanced Planning and Integration Office (APIO) committees: Earth Science and Applications from Space; Sun - Solar System Connection. Volume 2 contains NASA strategic roadmaps for the following APIO committees: Robotic and Human Exploration of Mars; Solar System Exploration; Search for Earth-like Planets; Universe Exploration, as well as membership rosters and charters for all APIO committees, including those above and the following: Exploration Transportation System; Nuclear Systems; Robotic and Human Lunar Exploration; Aeronautical Technologies; Space Shuttle; International Space Station; Education.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
An approximate methods approach to probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.
Fuzzy-probabilistic model for risk assessment of radioactive material railway transportation.
Avramenko, M; Bolyatko, V; Kosterev, V
2005-01-01
Transportation of radioactive materials is obviously accompanied by a certain risk. A model for risk assessment of emergency situations and terrorist attacks may be useful for choosing possible routes and for comparing the various defence strategies. In particular, risk assessment is crucial for safe transportation of excess weapons-grade plutonium arising from the removal of plutonium from military employment. A fuzzy-probabilistic model for risk assessment of railway transportation has been developed taking into account the different natures of risk-affecting parameters (probabilistic and not probabilistic but fuzzy). Fuzzy set theory methods as well as standard methods of probability theory have been used for quantitative risk assessment. Information-preserving transformations are applied to realise the correct aggregation of probabilistic and fuzzy parameters. Estimations have also been made of the inhalation doses resulting from possible accidents during plutonium transportation. The obtained data show the scale of possible consequences that may arise from plutonium transportation accidents.
Li, Zhixi; Peck, Kyung K.; Brennan, Nicole P.; Jenabi, Mehrnaz; Hsu, Meier; Zhang, Zhigang; Holodny, Andrei I.; Young, Robert J.
2014-01-01
Purpose The purpose of this study was to compare the deterministic and probabilistic tracking methods of diffusion tensor white matter fiber tractography in patients with brain tumors. Materials and Methods We identified 29 patients with left brain tumors <2 cm from the arcuate fasciculus who underwent pre-operative language fMRI and DTI. The arcuate fasciculus was reconstructed using a deterministic Fiber Assignment by Continuous Tracking (FACT) algorithm and a probabilistic method based on an extended Monte Carlo Random Walk algorithm. Tracking was controlled using two ROIs corresponding to Broca’s and Wernicke’s areas. Tracts in tumoraffected hemispheres were examined for extension between Broca’s and Wernicke’s areas, anterior-posterior length and volume, and compared with the normal contralateral tracts. Results Probabilistic tracts displayed more complete anterior extension to Broca’s area than did FACT tracts on the tumor-affected and normal sides (p < 0.0001). The median length ratio for tumor: normal sides was greater for probabilistic tracts than FACT tracts (p < 0.0001). The median tract volume ratio for tumor: normal sides was also greater for probabilistic tracts than FACT tracts (p = 0.01). Conclusion Probabilistic tractography reconstructs the arcuate fasciculus more completely and performs better through areas of tumor and/or edema. The FACT algorithm tends to underestimate the anterior-most fibers of the arcuate fasciculus, which are crossed by primary motor fibers. PMID:25328583
Memory Indexing: A Novel Method for Tracing Memory Processes in Complex Cognitive Tasks
ERIC Educational Resources Information Center
Renkewitz, Frank; Jahn, Georg
2012-01-01
We validate an eye-tracking method applicable for studying memory processes in complex cognitive tasks. The method is tested with a task on probabilistic inferences from memory. It provides valuable data on the time course of processing, thus clarifying previous results on heuristic probabilistic inference. Participants learned cue values of…
Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems
NASA Technical Reports Server (NTRS)
Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.
2005-01-01
The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.
Fully probabilistic control for stochastic nonlinear control systems with input dependent noise.
Herzallah, Randa
2015-03-01
Robust controllers for nonlinear stochastic systems with functional uncertainties can be consistently designed using probabilistic control methods. In this paper a generalised probabilistic controller design for the minimisation of the Kullback-Leibler divergence between the actual joint probability density function (pdf) of the closed loop control system, and an ideal joint pdf is presented emphasising how the uncertainty can be systematically incorporated in the absence of reliable systems models. To achieve this objective all probabilistic models of the system are estimated from process data using mixture density networks (MDNs) where all the parameters of the estimated pdfs are taken to be state and control input dependent. Based on this dependency of the density parameters on the input values, explicit formulations to the construction of optimal generalised probabilistic controllers are obtained through the techniques of dynamic programming and adaptive critic methods. Using the proposed generalised probabilistic controller, the conditional joint pdfs can be made to follow the ideal ones. A simulation example is used to demonstrate the implementation of the algorithm and encouraging results are obtained. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
LaBel, Kenneth A.; Sampson, Michael J.
2015-01-01
This presentation is a NASA Electronic Parts and Packaging (NEPP) Program: Roadmap for FY15 and Beyond. This roadmap provides a snapshot for current plans and collaborations on testing and evaluation of electronics as well as a discussion of the technology selection approach.
Lopatina, Elena; Damani, Zaheed; Bohm, Eric; Noseworthy, Tom W; Conner-Spady, Barbara; MacKean, Gail; Simpson, Chris S; Marshall, Deborah A
2017-09-01
Long waiting times for elective services continue to be a challenging issue. Single-entry models (SEMs) are used to increase access to and flow through the healthcare system. This paper provides a roadmap for healthcare decision-makers, managers, physicians, and researchers to guide implementation and management of successful and sustainable SEMs. The roadmap was informed by an inductive qualitative synthesis of the findings from a deliberative process (a symposium on SEMs, with clinicians, researchers, senior policy-makers, healthcare managers, and patient representatives) and focus groups with the symposium participants. SEMs are a promising strategy to improve the management of referrals and represent one approach to reduce waiting times. The SEMs roadmap outlines current knowledge about SEMs and critical success factors for SEMs' implementation and management. This SEM roadmap is intended to help clinicians, decision-makers, managers, and researchers interested in developing new or strengthening existing SEMs. We consider this roadmap to be a living document that will continue to evolve as we learn more about implementing and managing sustainable SEMs. Copyright © 2017 Elsevier B.V. All rights reserved.
The 2017 Plasma Roadmap: Low temperature plasma science and technology
NASA Astrophysics Data System (ADS)
Adamovich, I.; Baalrud, S. D.; Bogaerts, A.; Bruggeman, P. J.; Cappelli, M.; Colombo, V.; Czarnetzki, U.; Ebert, U.; Eden, J. G.; Favia, P.; Graves, D. B.; Hamaguchi, S.; Hieftje, G.; Hori, M.; Kaganovich, I. D.; Kortshagen, U.; Kushner, M. J.; Mason, N. J.; Mazouffre, S.; Mededovic Thagard, S.; Metelmann, H.-R.; Mizuno, A.; Moreau, E.; Murphy, A. B.; Niemira, B. A.; Oehrlein, G. S.; Petrovic, Z. Lj; Pitchford, L. C.; Pu, Y.-K.; Rauf, S.; Sakai, O.; Samukawa, S.; Starikovskaia, S.; Tennyson, J.; Terashima, K.; Turner, M. M.; van de Sanden, M. C. M.; Vardelle, A.
2017-08-01
Journal of Physics D: Applied Physics published the first Plasma Roadmap in 2012 consisting of the individual perspectives of 16 leading experts in the various sub-fields of low temperature plasma science and technology. The 2017 Plasma Roadmap is the first update of a planned series of periodic updates of the Plasma Roadmap. The continuously growing interdisciplinary nature of the low temperature plasma field and its equally broad range of applications are making it increasingly difficult to identify major challenges that encompass all of the many sub-fields and applications. This intellectual diversity is ultimately a strength of the field. The current state of the art for the 19 sub-fields addressed in this roadmap demonstrates the enviable track record of the low temperature plasma field in the development of plasmas as an enabling technology for a vast range of technologies that underpin our modern society. At the same time, the many important scientific and technological challenges shared in this roadmap show that the path forward is not only scientifically rich but has the potential to make wide and far reaching contributions to many societal challenges.
Proceedings of the 2003 NASA/JPL Workshop on Fundamental Physics in Space
NASA Technical Reports Server (NTRS)
Strayer, Don (Editor)
2003-01-01
The 2003 Fundamental Physics workshop included presentations ranging from forces acting on RNA to properties of clouds of degenerate Fermi atoms, to techniques to probe for a added space-time dimensions, and to flight hardware for low temperature experiments, amongst others. Mark Lee from NASA Headquarters described the new strategic plan that NASA has developed under Administrator Sean O'Keefe's leadership. Mark explained that the Fundamental Physics community now needs to align its research program and the roadmap describing the long-term goals of the program with the NASA plan. Ulf Israelsson of JPL discussed how the rewrite of the roadmap will be implemented under the leadership of the Fundamental Physics Discipline Working Group (DWG). Nick Bigelow, chair of the DWG, outlined how investigators can contribute to the writing of the roadmap. Results of measurements on very cold clouds of Fermi atoms near a Feshbach resonance were described by three investigators. Also, new measurements relating to tests of Einstein equivalence were discussed. Investigators also described methods to test other aspects of Einstein's relativity theories.
A Research Roadmap for Computation-Based Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Mandelli, Diego; Joe, Jeffrey
2015-08-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less
National Algal Biofuels Technology Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, John; Sarisky-Reed, Valerie
The framework for National Algal Biofuels Technology Roadmap was constructed at the Algal Biofuels Technology Roadmap Workshop, held December 9-10, 2008, at the University of Maryland-College Park. The Workshop was organized by the Biomass Program to discuss and identify the critical challenges currently hindering the development of a domestic, commercial-scale algal biofuels industry. This Roadmap presents information from a scientific, economic, and policy perspectives that can support and guide RD&D investment in algal biofuels. While addressing the potential economic and environmental benefits of using algal biomass for the production of liquid transportation fuels, the Roadmap describes the current status ofmore » algae RD&D. In doing so, it lays the groundwork for identifying challenges that likely need to be overcome for algal biomass to be used in the production of economically viable biofuels.« less
National roadmap for research infrastructure
NASA Astrophysics Data System (ADS)
Bonev, Tanyu
In 2010 the Council of Ministers of Republic of Bulgaria passed a National roadmap for research infrastructure (Decision Num. 692 from 21.09.2010). Part of the roadmap is the project called Regional Astronomical Center for Research and Education (RACIO). Distinctive feature of this project is the integration of the existing in the country research and educational organizations in the field of astronomy. The project is a substantial part of the strategy for the development of astronomy in Bulgaria over the next decade. What is the content of this strategis project? How it was possible to include RACIO in the roadmap? Does the national roadmap charmonize with the strategic plans for the development of astronomy in Europe, elaborated by Astronet (http://www.astronet-eu.org/)? These are some of the questions which I try to give answers in this paper.
Gottlieb, Sami L; Deal, Carolyn D; Giersing, Birgitte; Rees, Helen; Bolan, Gail; Johnston, Christine; Timms, Peter; Gray-Owen, Scott D; Jerse, Ann E; Cameron, Caroline E; Moorthy, Vasee S; Kiarie, James; Broutet, Nathalie
2016-06-03
In 2014, the World Health Organization, the US National Institutes of Health, and global technical partners published a comprehensive roadmap for development of new vaccines against sexually transmitted infections (STIs). Since its publication, progress has been made in several roadmap activities: obtaining better epidemiologic data to establish the public health rationale for STI vaccines, modeling the theoretical impact of future vaccines, advancing basic science research, defining preferred product characteristics for first-generation vaccines, and encouraging investment in STI vaccine development. This article reviews these overarching roadmap activities, provides updates on research and development of individual vaccines against herpes simplex virus, Chlamydia trachomatis, Neisseria gonorrhoeae, and Treponema pallidum, and discusses important next steps to advance the global roadmap for STI vaccine development. Copyright © 2016 World Health Organization. Published by Elsevier Ltd.. All rights reserved.
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
NASA capabilities roadmap: advanced telescopes and observatories
NASA Technical Reports Server (NTRS)
Feinberg, Lee D.
2005-01-01
The NASA Advanced Telescopes and Observatories (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories collecting all electromagnetic bands, ranging from x-rays to millimeter waves, and including gravity-waves. It has derived capability priorities from current and developing Space Missions Directorate (SMD) strategic roadmaps and, where appropriate, has ensured their consistency with other NASA Strategic and Capability Roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.
Non-Deterministic Dynamic Instability of Composite Shells
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2004-01-01
A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics, and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties, in that order.
Commercialization of NESSUS: Status
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Millwater, Harry R.
1991-01-01
A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werling, Eric
This report presents the Building America Research-to-Market Plan (Plan), including the integrated Building America Technology-to-Market Roadmaps (Roadmaps) that will guide Building America’s research, development, and deployment (RD&D) activities over the coming years. The Plan and Roadmaps will be updated as necessary to adapt to research findings and evolving stakeholder needs, and they will reflect input from DOE and stakeholders.
Students’ difficulties in probabilistic problem-solving
NASA Astrophysics Data System (ADS)
Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.
2018-03-01
There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Wing, Kam Liu
1987-01-01
In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.
P. B. Woodbury; D. A. Weinstein
2010-01-01
We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...
Non-unitary probabilistic quantum computing
NASA Technical Reports Server (NTRS)
Gingrich, Robert M.; Williams, Colin P.
2004-01-01
We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.
A molecular fragment cheminformatics roadmap for mesoscopic simulation.
Truszkowski, Andreas; Daniel, Mirco; Kuhn, Hubert; Neumann, Stefan; Steinbeck, Christoph; Zielesny, Achim; Epple, Matthias
2014-12-01
Mesoscopic simulation studies the structure, dynamics and properties of large molecular ensembles with millions of atoms: Its basic interacting units (beads) are no longer the nuclei and electrons of quantum chemical ab-initio calculations or the atom types of molecular mechanics but molecular fragments, molecules or even larger molecular entities. For its simulation setup and output a mesoscopic simulation kernel software uses abstract matrix (array) representations for bead topology and connectivity. Therefore a pure kernel-based mesoscopic simulation task is a tedious, time-consuming and error-prone venture that limits its practical use and application. A consequent cheminformatics approach tackles these problems and provides solutions for a considerably enhanced accessibility. This study aims at outlining a complete cheminformatics roadmap that frames a mesoscopic Molecular Fragment Dynamics (MFD) simulation kernel to allow its efficient use and practical application. The molecular fragment cheminformatics roadmap consists of four consecutive building blocks: An adequate fragment structure representation (1), defined operations on these fragment structures (2), the description of compartments with defined compositions and structural alignments (3), and the graphical setup and analysis of a whole simulation box (4). The basis of the cheminformatics approach (i.e. building block 1) is a SMILES-like line notation (denoted f SMILES) with connected molecular fragments to represent a molecular structure. The f SMILES notation and the following concepts and methods for building blocks 2-4 are outlined with examples and practical usage scenarios. It is shown that the requirements of the roadmap may be partly covered by already existing open-source cheminformatics software. Mesoscopic simulation techniques like MFD may be considerably alleviated and broadened for practical use with a consequent cheminformatics layer that successfully tackles its setup subtleties and conceptual usage hurdles. Molecular Fragment Cheminformatics may be regarded as a crucial accelerator to propagate MFD and similar mesoscopic simulation techniques in the molecular sciences. Graphical abstractA molecular fragment cheminformatics roadmap for mesoscopic simulation.
Zhang, Qiu; Kong, De-yu; Li, Chun-jian; Chen, Bo; Jia, En-zhi; Chen, Lei-Lei; Jia, Qing-zhe; Dai, Zhen-hua; Zhu, Tian-tian; Chen, Jun; Liu, Jie; Zhu, Tie-bing; Yang, Zhi-jian; Cao, Ke-jiang
2013-02-01
To evaluate the feasibility, efficacy and safety of the percutaneous coronary intervention (PCI)guided by computed tomography (CT) coronary angiography derived roadmap and magnetic navigation system (MNS). During June 2011 and May 2012, thirty consecutive patients receiving elective PCI were enrolled, coronary artery disease was primarily diagnosed by dual-source CT coronary angiography (DSCT-CA) at outpatient clinic and successively proved by coronary artery angiography in the hospital. Target vessels from pre-procedure DSCT-CA were transferred to the magnetic navigation system, and consequently edited, reconstructed, and projected onto the live fluoroscopic screen as roadmap. Parameters including characters of the target lesions, time, contrast volume, radiation dosage for guidewire crossing, and complications of the procedure were recorded. Thirty patients with 36 lesions were recruited and intervened by PCI. Among the target lesions, sixteen were classified as type A, 11 as type B1, 8 as type B2, 1 as type C. The average length of the target lesions was (22.0 ± 9.8) mm, and the average stenosis of the target lesions was (81.3 ± 10.3)%. Under the guidance of CT roadmap and MNS, 36 target lesions were crossed by the magnetic guidewires, with a lesion crossing ratio of 100%. The time of placement of the magnetic guidewires was 92.5 (56.6 - 131.3) seconds. The contrast volume and the radiation dosage for guidewire placement were 0.0 (0.0 - 3.0) ml and 235.0 (123.5 - 395.1) µGym(2)/36.5 (21.3 - 67.8) mGy, respectively. Guidewires were successfully placed in 21 (58.3%) lesions without contrast agent. All enrolled vessels were successfully treated, and there were no MNS associated complications. It is feasible, effective and safe to initiate PCI under the guidance of CT derived roadmap and MNS. This method might be helpful for the guidewire placement in the treatment of total occlusions.
TA-13: Ground and Launch Systems, 2015 NASA Technology Roadmaps
NASA Technical Reports Server (NTRS)
Fox, Jack J.
2015-01-01
This presentation is a summary of new content contained in the 2015 update of Technology Area-13, Ground and Launch Systems technology roadmap beyond the content contained in the 2010 version. Also included are brief assessments of benefits, alignments, challenges, technical risk and reasonableness, sequencing and timing, and time and effort to achieve goals. This presentation is part of overall presentations of new content only for the 2015 update of the 15 NASA Technology Roadmaps that will be conducted in a public forum managed by the National Research Council on September 28-29, 2015. The 15 roadmaps have already been publically released via the STI process.
The 2017 Plasma Roadmap: Low temperature plasma science and technology
Adamovich, I.; Baalrud, S. D.; Bogaerts, A.; ...
2017-07-14
Journal of Physics D: Applied Physics published the first Plasma Roadmap in 2012 consisting of the individual perspectives of 16 leading experts in the various sub-fields of low temperature plasma science and technology. The 2017 Plasma Roadmap is the first update of a planned series of periodic updates of the Plasma Roadmap. The continuously growing interdisciplinary nature of the low temperature plasma field and its equally broad range of applications are making it increasingly difficult to identify major challenges that encompass all of the many sub-fields and applications. This intellectual diversity is ultimately a strength of the field. The currentmore » state of the art for the 19 sub-fields addressed in this roadmap demonstrates the enviable track record of the low temperature plasma field in the development of plasmas as an enabling technology for a vast range of technologies that underpin our modern society. At the same time, the many important scientific and technological challenges shared in this roadmap show that the path forward is not only scientifically rich but has the potential to make wide and far reaching contributions to many societal challenges.« less
The 2017 Plasma Roadmap: Low temperature plasma science and technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adamovich, I.; Baalrud, S. D.; Bogaerts, A.
Journal of Physics D: Applied Physics published the first Plasma Roadmap in 2012 consisting of the individual perspectives of 16 leading experts in the various sub-fields of low temperature plasma science and technology. The 2017 Plasma Roadmap is the first update of a planned series of periodic updates of the Plasma Roadmap. The continuously growing interdisciplinary nature of the low temperature plasma field and its equally broad range of applications are making it increasingly difficult to identify major challenges that encompass all of the many sub-fields and applications. This intellectual diversity is ultimately a strength of the field. The currentmore » state of the art for the 19 sub-fields addressed in this roadmap demonstrates the enviable track record of the low temperature plasma field in the development of plasmas as an enabling technology for a vast range of technologies that underpin our modern society. At the same time, the many important scientific and technological challenges shared in this roadmap show that the path forward is not only scientifically rich but has the potential to make wide and far reaching contributions to many societal challenges.« less
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
Metrology needs for the semiconductor industry over the next decade
NASA Astrophysics Data System (ADS)
Melliar-Smith, Mark; Diebold, Alain C.
1998-11-01
Metrology will continue to be a key enabler for the development and manufacture of future generations of integrated circuits. During 1997, the Semiconductor Industry Association renewed the National Technology Roadmap for Semiconductors (NTRS) through the 50 nm technology generation and for the first time included a Metrology Roadmap (1). Meeting the needs described in the Metrology Roadmap will be both a technological and financial challenge. In an ideal world, metrology capability would be available at the start of process and tool development, and silicon suppliers would have 450 mm wafer capable metrology tools in time for development of that wafer size. Unfortunately, a majority of the metrology suppliers are small companies that typically can't afford the additional two to three year wait for return on R&D investment. Therefore, the success of the semiconductor industry demands that we expand cooperation between NIST, SEMATECH, the National Labs, SRC, and the entire community. In this paper, we will discuss several critical metrology topics including the role of sensor-based process control, in-line microscopy, focused measurements for transistor and interconnect fabrication, and development needs. Improvements in in-line microscopy must extend existing critical dimension measurements up to 100 nm generations and new methods may be required for sub 100 nm generations. Through development, existing metrology dielectric thickness and dopant dose and junction methods can be extended to 100 nm, but new and possibly in-situ methods are needed beyond 100 nm. Interconnect process control will undergo change before 100 nm due to the introduction of copper metallization, low dielectric constant interlevel dielectrics, and Damascene process flows.
Probabilistic brain tissue segmentation in neonatal magnetic resonance imaging.
Anbeek, Petronella; Vincken, Koen L; Groenendaal, Floris; Koeman, Annemieke; van Osch, Matthias J P; van der Grond, Jeroen
2008-02-01
A fully automated method has been developed for segmentation of four different structures in the neonatal brain: white matter (WM), central gray matter (CEGM), cortical gray matter (COGM), and cerebrospinal fluid (CSF). The segmentation algorithm is based on information from T2-weighted (T2-w) and inversion recovery (IR) scans. The method uses a K nearest neighbor (KNN) classification technique with features derived from spatial information and voxel intensities. Probabilistic segmentations of each tissue type were generated. By applying thresholds on these probability maps, binary segmentations were obtained. These final segmentations were evaluated by comparison with a gold standard. The sensitivity, specificity, and Dice similarity index (SI) were calculated for quantitative validation of the results. High sensitivity and specificity with respect to the gold standard were reached: sensitivity >0.82 and specificity >0.9 for all tissue types. Tissue volumes were calculated from the binary and probabilistic segmentations. The probabilistic segmentation volumes of all tissue types accurately estimated the gold standard volumes. The KNN approach offers valuable ways for neonatal brain segmentation. The probabilistic outcomes provide a useful tool for accurate volume measurements. The described method is based on routine diagnostic magnetic resonance imaging (MRI) and is suitable for large population studies.
NASA Strategic Roadmap Summary Report
NASA Technical Reports Server (NTRS)
Wilson, Scott; Bauer, Frank; Stetson, Doug; Robey, Judee; Smith, Eric P.; Capps, Rich; Gould, Dana; Tanner, Mike; Guerra, Lisa; Johnston, Gordon
2005-01-01
In response to the Vision, NASA commissioned strategic and capability roadmap teams to develop the pathways for turning the Vision into a reality. The strategic roadmaps were derived from the Vision for Space Exploration and the Aldrich Commission Report dated June 2004. NASA identified 12 strategic areas for roadmapping. The Agency added a thirteenth area on nuclear systems because the topic affects the entire program portfolio. To ensure long-term public visibility and engagement, NASA established a committee for each of the 13 areas. These committees - made up of prominent members of the scientific and aerospace industry communities and senior government personnel - worked under the Federal Advisory Committee Act. A committee was formed for each of the following program areas: 1) Robotic and Human Lunar Exploration; 2) Robotic and Human Exploration of Mars; 3) Solar System Exploration; 4) Search for Earth-Like Planets; 5) Exploration Transportation System; 6) International Space Station; 7) Space Shuttle; 8) Universe Exploration; 9) Earth Science and Applications from Space; 10) Sun-Solar System Connection; 11) Aeronautical Technologies; 12) Education; 13) Nuclear Systems. This document contains roadmap summaries for 10 of these 13 program areas; The International Space Station, Space Shuttle, and Education are excluded. The completed roadmaps for the following committees: Robotic and Human Exploration of Mars; Solar System Exploration; Search for Earth-Like Planets; Universe Exploration; Earth Science and Applications from Space; Sun-Solar System Connection are collected in a separate Strategic Roadmaps volume. This document contains memebership rosters and charters for all 13 committees.
NASA Astrophysics Data System (ADS)
Donovan, Amy; Oppenheimer, Clive; Bravo, Michael
2012-12-01
This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.
Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, J.; Ayala, S.
1999-01-01
NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.
Probabilistic finite elements for fracture and fatigue analysis
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.
1989-01-01
The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.
Cost-Reduction Roadmap Outlines Two Pathways to Meet DOE Residential Solar
Cost Target for 2030 | News | NREL Cost-Reduction Roadmap Outlines Two Pathways to Meet DOE Residential Solar Cost Target for 2030 News Release: Cost-Reduction Roadmap Outlines Two Pathways to Meet DOE Residential Solar Cost Target for 2030 Installing photovoltaics at the time of roof replacement or as part of
Promising roadmap alternatives for the SpaceLiner
NASA Astrophysics Data System (ADS)
Sippel, Martin
2010-06-01
The paper describes the vision and potential roadmap alternatives of an ultrafast intercontinental passenger transport based on a rocket powered two-stage reusable vehicle. An operational scenario and the latest technical lay-out of the configuration's preliminary design including flight performance are described. The question of how the revolutionary ultrafast transport can be realized is addressed by an assessment of the different technological and programmatic roadmap alternatives.
Probabilistic methods for rotordynamics analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.
1991-01-01
This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.
Process for computing geometric perturbations for probabilistic analysis
Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX
2012-04-10
A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.
Probabilistic Exposure Analysis for Chemical Risk Characterization
Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.
2009-01-01
This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660
Probabilistic simulation of uncertainties in thermal structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael
1990-01-01
Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.
Fracture mechanics analysis of cracked structures using weight function and neural network method
NASA Astrophysics Data System (ADS)
Chen, J. G.; Zang, F. G.; Yang, Y.; Shi, K. K.; Fu, X. L.
2018-06-01
Stress intensity factors(SIFs) due to thermal-mechanical load has been established by using weight function method. Two reference stress states sere used to determine the coefficients in the weight function. Results were evaluated by using data from literature and show a good agreement between them. So, the SIFs can be determined quickly using the weight function obtained when cracks subjected to arbitrary loads, and presented method can be used for probabilistic fracture mechanics analysis. A probabilistic methodology considering Monte-Carlo with neural network (MCNN) has been developed. The results indicate that an accurate probabilistic characteristic of the KI can be obtained by using the developed method. The probability of failure increases with the increasing of loads, and the relationship between is nonlinear.
Best Merge Region Growing with Integrated Probabilistic Classification for Hyperspectral Imagery
NASA Technical Reports Server (NTRS)
Tarabalka, Yuliya; Tilton, James C.
2011-01-01
A new method for spectral-spatial classification of hyperspectral images is proposed. The method is based on the integration of probabilistic classification within the hierarchical best merge region growing algorithm. For this purpose, preliminary probabilistic support vector machines classification is performed. Then, hierarchical step-wise optimization algorithm is applied, by iteratively merging regions with the smallest Dissimilarity Criterion (DC). The main novelty of this method consists in defining a DC between regions as a function of region statistical and geometrical features along with classification probabilities. Experimental results are presented on a 200-band AVIRIS image of the Northwestern Indiana s vegetation area and compared with those obtained by recently proposed spectral-spatial classification techniques. The proposed method improves classification accuracies when compared to other classification approaches.
Probabilistic Methods for Structural Design and Reliability
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)
2002-01-01
This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.
Guided SAR image despeckling with probabilistic non local weights
NASA Astrophysics Data System (ADS)
Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny
2017-12-01
SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.
NASA Technical Reports Server (NTRS)
Sawin, Charles F.
1999-01-01
The product of the critical path roadmap project is an integrated strategy for mitigating the risks associated with human exploration class missions. It is an evolving process that will assure the ability to communicate the integrated critical path roadmap. Unlike previous reports, this one will not sit on a shelf - it has the full support of the JSC Space and Life Sciences Directorate (SA) and is already being used as a decision making tool (e.g., budget and investigation planning for Shuttle and Space Station mission). Utility of this product depends on many efforts, namely: providing the required information (completed risk data sheets, critical question information, technology data). It is essential to communicate the results of the critical path roadmap to the scientific community - this meeting is a good opportunity to do so. The web site envisioned for the critical path roadmap will provide the capability to communicate to a broader community and to track and update the system routinely.
The OPTICON technology roadmap for optical and infrared astronomy
NASA Astrophysics Data System (ADS)
Cunningham, Colin; Melotte, David; Molster, Frank
2010-07-01
The Key Technology Network (KTN) within the OPTICON programme has been developing a roadmap for the technology needed to meet the challenges of optical and infrared astronomy over the next few years, with particular emphasis on the requirements of Extremely Large Telescopes. The process and methodology so far will be described, along with the most recent roadmap. The roadmap shows the expected progression of ground-based astronomy facilities and the technological developments which will be required to realise these new facilities. The roadmap highlights the key stages in the development of these technologies. In some areas, such as conventional optics, gradual developments in areas such as light-weighting of optics will slowly be adopted into future instruments. In other areas, such as large area IR detectors, more rapid progress can be expected as new processing techniques allow larger and faster arrays. Finally, other areas such as integrated photonics have the potential to revolutionise astronomical instrumentation. Future plans are outlined, in particular our intention to look at longer term development and disruptive technologies.
Collaboration process for integrated social and health care strategy implementation.
Korpela, Jukka; Elfvengren, Kalle; Kaarna, Tanja; Tepponen, Merja; Tuominen, Markku
2012-01-01
To present a collaboration process for creating a roadmap for the implementation of a strategy for integrated health and social care. The developed collaboration process includes multiple phases and uses electronic group decision support system technology (GDSS). A case study done in the South Karelia District of Social and Health Services in Finland during 2010-2011. An expert panel of 13 participants was used in the planning process of the strategy implementation. The participants were interviewed and observed during the case study. As a practical result, a roadmap for integrated health and social care strategy implementation has been developed. The strategic roadmap includes detailed plans of several projects which are needed for successful integration strategy implementation. As an academic result, a collaboration process to create such a roadmap has been developed. The collaboration process and technology seem to suit the planning process well. The participants of the meetings were satisfied with the collaboration process and the GDSS technology. The strategic roadmap was accepted by the participants, which indicates satisfaction with the developed process.
NASA Strategic Roadmap: Origin, Evolution, Structure, and Destiny of the Universe
NASA Technical Reports Server (NTRS)
White, Nicholas E.
2005-01-01
The NASA strategic roadmap on the Origin, Evolution, Structure and Destiny of the Universe is one of 13 roadmaps that outline NASA s approach to implement the vision for space exploration. The roadmap outlines a program to address the questions: What powered the Big Bang? What happens close to a Black Hole? What is Dark Energy? How did the infant universe grow into the galaxies, stars and planets, and set the stage for life? The roadmap builds upon the currently operating and successful missions such as HST, Chandra and Spitzer. The program contains two elements, Beyond Einstein and Pathways to Life, performed in three phases (2005-2015, 2015-2025 and >2025) with priorities set by inputs received from reviews undertaken by the National Academy of Sciences and technology readiness. The program includes the following missions: 2005-2015 GLAST, JWST and LISA; 2015-2025 Constellation-X and a series of Einstein Probes; and >2025 a number of ambitious vision missions which will be prioritized by results from the previous two phases.
Bayesian probabilistic population projections for all countries.
Raftery, Adrian E; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K
2012-08-28
Projections of countries' future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950-1990 are used for estimation, and applied to predict 1990-2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20-64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades.
a Probabilistic Embedding Clustering Method for Urban Structure Detection
NASA Astrophysics Data System (ADS)
Lin, X.; Li, H.; Zhang, Y.; Gao, L.; Zhao, L.; Deng, M.
2017-09-01
Urban structure detection is a basic task in urban geography. Clustering is a core technology to detect the patterns of urban spatial structure, urban functional region, and so on. In big data era, diverse urban sensing datasets recording information like human behaviour and human social activity, suffer from complexity in high dimension and high noise. And unfortunately, the state-of-the-art clustering methods does not handle the problem with high dimension and high noise issues concurrently. In this paper, a probabilistic embedding clustering method is proposed. Firstly, we come up with a Probabilistic Embedding Model (PEM) to find latent features from high dimensional urban sensing data by "learning" via probabilistic model. By latent features, we could catch essential features hidden in high dimensional data known as patterns; with the probabilistic model, we can also reduce uncertainty caused by high noise. Secondly, through tuning the parameters, our model could discover two kinds of urban structure, the homophily and structural equivalence, which means communities with intensive interaction or in the same roles in urban structure. We evaluated the performance of our model by conducting experiments on real-world data and experiments with real data in Shanghai (China) proved that our method could discover two kinds of urban structure, the homophily and structural equivalence, which means clustering community with intensive interaction or under the same roles in urban space.
Adoption of Electronic Health Records: A Roadmap for India
2016-01-01
Objectives The objective of the study was to create a roadmap for the adoption of Electronic Health Record (EHR) in India based an analysis of the strategies of other countries and national scenarios of ICT use in India. Methods The strategies for adoption of EHR in other countries were analyzed to find the crucial steps taken. Apart from reports collected from stakeholders in the country, the study relied on the experience of the author in handling several e-health projects. Results It was found that there are four major areas where the countries considered have made substantial efforts: ICT infrastructure, Policy & regulations, Standards & interoperability, and Research, development & education. A set of crucial activities were identified in each area. Based on the analysis, a roadmap is suggested. It includes the creation of a secure health network; health information exchange; and the use of open-source software, a national health policy, privacy laws, an agency for health IT standards, R&D, human resource development, etc. Conclusions Although some steps have been initiated, several new steps need to be taken up for the successful adoption of EHR. It requires a coordinated effort from all the stakeholders. PMID:27895957
COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS
Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Development of probabilistic design method for annular fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozawa, Takayuki
2007-07-01
The increase of linear power and burn-up during the reactor operation is considered as one measure to ensure the utility of fast reactors in the future; for this the application of annular oxide fuels is under consideration. The annular fuel design code CEPTAR was developed in the Japan Atomic Energy Agency (JAEA) and verified by using many irradiation experiences with oxide fuels. In addition, the probabilistic fuel design code BORNFREE was also developed to provide a safe and reasonable fuel design and to evaluate the design margins quantitatively. This study aimed at the development of a probabilistic design method formore » annular oxide fuels; this was implemented in the developed BORNFREE-CEPTAR code, and the code was used to make a probabilistic evaluation with regard to the permissive linear power. (author)« less
Rivas, Elena; Lang, Raymond; Eddy, Sean R
2012-02-01
The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.
Rivas, Elena; Lang, Raymond; Eddy, Sean R.
2012-01-01
The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308
Roadmap on semiconductor-cell biointerfaces
NASA Astrophysics Data System (ADS)
Tian, Bozhi; Xu, Shuai; Rogers, John A.; Cestellos-Blanco, Stefano; Yang, Peidong; Carvalho-de-Souza, João L.; Bezanilla, Francisco; Liu, Jia; Bao, Zhenan; Hjort, Martin; Cao, Yuhong; Melosh, Nicholas; Lanzani, Guglielmo; Benfenati, Fabio; Galli, Giulia; Gygi, Francois; Kautz, Rylan; Gorodetsky, Alon A.; Kim, Samuel S.; Lu, Timothy K.; Anikeeva, Polina; Cifra, Michal; Krivosudský, Ondrej; Havelka, Daniel; Jiang, Yuanwen
2018-05-01
This roadmap outlines the role semiconductor-based materials play in understanding the complex biophysical dynamics at multiple length scales, as well as the design and implementation of next-generation electronic, optoelectronic, and mechanical devices for biointerfaces. The roadmap emphasizes the advantages of semiconductor building blocks in interfacing, monitoring, and manipulating the activity of biological components, and discusses the possibility of using active semiconductor-cell interfaces for discovering new signaling processes in the biological world.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, J. F.; Ho, H. W.; Kurth, R. E.
1991-01-01
The work performed to develop composite load spectra (CLS) for the Space Shuttle Main Engine (SSME) using probabilistic methods. The three methods were implemented to be the engine system influence model. RASCAL was chosen to be the principal method as most component load models were implemented with the method. Validation of RASCAL was performed. High accuracy comparable to the Monte Carlo method can be obtained if a large enough bin size is used. Generic probabilistic models were developed and implemented for load calculations using the probabilistic methods discussed above. Each engine mission, either a real fighter or a test, has three mission phases: the engine start transient phase, the steady state phase, and the engine cut off transient phase. Power level and engine operating inlet conditions change during a mission. The load calculation module provides the steady-state and quasi-steady state calculation procedures with duty-cycle-data option. The quasi-steady state procedure is for engine transient phase calculations. In addition, a few generic probabilistic load models were also developed for specific conditions. These include the fixed transient spike model, the poison arrival transient spike model, and the rare event model. These generic probabilistic load models provide sufficient latitude for simulating loads with specific conditions. For SSME components, turbine blades, transfer ducts, LOX post, and the high pressure oxidizer turbopump (HPOTP) discharge duct were selected for application of the CLS program. They include static pressure loads and dynamic pressure loads for all four components, centrifugal force for the turbine blade, temperatures of thermal loads for all four components, and structural vibration loads for the ducts and LOX posts.
A Roadmap for Thermal Metrology
NASA Astrophysics Data System (ADS)
Bojkovski, J.; Fischer, J.; Machin, G.; Pavese, F.; Peruzzi, A.; Renaot, E.; Tegeler, E.
2009-02-01
A provisional roadmap for thermal metrology was developed in Spring 2006 as part of the EUROMET iMERA activity toward increasing impact from national investment in European metrology R&D. This consisted of two parts: one addressing the influence of thermal metrology on society, industry, and science, and the other specifying the requirements of enabling thermal metrology to serve future needs. The roadmap represents the shared vision of the EUROMET TC Therm committee as to how thermal metrology should develop to meet future requirements over the next 15 years. It is important to stress that these documents are a first attempt to roadmap the whole of thermal metrology and will certainly need regular review and revision to remain relevant and useful to the community they seek to serve. The first part of the roadmap, “Thermal metrology for society, industry, and science,” identifies the main social and economic triggers driving developments in thermal metrology—notably citizen safety and security, new production technologies, environment and global climate change, energy, and health. Stemming from these triggers, key targets are identified that require improved thermal measurements. The second part of the roadmap, “Enabling thermal metrology to serve future needs” identifies another set of triggers, like global trade and interoperability, future needs in transport, and the earth radiation budget. Stemming from these triggers, key targets are identified, such as improved realizations and dissemination of the SI unit the kelvin, anchoring the kelvin to the Boltzmann constant, k B, and calculating thermal properties from first principles. To facilitate these outcomes, the roadmap identifies the technical advances required in thermal measurement standards.
Evaluation of Roadmap to Achieve Energy Delivery Systems Cybersecurity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavez, Adrian R.
The Department of Energy/Office of Electricity Delivery and Energy Reliability (DOE/OE) Cybersecurity for Energy Delivery Systems (CEDS) program is currently evaluating the Roadmap to Achieve Energy Delivery Systems Cybersecurity document that sets a vision and outlines a set of milestones. The milestones are divided into five strategic focus areas that include: 1. Build a Culture of Security; 2. Assess and Monitor Risk; 3. Develop and Implement New Protective Measures to Reduce Risk; 4. Manage Incidents; and 5. Sustain Security Improvements. The most current version of the roadmap was last updated in September of 2016. Sandia National Laboratories (SNL) has beenmore » tasked with revisiting the roadmap to update the current state of energy delivery systems cybersecurity protections. SNL is currently working with previous and current partners to provide feedback on which of the roadmap milestones have been met and to identify any preexisting or new gaps that are not addressed by the roadmap. The specific focus areas SNL was asked to evaluate are: 1. Develop and Implement New Protective Measures to Reduce Risk and 2. Sustain Security Improvements. SNL has formed an Industry Advisory Board (IAB) to assist in answering these questions. The IAB consists of previous partners on past CEDS funded efforts as well as new collaborators that have unique insights into the current state of cybersecurity within energy delivery systems. The IAB includes asset owners, utilities and vendors of control systems. SNL will continue to maintain regular communications with the IAB to provide various perspectives on potential future updates to further improve the breadth of cybersecurity coverage of the roadmap.« less
NASA Technical Reports Server (NTRS)
2003-01-01
Contents include the following: About the roadmap. Summary of key elements. Science objectives. Mission roadmap. Technology. Research and analysis. Education and public outreach. Appendix - Road map framework.
Overview of current capabilities and research and technology developments for planetary protection
NASA Astrophysics Data System (ADS)
Frick, Andreas; Mogul, Rakesh; Stabekis, Pericles; Conley, Catharine A.; Ehrenfreund, Pascale
2014-07-01
The pace of scientific exploration of our solar system provides ever-increasing insights into potentially habitable environments, and associated concerns for their contamination by Earth organisms. Biological and organic-chemical contamination has been extensively considered by the COSPAR Panel on Planetary Protection (PPP) and has resulted in the internationally recognized regulations to which spacefaring nations adhere, and which have been in place for 40 years. The only successful Mars lander missions with system-level “sterilization” were the Viking landers in the 1970s. Since then different cleanliness requirements have been applied to spacecraft based on their destination, mission type, and scientific objectives. The Planetary Protection Subcommittee of the NASA Advisory Council has noted that a strategic Research & Technology Development (R&TD) roadmap would be very beneficial to encourage the timely availability of effective tools and methodologies to implement planetary protection requirements. New research avenues in planetary protection for ambitious future exploration missions can best be served by developing an over-arching program that integrates capability-driven developments with mission-driven implementation efforts. This paper analyzes the current status concerning microbial reduction and cleaning methods, recontamination control and bio-barriers, operational analysis methods, and addresses concepts for human exploration. Crosscutting research and support activities are discussed and a rationale for a Strategic Planetary Protection R&TD Roadmap is outlined. Such a roadmap for planetary protection provides a forum for strategic planning and will help to enable the next phases of solar system exploration.
Probabilistic models of cognition: conceptual foundations.
Chater, Nick; Tenenbaum, Joshua B; Yuille, Alan
2006-07-01
Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, 'sophisticated' probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the approach, explore how the approach relates to studies of explicit probabilistic reasoning, and give a brief overview of the field as it stands today.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... Role of Risk Analysis in Decision-Making AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... documents entitled, ``Using Probabilistic Methods to Enhance the Role of Risk Analysis in Decision- Making... Probabilistic Methods to Enhance the Role of Risk Analysis in Decision-Making, with Case Study Examples'' and...
Denis Valle; Benjamin Baiser; Christopher W. Woodall; Robin Chazdon; Jerome Chave
2014-01-01
We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates...
Probabilistic composite micromechanics
NASA Technical Reports Server (NTRS)
Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.
1988-01-01
Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.
Probabilistic BPRRC: Robust Change Detection against Illumination Changes and Background Movements
NASA Astrophysics Data System (ADS)
Yokoi, Kentaro
This paper presents Probabilistic Bi-polar Radial Reach Correlation (PrBPRRC), a change detection method that is robust against illumination changes and background movements. Most of the traditional change detection methods are robust against either illumination changes or background movements; BPRRC is one of the illumination-robust change detection methods. We introduce a probabilistic background texture model into BPRRC and add the robustness against background movements including foreground invasions such as moving cars, walking people, swaying trees, and falling snow. We show the superiority of PrBPRRC in the environment with illumination changes and background movements by using three public datasets and one private dataset: ATON Highway data, Karlsruhe traffic sequence data, PETS 2007 data, and Walking-in-a-room data.
Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo
NASA Astrophysics Data System (ADS)
Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik
2018-05-01
Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.
Alternate Methods in Refining the SLS Nozzle Plug Loads
NASA Technical Reports Server (NTRS)
Burbank, Scott; Allen, Andrew
2013-01-01
Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.
Methodology for Constructing a Modernization Roadmap for Air Force Automatic Test Systems
2012-01-01
Constructing a Modernization Roadmap for Air Force Automatic Test Systems Lionel A. Galway , Rachel Rue, James M. Masters, Ben D. Van Roo, Manuel...constructing a modernization roadmap for Air Force automatic test systems / Lionel A. Galway ... [et al.]. p. cm. Includes bibliographical...references. ISBN 978-0-8330-5899-7 (pbk. : alk. paper) 1. United States. Air Force—Weapons systems—Testing. I. Galway , Lionel A., 1950- UG633.M3445
NASA Astrophysics Data System (ADS)
Mayr, G. J.; Kneringer, P.; Dietz, S. J.; Zeileis, A.
2016-12-01
Low visibility or low cloud ceiling reduce the capacity of airports by requiring special low visibility procedures (LVP) for incoming/departing aircraft. Probabilistic forecasts when such procedures will become necessary help to mitigate delays and economic losses.We compare the performance of probabilistic nowcasts with two statistical methods: ordered logistic regression, and trees and random forests. These models harness historic and current meteorological measurements in the vicinity of the airport and LVP states, and incorporate diurnal and seasonal climatological information via generalized additive models (GAM). The methods are applied at Vienna International Airport (Austria). The performance is benchmarked against climatology, persistence and human forecasters.
NASA Technical Reports Server (NTRS)
Canfield, R. C.; Ricchiazzi, P. J.
1980-01-01
An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis
2013-09-01
During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less
Forest Products Industry Technology Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2010-04-01
This document describes the forest products industry's research and development priorities. The original technology roadmap published by the industry in 1999 and was most recently updated in April 2010.
NASA Astrophysics Data System (ADS)
Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor
2017-04-01
Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.
An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory
Yen, Chung-Cheng; Guymon, Gary L.
1990-01-01
An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.
An Efficient Deterministic-Probabilistic Approach to Modeling Regional Groundwater Flow: 1. Theory
NASA Astrophysics Data System (ADS)
Yen, Chung-Cheng; Guymon, Gary L.
1990-07-01
An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.
Variational approach to probabilistic finite elements
NASA Technical Reports Server (NTRS)
Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.
1991-01-01
Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.
Variational approach to probabilistic finite elements
NASA Astrophysics Data System (ADS)
Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.
1991-08-01
Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.
Variational approach to probabilistic finite elements
NASA Technical Reports Server (NTRS)
Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.
1987-01-01
Probabilistic finite element method (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties, and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.
Roadmap for the international, accelerator-based neutrino programme
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, J.; de Gouvêa, A.; Duchesneau, D.
In line with its terms of reference the ICFA Neutrino Panel has developed a roadmap for the international, accelerator-based neutrino programme. A "roadmap discussion document" was presented in May 2016 taking into account the peer-group-consultation described in the Panel's initial report. The "roadmap discussion document" was used to solicit feedback from the neutrino community---and more broadly, the particle- and astroparticle-physics communities---and the various stakeholders in the programme. The roadmap, the conclusions and recommendations presented in this document take into account the comments received following the publication of the roadmap discussion document. With its roadmap the Panel documents the approved objectivesmore » and milestones of the experiments that are presently in operation or under construction. Approval, construction and exploitation milestones are presented for experiments that are being considered for approval. The timetable proposed by the proponents is presented for experiments that are not yet being considered formally for approval. Based on this information, the evolution of the precision with which the critical parameters governinger the neutrino are known has been evaluated. Branch or decision points have been identified based on the anticipated evolution in precision. The branch or decision points have in turn been used to identify desirable timelines for the neutrino-nucleus cross section and hadro-production measurements that are required to maximise the integrated scientific output of the programme. The branch points have also been used to identify the timeline for the R&D required to take the programme beyond the horizon of the next generation of experiments. The theory and phenomenology programme, including nuclear theory, required to ensure that maximum benefit is derived from the experimental programme is also discussed.« less
2011-01-01
ER D C TR -0 6- 10 , S up pl em en t 2 Building Information Modeling ( BIM ) Roadmap Supplement 2 – BIM Implementation Plan for Military...release; distribution is unlimited. ERDC TR-06-10, Supplement 2 January 2011 Building Information Modeling ( BIM ) Roadmap Supplement 2 – BIM ...ERDC TR-06-10, Supplement 2 (January 2011) 2 Abstract: Building Information Modeling ( BIM ) technology provides the communities of practice in
A CFD validation roadmap for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1992-01-01
A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.
A CFD validation roadmap for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1993-01-01
A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.
Biogas Opportunities Roadmap Progress Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
In support of the Obama Administration's Climate Action Plan, the U.S. Department of Energy, the U.S. Environmental Protection Agency, and U.S. Department of Agriculture jointly released the Biogas Opportunities Roadmap Progress Report, updating the federal government's progress to reduce methane emissions through biogas systems since the Biogas Opportunities Roadmap was completed by the three agencies in July 2014. The report highlights actions taken, outlines challenges and opportunities, and identifies next steps to the growth of a robust biogas industry.
A European Roadmap for Thermophysical Properties Metrology
NASA Astrophysics Data System (ADS)
Filtz, J.-R.; Wu, J.; Stacey, C.; Hollandt, J.; Monte, C.; Hay, B.; Hameury, J.; Villamañan, M. A.; Thurzo-Andras, E.; Sarge, S.
2015-03-01
A roadmap for thermophysical properties metrology was developed in spring 2011 by the Thermophysical Properties Working Group in the EURAMET Technical Committee in charge of Thermometry, Humidity and Moisture, and Thermophysical Properties metrology. This roadmapping process is part of the EURAMET (European Association of National Metrology Institutes) activities aiming to increase impact from national investment in European metrology R&D. The roadmap shows a shared vision of how the development of thermophysical properties metrology should be oriented over the next 15 years to meet future social and economic needs. Since thermophysical properties metrology is a very broad and varied field, the authors have limited this roadmap to the following families of properties: thermal transport properties (thermal conductivity, thermal diffusivity, etc.), radiative properties (emissivity, absorbance, reflectance, and transmittance), caloric quantities (specific heat, enthalpy, etc.), thermodynamic properties (PVT and phase equilibria properties), and temperature-dependent quantities (thermal expansion, compressibility, etc.). This roadmap identifies the main societal and economical triggers that drive developments in thermophysical properties metrology. The key topics considered are energy, environment, advanced manufacturing and processing, public safety, security, and health. Key targets that require improved thermophysical properties measurements are identified in order to address these triggers. Ways are also proposed for defining the necessary skills and the main useful means to be implemented. These proposals will have to be revised as needs and technologies evolve in the future.
Fully probabilistic control design in an adaptive critic framework.
Herzallah, Randa; Kárný, Miroslav
2011-12-01
Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem; in particular, very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic control algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this paper. Copyright © 2011 Elsevier Ltd. All rights reserved.
Development of a probabilistic analysis methodology for structural reliability estimation
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.
1991-01-01
The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.
Bayesian probabilistic population projections for all countries
Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.
2012-01-01
Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249
A probabilistic approach to composite micromechanics
NASA Technical Reports Server (NTRS)
Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.
1988-01-01
Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.
Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Florita, Anthony R; Krishnan, Venkat K
Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Florita, Anthony R; Krishnan, Venkat K
2017-08-31
Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less
5.0 Aerodynamic and Propulsive Decelerator Systems
NASA Technical Reports Server (NTRS)
Cruz, Juan R.; Powell, Richard; Masciarelli, James; Brown, Glenn; Witkowski, Al; Guernsey, Carl
2005-01-01
Contents include the following: Introduction. Capability Breakdown Structure. Decelerator Functions. Candidate Solutions. Performance and Technology. Capability State-of-the-Art. Performance Needs. Candidate Configurations. Possible Technology Roadmaps. Capability Roadmaps.
Probabilistic liver atlas construction.
Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E
2017-01-13
Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.
NASA Technical Reports Server (NTRS)
Warner, James E.; Zubair, Mohammad; Ranjan, Desh
2017-01-01
This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.
Probabilistic Composite Design
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1997-01-01
Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.
Probabilistic population projections with migration uncertainty
Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.
2016-01-01
We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571
Application of Probabilistic Analysis to Aircraft Impact Dynamics
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.
2003-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.
Roadmap to a Tobacco Epidemic: Transnational Tobacco Companies Invade Indonesia
Hurt, Richard D.; Ebbert, Jon O.; Achadi, Anhari; Croghan, Ivana T.
2014-01-01
Background Indonesia is the world’s fifth largest cigarette market in the world but for decades, transnational tobacco companies (TTCs) have had limited success infiltrating this market, due to their inability to compete in the kretek market. Kreteks are clove/tobacco cigarettes that most Indonesians smoke. Objective To determine how Phillip Morris International (PMI) and British American Tobacco (BAT) have now successfully achieved a substantial market presence in Indonesia. Methods We analyzed previously secret, tobacco industry documents, corporate reports on Indonesia operations, the Tobacco Trade press, Indonesia media, and “The Roadmap.” Results Internal, corporate documents from BAT and PMI demonstrate that they had known for decades that kreteks are highly carcinogenic. Despite that knowledge, BAT and PMI now own and heavily market these products, as well as new more westernized versions of kreteks. BAT and PMI maintained the basic strategy of keeping cigarettes affordable by maintaining the social responsibility of smoking and opposing smoke-free workplace laws but in the 21st century, they added the acquisition of and Westernization of domestic kretek manufacturers as an additional strategy. These acquisitions allowed them to assert influences on health policy in Indonesia and to grow their business under current government policy embodied in the 2007-2020 Roadmap of Tobacco Products Industry and Excise Policy which calls for increased cigarette production by 12% over the next 15 years. Conclusion PMI and Bat have successfully entered and are expanding their share in the Indonesia cigarette market. Despite the obvious and pervasive influence of the tobacco industry on policy decisions, the Indonesian government should ratify the FCTC and implement effective legislation to reduce tobacco consumption and exposure to tobacco smoke and revise the Roadmap to protect future generations of Indonesians. PMID:21852413
Idaho National Engineering Laboratory High-Level Waste Roadmap. Revision 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-08-01
The Idaho National Engineering Laboratory (INEL) High-Level Waste (HLW) Roadmap takes a strategic look at the entire HLW life-cycle starting with generation, through interim storage, treatment and processing, transportation, and on to final disposal. The roadmap is an issue-based planning approach that compares ``where we are now`` to ``where we want and need to be.`` The INEL has been effectively managing HLW for the last 30 years. Calcining operations are continuing to turn liquid HLW into a more manageable form. Although this document recognizes problems concerning HLW at the INEL, there is no imminent risk to the public or environment.more » By analyzing the INEL current business operations, pertinent laws and regulations, and committed milestones, the INEL HLW Roadmap has identified eight key issues existing at the INEL that must be resolved in order to reach long-term objectives. These issues are as follows: A. The US Department of Energy (DOE) needs a consistent policy for HLW generation, handling, treatment, storage, and disposal. B. The capability for final disposal of HLW does not exist. C. Adequate processes have not been developed or implemented for immobilization and disposal of INEL HLW. D. HLW storage at the INEL is not adequate in terms of capacity and regulatory requirements. E. Waste streams are generated with limited consideration for waste minimization. F. HLW is not adequately characterized for disposal nor, in some cases, for storage. G. Research and development of all process options for INEL HLW treatment and disposal are not being adequately pursued due to resource limitations. H. HLW transportation methods are not selected or implemented. A root-cause analysis uncovered the underlying causes of each of these issues.« less
Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.
Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F
Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.
NASA Astrophysics Data System (ADS)
Rubinsztein-Dunlop, Halina; Forbes, Andrew; Berry, M. V.; Dennis, M. R.; Andrews, David L.; Mansuripur, Masud; Denz, Cornelia; Alpmann, Christina; Banzer, Peter; Bauer, Thomas; Karimi, Ebrahim; Marrucci, Lorenzo; Padgett, Miles; Ritsch-Marte, Monika; Litchinitser, Natalia M.; Bigelow, Nicholas P.; Rosales-Guzmán, C.; Belmonte, A.; Torres, J. P.; Neely, Tyler W.; Baker, Mark; Gordon, Reuven; Stilgoe, Alexander B.; Romero, Jacquiline; White, Andrew G.; Fickler, Robert; Willner, Alan E.; Xie, Guodong; McMorran, Benjamin; Weiner, Andrew M.
2017-01-01
Structured light refers to the generation and application of custom light fields. As the tools and technology to create and detect structured light have evolved, steadily the applications have begun to emerge. This roadmap touches on the key fields within structured light from the perspective of experts in those areas, providing insight into the current state and the challenges their respective fields face. Collectively the roadmap outlines the venerable nature of structured light research and the exciting prospects for the future that are yet to be realized.
Summary of NASA Advanced Telescope and Observatory Capability Roadmap
NASA Technical Reports Server (NTRS)
Stahl, H. Phil; Feinberg, Lee
2006-01-01
The NASA Advanced Telescope and Observatory (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories operating in all electromagnetic bands, from x-rays to millimeter waves, and including gravity-waves. It lists capability priorities derived from current and developing Space Missions Directorate (SMD) strategic roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.
ILEWG technology roadmap for Moon exploration
NASA Astrophysics Data System (ADS)
Foing, Bernard H.
2008-04-01
We discuss the charter and activities of the International Lunar Exploration Working Group (ILEWG), and give an update from the related ILEWG task groups. We discuss the different rationale and technology roadmap for Moon exploration, as debated in previous ILEWG conferences. The Technology rationale includes: 1) The advancement of instrumentation: 2) Technologies in robotic and human exploration 3) Moon-Mars Exploration can inspire solutions to global Earth sustained development. We finally discuss a possible roadmap for development of technologies necessary for Moon and Mars exploration.
Summary of NASA Advanced Telescope and Observatory Capability Roadmap
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Feinberg, Lee
2007-01-01
The NASA Advanced Telescope and Observatory (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories operating in all electromagnetic bands, from x-rays to millimeter waves, and including gravity-waves. It lists capability priorities derived from current and developing Space Missions Directorate (SMD) strategic roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.
These Roadmaps identify scientific gaps that inform the National Research Programs in the development of their Strategic Research Action Plans. EPA expects to use this approach to integrate existing research efforts and to identify needed work.
Influence Diagrams as Decision-Making Tools for Pesticide Risk Management
The pesticide policy arena is filled with discussion of probabilistic approaches to assess ecological risk, however, similar discussions about implementing formal probabilistic methods in pesticide risk decision making are less common. An influence diagram approach is proposed f...
Environmental probabilistic quantitative assessment methodologies
Crovelli, R.A.
1995-01-01
In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacKinnon, Robert J.; Kuhlman, Kristopher L
2016-05-01
We present a method of control variates for calculating improved estimates for mean performance quantities of interest, E(PQI) , computed from Monte Carlo probabilistic simulations. An example of a PQI is the concentration of a contaminant at a particular location in a problem domain computed from simulations of transport in porous media. To simplify the presentation, the method is described in the setting of a one- dimensional elliptical model problem involving a single uncertain parameter represented by a probability distribution. The approach can be easily implemented for more complex problems involving multiple uncertain parameters and in particular for application tomore » probabilistic performance assessment of deep geologic nuclear waste repository systems. Numerical results indicate the method can produce estimates of E(PQI)having superior accuracy on coarser meshes and reduce the required number of simulations needed to achieve an acceptable estimate.« less
A Roadmap for the Development of Alternative (Non-Animal) Methods for Systemic Toxicity Testing
Systemic toxicity testing forms the cornerstone for the safety evaluation of substances. Pressures to move from traditional animal models to novel technologies arise from various concerns, including: the need to evaluate large numbers of previously untested chemicals and new prod...
A roadmap for natural product discovery based on large-scale genomics and metabolomics
USDA-ARS?s Scientific Manuscript database
Actinobacteria encode a wealth of natural product biosynthetic gene clusters, whose systematic study is complicated by numerous repetitive motifs. By combining several metrics we developed a method for global classification of these gene clusters into families (GCFs) and analyzed the biosynthetic ca...
An Interim Report on NASA's Draft Space Technology Roadmaps
NASA Technical Reports Server (NTRS)
2011-01-01
NASA has developed a set of 14 draft roadmaps to guide the development of space technologies under the leadership of the NASA Office of the Chief Technologist (OCT). Each of these roadmaps focuses on a particular technology area (TA). The roadmaps are intended to foster the development of advanced technologies and concepts that address NASA's needs and contribute to other aerospace and national needs. OCT requested that the National Research Council conduct a study to review the draft roadmaps, gather and assess relevant community input, and make recommendations and suggest priorities to inform NASA's decisions as it finalizes its roadmaps. The statement of task states that "based on the results of the community input and its own deliberations, the steering committee will prepare a brief interim report that addresses high-level issues associated with the roadmaps, such as the advisability of modifying the number or technical focus of the draft NASA roadmaps." This interim report, which does not include formal recommendations, addresses that one element of the study charge. NASA requested this interim report so that it would have the opportunity to make an early start in modifying the draft roadmaps based on feedback from the panels and steering committee. The final report will address all other tasks in the statement of task. In particular, the final report will include a prioritization of technologies, will describe in detail the prioritization process and criteria, and will include specific recommendations on a variety of topics, including many of the topics mentioned in this interim report. In developing both this interim report and the final report to come, the steering committee draws on the work of six study panels organized by technical area, loosely following the organization of the 14 roadmaps, as follows: A Panel 1: Propulsion and Power TA01 Launch Propulsion Systems TA02 In-Space Propulsion Technologies TA03 Space Power and Energy Storage Systems TA13 Ground and Launch Systems Processing B Panel 2: Robotics, Communications, and Navigation TA04 Robotics, TeleRobotics, and Autonomous Systems TA05 Communication and Navigation Systems C Panel 3: Instruments and Computing TA08 Science Instruments, Observatories, and Sensor Systems TA11 Modeling, Simulation, Information Technology, and Data Processing D Panel 4: Human Health and Surface Exploration TA06 Human Health, Life Support, and Habitation Systems TA07 Human Exploration Destination Systems E Panel 5: Materials Panel TA10 Nanotechnology TA12 Materials, Structures, Mechanical Systems, and Manufacturing TA14 Thermal Management Systems F Panel 6: Entry, Descent, and Landing Panel TA09 Entry, Descent, and Landing Systems In addition to drawing on the expertise represented on the steering committee and panels, the committee obtained input from each of 14 public workshops held on each of the 14 roadmaps. At these 1-day workshops, invited speakers, guests, and members of the public engaged in discussions on the different technology areas and their value to NASA. Broad community input was also solicited from a public website, where more than 240 public comments were received on the draft roadmaps in response to application of criteria (such as benefit, risk and reasonableness, and alignment with NASA and national goals) that the steering committee established. This interim report reflects the results of deliberations by the steering committee in light of these public inputs as well as additional inputs from the six panels. The steering committee's final report will be completed early in 2012. That report will prioritize the technologies that span the entire scope of the 14 roadmaps and provide additional guidance on crosscutting themes and other relevant topics.
Perez-Cruz, Pedro E.; dos Santos, Renata; Silva, Thiago Buosi; Crovador, Camila Souza; Nascimento, Maria Salete de Angelis; Hall, Stacy; Fajardo, Julieta; Bruera, Eduardo; Hui, David
2014-01-01
Context Survival prognostication is important during end-of-life. The accuracy of clinician prediction of survival (CPS) over time has not been well characterized. Objectives To examine changes in prognostication accuracy during the last 14 days of life in a cohort of patients with advanced cancer admitted to two acute palliative care units and to compare the accuracy between the temporal and probabilistic approaches. Methods Physicians and nurses prognosticated survival daily for cancer patients in two hospitals until death/discharge using two prognostic approaches: temporal and probabilistic. We assessed accuracy for each method daily during the last 14 days of life comparing accuracy at day −14 (baseline) with accuracy at each time point using a test of proportions. Results 6718 temporal and 6621 probabilistic estimations were provided by physicians and nurses for 311 patients, respectively. Median (interquartile range) survival was 8 (4, 20) days. Temporal CPS had low accuracy (10–40%) and did not change over time. In contrast, probabilistic CPS was significantly more accurate (p<.05 at each time point) but decreased close to death. Conclusion Probabilistic CPS was consistently more accurate than temporal CPS over the last 14 days of life; however, its accuracy decreased as patients approached death. Our findings suggest that better tools to predict impending death are necessary. PMID:24746583
NASA Technical Reports Server (NTRS)
Fayssal, Safie; Weldon, Danny
2008-01-01
The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.
Martin, Sébastien; Troccaz, Jocelyne; Daanenc, Vincent
2010-04-01
The authors present a fully automatic algorithm for the segmentation of the prostate in three-dimensional magnetic resonance (MR) images. The approach requires the use of an anatomical atlas which is built by computing transformation fields mapping a set of manually segmented images to a common reference. These transformation fields are then applied to the manually segmented structures of the training set in order to get a probabilistic map on the atlas. The segmentation is then realized through a two stage procedure. In the first stage, the processed image is registered to the probabilistic atlas. Subsequently, a probabilistic segmentation is obtained by mapping the probabilistic map of the atlas to the patient's anatomy. In the second stage, a deformable surface evolves toward the prostate boundaries by merging information coming from the probabilistic segmentation, an image feature model and a statistical shape model. During the evolution of the surface, the probabilistic segmentation allows the introduction of a spatial constraint that prevents the deformable surface from leaking in an unlikely configuration. The proposed method is evaluated on 36 exams that were manually segmented by a single expert. A median Dice similarity coefficient of 0.86 and an average surface error of 2.41 mm are achieved. By merging prior knowledge, the presented method achieves a robust and completely automatic segmentation of the prostate in MR images. Results show that the use of a spatial constraint is useful to increase the robustness of the deformable model comparatively to a deformable surface that is only driven by an image appearance model.
General Purpose Probabilistic Programming Platform with Effective Stochastic Inference
2018-04-01
2.2 Venture 10 2.3 BayesDB 12 2.4 Picture 17 2.5 MetaProb 20 3.0 METHODS , ASSUMPTIONS, AND PROCEDURES 22 4.0 RESULTS AND DISCUSSION 23 4.1...The methods section outlines the research approach. The results and discussion section gives representative quantitative and qualitative results...modeling via CrossCat, a probabilistic method that emulates many of the judgment calls ordinarily made by a human data analyst. This AI assistance
Superposition-Based Analysis of First-Order Probabilistic Timed Automata
NASA Astrophysics Data System (ADS)
Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph
This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
A probabilistic approach to aircraft design emphasizing stability and control uncertainties
NASA Astrophysics Data System (ADS)
Delaurentis, Daniel Andrew
In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.
Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C
2015-01-01
Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care.
A novel probabilistic framework for event-based speech recognition
NASA Astrophysics Data System (ADS)
Juneja, Amit; Espy-Wilson, Carol
2003-10-01
One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.
NASA Technical Reports Server (NTRS)
Van Dalsem, William; Krishnakumar, Kalmanje Srinivas
2016-01-01
This is a powerpoint presentation that highlights autonomy across the 15 NASA technology roadmaps, including specific examples of projects (past and present) at NASA Ames Research Center. The NASA technology roadmaps are located here: http:www.nasa.govofficesocthomeroadmapsindex.html
NASA Astrophysics Data System (ADS)
Kramer, G. Y.; Lawrence, D. J.; Neal, C. R.; Clark, P. E.; Green, R. O.; Horanyi, M.; Johnson, M. D.; Kelso, R. M.; Sultana, M.; Thompson, D. R.
2016-11-01
A Lunar Capabilities Roadmap (LCR) is required to highlight capabilities critical for science and exploration of the Moon as well as beyond. The LCR will focus mainly on capabilities with examples of specific technologies to satisfy those needs.
Development of the INEEL Site Wide Vadose Zone Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yonk, Alan Keith
2001-09-01
The INEEL Vadose Zone Roadmap was developed to identify inadquacies in current knowledge, to assist in contaminant management capabilities relative to the INEEL vadose zone, and to ensure that ongoing and planned Science and Technology developments will meet the risk management challenges facing the INEEL in coming years. The primary objective of the Roadmap is to determine the S&T needs that will facilitate monitoring, characterization, prediction, and assessment activities necessary to support INEEL risk management decisions and to ensure that long-term stewardship of contaminated sites at the INEEL is achieved. The mission of the Roadmap is to insure that themore » long-term S&T strategy is aligned with site programs, that it takes advantage of progress made to date, and that it can assist in meeting the milestones and budgets of operations.« less
Joint Probabilistic Projection of Female and Male Life Expectancy
Raftery, Adrian E.; Lalic, Nevena; Gerland, Patrick
2014-01-01
BACKGROUND The United Nations (UN) produces population projections for all countries every two years. These are used by international organizations, governments, the private sector and researchers for policy planning, for monitoring development goals, as inputs to economic and environmental models, and for social and health research. The UN is considering producing fully probabilistic population projections, for which joint probabilistic projections of future female and male life expectancy at birth are needed. OBJECTIVE We propose a methodology for obtaining joint probabilistic projections of female and male life expectancy at birth. METHODS We first project female life expectancy using a one-sex method for probabilistic projection of life expectancy. We then project the gap between female and male life expectancy. We propose an autoregressive model for the gap in a future time period for a particular country, which is a function of female life expectancy and a t-distributed random perturbation. This method takes into account mortality data limitations, is comparable across countries, and accounts for shocks. We estimate all parameters based on life expectancy estimates for 1950–2010. The methods are implemented in the bayesLife and bayesPop R packages. RESULTS We evaluated our model using out-of-sample projections for the period 1995–2010, and found that our method performed better than several possible alternatives. CONCLUSIONS We find that the average gap between female and male life expectancy has been increasing for female life expectancy below 75, and decreasing for female life expectancy above 75. Our projections of the gap are lower than the UN’s 2008 projections for most countries and so lead to higher projections of male life expectancy. PMID:25580082
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less
NASA Technical Reports Server (NTRS)
1992-01-01
The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.
Lunar Surface Systems Supportability Technology Development Roadmap
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.; Struk, Peter M.; Green, Jennifer L.; Chau, Savio N.; Curell, Philip C.; Dempsey, Cathy A.; Patterson, Linda P.; Robbins, William; Steele, Michael A.; DAnnunzio, Anthony;
2011-01-01
The Lunar Surface Systems Supportability Technology Development Roadmap is a guide for developing the technologies needed to enable the supportable, sustainable, and affordable exploration of the Moon and other destinations beyond Earth. Supportability is defined in terms of space maintenance, repair, and related logistics. This report considers the supportability lessons learned from NASA and the Department of Defense. Lunar Outpost supportability needs are summarized, and a supportability technology strategy is established to make the transition from high logistics dependence to logistics independence. This strategy will enable flight crews to act effectively to respond to problems and exploit opportunities in an environment of extreme resource scarcity and isolation. The supportability roadmap defines the general technology selection criteria. Technologies are organized into three categories: diagnostics, test, and verification; maintenance and repair; and scavenge and recycle. Furthermore, "embedded technologies" and "process technologies" are used to designate distinct technology types with different development cycles. The roadmap examines the current technology readiness level and lays out a four-phase incremental development schedule with selection decision gates. The supportability technology roadmap is intended to develop technologies with the widest possible capability and utility while minimizing the impact on crew time and training and remaining within the time and cost constraints of the program.
Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections
NASA Astrophysics Data System (ADS)
Wakazuki, Y.
2015-12-01
A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.
Probabilistic structural analysis using a general purpose finite element program
NASA Astrophysics Data System (ADS)
Riha, D. S.; Millwater, H. R.; Thacker, B. H.
1992-07-01
This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.
Structural system reliability calculation using a probabilistic fault tree analysis method
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.
1992-01-01
The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.
OBPR Free Flyer draft roadmap overview
NASA Technical Reports Server (NTRS)
Israelsson, Ulf
2005-01-01
OBPR Free Flyer Roadmap Purpose is to describe the OBPR research which is enabled by a free flying spacecraft capability To illustrate how research performed on free flying spacecrafts complement current and planned OBPR ISS activities.
Cyber S&T Priority Steering Council Research Roadmap
2011-11-08
Priority Steering Council Research Roadmap for the National Defense Industrial Association Disruptive Technologies Conference 8 November 2011...AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES Presented at the NDIA Disruptive Technologies Conference
INTEGRATED ENVIRONMENTAL STRATEGIES HANDBOOK
Chapter 1: Introduction, Background, Roadmap: History and motivation behind IES, historical background, where the program is going, roadmap (brief paragraphs explaining content of each chapter and possibly the audience sector who will benefit from reading the chapter). Chapt...
Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review
NASA Technical Reports Server (NTRS)
Antonsson, Erik; Gombosi, Tamas
2005-01-01
Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.
PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.
1998-01-01
PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.
Proposal of a method for evaluating tsunami risk using response-surface methodology
NASA Astrophysics Data System (ADS)
Fukutani, Y.
2017-12-01
Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.
76 FR 28102 - Notice of Issuance of Regulatory Guide
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-13
..., Probabilistic Risk Assessment Branch, Division of Risk Analysis, Office of Nuclear Regulatory Research, U.S... approaches and methods (whether quantitative or qualitative, deterministic or probabilistic), data, and... uses in evaluating specific problems or postulated accidents, and data that the staff needs in its...
Campbell, Kieran R; Yau, Christopher
2017-03-15
Modeling bifurcations in single-cell transcriptomics data has become an increasingly popular field of research. Several methods have been proposed to infer bifurcation structure from such data, but all rely on heuristic non-probabilistic inference. Here we propose the first generative, fully probabilistic model for such inference based on a Bayesian hierarchical mixture of factor analyzers. Our model exhibits competitive performance on large datasets despite implementing full Markov-Chain Monte Carlo sampling, and its unique hierarchical prior structure enables automatic determination of genes driving the bifurcation process. We additionally propose an Empirical-Bayes like extension that deals with the high levels of zero-inflation in single-cell RNA-seq data and quantify when such models are useful. We apply or model to both real and simulated single-cell gene expression data and compare the results to existing pseudotime methods. Finally, we discuss both the merits and weaknesses of such a unified, probabilistic approach in the context practical bioinformatics analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ardani, K.; Seif, D.; Margolis, R.
2013-08-01
The objective of this analysis is to roadmap the cost reductions and innovations necessary to achieve the U.S. Department of Energy (DOE) SunShot Initiative's total soft-cost targets by 2020. The roadmap focuses on advances in four soft-cost areas: (1) customer acquisition; (2) permitting, inspection, and interconnection (PII); (3) installation labor; and (4) financing. Financing cost reductions are in terms of the weighted average cost of capital (WACC) for financing PV system installations, with real-percent targets of 3.0% (residential) and 3.4% (commercial).
Strategic Directions in Heliophysics Research Related to Weakly Ionized Plasmas
NASA Technical Reports Server (NTRS)
Spann, James F.
2010-01-01
In 2009, the Heliophysics Division of NASA published its triennial roadmap entitled "Heliophysics; the solar and space physics of a new era." In this document contains a science priority that is recommended that will serve as input into the recently initiated NRC Heliophysics Decadal Survey. The 2009 roadmap includes several science targets recommendations that are directly related to weakly ionized plasmas, including on entitled "Ion-Neutral Coupling in the Atmosphere." This talk will be a brief overview of the roadmap with particular focus on the science targets relevant to weakly ionized plasmas.
Probabilistic Analysis and Density Parameter Estimation Within Nessus
NASA Astrophysics Data System (ADS)
Godines, Cody R.; Manteufel, Randall D.
2002-12-01
This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.
Probabilistic Analysis and Density Parameter Estimation Within Nessus
NASA Technical Reports Server (NTRS)
Godines, Cody R.; Manteufel, Randall D.; Chamis, Christos C. (Technical Monitor)
2002-01-01
This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.
New decoding methods of interleaved burst error-correcting codes
NASA Astrophysics Data System (ADS)
Nakano, Y.; Kasahara, M.; Namekawa, T.
1983-04-01
A probabilistic method of single burst error correction, using the syndrome correlation of subcodes which constitute the interleaved code, is presented. This method makes it possible to realize a high capability of burst error correction with less decoding delay. By generalizing this method it is possible to obtain probabilistic method of multiple (m-fold) burst error correction. After estimating the burst error positions using syndrome correlation of subcodes which are interleaved m-fold burst error detecting codes, this second method corrects erasure errors in each subcode and m-fold burst errors. The performance of these two methods is analyzed via computer simulation, and their effectiveness is demonstrated.
Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.
2010-01-01
Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502
Probabilistic liquefaction triggering based on the cone penetration test
Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.
2005-01-01
Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.
Probabilistic thinking and death anxiety: a terror management based study.
Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S
2014-01-01
Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.
EURO-CARES as Roadmap for a European Sample Curation Facility
NASA Astrophysics Data System (ADS)
Brucato, J. R.; Russell, S.; Smith, C.; Hutzler, A.; Meneghin, A.; Aléon, J.; Bennett, A.; Berthoud, L.; Bridges, J.; Debaille, V.; Ferrière, L.; Folco, L.; Foucher, F.; Franchi, I.; Gounelle, M.; Grady, M.; Leuko, S.; Longobardo, A.; Palomba, E.; Pottage, T.; Rettberg, P.; Vrublevskis, J.; Westall, F.; Zipfel, J.; Euro-Cares Team
2018-04-01
EURO-CARES is a three-year multinational project funded under the European Commission Horizon2020 research program to develop a roadmap for a European Extraterrestrial Sample Curation Facility for samples returned from solar system missions.
Implementation Plan for Chemical Industry R&D Roadmap for Nanomaterials by Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2006-04-01
The purpose of this effort is to develop an implementation plan to realize the vision and goals identified in the Chemical Industry R&D Roadmap for Nanomaterials By Design: From Fundamentals to Function.
Unmanned Aircraft Systems Roadmap 2005-2030
DOT National Transportation Integrated Search
2005-01-01
This document presents the Department of Defense's (DoD) roadmap for developing and employing unmanned aircraft systems over the next 25 years (2005 to 2030). It describes the missions identified by theater warfighters to which systems could be appli...
MAPSIT and a Roadmap for Lunar and Planetary Spatial Data Infrastructure
NASA Astrophysics Data System (ADS)
Radebaugh, J.; Archinal, B.; Beyer, R.; DellaGiustina, D.; Fassett, C.; Gaddis, L.; Hagerty, J.; Hare, T.; Laura, J.; Lawrence, S. J.; Mazarico, E.; Naß, A.; Patthoff, A.; Skinner, J.; Sutton, S.; Thomson, B. J.; Williams, D.
2017-10-01
We describe MAPSIT, and the development of a roadmap for lunar and planetary SDI, based on previous relevant documents and community input, and consider how to best advance lunar science, exploration, and commercial development.
EV Charging Infrastructure Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karner, Donald; Garetson, Thomas; Francfort, Jim
2016-08-01
As highlighted in the U.S. Department of Energy’s EV Everywhere Grand Challenge, vehicle technology is advancing toward an objective to “… produce plug-in electric vehicles that are as affordable and convenient for the average American family as today’s gasoline-powered vehicles …” [1] by developing more efficient drivetrains, greater battery energy storage per dollar, and lighter-weight vehicle components and construction. With this technology advancement and improved vehicle performance, the objective for charging infrastructure is to promote vehicle adoption and maximize the number of electric miles driven. The EV Everywhere Charging Infrastructure Roadmap (hereafter referred to as Roadmap) looks forward and assumesmore » that the technical challenges and vehicle performance improvements set forth in the EV Everywhere Grand Challenge will be met. The Roadmap identifies and prioritizes deployment of charging infrastructure in support of this charging infrastructure objective for the EV Everywhere Grand Challenge« less
NASA Astrophysics Data System (ADS)
Stockman, Mark I.; Kneipp, Katrin; Bozhevolnyi, Sergey I.; Saha, Soham; Dutta, Aveek; Ndukaife, Justus; Kinsey, Nathaniel; Reddy, Harsha; Guler, Urcan; Shalaev, Vladimir M.; Boltasseva, Alexandra; Gholipour, Behrad; Krishnamoorthy, Harish N. S.; MacDonald, Kevin F.; Soci, Cesare; Zheludev, Nikolay I.; Savinov, Vassili; Singh, Ranjan; Groß, Petra; Lienau, Christoph; Vadai, Michal; Solomon, Michelle L.; Barton, David R., III; Lawrence, Mark; Dionne, Jennifer A.; Boriskina, Svetlana V.; Esteban, Ruben; Aizpurua, Javier; Zhang, Xiang; Yang, Sui; Wang, Danqing; Wang, Weijia; Odom, Teri W.; Accanto, Nicolò; de Roque, Pablo M.; Hancu, Ion M.; Piatkowski, Lukasz; van Hulst, Niek F.; Kling, Matthias F.
2018-04-01
Plasmonics is a rapidly developing field at the boundary of physical optics and condensed matter physics. It studies phenomena induced by and associated with surface plasmons—elementary polar excitations bound to surfaces and interfaces of good nanostructured metals. This Roadmap is written collectively by prominent researchers in the field of plasmonics. It encompasses selected aspects of nanoplasmonics. Among them are fundamental aspects, such as quantum plasmonics based on the quantum-mechanical properties of both the underlying materials and the plasmons themselves (such as their quantum generator, spaser), plasmonics in novel materials, ultrafast (attosecond) nanoplasmonics, etc. Selected applications of nanoplasmonics are also reflected in this Roadmap, in particular, plasmonic waveguiding, practical applications of plasmonics enabled by novel materials, thermo-plasmonics, plasmonic-induced photochemistry and photo-catalysis. This Roadmap is a concise but authoritative overview of modern plasmonics. It will be of interest to a wide audience of both fundamental physicists and chemists, as well as applied scientists and engineers.
The WHF Roadmap for Reducing CV Morbidity and Mortality Through Prevention and Control of RHD.
Palafox, Benjamin; Mocumbi, Ana Olga; Kumar, R Krishna; Ali, Sulafa K M; Kennedy, Elizabeth; Haileamlak, Abraham; Watkins, David; Petricca, Kadia; Wyber, Rosemary; Timeon, Patrick; Mwangi, Jeremiah
2017-03-01
Rheumatic heart disease (RHD) is a preventable non-communicable condition that disproportionately affects the world's poorest and most vulnerable. The World Heart Federation Roadmap for improved RHD control is a resource designed to help a variety of stakeholders raise the profile of RHD nationally and globally, and provide a framework to guide and support the strengthening of national, regional and global RHD control efforts. The Roadmap identifies the barriers that limit access to and uptake of proven interventions for the prevention and control of RHD. It also highlights a variety of established and promising solutions that may be used to overcome these barriers. As a general guide, the Roadmap is meant to serve as the foundation for the development of tailored plans of action to improve RHD control in specific contexts. Copyright © 2016 World Heart Federation (Geneva). Published by Elsevier B.V. All rights reserved.
NASA's Launch Propulsion Systems Technology Roadmap
NASA Technical Reports Server (NTRS)
McConnaughey, Paul K.; Femminineo, Mark G.; Koelfgen, Syri J.; Lepsch, Roger A; Ryan, Richard M.; Taylor, Steven A.
2012-01-01
Safe, reliable, and affordable access to low-Earth (LEO) orbit is necessary for all of the United States (US) space endeavors. In 2010, NASA s Office of the Chief Technologist commissioned 14 teams to develop technology roadmaps that could be used to guide the Agency s and US technology investment decisions for the next few decades. The Launch Propulsion Systems Technology Area (LPSTA) team was tasked to address the propulsion technology challenges for access to LEO. The developed LPSTA roadmap addresses technologies that enhance existing solid or liquid propulsion technologies and their related ancillary systems or significantly advance the technology readiness level (TRL) of less mature systems like airbreathing, unconventional, and other launch technologies. In developing this roadmap, the LPSTA team consulted previous NASA, military, and industry studies as well as subject matter experts to develop their assessment of this field, which has fundamental technological and strategic impacts for US space capabilities.
Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.
Zhao, Yuchao; Frey, H Christopher
2004-11-01
Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.
Bayesian Probabilistic Projections of Life Expectancy for All Countries
Raftery, Adrian E.; Chunn, Jennifer L.; Gerland, Patrick; Ševčíková, Hana
2014-01-01
We propose a Bayesian hierarchical model for producing probabilistic forecasts of male period life expectancy at birth for all the countries of the world from the present to 2100. Such forecasts would be an input to the production of probabilistic population projections for all countries, which is currently being considered by the United Nations. To evaluate the method, we did an out-of-sample cross-validation experiment, fitting the model to the data from 1950–1995, and using the estimated model to forecast for the subsequent ten years. The ten-year predictions had a mean absolute error of about 1 year, about 40% less than the current UN methodology. The probabilistic forecasts were calibrated, in the sense that (for example) the 80% prediction intervals contained the truth about 80% of the time. We illustrate our method with results from Madagascar (a typical country with steadily improving life expectancy), Latvia (a country that has had a mortality crisis), and Japan (a leading country). We also show aggregated results for South Asia, a region with eight countries. Free publicly available R software packages called bayesLife and bayesDem are available to implement the method. PMID:23494599
Offerman, Theo; Palley, Asa B
2016-01-01
Strictly proper scoring rules are designed to truthfully elicit subjective probabilistic beliefs from risk neutral agents. Previous experimental studies have identified two problems with this method: (i) risk aversion causes agents to bias their reports toward the probability of [Formula: see text], and (ii) for moderate beliefs agents simply report [Formula: see text]. Applying a prospect theory model of risk preferences, we show that loss aversion can explain both of these behavioral phenomena. Using the insights of this model, we develop a simple off-the-shelf probability assessment mechanism that encourages loss-averse agents to report true beliefs. In an experiment, we demonstrate the effectiveness of this modification in both eliminating uninformative reports and eliciting true probabilistic beliefs.
NASA Astrophysics Data System (ADS)
Chen, Tzikang J.; Shiao, Michael
2016-04-01
This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.
An Evolved International Lunar Decade Global Exploration Roadmap
NASA Astrophysics Data System (ADS)
Dunlop, D.; Holder, K.
2015-10-01
An Evolved Global Exploration Roadmap (GER) reflecting a proposed International Lunar Decade is presented by an NSS chapter to address many of the omissions and new prospective commercial mission developments since the 2013 edition of the ISECG GER.
Extending the Fellegi-Sunter probabilistic record linkage method for approximate field comparators.
DuVall, Scott L; Kerber, Richard A; Thomas, Alun
2010-02-01
Probabilistic record linkage is a method commonly used to determine whether demographic records refer to the same person. The Fellegi-Sunter method is a probabilistic approach that uses field weights based on log likelihood ratios to determine record similarity. This paper introduces an extension of the Fellegi-Sunter method that incorporates approximate field comparators in the calculation of field weights. The data warehouse of a large academic medical center was used as a case study. The approximate comparator extension was compared with the Fellegi-Sunter method in its ability to find duplicate records previously identified in the data warehouse using different demographic fields and matching cutoffs. The approximate comparator extension misclassified 25% fewer pairs and had a larger Welch's T statistic than the Fellegi-Sunter method for all field sets and matching cutoffs. The accuracy gain provided by the approximate comparator extension grew as less information was provided and as the matching cutoff increased. Given the ubiquity of linkage in both clinical and research settings, the incremental improvement of the extension has the potential to make a considerable impact.
Probabilistic framework for product design optimization and risk management
NASA Astrophysics Data System (ADS)
Keski-Rahkonen, J. K.
2018-05-01
Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.
Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.
Efficient Sensitivity Methods for Probabilistic Lifing and Engine Prognostics
2010-09-01
AFRL-RX-WP-TR-2010-4297 EFFICIENT SENSITIVITY METHODS FOR PROBABILISTIC LIFING AND ENGINE PROGNOSTICS Harry Millwater , Ronald Bagley, Jose...5c. PROGRAM ELEMENT NUMBER 62102F 6. AUTHOR(S) Harry Millwater , Ronald Bagley, Jose Garza, D. Wagner, Andrew Bates, and Andy Voorhees 5d...Reliability Assessment, MIL-HDBK-1823, 30 April 1999. 9. Leverant GR, Millwater HR, McClung RC, Enright MP, A New Tool for Design and Certification of
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
Krejsa, Martin; Janas, Petr; Yilmaz, Işık; Marschalko, Marian; Bouchal, Tomas
2013-01-01
The load-carrying system of each construction should fulfill several conditions which represent reliable criteria in the assessment procedure. It is the theory of structural reliability which determines probability of keeping required properties of constructions. Using this theory, it is possible to apply probabilistic computations based on the probability theory and mathematic statistics. Development of those methods has become more and more popular; it is used, in particular, in designs of load-carrying structures with the required level or reliability when at least some input variables in the design are random. The objective of this paper is to indicate the current scope which might be covered by the new method—Direct Optimized Probabilistic Calculation (DOProC) in assessments of reliability of load-carrying structures. DOProC uses a purely numerical approach without any simulation techniques. This provides more accurate solutions to probabilistic tasks, and, in some cases, such approach results in considerably faster completion of computations. DOProC can be used to solve efficiently a number of probabilistic computations. A very good sphere of application for DOProC is the assessment of the bolt reinforcement in the underground and mining workings. For the purposes above, a special software application—“Anchor”—has been developed. PMID:23935412
Constructing Sample Space with Combinatorial Reasoning: A Mixed Methods Study
ERIC Educational Resources Information Center
McGalliard, William A., III.
2012-01-01
Recent curricular developments suggest that students at all levels need to be statistically literate and able to efficiently and accurately make probabilistic decisions. Furthermore, statistical literacy is a requirement to being a well-informed citizen of society. Research also recognizes that the ability to reason probabilistically is supported…
One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....
Three key areas of scientific inquiry in the study of human exposure to environmental contaminants are 1) assessment of aggregate (i.e., multi-pathway, multi-route) exposures, 2) application of probabilistic methods to exposure prediction, and 3) the interpretation of biomarker m...
EXPERIENCES WITH USING PROBABILISTIC EXPOSURE ANALYSIS METHODS IN THE U.S. EPA
Over the past decade various Offices and Programs within the U.S. EPA have either initiated or increased the development and application of probabilistic exposure analysis models. These models have been applied to a broad range of research or regulatory problems in EPA, such as e...
NASA Astrophysics Data System (ADS)
Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.
2013-12-01
Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.
Bayesian-information-gap decision theory with an application to CO 2 sequestration
O'Malley, D.; Vesselinov, V. V.
2015-09-04
Decisions related to subsurface engineering problems such as groundwater management, fossil fuel production, and geologic carbon sequestration are frequently challenging because of an overabundance of uncertainties (related to conceptualizations, parameters, observations, etc.). Because of the importance of these problems to agriculture, energy, and the climate (respectively), good decisions that are scientifically defensible must be made despite the uncertainties. We describe a general approach to making decisions for challenging problems such as these in the presence of severe uncertainties that combines probabilistic and non-probabilistic methods. The approach uses Bayesian sampling to assess parametric uncertainty and Information-Gap Decision Theory (IGDT) to addressmore » model inadequacy. The combined approach also resolves an issue that frequently arises when applying Bayesian methods to real-world engineering problems related to the enumeration of possible outcomes. In the case of zero non-probabilistic uncertainty, the method reduces to a Bayesian method. Lastly, to illustrate the approach, we apply it to a site-selection decision for geologic CO 2 sequestration.« less
Learning Analytics in Higher Education Development: A Roadmap
ERIC Educational Resources Information Center
Adejo, Olugbenga; Connolly, Thomas
2017-01-01
The increase in education data and advance in technology are bringing about enhanced teaching and learning methodology. The emerging field of Learning Analytics (LA) continues to seek ways to improve the different methods of gathering, analysing, managing and presenting learners' data with the sole aim of using it to improve the student learning…
USDA-ARS?s Scientific Manuscript database
An interactome is the genome-wide roadmap of protein-protein interactions that occur within an organism. Interactomes for humans, the fruit fly, and now plants such as Arabidopsis thaliana and Oryza sativa have been generated using high throughput experimental methods. It is possible to use these ...
Methods to Account for Accelerated Semi-Conductor Device Wearout in Longlife Aerospace Applications
2003-01-01
Vasi, “Device scalling effects on hot-carrier induced interface and oxide-trappoing charge distributions in MOSFETs,” IEEE Transactions on Electron...Symposium Proceedings, pp. 248–254, 2002. [104] S. I. A. ( SIA ), “International technology roadmap for semiconductors.” <www.semichips.org>, 1999. 113
Pest risk maps for invasive alien species: a roadmap for improvement
Robert C. Venette; Darren J. Kriticos; Roger D. Magarey; Frank H. Koch; Richard H.A. Baker; Susan P. Worner; Nadilia N. Gomez Raboteaux; Daniel W. McKenney; Erhard J. Dobesberger; Denys Yemshanov; Paul J. De Barro; William D. Hutchison; Glenn Fowler; Tom M. Kalaris; John Pedlar
2010-01-01
Pest risk maps are powerful visual communication tools to describe where invasive alien species might arrive, establish, spread, or cause harmful impacts. These maps inform strategic and tactical pest management decisions, such as potential restrictions on international trade or the design of pest surveys and domestic quarantines. Diverse methods are available to...
An ontology of and roadmap for mHealth research.
Cameron, Joshua D; Ramaprasad, Arkalgud; Syn, Thant
2017-04-01
Mobile health or mHealth research has been growing exponentially in recent years. However, the research on mHealth has been ad-hoc and selective without a clear definition of the mHealth domain. Without a roadmap for research we may not realize the full potential of mHealth. In this paper, we present an ontological framework to define the mHealth domain and illuminate a roadmap. We present an ontology of mHealth. The ontology is developed by systematically deconstructing the domain into its primary dimensions and elements. We map the extent research on mHealth in 2014 onto the ontology and highlight the bright, light, and blind/blank spots which represent the emphasis of mHealth research. The emphases of mHealth research in 2014 are very uneven. There are a few bright spots and many light spots. The research predominantly focuses on individuals' use of mobile devices and applications to capture or obtain health-related data mostly to improve quality of care through mobile intervention. We argue that the emphases can be balanced in the roadmap for mHealth research. The ontological mapping plays an integral role in developing and maintaining the roadmap which can be updated periodically to continuously assess and guide mHealth research. Copyright © 2017 Elsevier B.V. All rights reserved.
Probabilistic finite elements for fatigue and fracture analysis
NASA Astrophysics Data System (ADS)
Belytschko, Ted; Liu, Wing Kam
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
Probabilistic finite elements for fatigue and fracture analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Liu, Wing Kam
1992-01-01
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
Probabilistic numerical methods for PDE-constrained Bayesian inverse problems
NASA Astrophysics Data System (ADS)
Cockayne, Jon; Oates, Chris; Sullivan, Tim; Girolami, Mark
2017-06-01
This paper develops meshless methods for probabilistically describing discretisation error in the numerical solution of partial differential equations. This construction enables the solution of Bayesian inverse problems while accounting for the impact of the discretisation of the forward problem. In particular, this drives statistical inferences to be more conservative in the presence of significant solver error. Theoretical results are presented describing rates of convergence for the posteriors in both the forward and inverse problems. This method is tested on a challenging inverse problem with a nonlinear forward model.
EPA Nitrogen and Co-Pollutant Roadmap
Cross-media, integrated, multi-disciplinary approach to sustainably manage reactive nitrogen and co-pollutant loadings to air and water to reduce adverse impacts on the environment and human health. The goal of the Roadmap is to develop a common understanding of the Agency's rese...
Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue
NASA Astrophysics Data System (ADS)
Kree, P.; Soize, C.
The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.
Seismic, high wind, tornado, and probabilistic risk assessments of the High Flux Isotope Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, S.P.; Stover, R.L.; Hashimoto, P.S.
1989-01-01
Natural phenomena analyses were performed on the High Flux Isotope Reactor (HFIR) Deterministic and probabilistic evaluations were made to determine the risks resulting from earthquakes, high winds, and tornadoes. Analytic methods in conjunction with field evaluations and an earthquake experience data base evaluation methods were used to provide more realistic results in a shorter amount of time. Plant modifications completed in preparation for HFIR restart and potential future enhancements are discussed. 5 figs.
Bayesian Probabilistic Projection of International Migration.
Azose, Jonathan J; Raftery, Adrian E
2015-10-01
We propose a method for obtaining joint probabilistic projections of migration for all countries, broken down by age and sex. Joint trajectories for all countries are constrained to satisfy the requirement of zero global net migration. We evaluate our model using out-of-sample validation and compare point projections to the projected migration rates from a persistence model similar to the method used in the United Nations' World Population Prospects, and also to a state-of-the-art gravity model.
Probabilistic structural analysis methods and applications
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.
1988-01-01
An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Constructing probabilistic scenarios for wide-area solar power generation
Woodruff, David L.; Deride, Julio; Staid, Andrea; ...
2017-12-22
Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less
Constructing probabilistic scenarios for wide-area solar power generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodruff, David L.; Deride, Julio; Staid, Andrea
Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less
A note on probabilistic models over strings: the linear algebra approach.
Bouchard-Côté, Alexandre
2013-12-01
Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.
NASA Technical Reports Server (NTRS)
2012-01-01
Success in executing future NASA space missions will depend on advanced technology developments that should already be underway. It has been years since NASA has had a vigorous, broad-based program in advanced space technology development, and NASA's technology base is largely depleted. As noted in a recent National Research Council report on the U.S. civil space program: Future U.S. leadership in space requires a foundation of sustained technology advances that can enable the development of more capable, reliable, and lower-cost spacecraft and launch vehicles to achieve space program goals. A strong advanced technology development foundation is needed also to enhance technology readiness of new missions, mitigate their technological risks, improve the quality of cost estimates, and thereby contribute to better overall mission cost management. Yet financial support for this technology base has eroded over the years. The United States is now living on the innovation funded in the past and has an obligation to replenish this foundational element. NASA has developed a draft set of technology roadmaps to guide the development of space technologies under the leadership of the NASA Office of the Chief Technologist. The NRC appointed the Steering Committee for NASA Technology Roadmaps and six panels to evaluate the draft roadmaps, recommend improvements, and prioritize the technologies within each and among all of the technology areas as NASA finalizes the roadmaps. The steering committee is encouraged by the initiative NASA has taken through the Office of the Chief Technologist (OCT) to develop technology roadmaps and to seek input from the aerospace technical community with this study.
Development priorities for in-space propulsion technologies
NASA Astrophysics Data System (ADS)
Johnson, Les; Meyer, Michael; Palaszewski, Bryan; Coote, David; Goebel, Dan; White, Harold
2013-02-01
During the summer of 2010, NASA's Office of Chief Technologist assembled 15 civil service teams to support the creation of a NASA integrated technology roadmap. The Aero-Space Technology Area Roadmap is an integrated set of technology area roadmaps recommending the overall technology investment strategy and prioritization for NASA's technology programs. The integrated set of roadmaps will provide technology paths needed to meet NASA's strategic goals. The roadmaps have been reviewed by senior NASA management and the National Research Council. With the exception of electric propulsion systems used for commercial communications satellite station-keeping and a handful of deep space science missions, almost all of the rocket engines in use today are chemical rockets; that is, they obtain the energy needed to generate thrust by combining reactive chemicals to create a hot gas that is expanded to produce thrust. A significant limitation of chemical propulsion is that it has a relatively low specific impulse. Numerous concepts for advanced propulsion technologies with significantly higher values of specific impulse have been developed over the past 50 years. Advanced in-space propulsion technologies will enable much more effective exploration of our solar system, near and far, and will permit mission designers to plan missions to "fly anytime, anywhere, and complete a host of science objectives at the destinations" with greater reliability and safety. With a wide range of possible missions and candidate propulsion technologies with very diverse characteristics, the question of which technologies are 'best' for future missions is a difficult one. A portfolio of technologies to allow optimum propulsion solutions for a diverse set of missions and destinations are described in the roadmap and herein.
A Lunar Surface System Supportability Technology Development Roadmap
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.; Struk, Peter M.; Taleghani, Barmac K.
2009-01-01
This paper discusses the establishment of a Supportability Technology Development Roadmap as a guide for developing capabilities intended to allow NASA's Constellation program to enable a supportable, sustainable and affordable exploration of the Moon and Mars. Presented is a discussion of "supportability", in terms of space facility maintenance, repair and related logistics and a comparison of how lunar outpost supportability differs from the International Space Station. Supportability lessons learned from NASA and Department of Defense experience and their impact on a future lunar outpost is discussed. A supportability concept for future missions to the Moon and Mars that involves a transition from a highly logistics dependent to a logistically independent operation is discussed. Lunar outpost supportability capability needs are summarized and a supportability technology development strategy is established. The resulting Lunar Surface Systems Supportability Strategy defines general criteria that will be used to select technologies that will enable future flight crews to act effectively to respond to problems and exploit opportunities in a environment of extreme resource scarcity and isolation. This strategy also introduces the concept of exploiting flight hardware as a supportability resource. The technology roadmap involves development of three mutually supporting technology categories, Diagnostics Test & Verification, Maintenance & Repair, and Scavenging & Recycling. The technology roadmap establishes two distinct technology types, "Embedded" and "Process" technologies, with different implementation and thus different criteria and development approaches. The supportability technology roadmap addresses the technology readiness level, and estimated development schedule for technology groups that includes down-selection decision gates that correlate with the lunar program milestones. The resulting supportability technology roadmap is intended to develop a set of technologies with widest possible capability and utility with a minimum impact on crew time and training and remain within the time and cost constraints of the Constellation program
A Lunar Surface System Supportability Technology Development Roadmap
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.; Struk, Peter M.; Taleghani, barmac K.
2011-01-01
This paper discusses the establishment of a Supportability Technology Development Roadmap as a guide for developing capabilities intended to allow NASA s Constellation program to enable a supportable, sustainable and affordable exploration of the Moon and Mars. Presented is a discussion of supportability, in terms of space facility maintenance, repair and related logistics and a comparison of how lunar outpost supportability differs from the International Space Station. Supportability lessons learned from NASA and Department of Defense experience and their impact on a future lunar outpost is discussed. A supportability concept for future missions to the Moon and Mars that involves a transition from a highly logistics dependent to a logistically independent operation is discussed. Lunar outpost supportability capability needs are summarized and a supportability technology development strategy is established. The resulting Lunar Surface Systems Supportability Strategy defines general criteria that will be used to select technologies that will enable future flight crews to act effectively to respond to problems and exploit opportunities in an environment of extreme resource scarcity and isolation. This strategy also introduces the concept of exploiting flight hardware as a supportability resource. The technology roadmap involves development of three mutually supporting technology categories, Diagnostics Test and Verification, Maintenance and Repair, and Scavenging and Recycling. The technology roadmap establishes two distinct technology types, "Embedded" and "Process" technologies, with different implementation and thus different criteria and development approaches. The supportability technology roadmap addresses the technology readiness level, and estimated development schedule for technology groups that includes down-selection decision gates that correlate with the lunar program milestones. The resulting supportability technology roadmap is intended to develop a set of technologies with widest possible capability and utility with a minimum impact on crew time and training and remain within the time and cost constraints of the Constellation program.
ERIC Educational Resources Information Center
Vahabi, Mandana
2010-01-01
Objective: To test whether the format in which women receive probabilistic information about breast cancer and mammography affects their comprehension. Methods: A convenience sample of 180 women received pre-assembled randomized packages containing a breast health information brochure, with probabilities presented in either verbal or numeric…
On the Measurement and Properties of Ambiguity in Probabilistic Expectations
ERIC Educational Resources Information Center
Pickett, Justin T.; Loughran, Thomas A.; Bushway, Shawn
2015-01-01
Survey respondents' probabilistic expectations are now widely used in many fields to study risk perceptions, decision-making processes, and behavior. Researchers have developed several methods to account for the fact that the probability of an event may be more ambiguous for some respondents than others, but few prior studies have empirically…
2018-03-01
MARCH 2018 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED STINFO COPY AIR FORCE RESEARCH LABORATORY INFORMATION...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) Air Force Research Laboratory/RITA DARPA 525 Brooks Road 675 North Randolph Street Rome...1 3.0 METHODS , ASSUMPTIONS, AND PROCEDURES
Reliability, Risk and Cost Trade-Offs for Composite Designs
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.
1996-01-01
Risk and cost trade-offs have been simulated using a probabilistic method. The probabilistic method accounts for all naturally-occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry and loading conditions. The probability density function of first buckling load for a set of uncertain variables is computed. The probabilistic sensitivity factors of uncertain variables to the first buckling load is calculated. The reliability-based cost for a composite fuselage panel is defined and minimized with respect to requisite design parameters. The optimization is achieved by solving a system of nonlinear algebraic equations whose coefficients are functions of probabilistic sensitivity factors. With optimum design parameters such as the mean and coefficient of variation (representing range of scatter) of uncertain variables, the most efficient and economical manufacturing procedure can be selected. In this paper, optimum values of the requisite design parameters for a predetermined cost due to failure occurrence are computationally determined. The results for the fuselage panel analysis show that the higher the cost due to failure occurrence, the smaller the optimum coefficient of variation of fiber modulus (design parameter) in longitudinal direction.
Development of probabilistic regional climate scenario in East Asia
NASA Astrophysics Data System (ADS)
Dairaku, K.; Ueno, G.; Ishizaki, N. N.
2015-12-01
Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in East Asia (CORDEX-EA and Japan), the probability distribution of 2m air temperature was estimated by using developed regression model. The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. Probabilistic climate information in present (1969-1998) and future (2069-2098) climate was developed using CMIP3 SRES A1b scenarios 21 models and the observation data (CRU_TS3.22 & University of Delaware in CORDEX-EA, NIAES AMeDAS mesh data in Japan). The prototype of probabilistic information in CORDEX-EA and Japan represent the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Appropriate combination of statistical methods and optimization of climate ensemble experiments using multi-General Circulation Models (GCMs) and multi-regional climate models (RCMs) ensemble downscaling experiments are investigated.
Synthesis-Spectroscopy Roadmap Problems: Discovering Organic Chemistry
ERIC Educational Resources Information Center
Kurth, Laurie L.; Kurth, Mark J.
2014-01-01
Organic chemistry problems that interrelate and integrate synthesis with spectroscopy are presented. These synthesis-spectroscopy roadmap (SSR) problems uniquely engage second-year undergraduate organic chemistry students in the personal discovery of organic chemistry. SSR problems counter the memorize-or-bust strategy that many students tend to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lantz, Eric J.; Mone, Christopher D.; DeMeo, Edgar
IIn March 2015, the U.S. Department of Energy (DOE) released Wind Vision: A New Era for Wind Power in the United States (DOE 2015), which explores a scenario in which wind provides 10 percent of U.S. electricity in 2020, 20 percent in 2030, and 35 percent in 2050. The Wind Vision report also includes a roadmap of recommended actions aimed at pursuit of the vision and its underlying wind-deployment scenario. The roadmap was compiled by the Wind Vision project team, which included representatives from the industrial, electric-power, government-laboratory, academic, environmental-stewardship, regulatory, and permitting stakeholder groups. The roadmap describes high-level activitiesmore » suitable for all sectors with a stake in wind power and energy development. It is intended to be a 'living document,' and DOE expects to engage the wind community from time to time to track progress.« less
NASA Technical Reports Server (NTRS)
Crouch, Roger
2004-01-01
Viewgraphs on NASA's transition to its vision for space exploration is presented. The topics include: 1) Strategic Directives Guiding the Human Support Technology Program; 2) Progressive Capabilities; 3) A Journey to Inspire, Innovate, and Discover; 4) Risk Mitigation Status Technology Readiness Level (TRL) and Countermeasures Readiness Level (CRL); 5) Biological And Physical Research Enterprise Aligning With The Vision For U.S. Space Exploration; 6) Critical Path Roadmap Reference Missions; 7) Rating Risks; 8) Current Critical Path Roadmap (Draft) Rating Risks: Human Health; 9) Current Critical Path Roadmap (Draft) Rating Risks: System Performance/Efficiency; 10) Biological And Physical Research Enterprise Efforts to Align With Vision For U.S. Space Exploration; 11) Aligning with the Vision: Exploration Research Areas of Emphasis; 12) Code U Efforts To Align With The Vision For U.S. Space Exploration; 13) Types of Critical Path Roadmap Risks; and 14) ISS Human Support Systems Research, Development, and Demonstration. A summary discussing the vision for U.S. space exploration is also provided.
Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments
NASA Astrophysics Data System (ADS)
Berk, Mario; Å pačková, Olga; Straub, Daniel
2017-12-01
The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.
Evidence-based risk communication: a systematic review.
Zipkin, Daniella A; Umscheid, Craig A; Keating, Nancy L; Allen, Elizabeth; Aung, KoKo; Beyth, Rebecca; Kaatz, Scott; Mann, Devin M; Sussman, Jeremy B; Korenstein, Deborah; Schardt, Connie; Nagi, Avishek; Sloane, Richard; Feldstein, David A
2014-08-19
Effective communication of risks and benefits to patients is critical for shared decision making. To review the comparative effectiveness of methods of communicating probabilistic information to patients that maximize their cognitive and behavioral outcomes. PubMed (1966 to March 2014) and CINAHL, EMBASE, and the Cochrane Central Register of Controlled Trials (1966 to December 2011) using several keywords and structured terms. Prospective or cross-sectional studies that recruited patients or healthy volunteers and compared any method of communicating probabilistic information with another method. Two independent reviewers extracted study characteristics and assessed risk of bias. Eighty-four articles, representing 91 unique studies, evaluated various methods of numerical and visual risk display across several risk scenarios and with diverse outcome measures. Studies showed that visual aids (icon arrays and bar graphs) improved patients' understanding and satisfaction. Presentations including absolute risk reductions were better than those including relative risk reductions for maximizing accuracy and seemed less likely than presentations with relative risk reductions to influence decisions to accept therapy. The presentation of numbers needed to treat reduced understanding. Comparative effects of presentations of frequencies (such as 1 in 5) versus event rates (percentages, such as 20%) were inconclusive. Most studies were small and highly variable in terms of setting, context, and methods of administering interventions. Visual aids and absolute risk formats can improve patients' understanding of probabilistic information, whereas numbers needed to treat can lessen their understanding. Due to study heterogeneity, the superiority of any single method for conveying probabilistic information is not established, but there are several good options to help clinicians communicate with patients. None.
Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.
Frommholz, Ingo; Roelleke, Thomas
2016-01-01
Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.
Eddy, Sean R.
2008-01-01
Sequence database searches require accurate estimation of the statistical significance of scores. Optimal local sequence alignment scores follow Gumbel distributions, but determining an important parameter of the distribution (λ) requires time-consuming computational simulation. Moreover, optimal alignment scores are less powerful than probabilistic scores that integrate over alignment uncertainty (“Forward” scores), but the expected distribution of Forward scores remains unknown. Here, I conjecture that both expected score distributions have simple, predictable forms when full probabilistic modeling methods are used. For a probabilistic model of local sequence alignment, optimal alignment bit scores (“Viterbi” scores) are Gumbel-distributed with constant λ = log 2, and the high scoring tail of Forward scores is exponential with the same constant λ. Simulation studies support these conjectures over a wide range of profile/sequence comparisons, using 9,318 profile-hidden Markov models from the Pfam database. This enables efficient and accurate determination of expectation values (E-values) for both Viterbi and Forward scores for probabilistic local alignments. PMID:18516236
The COSPAR roadmap on Space-based observation and Integrated Earth System Science for 2016-2025
NASA Astrophysics Data System (ADS)
Fellous, Jean-Louis
2016-07-01
The Committee on Space Research of the International Council for Science recently commissioned a study group to prepare a roadmap on observation and integrated Earth-system science for the coming ten years. Its focus is on the combined use of observations and modelling to address the functioning, predictability and projected evolution of the Earth system on timescales out to a century or so. It discusses how observations support integrated Earth-system science and its applications, and identifies planned enhancements to the contributing observing systems and other requirements for observations and their processing. The paper will provide an overview of the content of the roadmap. All types of observation are considered in the roadmap, but emphasis is placed on those made from space. The origins and development of the integrated view of the Earth system are outlined, noting the interactions between the main components that lead to requirements for integrated science and modelling, and for the observations that guide and support them. What constitutes an Earth-system model is discussed. Summaries are given of key cycles within the Earth system. The nature of Earth observation and the arrangements for international coordination essential for effective operation of global observing systems are introduced in the roadmap. Instances are given of present types of observation, what is already on the roadmap for 2016-2025 and some of the issues to be faced. The current status and prospects for Earth-system modelling are summarized. Data assimilation is discussed not only because it uses observations and models to generate datasets for monitoring the Earth system and for initiating and evaluating predictions, in particular through reanalysis, but also because of the feedback it provides on the quality of both the observations and the models employed. Finally the roadmap offers a set of concluding discussions covering general developmental needs, requirements for continuity of space-based observing systems, further long-term requirements for observations and other data, technological advances and data challenges, and the importance of enhanced international cooperation.
NASA Astrophysics Data System (ADS)
Yu, Bo; Ning, Chao-lie; Li, Bing
2017-03-01
A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.
76 FR 11308 - Aviation Noise Impacts Roadmap Annual Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... impacts. The purpose of the meeting is to update and advance our collective scientific knowledge of the... Aviation Administration (FAA), National Aeronautics and Space Administration (NASA), Department of Defense... knowledge gaps and future research activities. The intent of the Roadmap is to define systematic, focused...
The Risk Assessment in the 21st Century (RISK21): Roadmap and Matrix
The RISK21 integrated evaluation strategy is a problem formulation-based exposure-driven risk assessment roadmap that takes advantage of existing information to graphically represent the intersection of exposure and toxicity data on a highly visual matrix. This paper describes i...
Materials Technical Team Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2013-08-01
Roadmap identifying the efforts of the Materials Technical Team (MTT) to focus primarily on reducing the mass of structural systems such as the body and chassis in light-duty vehicles (including passenger cars and light trucks) which enables improved vehicle efficiency regardless of the vehicle size or propulsion system employed.
NASA's Deep Space Telecommunications Roadmap
NASA Technical Reports Server (NTRS)
Edwards, C., Jr.; Stelzried, C.; Deutsch, L.; Swanson, L.
1998-01-01
This paper will present this roadmap, describe how it will support an increasing mission set while also providing significantly increased science data return, summarize the current state of key Ka-band and optical communications technologies, and identify critical path items in terms of technology developments, demonstrations, and mission users.
NASA Technical Reports Server (NTRS)
Chiaramonte, Fran
2003-01-01
This viewgraph presentation discusses the status and goals for the NASA OBPR Physical Science Research Program. The following text was used to summarize the presentation. The OBPR Physical Sciences Research program has been comprehensively reviewed and endorsed by National Research Council. The value and need for the research have been re-affirmed. The research program has been prioritized and resource re-allocations have been carried out through an OBPR-wide process. An increasing emphasis on strategic, mission-oriented research is planned. The program will strive to maintain a balance between strategic and fundamental research. A feasible ISS flight research program fitting within the budgetary and ISS resource envelopes has been formulated for the near term (2003-2007). The current ISS research program will be significantly strengthened starting 2005 by using discipline dedicated research facility racks. A research re-planning effort has been initiated and will include active participation from the research community in the next few months. The research re-planning effort will poise PSR to increase ISS research utilization for a potential enhancement beyond ISS IP Core Complete. The Physical Sciences research program readily integrates the cross-disciplinary requirements of the NASA and OBPR strategic objectives. Each fundamental research thrust will develop a roadmap through technical workshops and Discipline Working Groups (DWGs). Most fundamental research thrusts will involve cross-disciplinary efforts. A Technology Roadmap will guide the Strategic Research for Exploration thrust. The Research Plan will integrate and coordinate fundamental Research Thrusts Roadmaps with the Technology Roadmap. The Technology Roadmap will be developed in coordination with other OBPR programs as well as other Enterprise (R,S,M,N). International Partners will contribute to the roadmaps and through research coordination. The research plan will be vetted with the discipline working groups, the BPRAC subcommittees, and with the BPRAC. Recommendations from NRC past and current committees will be implemented whenever appropriate.Proposed theme element content will be "missionized" around planned content and potential new projects (facilities, modules, initiatives) on approximately a five-year horizon, with the approval of PSRD management. Center/science working group teams will develop descriptions of "mission" objectives, value, and requirements. Purpose is to create a competitive environment for concept development and to stimulate community ownership/advocacy. Proposed theme elements reviewed and approved by PSRD management. Strawman roadmaps for themes developed. Program budget and technology requirements verified. Theme elements are prioritized with the input of advisory groups. Integration into program themes (questions) and required technology investments are defined by science and technology roadmaps. Review and assessment by OBPR management.
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K, Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-25
... purpose and need, the alternatives to be studied, the impacts to be evaluated, and the evaluation methods... clear roadmap for concise development of the environmental document. In the interest of producing a... unincorporated Los Angeles County which includes east Los Angeles and west Whittier-Los Nietos. A diverse mix of...
Pest risk maps for invasive alien species: a roadmap for improvement
Robert C. Venette; Darren J. Kriticos; Roger D. Magarey; Frank H. Koch; Richard H. A. Baker; Susan P. Worner; Nadila N. Gomez Raboteaux; Daniel W. McKenney; Erhard J. Dobesberger; Denys Yemshanov; Paul J. De Barro; William D. Hutchinson; Glenn Fowler; Tom M. Kalaris; John Pedlar
2010-01-01
Pest risk maps are powerful visual communication tools to describe where invasive alien species might arrive, establish, spread, or cause harmful impacts. These maps inform strategic and tactical pest management decisions, such as potential restrictions on international trade or the design of pest surveys and domestic quarantines. Diverse methods are available to...
Sensors for process control Focus Team report
NASA Astrophysics Data System (ADS)
At the Semiconductor Technology Workshop, held in November 1992, the Semiconductor Industry Association (SIA) convened 179 semiconductor technology experts to assess the 15-year outlook for the semiconductor manufacturing industry. The output of the Workshop, a document entitled 'Semiconductor Technology: Workshop Working Group Reports,' contained an overall roadmap for the technology characteristics envisioned in integrated circuits (IC's) for the period 1992-2007. In addition, the document contained individual roadmaps for numerous key areas in IC manufacturing, such as film deposition, thermal processing, manufacturing systems, exposure technology, etc. The SIA Report did not contain a separate roadmap for contamination free manufacturing (CFM). A key component of CFM for the next 15 years is the use of sensors for (1) defect reduction, (2) improved product quality, (3) improved yield, (4) improved tool utilization through contamination reduction, and (5) real time process control in semiconductor fabrication. The objective of this Focus Team is to generate a Sensors for Process Control Roadmap. Implicit in this objective is the identification of gaps in current sensor technology so that research and development activity in the sensor industry can be stimulated to develop sensor systems capable of meeting the projected roadmap needs. Sensor performance features of interest include detection limit, specificity, sensitivity, ease of installation and maintenance, range, response time, accuracy, precision, ease and frequency of calibration, degree of automation, and adaptability to in-line process control applications.
Roadmap for In-Space Propulsion Technology
NASA Technical Reports Server (NTRS)
Meyer, Michael; Johnson, Les; Palaszewski, Bryan; Coote, David; Goebel, Dan; White, Harold
2012-01-01
NASA has created a roadmap for the development of advanced in-space propulsion technologies for the NASA Office of the Chief Technologist (OCT). This roadmap was drafted by a team of subject matter experts from within the Agency and then independently evaluated, integrated and prioritized by a National Research Council (NRC) panel. The roadmap describes a portfolio of in-space propulsion technologies that could meet future space science and exploration needs, and shows their traceability to potential future missions. Mission applications range from small satellites and robotic deep space exploration to space stations and human missions to Mars. Development of technologies within the area of in-space propulsion will result in technical solutions with improvements in thrust, specific impulse (Isp), power, specific mass (or specific power), volume, system mass, system complexity, operational complexity, commonality with other spacecraft systems, manufacturability, durability, and of course, cost. These types of improvements will yield decreased transit times, increased payload mass, safer spacecraft, and decreased costs. In some instances, development of technologies within this area will result in mission-enabling breakthroughs that will revolutionize space exploration. There is no single propulsion technology that will benefit all missions or mission types. The requirements for in-space propulsion vary widely according to their intended application. This paper provides an updated summary of the In-Space Propulsion Systems technology area roadmap incorporating the recommendations of the NRC.
Risk assessment for construction projects of transport infrastructure objects
NASA Astrophysics Data System (ADS)
Titarenko, Boris
2017-10-01
The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.
A new discriminative kernel from probabilistic models.
Tsuda, Koji; Kawanabe, Motoaki; Rätsch, Gunnar; Sonnenburg, Sören; Müller, Klaus-Robert
2002-10-01
Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel derived; from tangent vectors of posterior log-odds. Furthermore, we develop a theoretical framework on feature extractors from probabilistic models and use it for analyzing the TOP kernel. In experiments, our new discriminative TOP kernel compares favorably to the Fisher kernel.
NASA Technical Reports Server (NTRS)
Rajagopal, K. R.
1992-01-01
The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.
Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.
2003-01-01
Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.
Probabilistic topic modeling for the analysis and classification of genomic sequences
2015-01-01
Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734
Probabilistic Simulation of Multi-Scale Composite Behavior
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2012-01-01
A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.
NASA Astrophysics Data System (ADS)
Sanchez, J.
2018-06-01
In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.
Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions
2017-01-01
A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy. PMID:29209469
Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions.
Nantha, Yogarabindranath Swarna
2017-11-01
A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.
Composite Load Spectra for Select Space Propulsion Structural Components
NASA Technical Reports Server (NTRS)
Ho, Hing W.; Newell, James F.
1994-01-01
Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.
1983-07-01
be a useful tool for assessing kowledge , but there are several problems with this item format. These problems include the possibility of an examinee...1959. -Kane, M. T., & Moloney, J. M. The effect of SSM grading on reliability when residual items have no discriminating power . Paper presented at
Advanced Telescopes and Observatories Capability Roadmap Presentation to the NRC
NASA Technical Reports Server (NTRS)
2005-01-01
This viewgraph presentation provides an overview of the NASA Advanced Planning and Integration Office (APIO) roadmap for developing technological capabilities for telescopes and observatories in the following areas: Optics; Wavefront Sensing and Control and Interferometry; Distributed and Advanced Spacecraft; Large Precision Structures; Cryogenic and Thermal Control Systems; Infrastructure.
Roadmap for Navy Family Research.
1980-08-01
of methodological limitations, including: small, often non -representative or narrowly defined samples; inadequate statistical controls, inadequate...1-1 1.2 Overview of the Research Roadmap ..................... 1-2 2. Methodology ...the Office of Naval Research by the Westinghouse Public Applied Systems Division, and is designed to provide the Navy with a systematic framework for
DOT National Transportation Integrated Search
2001-08-01
This roadmap explains how your community can join forces with the nationwide network of Clean Cities to increase the use of alternative fuels and alternative fuel vehicles (AFVs). You will learn how the U.S. Department of Energy (DOE) can help your c...
Leveraging Our Expertise To Inform International RE Roadmaps | Energy
energy targets to support Mexico's renewable energy goal. NREL and its Mexico partners developed the institutions need to take to determine how the electricity infrastructure and systems must change to accommodate high levels of renewables. The roadmap focuses on analysis methodologies-including grid expansion
Human Health and Support Systems Capability Roadmap Progress Review
NASA Technical Reports Server (NTRS)
Grounds, Dennis; Boehm, Al
2005-01-01
The Human Health and Support Systems Capability Roadmap focuses on research and technology development and demonstration required to ensure the health, habitation, safety, and effectiveness of crews in and beyond low Earth orbit. It contains three distinct sub-capabilities: Human Health and Performance. Life Support and Habitats. Extra-Vehicular Activity.
Roadmap to Measuring Distance Education Instructional Design Competencies
ERIC Educational Resources Information Center
Dooley, Kim E.; Lindner, James R.; Telg, Ricky W.; Irani, Tracy; Moore, Lori; Lundy, Lisa
2007-01-01
This study was designed to measure instructional design competencies as a result of participation in a 9-month Web-based training program called "Roadmap to Effective Distance Education Instructional Design." The researchers used a self-assessment pre- and posttest to determine participant initial and final competence in 12 areas: adult…
Roadmapping towards Sustainability Proficiency in Engineering Education
ERIC Educational Resources Information Center
Rodriguez-Andara, Alejandro; Río-Belver, Rosa María; Rodríguez-Salvador, Marisela; Lezama-Nicolás, René
2018-01-01
Purpose: The purpose of this paper is to deliver a roadmap that displays pathways to develop sustainability skills in the engineering curricula. Design/methodology/approach: The selected approach to enrich engineering students with sustainability skills was active learning methodologies. First, a survey was carried out on a sample of 189 students…
Six Tips for Successful IEP Meetings
ERIC Educational Resources Information Center
Diliberto, Jennifer A.; Brewer, Denise
2012-01-01
Individuals with Disabilities Education Improvement Act (IDEIA, 2004) mandates that each student with a disability has an individualized education program (IEP). The IEP serves as the curriculum roadmap for special education services. In order to generate a clear roadmap, full team communication is necessary. The purpose of this paper is to…
An Imaging Roadmap for Biology Education: From Nanoparticles to Whole Organisms
ERIC Educational Resources Information Center
Kelley, Daniel J.; Davidson, Richard J.; Nelson, David L.
2008-01-01
Imaging techniques provide ways of knowing structure and function in biology at different scales. The multidisciplinary nature and rapid advancement of imaging sciences requires imaging education to begin early in the biology curriculum. Guided by the National Institutes of Health (NIH) Roadmap initiatives, we incorporated a nanoimaging, molecular…
Science Instruments and Sensors Capability Roadmap: NRC Dialogue
NASA Technical Reports Server (NTRS)
Barney, Rich; Zuber, Maria
2005-01-01
The Science Instruments and Sensors roadmaps include capabilities associated with the collection, detection, conversion, and processing of scientific data required to answer compelling science questions driven by the Vision for Space Exploration and The New Age of Exploration (NASA's Direction for 2005 & Beyond). Viewgraphs on these instruments and sensors are presented.
The Roadmap presents critical issues and research questions for each theme. For Theme 1, the issues for limiting the harm from materials and process in electronics industry include identifying the chemicals in products, production process, in the extraction of virgin materials, i...
Review of the Semiconductor Industry and Technology Roadmap.
ERIC Educational Resources Information Center
Kumar, Sameer; Krenner, Nicole
2002-01-01
Points out that the semiconductor industry is extremely competitive and requires ongoing technological advances to improve performance while reducing costs to remain competitive and how essential it is to gain an understanding of important facets of the industry. Provides an overview of the initial and current semiconductor technology roadmap that…
NASA Astrophysics Data System (ADS)
Sander, D.; Valenzuela, S. O.; Makarov, D.; Marrows, C. H.; Fullerton, E. E.; Fischer, P.; McCord, J.; Vavassori, P.; Mangin, S.; Pirro, P.; Hillebrands, B.; Kent, A. D.; Jungwirth, T.; Gutfleisch, O.; Kim, C. G.; Berger, A.
2017-09-01
Building upon the success and relevance of the 2014 Magnetism Roadmap, this 2017 Magnetism Roadmap edition follows a similar general layout, even if its focus is naturally shifted, and a different group of experts and, thus, viewpoints are being collected and presented. More importantly, key developments have changed the research landscape in very relevant ways, so that a novel view onto some of the most crucial developments is warranted, and thus, this 2017 Magnetism Roadmap article is a timely endeavour. The change in landscape is hereby not exclusively scientific, but also reflects the magnetism related industrial application portfolio. Specifically, Hard Disk Drive technology, which still dominates digital storage and will continue to do so for many years, if not decades, has now limited its footprint in the scientific and research community, whereas significantly growing interest in magnetism and magnetic materials in relation to energy applications is noticeable, and other technological fields are emerging as well. Also, more and more work is occurring in which complex topologies of magnetically ordered states are being explored, hereby aiming at a technological utilization of the very theoretical concepts that were recognised by the 2016 Nobel Prize in Physics. Given this somewhat shifted scenario, it seemed appropriate to select topics for this Roadmap article that represent the three core pillars of magnetism, namely magnetic materials, magnetic phenomena and associated characterization techniques, as well as applications of magnetism. While many of the contributions in this Roadmap have clearly overlapping relevance in all three fields, their relative focus is mostly associated to one of the three pillars. In this way, the interconnecting roles of having suitable magnetic materials, understanding (and being able to characterize) the underlying physics of their behaviour and utilizing them for applications and devices is well illustrated, thus giving an accurate snapshot of the world of magnetism in 2017. The article consists of 14 sections, each written by an expert in the field and addressing a specific subject on two pages. Evidently, the depth at which each contribution can describe the subject matter is limited and a full review of their statuses, advances, challenges and perspectives cannot be fully accomplished. Also, magnetism, as a vibrant research field, is too diverse, so that a number of areas will not be adequately represented here, leaving space for further Roadmap editions in the future. However, this 2017 Magnetism Roadmap article can provide a frame that will enable the reader to judge where each subject and magnetism research field stands overall today and which directions it might take in the foreseeable future. The first material focused pillar of the 2017 Magnetism Roadmap contains five articles, which address the questions of atomic scale confinement, 2D, curved and topological magnetic materials, as well as materials exhibiting unconventional magnetic phase transitions. The second pillar also has five contributions, which are devoted to advances in magnetic characterization, magneto-optics and magneto-plasmonics, ultrafast magnetization dynamics and magnonic transport. The final and application focused pillar has four contributions, which present non-volatile memory technology, antiferromagnetic spintronics, as well as magnet technology for energy and bio-related applications. As a whole, the 2017 Magnetism Roadmap article, just as with its 2014 predecessor, is intended to act as a reference point and guideline for emerging research directions in modern magnetism.
Methods for estimating the amount of vernal pool habitat in the northeastern United States
Van Meter, R.; Bailey, L.L.; Grant, E.H.C.
2008-01-01
The loss of small, seasonal wetlands is a major concern for a variety of state, local, and federal organizations in the northeastern U.S. Identifying and estimating the number of vernal pools within a given region is critical to developing long-term conservation and management strategies for these unique habitats and their faunal communities. We use three probabilistic sampling methods (simple random sampling, adaptive cluster sampling, and the dual frame method) to estimate the number of vernal pools on protected, forested lands. Overall, these methods yielded similar values of vernal pool abundance for each study area, and suggest that photographic interpretation alone may grossly underestimate the number of vernal pools in forested habitats. We compare the relative efficiency of each method and discuss ways of improving precision. Acknowledging that the objectives of a study or monitoring program ultimately determine which sampling designs are most appropriate, we recommend that some type of probabilistic sampling method be applied. We view the dual-frame method as an especially useful way of combining incomplete remote sensing methods, such as aerial photograph interpretation, with a probabilistic sample of the entire area of interest to provide more robust estimates of the number of vernal pools and a more representative sample of existing vernal pool habitats.
Probabilistic Assessment of Fracture Progression in Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank
1999-01-01
This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.
Probabilistic segmentation and intensity estimation for microarray images.
Gottardo, Raphael; Besag, Julian; Stephens, Matthew; Murua, Alejandro
2006-01-01
We describe a probabilistic approach to simultaneous image segmentation and intensity estimation for complementary DNA microarray experiments. The approach overcomes several limitations of existing methods. In particular, it (a) uses a flexible Markov random field approach to segmentation that allows for a wider range of spot shapes than existing methods, including relatively common 'doughnut-shaped' spots; (b) models the image directly as background plus hybridization intensity, and estimates the two quantities simultaneously, avoiding the common logical error that estimates of foreground may be less than those of the corresponding background if the two are estimated separately; and (c) uses a probabilistic modeling approach to simultaneously perform segmentation and intensity estimation, and to compute spot quality measures. We describe two approaches to parameter estimation: a fast algorithm, based on the expectation-maximization and the iterated conditional modes algorithms, and a fully Bayesian framework. These approaches produce comparable results, and both appear to offer some advantages over other methods. We use an HIV experiment to compare our approach to two commercial software products: Spot and Arrayvision.
Probabilistic biological network alignment.
Todor, Andrei; Dobra, Alin; Kahveci, Tamer
2013-01-01
Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.
Hiraishi, Kunihiko
2014-01-01
One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766
Optimal Collision Avoidance Trajectories for Unmanned/Remotely Piloted Aircraft
2014-12-26
projected operational tempos (OPTEMPOs)” [15]. The Oce of the Secretary of Defense (OSD) Unmanned Systems Roadmap [15] goes on to say that the airspace...methods [63]. In an indirect method, the researcher derives the first- order necessary conditions for optimality “via the calculus of variations and...region around the ownship using a variation of a superquadric. From [116], the standard equation for a superellipsoid appears as: ✓ x a1 ◆ 2 ✏ 2
Probabilistic Component Mode Synthesis of Nondeterministic Substructures
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1996-01-01
Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. We present a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.
A probabilistic method for testing and estimating selection differences between populations
He, Yungang; Wang, Minxian; Huang, Xin; Li, Ran; Xu, Hongyang; Xu, Shuhua; Jin, Li
2015-01-01
Human populations around the world encounter various environmental challenges and, consequently, develop genetic adaptations to different selection forces. Identifying the differences in natural selection between populations is critical for understanding the roles of specific genetic variants in evolutionary adaptation. Although numerous methods have been developed to detect genetic loci under recent directional selection, a probabilistic solution for testing and quantifying selection differences between populations is lacking. Here we report the development of a probabilistic method for testing and estimating selection differences between populations. By use of a probabilistic model of genetic drift and selection, we showed that logarithm odds ratios of allele frequencies provide estimates of the differences in selection coefficients between populations. The estimates approximate a normal distribution, and variance can be estimated using genome-wide variants. This allows us to quantify differences in selection coefficients and to determine the confidence intervals of the estimate. Our work also revealed the link between genetic association testing and hypothesis testing of selection differences. It therefore supplies a solution for hypothesis testing of selection differences. This method was applied to a genome-wide data analysis of Han and Tibetan populations. The results confirmed that both the EPAS1 and EGLN1 genes are under statistically different selection in Han and Tibetan populations. We further estimated differences in the selection coefficients for genetic variants involved in melanin formation and determined their confidence intervals between continental population groups. Application of the method to empirical data demonstrated the outstanding capability of this novel approach for testing and quantifying differences in natural selection. PMID:26463656
Nagarajan, Mahesh B; Raman, Steven S; Lo, Pechin; Lin, Wei-Chan; Khoshnoodi, Pooria; Sayre, James W; Ramakrishna, Bharath; Ahuja, Preeti; Huang, Jiaoti; Margolis, Daniel J A; Lu, David S K; Reiter, Robert E; Goldin, Jonathan G; Brown, Matthew S; Enzmann, Dieter R
2018-02-19
We present a method for generating a T2 MR-based probabilistic model of tumor occurrence in the prostate to guide the selection of anatomical sites for targeted biopsies and serve as a diagnostic tool to aid radiological evaluation of prostate cancer. In our study, the prostate and any radiological findings within were segmented retrospectively on 3D T2-weighted MR images of 266 subjects who underwent radical prostatectomy. Subsequent histopathological analysis determined both the ground truth and the Gleason grade of the tumors. A randomly chosen subset of 19 subjects was used to generate a multi-subject-derived prostate template. Subsequently, a cascading registration algorithm involving both affine and non-rigid B-spline transforms was used to register the prostate of every subject to the template. Corresponding transformation of radiological findings yielded a population-based probabilistic model of tumor occurrence. The quality of our probabilistic model building approach was statistically evaluated by measuring the proportion of correct placements of tumors in the prostate template, i.e., the number of tumors that maintained their anatomical location within the prostate after their transformation into the prostate template space. Probabilistic model built with tumors deemed clinically significant demonstrated a heterogeneous distribution of tumors, with higher likelihood of tumor occurrence at the mid-gland anterior transition zone and the base-to-mid-gland posterior peripheral zones. Of 250 MR lesions analyzed, 248 maintained their original anatomical location with respect to the prostate zones after transformation to the prostate. We present a robust method for generating a probabilistic model of tumor occurrence in the prostate that could aid clinical decision making, such as selection of anatomical sites for MR-guided prostate biopsies.
Fifth Annual Workshop on the Application of Probabilistic Methods for Gas Turbine Engines
NASA Technical Reports Server (NTRS)
Briscoe, Victoria (Compiler)
2002-01-01
These are the proceedings of the 5th Annual FAA/Air Force/NASA/Navy Workshop on the Probabilistic Methods for Gas Turbine Engines hosted by NASA Glenn Research Center and held at the Holiday Inn Cleveland West. The history of this series of workshops stems from the recognition that both military and commercial aircraft engines are inevitably subjected to similar design and manufacturing principles. As such, it was eminently logical to combine knowledge bases on how some of these overlapping principles and methodologies are being applied. We have started the process by creating synergy and cooperation between the FAA, Air Force, Navy, and NASA in these workshops. The recent 3-day workshop was specifically designed to benefit the development of probabilistic methods for gas turbine engines by addressing recent technical accomplishments and forging new ideas. We accomplished our goals of minimizing duplication, maximizing the dissemination of information, and improving program planning to all concerned. This proceeding includes the final agenda, abstracts, presentations, and panel notes, plus the valuable contact information from our presenters and attendees. We hope that this proceeding will be a tool to enhance understanding of the developers and users of probabilistic methods. The fifth workshop doubled its attendance and had the success of collaboration with the many diverse groups represented including government, industry, academia, and our international partners. So, "Start your engines!" and utilize these proceedings towards creating safer and more reliable gas turbine engines for our commercial and military partners.
Evaluation of Lithofacies Up-Scaling Methods for Probabilistic Prediction of Carbon Dioxide Behavior
NASA Astrophysics Data System (ADS)
Park, J. Y.; Lee, S.; Lee, Y. I.; Kihm, J. H.; Kim, J. M.
2017-12-01
Behavior of carbon dioxide injected into target reservoir (storage) formations is highly dependent on heterogeneities of geologic lithofacies and properties. These heterogeneous lithofacies and properties basically have probabilistic characteristics. Thus, their probabilistic evaluation has to be implemented properly into predicting behavior of injected carbon dioxide in heterogeneous storage formations. In this study, a series of three-dimensional geologic modeling is performed first using SKUA-GOCAD (ASGA and Paradigm) to establish lithofacies models of the Janggi Conglomerate in the Janggi Basin, Korea within a modeling domain. The Janggi Conglomerate is composed of mudstone, sandstone, and conglomerate, and it has been identified as a potential reservoir rock (clastic saline formation) for geologic carbon dioxide storage. Its lithofacies information are obtained from four boreholes and used in lithofacies modeling. Three different up-scaling methods (i.e., nearest to cell center, largest proportion, and random) are applied, and lithofacies modeling is performed 100 times for each up-scaling method. The lithofacies models are then compared and analyzed with the borehole data to evaluate the relative suitability of the three up-scaling methods. Finally, the lithofacies models are converted into coarser lithofacies models within the same modeling domain with larger grid blocks using the three up-scaling methods, and a series of multiphase thermo-hydrological numerical simulation is performed using TOUGH2-MP (Zhang et al., 2008) to predict probabilistically behavior of injected carbon dioxide. The coarser lithofacies models are also compared and analyzed with the borehole data and finer lithofacies models to evaluate the relative suitability of the three up-scaling methods. Three-dimensional geologic modeling, up-scaling, and multiphase thermo-hydrological numerical simulation as linked methodologies presented in this study can be utilized as a practical probabilistic evaluation tool to predict behavior of injected carbon dioxide and even to analyze its leakage risk. This work was supported by the Korea CCS 2020 Project of the Korea Carbon Capture and Sequestration R&D Center (KCRC) funded by the National Research Foundation (NRF), Ministry of Science and ICT (MSIT), Korea.
VERAM, for a sustainable and competitive future for EU Raw Materials
NASA Astrophysics Data System (ADS)
Mobili, A.; Tittarelli, F.; Revel, G. M.; Wall, P.
2018-03-01
The project, VERAM “Vision and Roadmap for European Raw Materials”, aims to deliver a mapping of on-going initiatives on non-food, non-energy raw materials (including metals, industrial minerals, aggregates and wood) at European, Member State, and regional levels both from the Research and Innovation (R&I), industry, and policy perspectives. Moreover, based on a comprehensive gap analysis, VERAM will propose a common long term 2050 Vision and Roadmap in coordination and cooperation with all stakeholders across the value chain. For the first time, two European Technology Platforms (ETPs) together with their corresponding European Research Area Networks (ERA-NETs) are joining forces to develop a common roadmap.
Progress along the E-ELT instrumentation roadmap
NASA Astrophysics Data System (ADS)
Ramsay, Suzanne; Casali, Mark; Cirasuolo, Michele; Egner, Sebastian; Gray, Peter; Gonzáles Herrera, Juan Carlos; Hammersley, Peter; Haupt, Christoph; Ives, Derek; Jochum, Lieselotte; Kasper, Markus; Kerber, Florian; Lewis, Steffan; Mainieri, Vincenzo; Manescau, Antonio; Marchetti, Enrico; Oberti, Sylvain; Padovani, Paolo; Schmid, Christian; Schimpelsberger, Johannes; Siebenmorgen, Ralf; Szecsenyi, Orsolya; Tamai, Roberto; Vernet, Joël.
2016-08-01
A suite of seven instruments and associated AO systems have been planned as the "E-ELT Instrumentation Roadmap". Following the E-ELT project approval in December 2014, rapid progress has been made in organising and signing the agreements for construction with European universities and institutes. Three instruments (HARMONI, MICADO and METIS) and one MCAO module (MAORY) have now been approved for construction. In addition, Phase-A studies have begun for the next two instruments - a multi-object spectrograph and high-resolution spectrograph. Technology development is also ongoing in preparation for the final instrument in the roadmap, the planetary camera and spectrograph. We present a summary of the status and capabilities of this first set of instruments for the E-ELT.
Fundamental Physics Changes in Response to Evolving NASA Needs
NASA Technical Reports Server (NTRS)
Israelsson, Ulf
2003-01-01
To continue growing as a discipline, we need to establish a new vision of where we are going that is consistent with today s physics, NASA s strategic plan, and the new OBPR direction. 1998 Roadmap focused exclusively on Physics, and did not worry about boundaries between OBPR and OSS. Updated Roadmap: Must incorporate some strategic research activities to be fully responsive to the current OBPR direction. Must capture the imagination of OBPR leadership, OMB, and Congress. Must delineate OBPR from the "beyond Einstein" program in OSS. Must address relevancy to Society explicitly. Status of the Roadmap development will be discussed after lunch today. Seeking community inputs and endorsement. Draft update targeted for June, final in August.
Fast, Nonlinear, Fully Probabilistic Inversion of Large Geophysical Problems
NASA Astrophysics Data System (ADS)
Curtis, A.; Shahraeeni, M.; Trampert, J.; Meier, U.; Cho, G.
2010-12-01
Almost all Geophysical inverse problems are in reality nonlinear. Fully nonlinear inversion including non-approximated physics, and solving for probability distribution functions (pdf’s) that describe the solution uncertainty, generally requires sampling-based Monte-Carlo style methods that are computationally intractable in most large problems. In order to solve such problems, physical relationships are usually linearized leading to efficiently-solved, (possibly iterated) linear inverse problems. However, it is well known that linearization can lead to erroneous solutions, and in particular to overly optimistic uncertainty estimates. What is needed across many Geophysical disciplines is a method to invert large inverse problems (or potentially tens of thousands of small inverse problems) fully probabilistically and without linearization. This talk shows how very large nonlinear inverse problems can be solved fully probabilistically and incorporating any available prior information using mixture density networks (driven by neural network banks), provided the problem can be decomposed into many small inverse problems. In this talk I will explain the methodology, compare multi-dimensional pdf inversion results to full Monte Carlo solutions, and illustrate the method with two applications: first, inverting surface wave group and phase velocities for a fully-probabilistic global tomography model of the Earth’s crust and mantle, and second inverting industrial 3D seismic data for petrophysical properties throughout and around a subsurface hydrocarbon reservoir. The latter problem is typically decomposed into 104 to 105 individual inverse problems, each solved fully probabilistically and without linearization. The results in both cases are sufficiently close to the Monte Carlo solution to exhibit realistic uncertainty, multimodality and bias. This provides far greater confidence in the results, and in decisions made on their basis.
Sarma-based key-group method for rock slope reliability analyses
NASA Astrophysics Data System (ADS)
Yarahmadi Bafghi, A. R.; Verdel, T.
2005-08-01
The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
...-1659-01] Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing... Publication 500-293, US Government Cloud Computing Technology Roadmap, Release 1.0 (Draft). This document is... (USG) agencies to accelerate their adoption of cloud computing. The roadmap has been developed through...
Space Communications Capability Roadmap Interim Review
NASA Technical Reports Server (NTRS)
Spearing, Robert; Regan, Michael
2005-01-01
Contents include the following: Identify the need for a robust communications and navigation architecture for the success of exploration and science missions. Describe an approach for specifying architecture alternatives and analyzing them. Establish a top level architecture based on a network of networks. Identify key enabling technologies. Synthesize capability, architecture and technology into an initial capability roadmap.
FY2009-2034 Unmanned Systems Integrated Roadmap
2009-04-20
FY2009–2034 Unmanned Systems Integrated Roadmap Page i Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the...56 A.1.7 XM-156 Class I ...60 A.1.11 Improved Gnat Extended Range ( I -Gnat-ER) “Warrior Alpha” / Extended Range/Multi- purpose (ER/MP) Block
The Idaho National Engineering & Environmental Lab (INEEL) was charged by DOE EM to develop a complex-wide science and technology roadmap for the characterization, modeling and simulation of the fate and transport of contamination in the vadose zone. Various types of hazardous, r...
Virtual Learning and Instructional Tools: Perfecting the Weekly Roadmap
ERIC Educational Resources Information Center
Cicco, Gina
2015-01-01
This article will provide details on the importance of providing structure within an online graduate counseling course in the form of a weekly roadmap tool. There are various instructional tools that may be useful in providing students with differing levels of structure, to meet their learning style preferences for structural stimuli (Cicco,…
Occurrence, Genotoxicity, and Carcinogenicity of Emerging Disinfection By-products in Drinking Water: A Review and Roadmap for Research
Summary of Paper
What is study?
This is the first review of the 30 year's research effort on the occurrence, genotoxicity,...
Doe, John E.; Lander, Deborah R.; Doerrer, Nancy G.; Heard, Nina; Hines, Ronald N.; Lowit, Anna B.; Pastoor, Timothy; Phillips, Richard D.; Sargent, Dana; Sherman, James H.; Young Tanir, Jennifer; Embry, Michelle R.
2016-01-01
Abstract The HESI-coordinated RISK21 roadmap and matrix are tools that provide a transparent method to compare exposure and toxicity information and assess whether additional refinement is required to obtain the necessary precision level for a decision regarding safety. A case study of the use of a pyrethroid, “pseudomethrin,” in bed netting to control malaria is presented to demonstrate the application of the roadmap and matrix. The evaluation began with a problem formulation step. The first assessment utilized existing information pertaining to the use and the class of chemistry. At each stage of the step-wise approach, the precision of the toxicity and exposure estimates were refined as necessary by obtaining key data which enabled a decision on safety to be made efficiently and with confidence. The evaluation demonstrated the concept of using existing information within the RISK21 matrix to drive the generation of additional data using a value-of-information approach. The use of the matrix highlighted whether exposure or toxicity required further investigation and emphasized the need to address the default uncertainty factor of 100 at the highest tier of the evaluation. It also showed how new methodology such as the use of in vitro studies and assays could be used to answer the specific questions which arise through the use of the matrix. The matrix also serves as a useful means to communicate progress to stakeholders during an assessment of chemical use. PMID:26517449
NASA Technical Reports Server (NTRS)
Des Marais, David J.; Allamandola, Louis J.; Benner, Steven A.; Boss, Alan P.; Deamer, David; Falkowski, Paul G.; Farmer, Jack D.; Hedges, S. Blair; Jakosky, Bruce M.; Knoll, Andrew H.;
2003-01-01
The NASA Astrobiology Roadmap provides guidance for research and technology development across the NASA enterprises that encompass the space, Earth, and biological sciences. The ongoing development of astrobiology roadmaps embodies the contributions of diverse scientists and technologists from government, universities, and private institutions. The Roadmap addresses three basic questions: How does life begin and evolve, does life exist elsewhere in the universe, and what is the future of life on Earth and beyond? Seven Science Goals outline the following key domains of investigation: understanding the nature and distribution of habitable environments in the universe, exploring for habitable environments and life in our own solar system, understanding the emergence of life, determining how early life on Earth interacted and evolved with its changing environment, understanding the evolutionary mechanisms and environmental limits of life, determining the principles that will shape life in the future, and recognizing signatures of life on other worlds and on early Earth. For each of these goals, Science Objectives outline more specific high-priority efforts for the next 3-5 years. These 18 objectives are being integrated with NASA strategic planning.
The NASA Astrobiology Roadmap.
Des Marais, David J; Allamandola, Louis J; Benner, Steven A; Boss, Alan P; Deamer, David; Falkowski, Paul G; Farmer, Jack D; Hedges, S Blair; Jakosky, Bruce M; Knoll, Andrew H; Liskowsky, David R; Meadows, Victoria S; Meyer, Michael A; Pilcher, Carl B; Nealson, Kenneth H; Spormann, Alfred M; Trent, Jonathan D; Turner, William W; Woolf, Neville J; Yorke, Harold W
2003-01-01
The NASA Astrobiology Roadmap provides guidance for research and technology development across the NASA enterprises that encompass the space, Earth, and biological sciences. The ongoing development of astrobiology roadmaps embodies the contributions of diverse scientists and technologists from government, universities, and private institutions. The Roadmap addresses three basic questions: How does life begin and evolve, does life exist elsewhere in the universe, and what is the future of life on Earth and beyond? Seven Science Goals outline the following key domains of investigation: understanding the nature and distribution of habitable environments in the universe, exploring for habitable environments and life in our own solar system, understanding the emergence of life, determining how early life on Earth interacted and evolved with its changing environment, understanding the evolutionary mechanisms and environmental limits of life, determining the principles that will shape life in the future, and recognizing signatures of life on other worlds and on early Earth. For each of these goals, Science Objectives outline more specific high-priority efforts for the next 3-5 years. These 18 objectives are being integrated with NASA strategic planning.
Carbon Dioxide Utilization (CO2U) ICEF Roadmap 2.0. Draft October 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandalow, David; Aines, Roger; Friedmann, Julio
Last year, experts from CO 2 Sciences, Columbia University and Valence Strategic came together to develop a roadmap. That document, Carbon Dioxide Utilization ICEF Roadmap 1.0, released at the UNFCCC Marrakesh Climate Change Conference in 2016, surveyed the commercial and technical landscape of CO 2 conversion and use. The document provided extensive background and analysis and has helped to provide a foundation for additional studies, including this one.This roadmap is meant to complement and expand upon the work of its predecessor. Based in part on a workshop at Columbia University’s Center on Global Energy Policy in July 2017, it exploresmore » three distinct categories of CO 2-based products, the technologies that can be harnessed to convert CO2 to these products, and the associated research and development needs. It also explores the complicated topic of life cycle analysis—critically important when considering the climate impacts of CO 2 conversion and use—as well as policy tools that could be used to promote CO 2-based products.« less
The NASA Astrobiology Roadmap.
Des Marais, David J; Nuth, Joseph A; Allamandola, Louis J; Boss, Alan P; Farmer, Jack D; Hoehler, Tori M; Jakosky, Bruce M; Meadows, Victoria S; Pohorille, Andrew; Runnegar, Bruce; Spormann, Alfred M
2008-08-01
The NASA Astrobiology Roadmap provides guidance for research and technology development across the NASA enterprises that encompass the space, Earth, and biological sciences. The ongoing development of astrobiology roadmaps embodies the contributions of diverse scientists and technologists from government, universities, and private institutions. The Roadmap addresses three basic questions: how does life begin and evolve, does life exist elsewhere in the universe, and what is the future of life on Earth and beyond? Seven Science Goals outline the following key domains of investigation: understanding the nature and distribution of habitable environments in the universe, exploring for habitable environments and life in our own Solar System, understanding the emergence of life, determining how early life on Earth interacted and evolved with its changing environment, understanding the evolutionary mechanisms and environmental limits of life, determining the principles that will shape life in the future, and recognizing signatures of life on other worlds and on early Earth. For each of these goals, Science Objectives outline more specific high priority efforts for the next three to five years. These eighteen objectives are being integrated with NASA strategic planning.
Scalable Quantum Networks for Distributed Computing and Sensing
2016-04-01
probabilistic measurement , so we developed quantum memories and guided-wave implementations of same, demonstrating controlled delay of a heralded single...Second, fundamental scalability requires a method to synchronize protocols based on quantum measurements , which are inherently probabilistic. To meet...AFRL-AFOSR-UK-TR-2016-0007 Scalable Quantum Networks for Distributed Computing and Sensing Ian Walmsley THE UNIVERSITY OF OXFORD Final Report 04/01
Effects of delay and probability combinations on discounting in humans
Cox, David J.; Dallery, Jesse
2017-01-01
To determine discount rates, researchers typically adjust the amount of an immediate or certain option relative to a delayed or uncertain option. Because this adjusting amount method can be relatively time consuming, researchers have developed more efficient procedures. One such procedure is a 5-trial adjusting delay procedure, which measures the delay at which an amount of money loses half of its value (e.g., $1000 is valued at $500 with a 10-year delay to its receipt). Experiment 1 (n = 212) used 5-trial adjusting delay or probability tasks to measure delay discounting of losses, probabilistic gains, and probabilistic losses. Experiment 2 (n = 98) assessed combined probabilistic and delayed alternatives. In both experiments, we compared results from 5-trial adjusting delay or probability tasks to traditional adjusting amount procedures. Results suggest both procedures produced similar rates of probability and delay discounting in six out of seven comparisons. A magnitude effect consistent with previous research was observed for probabilistic gains and losses, but not for delayed losses. Results also suggest that delay and probability interact to determine the value of money. Five-trial methods may allow researchers to assess discounting more efficiently as well as study more complex choice scenarios. PMID:27498073
Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.
Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong
2015-11-01
Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.
Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modeling: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Yin; Gao, Wenzhong; Momoh, James
In this paper, an economic dispatch model with probabilistic modeling is developed for a microgrid. The electric power supply in a microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Because of the fluctuation in the output of solar and wind power plants, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar power plants, the parameters for probabilistic distribution are further adjusted individually for both. On the other hand, with the growing trend in plug-in electric vehicles (PHEVs), an integrated microgridmore » system must also consider the impact of PHEVs. The charging loads from PHEVs as well as the discharging output via the vehicle-to-grid (V2G) method can greatly affect the economic dispatch for all of the micro energy sources in a microgrid. This paper presents an optimization method for economic dispatch in a microgrid considering conventional power plants, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in a modern microgrid.« less
A Probabilistic Design Method Applied to Smart Composite Structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1995-01-01
A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.
Sjöberg, C; Ahnesjö, A
2013-06-01
Label fusion multi-atlas approaches for image segmentation can give better segmentation results than single atlas methods. We present a multi-atlas label fusion strategy based on probabilistic weighting of distance maps. Relationships between image similarities and segmentation similarities are estimated in a learning phase and used to derive fusion weights that are proportional to the probability for each atlas to improve the segmentation result. The method was tested using a leave-one-out strategy on a database of 21 pre-segmented prostate patients for different image registrations combined with different image similarity scorings. The probabilistic weighting yields results that are equal or better compared to both fusion with equal weights and results using the STAPLE algorithm. Results from the experiments demonstrate that label fusion by weighted distance maps is feasible, and that probabilistic weighted fusion improves segmentation quality more the stronger the individual atlas segmentation quality depends on the corresponding registered image similarity. The regions used for evaluation of the image similarity measures were found to be more important than the choice of similarity measure. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Basketter, David A; Clewell, Harvey; Kimber, Ian; Rossi, Annamaria; Blaauboer, Bas; Burrier, Robert; Daneshian, Mardas; Eskes, Chantra; Goldberg, Alan; Hasiwa, Nina; Hoffmann, Sebastian; Jaworska, Joanna; Knudsen, Thomas B; Landsiedel, Robert; Leist, Marcel; Locke, Paul; Maxwell, Gavin; McKim, James; McVey, Emily A; Ouédraogo, Gladys; Patlewicz, Grace; Pelkonen, Olavi; Roggen, Erwin; Rovida, Costanza; Ruhdel, Irmela; Schwarz, Michael; Schepky, Andreas; Schoeters, Greet; Skinner, Nigel; Trentz, Kerstin; Turner, Marian; Vanparys, Philippe; Yager, James; Zurlo, Joanne; Hartung, Thomas
2012-01-01
Systemic toxicity testing forms the cornerstone for the safety evaluation of substances. Pressures to move from traditional animal models to novel technologies arise from various concerns, including: the need to evaluate large numbers of previously untested chemicals and new products (such as nanoparticles or cell therapies), the limited predictivity of traditional tests for human health effects, duration and costs of current approaches, and animal welfare considerations. The latter holds especially true in the context of the scheduled 2013 marketing ban on cosmetic ingredients tested for systemic toxicity. Based on a major analysis of the status of alternative methods (Adler et al., 2011) and its independent review (Hartung et al., 2011), the present report proposes a roadmap for how to overcome the acknowledged scientific gaps for the full replacement of systemic toxicity testing using animals. Five whitepapers were commissioned addressing toxicokinetics, skin sensitization, repeated-dose toxicity, carcinogenicity, and reproductive toxicity testing. An expert workshop of 35 participants from Europe and the US discussed and refined these whitepapers, which were subsequently compiled to form the present report. By prioritizing the many options to move the field forward, the expert group hopes to advance regulatory science.
NASA Technical Reports Server (NTRS)
Merchant, D. H.
1976-01-01
Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occurring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the method are also presented.
CPT-based probabilistic and deterministic assessment of in situ seismic soil liquefaction potential
Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Der Kiureghian, A.; Cetin, K.O.
2006-01-01
This paper presents a complete methodology for both probabilistic and deterministic assessment of seismic soil liquefaction triggering potential based on the cone penetration test (CPT). A comprehensive worldwide set of CPT-based liquefaction field case histories were compiled and back analyzed, and the data then used to develop probabilistic triggering correlations. Issues investigated in this study include improved normalization of CPT resistance measurements for the influence of effective overburden stress, and adjustment to CPT tip resistance for the potential influence of "thin" liquefiable layers. The effects of soil type and soil character (i.e., "fines" adjustment) for the new correlations are based on a combination of CPT tip and sleeve resistance. To quantify probability for performancebased engineering applications, Bayesian "regression" methods were used, and the uncertainties of all variables comprising both the seismic demand and the liquefaction resistance were estimated and included in the analysis. The resulting correlations were developed using a Bayesian framework and are presented in both probabilistic and deterministic formats. The results are compared to previous probabilistic and deterministic correlations. ?? 2006 ASCE.
NASA Astrophysics Data System (ADS)
Pappenberger, F.; Stephens, E. M.; Thielen, J.; Salomon, P.; Demeritt, D.; van Andel, S.; Wetterhall, F.; Alfieri, L.
2011-12-01
The aim of this paper is to understand and to contribute to improved communication of the probabilistic flood forecasts generated by Hydrological Ensemble Prediction Systems (HEPS) with particular focus on the inter expert communication. Different users are likely to require different kinds of information from HEPS and thus different visualizations. The perceptions of this expert group are important both because they are the designers and primary users of existing HEPS. Nevertheless, they have sometimes resisted the release of uncertainty information to the general public because of doubts about whether it can be successfully communicated in ways that would be readily understood to non-experts. In this paper we explore the strengths and weaknesses of existing HEPS visualization methods and thereby formulate some wider recommendations about best practice for HEPS visualization and communication. We suggest that specific training on probabilistic forecasting would foster use of probabilistic forecasts with a wider range of applications. The result of a case study exercise showed that there is no overarching agreement between experts on how to display probabilistic forecasts and what they consider essential information that should accompany plots and diagrams. In this paper we propose a list of minimum properties that, if consistently displayed with probabilistic forecasts, would make the products more easily understandable.
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
Probabilistic combination of static and dynamic gait features for verification
NASA Astrophysics Data System (ADS)
Bazin, Alex I.; Nixon, Mark S.
2005-03-01
This paper describes a novel probabilistic framework for biometric identification and data fusion. Based on intra and inter-class variation extracted from training data, posterior probabilities describing the similarity between two feature vectors may be directly calculated from the data using the logistic function and Bayes rule. Using a large publicly available database we show the two imbalanced gait modalities may be fused using this framework. All fusion methods tested provide an improvement over the best modality, with the weighted sum rule giving the best performance, hence showing that highly imbalanced classifiers may be fused in a probabilistic setting; improving not only the performance, but also generalized application capability.
A generative, probabilistic model of local protein structure.
Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas
2008-07-01
Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.
Campbell, Kieran R.
2016-01-01
Single cell gene expression profiling can be used to quantify transcriptional dynamics in temporal processes, such as cell differentiation, using computational methods to label each cell with a ‘pseudotime’ where true time series experimentation is too difficult to perform. However, owing to the high variability in gene expression between individual cells, there is an inherent uncertainty in the precise temporal ordering of the cells. Pre-existing methods for pseudotime estimation have predominantly given point estimates precluding a rigorous analysis of the implications of uncertainty. We use probabilistic modelling techniques to quantify pseudotime uncertainty and propagate this into downstream differential expression analysis. We demonstrate that reliance on a point estimate of pseudotime can lead to inflated false discovery rates and that probabilistic approaches provide greater robustness and measures of the temporal resolution that can be obtained from pseudotime inference. PMID:27870852
Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations
NASA Astrophysics Data System (ADS)
Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.
2014-02-01
The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.
Rosenthal, Mariana; Anderson, Katey; Tengelsen, Leslie; Carter, Kris; Hahn, Christine; Ball, Christopher
2017-08-24
The Right Size Roadmap was developed by the Association of Public Health Laboratories and the Centers for Disease Control and Prevention to improve influenza virologic surveillance efficiency. Guidelines were provided to state health departments regarding representativeness and statistical estimates of specimen numbers needed for seasonal influenza situational awareness, rare or novel influenza virus detection, and rare or novel influenza virus investigation. The aim of this study was to compare Roadmap sampling recommendations with Idaho's influenza virologic surveillance to determine implementation feasibility. We calculated the proportion of medically attended influenza-like illness (MA-ILI) from Idaho's influenza-like illness surveillance among outpatients during October 2008 to May 2014, applied data to Roadmap-provided sample size calculators, and compared calculations with actual numbers of specimens tested for influenza by the Idaho Bureau of Laboratories (IBL). We assessed representativeness among patients' tested specimens to census estimates by age, sex, and health district residence. Among outpatients surveilled, Idaho's mean annual proportion of MA-ILI was 2.30% (20,834/905,818) during a 5-year period. Thus, according to Roadmap recommendations, Idaho needs to collect 128 specimens from MA-ILI patients/week for situational awareness, 1496 influenza-positive specimens/week for detection of a rare or novel influenza virus at 0.2% prevalence, and after detection, 478 specimens/week to confirm true prevalence is ≤2% of influenza-positive samples. The mean number of respiratory specimens Idaho tested for influenza/week, excluding the 2009-2010 influenza season, ranged from 6 to 24. Various influenza virus types and subtypes were collected and specimen submission sources were representative in terms of geographic distribution, patient age range and sex, and disease severity. Insufficient numbers of respiratory specimens are submitted to IBL for influenza laboratory testing. Increased specimen submission would facilitate meeting Roadmap sample size recommendations. ©Mariana Rosenthal, Katey Anderson, Leslie Tengelsen, Kris Carter, Christine Hahn, Christopher Ball. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 24.08.2017.
Evolutionistic or revolutionary paths? A PACS maturity model for strategic situational planning.
van de Wetering, Rogier; Batenburg, Ronald; Lederman, Reeva
2010-07-01
While many hospitals are re-evaluating their current Picture Archiving and Communication System (PACS), few have a mature strategy for PACS deployment. Furthermore, strategies for implementation, strategic and situational planning methods for the evolution of PACS maturity are scarce in the scientific literature. Consequently, in this paper we propose a strategic planning method for PACS deployment. This method builds upon a PACS maturity model (PMM), based on the elaboration of the strategic alignment concept and the maturity growth path concept previously developed in the PACS domain. First, we review the literature on strategic planning for information systems and information technology and PACS maturity. Secondly, the PMM is extended by applying four different strategic perspectives of the Strategic Alignment Framework whereupon two types of growth paths (evolutionistic and revolutionary) are applied that focus on a roadmap for PMM. This roadmap builds a path to get from one level of maturity and evolve to the next. An extended method for PACS strategic planning is developed. This method defines eight distinctive strategies for PACS strategic situational planning that allow decision-makers in hospitals to decide which approach best suits their hospitals' current situation and future ambition and what in principle is needed to evolve through the different maturity levels. The proposed method allows hospitals to strategically plan for PACS maturation. It is situational in that the required investments and activities depend on the alignment between the hospital strategy and the selected growth path. The inclusion of both strategic alignment and maturity growth path concepts make the planning method rigorous, and provide a framework for further empirical research and clinical practice.
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K. Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.
Feature selection using probabilistic prediction of support vector regression.
Yang, Jian-Bo; Ong, Chong-Jin
2011-06-01
This paper presents a new wrapper-based feature selection method for support vector regression (SVR) using its probabilistic predictions. The method computes the importance of a feature by aggregating the difference, over the feature space, of the conditional density functions of the SVR prediction with and without the feature. As the exact computation of this importance measure is expensive, two approximations are proposed. The effectiveness of the measure using these approximations, in comparison to several other existing feature selection methods for SVR, is evaluated on both artificial and real-world problems. The result of the experiments show that the proposed method generally performs better than, or at least as well as, the existing methods, with notable advantage when the dataset is sparse.
Probabilistic Sizing and Verification of Space Ceramic Structures
NASA Astrophysics Data System (ADS)
Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit
2012-07-01
Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
Wang, Yue; Adalý, Tülay; Kung, Sun-Yuan; Szabo, Zsolt
2007-01-01
This paper presents a probabilistic neural network based technique for unsupervised quantification and segmentation of brain tissues from magnetic resonance images. It is shown that this problem can be solved by distribution learning and relaxation labeling, resulting in an efficient method that may be particularly useful in quantifying and segmenting abnormal brain tissues where the number of tissue types is unknown and the distributions of tissue types heavily overlap. The new technique uses suitable statistical models for both the pixel and context images and formulates the problem in terms of model-histogram fitting and global consistency labeling. The quantification is achieved by probabilistic self-organizing mixtures and the segmentation by a probabilistic constraint relaxation network. The experimental results show the efficient and robust performance of the new algorithm and that it outperforms the conventional classification based approaches. PMID:18172510
Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten
2017-08-01
Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
Probabilistic DHP adaptive critic for nonlinear stochastic control systems.
Herzallah, Randa
2013-06-01
Following the recently developed algorithms for fully probabilistic control design for general dynamic stochastic systems (Herzallah & Káarnáy, 2011; Kárný, 1996), this paper presents the solution to the probabilistic dual heuristic programming (DHP) adaptive critic method (Herzallah & Káarnáy, 2011) and randomized control algorithm for stochastic nonlinear dynamical systems. The purpose of the randomized control input design is to make the joint probability density function of the closed loop system as close as possible to a predetermined ideal joint probability density function. This paper completes the previous work (Herzallah & Káarnáy, 2011; Kárný, 1996) by formulating and solving the fully probabilistic control design problem on the more general case of nonlinear stochastic discrete time systems. A simulated example is used to demonstrate the use of the algorithm and encouraging results have been obtained. Copyright © 2013 Elsevier Ltd. All rights reserved.
Probabilistic double guarantee kidnapping detection in SLAM.
Tian, Yang; Ma, Shugen
2016-01-01
For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.
Teacher Quality Roadmap: Improving Policies and Practices in the Miami-Dade County Public Schools
ERIC Educational Resources Information Center
National Council on Teacher Quality, 2012
2012-01-01
In partnership with the Urban League of Greater Miami, the National Council on Teacher Quality (NCTQ) released "Teacher Quality Roadmap: Improving Policies and Practices in Miami," an in-depth study of the work rules Miami-Dade teachers. This look at the state of teacher policies in Miami-Dade County Public Schools explores the…
Roadmap to Guide U.S. Photovoltaics Industry in 21st Century
industry wants them to have it. Solar-cell manufacturers and suppliers see photovoltaics (PV) producing at Roadmap to Guide U.S. Photovoltaics Industry in 21st Century Solar energy will provide emergency Douglas Golden, Colo., Jan. 20, 2000 - Americans want clean solar electricity. The U.S. photovoltaics
ERIC Educational Resources Information Center
Veliyath, Rajaram; Adams, Janet S.
2005-01-01
The course syllabus is a contract between instructor and students, a schedule of course assignments and activities, and a roadmap delineating objectives and checkpoints in the course. It is also a planning and reference tool for both students and instructor, and it models professors' expectations for their students. This study investigated whether…
Going Further: A Roadmap to the Works of the ACCLAIM Research Initiative. Working Paper No. 42
ERIC Educational Resources Information Center
Wilson, Zach; Howley, Craig
2012-01-01
"Going Further" presents a roadmap to the works of the ACCLAIM (Appalachian Collaborative Center for Learning, Assessment, and Instruction in Mathematics) Research Initiative, the research effort of one the Centers for Learning and Teaching (CLTs) created with a grant (2001-2005) from the National Science Foundation. The Center began…
Reducing Energy Burden with Solar: Colorado's Strategy and Roadmap for
-income residents suffer from a high energy burden, which can force these residents to choose between . The report concludes with a roadmap other states might consider when developing their own low-income states might learn from the state's experience when they design their own programs. The report concludes
NASA Technology Area 1: Launch Propulsion Systems
NASA Technical Reports Server (NTRS)
McConnaughey, Paul; Femminineo, Mark; Koelfgen, Syri; Lepsch, Roger; Ryan, Richard M.; Taylor, Steven A.
2011-01-01
This slide presentation reviews the technology advancements plans for the NASA Technology Area 1, Launch Propulsion Systems Technology Area (LPSTA). The draft roadmap reviews various propulsion system technologies that will be developed during the next 25 + years. This roadmap will be reviewed by the National Research Council which will issue a final report, that will include findings and recommendations.
ERIC Educational Resources Information Center
Fox, Lise; Veguilla, Myrna; Perez Binder, Denise
2014-01-01
The Technical Assistance Center on Social Emotional Intervention for Young Children (TACSEI) Roadmap on "Data Decision-Making and Program-Wide Implementation of the Pyramid Model" provides programs with guidance on how to collect and use data to ensure the implementation of the Pyramid Model with fidelity and decision-making that…
U.S. Department of Energy Office of Indian Energy Policy and Programs: Strategic Roadmap 2025
DOE Office of Scientific and Technical Information (OSTI.GOV)
The U.S. Department of Energy Office of Indian Energy Policy and Programs Strategic Roadmap 2025 outlines strategic target areas and tactical actions to ensure the Office remains aligned with its congressional mandates and DOE goals, and that it can be responsive to changing conditions in Indian Country and the nation.
Defining the role of silvicultural research in the Northeastern Forest Experiment Station
Chris Nowak; Susan Stout; John Brissette; Laura Kenefic; Gary Miller; Bill Leak; Dan Yaussy; Tom Schuler; Kurt Gottschalk
1997-01-01
Research planning in the Northeastern Forest Experiment Station has followed a grass roots model for more than two years-ROADMAP, a research and development management plan. The goals for research within ROADMAP include understanding, protecting, managing, and utilizing forest ecosystems. There are nine research themes set to help achieve these goals, each with a set...
The technology roadmap for plant/crop-based renewable resources 2020
DOE Office of Scientific and Technical Information (OSTI.GOV)
McLaren, J.
1999-02-22
The long-term well-being of the nation and maintenance of a sustainable leadership position in agriculture, forestry, and manufacturing, clearly depend on current and near-term support of multidisciplinary research for the development of a reliable renewable resource base. This document sets a roadmap and priorities for that research. America needs leadership that will continue to recognize, support, and move rapidly to meet the need to expand the use of sustainable renewable resources. This roadmap has highlighted potential ways for progress and has identified goals in specific components of the system. Achieving success with these goals will provide the opportunity to hitmore » the vision target of a fivefold increase in renewable resource use by 2020.« less
The Technology Roadmap for Plant/Crop-Based Renewable Resources 2020
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1999-02-01
The long-term well-being of the nation and maintenance of a sustainable leadership position in agriculture, forestry, and manufacturing, clearly depend on current and near-term support of multidisciplinary research for the development of a reliable renewable resource base. This document sets a roadmap and priorities for that research. America needs leadership that will continue to recognize, support, and move rapidly to meet the need to expand the use of sustainable renewable resources. This roadmap has highlighted potential ways for progress and has identified goals in specific components of the system. Achieving success with these goals will provide the opportunity to hitmore » the vision target of a fivefold increase in renewable resource use by 2020.« less
NASA Technical Reports Server (NTRS)
McNeal, Curtis I., Jr.; Anderson, William
1999-01-01
NASA's current focus on technology roadmaps as a tool for guiding investment decisions leads naturally to a discussion of NASA's roadmap for peroxide propulsion system development. NASA's new Second Generation Space Transportation System roadmap calls for an integrated Reusable Upper-Stage (RUS) engine technology demonstration in the FY03/FY04 time period. Preceding this integrated demonstration are several years of component developments and subsystem technology demonstrations. NASA and the Air Force took the first steps at developing focused upper stage technologies with the initiation of the Upper Stage Flight Experiment with Orbital Sciences in December 1997. A review of this program's peroxide propulsion development is a useful first step in establishing the peroxide propulsion pathway that could lead to a RUS demonstration in 2004.
A roadmap to effective urban climate change adaptation
NASA Astrophysics Data System (ADS)
Setiadi, R.
2018-03-01
This paper outlines a roadmap to effective urban climate change adaptation built from our practical understanding of the evidence and effects of climate change and the preparation of climate change adaptation strategies and plans. This roadmap aims to drive research in achieving fruitful knowledge and solution-based achievable recommendations in adapting to climate change in urban areas with effective and systematic manner. This paper underscores the importance of the interplay between local government initiatives and a national government for effective adaptation to climate change and takes into account the policy process and politics. This paper argues that effective urban climate change adaptation has a contribution to build urban resilience and helps the achievement of national government goals and targets in climate change adaptation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stukel, Laura; Hoen, Ben; Adomatis, Sandra
Capturing the Sun: A Roadmap for Navigating Data-Access Challenges and Auto-Populating Solar Home Sales Listings supports a vision of solar photovoltaic (PV) advocates and real estate advocates evolving together to make information about solar homes more accessible to home buyers and sellers and to simplify the process when these homes are resold. The Roadmap is based on a concept in the real estate industry known as automatic population of fields. Auto-population (also called auto-pop in the industry) is the technology that allows data aggregated by an outside industry to be matched automatically with home sale listings in a multiple listingmore » service (MLS).« less
Exploration of Advanced Probabilistic and Stochastic Design Methods
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.
2003-01-01
The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and tutorials are attached in electronic form with the enclosed CD.
Craniopharyngioma: a roadmap for scientific translation.
Gupta, Saksham; Bi, Wenya Linda; Giantini Larsen, Alexandra; Al-Abdulmohsen, Sally; Abedalthagafi, Malak; Dunn, Ian F
2018-06-01
OBJECTIVE Craniopharyngiomas are among the most challenging of intracranial tumors to manage because of their pattern of growth, associated morbidities, and high recurrence rate. Complete resection on initial encounter can be curative, but it may be impeded by the risks posed by the involved neurovascular structures. Recurrent craniopharyngiomas, in turn, are frequently refractory to additional surgery and adjuvant radiation or chemotherapy. METHODS The authors conducted a review of primary literature. RESULTS Recent advances in the understanding of craniopharyngioma biology have illuminated potential oncogenic targets for pharmacotherapy. Specifically, distinct molecular profiles define two histological subtypes of craniopharyngioma: adamantinomatous and papillary. The discovery of overactive B-Raf signaling in the adult papillary subtype has led to reports of targeted inhibitors, with a growing acceptance for refractory cases. An expanding knowledge of the biological underpinnings of craniopharyngioma will continue to drive development of targeted therapies and immunotherapies that are personalized to the molecular signature of each individual tumor. CONCLUSIONS The rapid translation of genomic findings to medical therapies for recurrent craniopharyngiomas serves as a roadmap for other challenging neurooncological diseases.
A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects
Slob, Wout
2015-01-01
Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063
A Guide to the Literature on Learning Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Friedland, Peter (Technical Monitor)
1994-01-01
This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and more generally, learning probabilistic graphical models. Because many problems in artificial intelligence, statistics and neural networks can be represented as a probabilistic graphical model, this area provides a unifying perspective on learning. This paper organizes the research in this area along methodological lines of increasing complexity.
PREDICT: Privacy and Security Enhancing Dynamic Information Monitoring
2015-08-03
consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided local...12], consisting of global server-side probabilistic assignment by an untrusted server using cloaked locations, followed by feedback-loop guided...these methods achieve high sensing coverage with low cost using cloaked locations [3]. In follow-on work, the issue of mobility is addressed. Task
The purpose of this SOP is to describe the procedures undertaken to calculate the ingestion exposure using composite food chemical residue values from the day of direct measurements. The calculation is based on the probabilistic approach. This SOP uses data that have been proper...
Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment
NASA Technical Reports Server (NTRS)
Yackovetsky, Robert (Technical Monitor)
2002-01-01
The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.
A probabilistic method for testing and estimating selection differences between populations.
He, Yungang; Wang, Minxian; Huang, Xin; Li, Ran; Xu, Hongyang; Xu, Shuhua; Jin, Li
2015-12-01
Human populations around the world encounter various environmental challenges and, consequently, develop genetic adaptations to different selection forces. Identifying the differences in natural selection between populations is critical for understanding the roles of specific genetic variants in evolutionary adaptation. Although numerous methods have been developed to detect genetic loci under recent directional selection, a probabilistic solution for testing and quantifying selection differences between populations is lacking. Here we report the development of a probabilistic method for testing and estimating selection differences between populations. By use of a probabilistic model of genetic drift and selection, we showed that logarithm odds ratios of allele frequencies provide estimates of the differences in selection coefficients between populations. The estimates approximate a normal distribution, and variance can be estimated using genome-wide variants. This allows us to quantify differences in selection coefficients and to determine the confidence intervals of the estimate. Our work also revealed the link between genetic association testing and hypothesis testing of selection differences. It therefore supplies a solution for hypothesis testing of selection differences. This method was applied to a genome-wide data analysis of Han and Tibetan populations. The results confirmed that both the EPAS1 and EGLN1 genes are under statistically different selection in Han and Tibetan populations. We further estimated differences in the selection coefficients for genetic variants involved in melanin formation and determined their confidence intervals between continental population groups. Application of the method to empirical data demonstrated the outstanding capability of this novel approach for testing and quantifying differences in natural selection. © 2015 He et al.; Published by Cold Spring Harbor Laboratory Press.
Middlebrooks, E H; Tuna, I S; Grewal, S S; Almeida, L; Heckman, M G; Lesser, E R; Foote, K D; Okun, M S; Holanda, V M
2018-06-01
Although globus pallidus internus deep brain stimulation is a widely accepted treatment for Parkinson disease, there is persistent variability in outcomes that is not yet fully understood. In this pilot study, we aimed to investigate the potential role of globus pallidus internus segmentation using probabilistic tractography as a supplement to traditional targeting methods. Eleven patients undergoing globus pallidus internus deep brain stimulation were included in this retrospective analysis. Using multidirection diffusion-weighted MR imaging, we performed probabilistic tractography at all individual globus pallidus internus voxels. Each globus pallidus internus voxel was then assigned to the 1 ROI with the greatest number of propagated paths. On the basis of deep brain stimulation programming settings, the volume of tissue activated was generated for each patient using a finite element method solution. For each patient, the volume of tissue activated within each of the 10 segmented globus pallidus internus regions was calculated and examined for association with a change in the Unified Parkinson Disease Rating Scale, Part III score before and after treatment. Increasing volume of tissue activated was most strongly correlated with a change in the Unified Parkinson Disease Rating Scale, Part III score for the primary motor region (Spearman r = 0.74, P = .010), followed by the supplementary motor area/premotor cortex (Spearman r = 0.47, P = .15). In this pilot study, we assessed a novel method of segmentation of the globus pallidus internus based on probabilistic tractography as a supplement to traditional targeting methods. Our results suggest that our method may be an independent predictor of deep brain stimulation outcome, and evaluation of a larger cohort or prospective study is warranted to validate these findings. © 2018 by American Journal of Neuroradiology.
NASA Astrophysics Data System (ADS)
Ren, Weiwei; Yang, Tao; Shi, Pengfei; Xu, Chong-yu; Zhang, Ke; Zhou, Xudong; Shao, Quanxi; Ciais, Philippe
2018-06-01
Climate change imposes profound influence on regional hydrological cycle and water security in many alpine regions worldwide. Investigating regional climate impacts using watershed scale hydrological models requires a large number of input data such as topography, meteorological and hydrological data. However, data scarcity in alpine regions seriously restricts evaluation of climate change impacts on water cycle using conventional approaches based on global or regional climate models, statistical downscaling methods and hydrological models. Therefore, this study is dedicated to development of a probabilistic model to replace the conventional approaches for streamflow projection. The probabilistic model was built upon an advanced Bayesian Neural Network (BNN) approach directly fed by the large-scale climate predictor variables and tested in a typical data sparse alpine region, the Kaidu River basin in Central Asia. Results show that BNN model performs better than the general methods across a number of statistical measures. The BNN method with flexible model structures by active indicator functions, which reduce the dependence on the initial specification for the input variables and the number of hidden units, can work well in a data limited region. Moreover, it can provide more reliable streamflow projections with a robust generalization ability. Forced by the latest bias-corrected GCM scenarios, streamflow projections for the 21st century under three RCP emission pathways were constructed and analyzed. Briefly, the proposed probabilistic projection approach could improve runoff predictive ability over conventional methods and provide better support to water resources planning and management under data limited conditions as well as enable a facilitated climate change impact analysis on runoff and water resources in alpine regions worldwide.
NASA Astrophysics Data System (ADS)
Ishizaki, N. N.; Dairaku, K.; Ueno, G.
2016-12-01
We have developed a statistical downscaling method for estimating probabilistic climate projection using CMIP5 multi general circulation models (GCMs). A regression model was established so that the combination of weights of GCMs reflects the characteristics of the variation of observations at each grid point. Cross validations were conducted to select GCMs and to evaluate the regression model to avoid multicollinearity. By using spatially high resolution observation system, we conducted statistically downscaled probabilistic climate projections with 20-km horizontal grid spacing. Root mean squared errors for monthly mean air surface temperature and precipitation estimated by the regression method were the smallest compared with the results derived from a simple ensemble mean of GCMs and a cumulative distribution function based bias correction method. Projected changes in the mean temperature and precipitation were basically similar to those of the simple ensemble mean of GCMs. Mean precipitation was generally projected to increase associated with increased temperature and consequent increased moisture content in the air. Weakening of the winter monsoon may affect precipitation decrease in some areas. Temperature increase in excess of 4 K was expected in most areas of Japan in the end of 21st century under RCP8.5 scenario. The estimated probability of monthly precipitation exceeding 300 mm would increase around the Pacific side during the summer and the Japan Sea side during the winter season. This probabilistic climate projection based on the statistical method can be expected to bring useful information to the impact studies and risk assessments.
Zulkifley, Mohd Asyraf; Rawlinson, David; Moran, Bill
2012-01-01
In video analytics, robust observation detection is very important as the content of the videos varies a lot, especially for tracking implementation. Contrary to the image processing field, the problems of blurring, moderate deformation, low illumination surroundings, illumination change and homogenous texture are normally encountered in video analytics. Patch-Based Observation Detection (PBOD) is developed to improve detection robustness to complex scenes by fusing both feature- and template-based recognition methods. While we believe that feature-based detectors are more distinctive, however, for finding the matching between the frames are best achieved by a collection of points as in template-based detectors. Two methods of PBOD—the deterministic and probabilistic approaches—have been tested to find the best mode of detection. Both algorithms start by building comparison vectors at each detected points of interest. The vectors are matched to build candidate patches based on their respective coordination. For the deterministic method, patch matching is done in 2-level test where threshold-based position and size smoothing are applied to the patch with the highest correlation value. For the second approach, patch matching is done probabilistically by modelling the histograms of the patches by Poisson distributions for both RGB and HSV colour models. Then, maximum likelihood is applied for position smoothing while a Bayesian approach is applied for size smoothing. The result showed that probabilistic PBOD outperforms the deterministic approach with average distance error of 10.03% compared with 21.03%. This algorithm is best implemented as a complement to other simpler detection methods due to heavy processing requirement. PMID:23202226
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1995-01-01
Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.
Zhang, Lei; Zeng, Zhi; Ji, Qiang
2011-09-01
Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.
Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William
2009-01-01
This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).
NASA Astrophysics Data System (ADS)
Nawaz, Muhammad Atif; Curtis, Andrew
2018-04-01
We introduce a new Bayesian inversion method that estimates the spatial distribution of geological facies from attributes of seismic data, by showing how the usual probabilistic inverse problem can be solved using an optimization framework still providing full probabilistic results. Our mathematical model consists of seismic attributes as observed data, which are assumed to have been generated by the geological facies. The method infers the post-inversion (posterior) probability density of the facies plus some other unknown model parameters, from the seismic attributes and geological prior information. Most previous research in this domain is based on the localized likelihoods assumption, whereby the seismic attributes at a location are assumed to depend on the facies only at that location. Such an assumption is unrealistic because of imperfect seismic data acquisition and processing, and fundamental limitations of seismic imaging methods. In this paper, we relax this assumption: we allow probabilistic dependence between seismic attributes at a location and the facies in any neighbourhood of that location through a spatial filter. We term such likelihoods quasi-localized.
Carbon stocks on forestland of the United States, with emphasis on USDA Forest Service ownership
Linda S. Heath; James E. Smith; Christopher W. Woodall; David L. Azuma; Karen L. Waddell
2011-01-01
The U.S. Department of Agriculture Forest Service (USFS) manages one-fifth of the area of forestland in the United States. The Forest Service Roadmap for responding to climate change identified assessing and managing carbon stocks and change as a major element of its plan. This study presents methods and results of estimating current forest carbon stocks and change in...
Carbon stocks on forestland of the United States, with emphaisis on USDA Forest Service ownership
Linda S. Heath; James E. Smith; Christopher W. Woodall; Dave Azuma; Karen L. Waddell
2011-01-01
The U.S. Department of Agriculture Forest Service (USFS) manages one-fifth of the area of forestland in the United States. The Forest Service Roadmap for responding to climate change identified assessing and managing carbon stocks and change as a major element of its plan. This study presents methods and results of estimating current forest carbon stocks and change in...
The European Hematology Association Roadmap for European Hematology Research: a consensus document.
Engert, Andreas; Balduini, Carlo; Brand, Anneke; Coiffier, Bertrand; Cordonnier, Catherine; Döhner, Hartmut; de Wit, Thom Duyvené; Eichinger, Sabine; Fibbe, Willem; Green, Tony; de Haas, Fleur; Iolascon, Achille; Jaffredo, Thierry; Rodeghiero, Francesco; Salles, Gilles; Schuringa, Jan Jacob
2016-02-01
The European Hematology Association (EHA) Roadmap for European Hematology Research highlights major achievements in diagnosis and treatment of blood disorders and identifies the greatest unmet clinical and scientific needs in those areas to enable better funded, more focused European hematology research. Initiated by the EHA, around 300 experts contributed to the consensus document, which will help European policy makers, research funders, research organizations, researchers, and patient groups make better informed decisions on hematology research. It also aims to raise public awareness of the burden of blood disorders on European society, which purely in economic terms is estimated at €23 billion per year, a level of cost that is not matched in current European hematology research funding. In recent decades, hematology research has improved our fundamental understanding of the biology of blood disorders, and has improved diagnostics and treatments, sometimes in revolutionary ways. This progress highlights the potential of focused basic research programs such as this EHA Roadmap.The EHA Roadmap identifies nine 'sections' in hematology: normal hematopoiesis, malignant lymphoid and myeloid diseases, anemias and related diseases, platelet disorders, blood coagulation and hemostatic disorders, transfusion medicine, infections in hematology, and hematopoietic stem cell transplantation. These sections span 60 smaller groups of diseases or disorders.The EHA Roadmap identifies priorities and needs across the field of hematology, including those to develop targeted therapies based on genomic profiling and chemical biology, to eradicate minimal residual malignant disease, and to develop cellular immunotherapies, combination treatments, gene therapies, hematopoietic stem cell treatments, and treatments that are better tolerated by elderly patients. Copyright© Ferrata Storti Foundation.
The European Hematology Association Roadmap for European Hematology Research: a consensus document
Engert, Andreas; Balduini, Carlo; Brand, Anneke; Coiffier, Bertrand; Cordonnier, Catherine; Döhner, Hartmut; de Wit, Thom Duyvené; Eichinger, Sabine; Fibbe, Willem; Green, Tony; de Haas, Fleur; Iolascon, Achille; Jaffredo, Thierry; Rodeghiero, Francesco; Salles, Gilles; Schuringa, Jan Jacob
2016-01-01
The European Hematology Association (EHA) Roadmap for European Hematology Research highlights major achievements in diagnosis and treatment of blood disorders and identifies the greatest unmet clinical and scientific needs in those areas to enable better funded, more focused European hematology research. Initiated by the EHA, around 300 experts contributed to the consensus document, which will help European policy makers, research funders, research organizations, researchers, and patient groups make better informed decisions on hematology research. It also aims to raise public awareness of the burden of blood disorders on European society, which purely in economic terms is estimated at €23 billion per year, a level of cost that is not matched in current European hematology research funding. In recent decades, hematology research has improved our fundamental understanding of the biology of blood disorders, and has improved diagnostics and treatments, sometimes in revolutionary ways. This progress highlights the potential of focused basic research programs such as this EHA Roadmap. The EHA Roadmap identifies nine ‘sections’ in hematology: normal hematopoiesis, malignant lymphoid and myeloid diseases, anemias and related diseases, platelet disorders, blood coagulation and hemostatic disorders, transfusion medicine, infections in hematology, and hematopoietic stem cell transplantation. These sections span 60 smaller groups of diseases or disorders. The EHA Roadmap identifies priorities and needs across the field of hematology, including those to develop targeted therapies based on genomic profiling and chemical biology, to eradicate minimal residual malignant disease, and to develop cellular immunotherapies, combination treatments, gene therapies, hematopoietic stem cell treatments, and treatments that are better tolerated by elderly patients. PMID:26819058
Linking Six Sigma to simulation: a new roadmap to improve the quality of patient care.
Celano, Giovanni; Costa, Antonio; Fichera, Sergio; Tringali, Giuseppe
2012-01-01
Improving the quality of patient care is a challenge that calls for a multidisciplinary approach, embedding a broad spectrum of knowledge and involving healthcare professionals from diverse backgrounds. The purpose of this paper is to present an innovative approach that implements discrete-event simulation (DES) as a decision-supporting tool in the management of Six Sigma quality improvement projects. A roadmap is designed to assist quality practitioners and health care professionals in the design and successful implementation of simulation models within the define-measure-analyse-design-verify (DMADV) or define-measure-analyse-improve-control (DMAIC) Six Sigma procedures. A case regarding the reorganisation of the flow of emergency patients affected by vertigo symptoms was developed in a large town hospital as a preliminary test of the roadmap. The positive feedback from professionals carrying out the project looks promising and encourages further roadmap testing in other clinical settings. The roadmap is a structured procedure that people involved in quality improvement can implement to manage projects based on the analysis and comparison of alternative scenarios. The role of Six Sigma philosophy in improvement of the quality of healthcare services is recognised both by researchers and by quality practitioners; discrete-event simulation models are commonly used to improve the key performance measures of patient care delivery. The two approaches are seldom referenced and implemented together; however, they could be successfully integrated to carry out quality improvement programs. This paper proposes an innovative approach to bridge the gap and enrich the Six Sigma toolbox of quality improvement procedures with DES.
Durability reliability analysis for corroding concrete structures under uncertainty
NASA Astrophysics Data System (ADS)
Zhang, Hao
2018-02-01
This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.
NASA Technical Reports Server (NTRS)
McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.
2012-01-01
This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.
Alomari, Yazan M.; MdZin, Reena Rahayu
2015-01-01
Analysis of whole-slide tissue for digital pathology images has been clinically approved to provide a second opinion to pathologists. Localization of focus points from Ki-67-stained histopathology whole-slide tissue microscopic images is considered the first step in the process of proliferation rate estimation. Pathologists use eye pooling or eagle-view techniques to localize the highly stained cell-concentrated regions from the whole slide under microscope, which is called focus-point regions. This procedure leads to a high variety of interpersonal observations and time consuming, tedious work and causes inaccurate findings. The localization of focus-point regions can be addressed as a clustering problem. This paper aims to automate the localization of focus-point regions from whole-slide images using the random patch probabilistic density method. Unlike other clustering methods, random patch probabilistic density method can adaptively localize focus-point regions without predetermining the number of clusters. The proposed method was compared with the k-means and fuzzy c-means clustering methods. Our proposed method achieves a good performance, when the results were evaluated by three expert pathologists. The proposed method achieves an average false-positive rate of 0.84% for the focus-point region localization error. Moreover, regarding RPPD used to localize tissue from whole-slide images, 228 whole-slide images have been tested; 97.3% localization accuracy was achieved. PMID:25793010
NASA Astrophysics Data System (ADS)
Králik, Juraj
2017-07-01
The paper presents the probabilistic and sensitivity analysis of the efficiency of the damping devices cover of nuclear power plant under impact of the container of nuclear fuel of type TK C30 drop. The finite element idealization of nuclear power plant structure is used in space. The steel pipe damper system is proposed for dissipation of the kinetic energy of the container free fall. The experimental results of the shock-damper basic element behavior under impact loads are presented. The Newmark integration method is used for solution of the dynamic equations. The sensitivity and probabilistic analysis of damping devices was realized in the AntHILL and ANSYS software.
The application of probabilistic design theory to high temperature low cycle fatigue
NASA Technical Reports Server (NTRS)
Wirsching, P. H.
1981-01-01
Metal fatigue under stress and thermal cycling is a principal mode of failure in gas turbine engine hot section components such as turbine blades and disks and combustor liners. Designing for fatigue is subject to considerable uncertainty, e.g., scatter in cycles to failure, available fatigue test data and operating environment data, uncertainties in the models used to predict stresses, etc. Methods of analyzing fatigue test data for probabilistic design purposes are summarized. The general strain life as well as homo- and hetero-scedastic models are considered. Modern probabilistic design theory is reviewed and examples are presented which illustrate application to reliability analysis of gas turbine engine components.
NASA Astrophysics Data System (ADS)
Javidi, Bahram; Carnicer, Artur; Yamaguchi, Masahiro; Nomura, Takanori; Pérez-Cabré, Elisabet; Millán, María S.; Nishchal, Naveen K.; Torroba, Roberto; Fredy Barrera, John; He, Wenqi; Peng, Xiang; Stern, Adrian; Rivenson, Yair; Alfalou, A.; Brosseau, C.; Guo, Changliang; Sheridan, John T.; Situ, Guohai; Naruse, Makoto; Matsumoto, Tsutomu; Juvells, Ignasi; Tajahuerce, Enrique; Lancis, Jesús; Chen, Wen; Chen, Xudong; Pinkse, Pepijn W. H.; Mosk, Allard P.; Markman, Adam
2016-08-01
Information security and authentication are important challenges facing society. Recent attacks by hackers on the databases of large commercial and financial companies have demonstrated that more research and development of advanced approaches are necessary to deny unauthorized access to critical data. Free space optical technology has been investigated by many researchers in information security, encryption, and authentication. The main motivation for using optics and photonics for information security is that optical waveforms possess many complex degrees of freedom such as amplitude, phase, polarization, large bandwidth, nonlinear transformations, quantum properties of photons, and multiplexing that can be combined in many ways to make information encryption more secure and more difficult to attack. This roadmap article presents an overview of the potential, recent advances, and challenges of optical security and encryption using free space optics. The roadmap on optical security is comprised of six categories that together include 16 short sections written by authors who have made relevant contributions in this field. The first category of this roadmap describes novel encryption approaches, including secure optical sensing which summarizes double random phase encryption applications and flaws [Yamaguchi], the digital holographic encryption in free space optical technique which describes encryption using multidimensional digital holography [Nomura], simultaneous encryption of multiple signals [Pérez-Cabré], asymmetric methods based on information truncation [Nishchal], and dynamic encryption of video sequences [Torroba]. Asymmetric and one-way cryptosystems are analyzed by Peng. The second category is on compression for encryption. In their respective contributions, Alfalou and Stern propose similar goals involving compressed data and compressive sensing encryption. The very important area of cryptanalysis is the topic of the third category with two sections: Sheridan reviews phase retrieval algorithms to perform different attacks, whereas Situ discusses nonlinear optical encryption techniques and the development of a rigorous optical information security theory. The fourth category with two contributions reports how encryption could be implemented at the nano- or micro-scale. Naruse discusses the use of nanostructures in security applications and Carnicer proposes encoding information in a tightly focused beam. In the fifth category, encryption based on ghost imaging using single-pixel detectors is also considered. In particular, the authors [Chen, Tajahuerce] emphasize the need for more specialized hardware and image processing algorithms. Finally, in the sixth category, Mosk and Javidi analyze in their corresponding papers how quantum imaging can benefit optical encryption systems. Sources that use few photons make encryption systems much more difficult to attack, providing a secure method for authentication.
Leading from the Front of the Classroom: A Roadmap to Teacher Leadership That Works
ERIC Educational Resources Information Center
Aspen Institute, 2014
2014-01-01
In this paper, Leading Educators and the Aspen Institute propose a roadmap to empower teachers to lead from the front of the classroom. This paper outlines key phases that system administrators will need to consider as they build teacher leadership systems that address their highest priorities. For each phase, the Aspen Institute offers a…
ERIC Educational Resources Information Center
Castro, Helio; Putnik, Goran D.; Shah, Vaibhav
2012-01-01
Purpose: The aim of this paper is to analyze international and national research and development (R&D) programs and roadmaps for the manufacturing sector, presenting how agile and lean manufacturing models are addressed in these programs. Design/methodology/approach: In this review, several manufacturing research and development programs and…
ERIC Educational Resources Information Center
Data Quality Campaign, 2014
2014-01-01
High school feedback reports let school and district leaders know where their students go after graduation and how well they are prepared for college and beyond. This roadmap discusses the seven key focus areas the Data Quality Campaign (DQC) recommends states work on to ensure quality implementation of high school feedback reports.
Reducing Human Radiation Risks on Deep Space Missions
2017-09-01
Roadmap (2016). .........................................................108 Figure 53. Risk Assessment for Acute Radiation Syndrome Due to SPEs...Risk of Acute Radiation Syndromes Due to Solar Particle Events Figure 53 highlights the fact that acute radiation syndrome is a short-term risk...acceptable for long-term missions. Figure 53. Risk Assessment for Acute Radiation Syndrome Due to SPEs. Source: NASA Human Research Roadmap (2016
ERIC Educational Resources Information Center
Data Quality Campaign, 2014
2014-01-01
State licensure polices are meant to provide teacher preparation programs with direction about the skills teachers need to be qualified to teach, including skills to use data. This roadmap discusses the 10 key data use skills that states can include in a licensure policy with a quality focus on effective data use.
ERIC Educational Resources Information Center
Data Quality Campaign, 2016
2016-01-01
Every state can create secure, robust linkages between early childhood and K-12 data systems, and effectively use the information from these linkages to implement initiatives to support programs and children, answer key policy questions, and be transparent about how the state's early childhood investments prepare students for success in school and…
Unmanned Systems Integrated Roadmap FY2011-2036
2011-10-01
neuroscience , and cognition science may lead to the implementation of some of the most critical functionalities of heterogeneous, sensor net...Roadmap FY2011-2036 69 7.4.5.4 Encryption Unmanned systems incorporation of data encryption includes National Security Agency ( NSA ) Type 1 (for...see DODI 4660). Numerous other policies and initiatives are under development within the NSA to significantly streamline the certification processes
NASA Astrophysics Data System (ADS)
Corbisier, Christopher
2005-09-01
Research in Europe, as documented by an FHWA/AASHTO European Scan Tour held in May 2004, and recent activity in Arizona and California, has fostered much interest in ``quiet pavements.'' On September 14-16, 2004, an FHWA sponsored Roadmap to Quieter Highways workshop was held at Purdue University. Participants were from the disciplines of pavement, safety, and noise from FHWA, State departments of transportation, industry (paving associations, general contractors, tire, and vehicle manufacturers), and academia. After several breakout sessions in the areas of policy, construction, maintenance, analysis (measurement and prediction), research, and design, the group had identified the knowledge gaps and developed a plan to fill those gaps. Several activities have been implemented based on the Roadmap to Quieter Highways. An Expert Task Group was formed to provide a draft provisional standard for the measurement methodologies, e.g., source, wayside, pavement absorption. A Tire/Pavement 101 workshop is being developed to educate pavement practitioners in noise concepts and noise practitioners in pavement concepts. A Tire/Pavement Noise clearinghouse is being developed as a one-stop location for all current tire/pavement noise or quiet pavement activities. Several research studies have been started and a second workshop will be held in 2006 to assess progress of the Roadmap.
Integration of NASA-Developed Lifing Technology for PM Alloys into DARWIN (registered trademark)
NASA Technical Reports Server (NTRS)
McClung, R. Craig; Enright, Michael P.; Liang, Wuwei
2011-01-01
In recent years, Southwest Research Institute (SwRI) and NASA Glenn Research Center (GRC) have worked independently on the development of probabilistic life prediction methods for materials used in gas turbine engine rotors. The two organizations have addressed different but complementary technical challenges. This report summarizes a brief investigation into the current status of the relevant technology at SwRI and GRC with a view towards a future integration of methods and models developed by GRC for probabilistic lifing of powder metallurgy (P/M) nickel turbine rotor alloys into the DARWIN (Darwin Corporation) software developed by SwRI.
Speech processing using maximum likelihood continuity mapping
Hogden, John E.
2000-01-01
Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.
Speech processing using maximum likelihood continuity mapping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogden, J.E.
Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.
ERIC Educational Resources Information Center
Wang, Yinying; Bowers, Alex J.; Fikis, David J.
2017-01-01
Purpose: The purpose of this study is to describe the underlying topics and the topic evolution in the 50-year history of educational leadership research literature. Method: We used automated text data mining with probabilistic latent topic models to examine the full text of the entire publication history of all 1,539 articles published in…
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
A Multiatlas Segmentation Using Graph Cuts with Applications to Liver Segmentation in CT Scans
2014-01-01
An atlas-based segmentation approach is presented that combines low-level operations, an affine probabilistic atlas, and a multiatlas-based segmentation. The proposed combination provides highly accurate segmentation due to registrations and atlas selections based on the regions of interest (ROIs) and coarse segmentations. Our approach shares the following common elements between the probabilistic atlas and multiatlas segmentation: (a) the spatial normalisation and (b) the segmentation method, which is based on minimising a discrete energy function using graph cuts. The method is evaluated for the segmentation of the liver in computed tomography (CT) images. Low-level operations define a ROI around the liver from an abdominal CT. We generate a probabilistic atlas using an affine registration based on geometry moments from manually labelled data. Next, a coarse segmentation of the liver is obtained from the probabilistic atlas with low computational effort. Then, a multiatlas segmentation approach improves the accuracy of the segmentation. Both the atlas selections and the nonrigid registrations of the multiatlas approach use a binary mask defined by coarse segmentation. We experimentally demonstrate that this approach performs better than atlas selections and nonrigid registrations in the entire ROI. The segmentation results are comparable to those obtained by human experts and to other recently published results. PMID:25276219
A Probabilistic Approach to Predict Thermal Fatigue Life for Ball Grid Array Solder Joints
NASA Astrophysics Data System (ADS)
Wei, Helin; Wang, Kuisheng
2011-11-01
Numerous studies of the reliability of solder joints have been performed. Most life prediction models are limited to a deterministic approach. However, manufacturing induces uncertainty in the geometry parameters of solder joints, and the environmental temperature varies widely due to end-user diversity, creating uncertainties in the reliability of solder joints. In this study, a methodology for accounting for variation in the lifetime prediction for lead-free solder joints of ball grid array packages (PBGA) is demonstrated. The key aspects of the solder joint parameters and the cyclic temperature range related to reliability are involved. Probabilistic solutions of the inelastic strain range and thermal fatigue life based on the Engelmaier model are developed to determine the probability of solder joint failure. The results indicate that the standard deviation increases significantly when more random variations are involved. Using the probabilistic method, the influence of each variable on the thermal fatigue life is quantified. This information can be used to optimize product design and process validation acceptance criteria. The probabilistic approach creates the opportunity to identify the root causes of failed samples from product fatigue tests and field returns. The method can be applied to better understand how variation affects parameters of interest in an electronic package design with area array interconnections.
Effects of delay and probability combinations on discounting in humans.
Cox, David J; Dallery, Jesse
2016-10-01
To determine discount rates, researchers typically adjust the amount of an immediate or certain option relative to a delayed or uncertain option. Because this adjusting amount method can be relatively time consuming, researchers have developed more efficient procedures. One such procedure is a 5-trial adjusting delay procedure, which measures the delay at which an amount of money loses half of its value (e.g., $1000 is valued at $500 with a 10-year delay to its receipt). Experiment 1 (n=212) used 5-trial adjusting delay or probability tasks to measure delay discounting of losses, probabilistic gains, and probabilistic losses. Experiment 2 (n=98) assessed combined probabilistic and delayed alternatives. In both experiments, we compared results from 5-trial adjusting delay or probability tasks to traditional adjusting amount procedures. Results suggest both procedures produced similar rates of probability and delay discounting in six out of seven comparisons. A magnitude effect consistent with previous research was observed for probabilistic gains and losses, but not for delayed losses. Results also suggest that delay and probability interact to determine the value of money. Five-trial methods may allow researchers to assess discounting more efficiently as well as study more complex choice scenarios. Copyright © 2016 Elsevier B.V. All rights reserved.
Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng
2010-10-01
Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.
NASA Capability Roadmaps Executive Summary
NASA Technical Reports Server (NTRS)
Willcoxon, Rita; Thronson, Harley; Varsi, Guilio; Mueller, Robert; Regenie, Victoria; Inman, Tom; Crooke, Julie; Coulter, Dan
2005-01-01
This document is the result of eight months of hard work and dedication from NASA, industry, other government agencies, and academic experts from across the nation. It provides a summary of the capabilities necessary to execute the Vision for Space Exploration and the key architecture decisions that drive the direction for those capabilities. This report is being provided to the Exploration Systems Architecture Study (ESAS) team for consideration in development of an architecture approach and investment strategy to support NASA future mission, programs and budget requests. In addition, it will be an excellent reference for NASA's strategic planning. A more detailed set of roadmaps at the technology and sub-capability levels are available on CD. These detailed products include key driving assumptions, capability maturation assessments, and technology and capability development roadmaps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yun, E-mail: genliyun@126.com, E-mail: cuiwanzhao@126.com; Cui, Wan-Zhao, E-mail: genliyun@126.com, E-mail: cuiwanzhao@126.com; Wang, Hong-Guang
2015-05-15
Effects of the secondary electron emission (SEE) phenomenon of metal surface on the multipactor analysis of microwave components are investigated numerically and experimentally in this paper. Both the secondary electron yield (SEY) and the emitted energy spectrum measurements are performed on silver plated samples for accurate description of the SEE phenomenon. A phenomenological probabilistic model based on SEE physics is utilized and fitted accurately to the measured SEY and emitted energy spectrum of the conditioned surface material of microwave components. Specially, the phenomenological probabilistic model is extended to the low primary energy end lower than 20 eV mathematically, since no accuratemore » measurement data can be obtained. Embedding the phenomenological probabilistic model into the Electromagnetic Particle-In-Cell (EM-PIC) method, the electronic resonant multipacting in microwave components can be tracked and hence the multipactor threshold can be predicted. The threshold prediction error of the transformer and the coaxial filter is 0.12 dB and 1.5 dB, respectively. Simulation results demonstrate that the discharge threshold is strongly dependent on the SEYs and its energy spectrum in the low energy end (lower than 50 eV). Multipacting simulation results agree quite well with experiments in practical components, while the phenomenological probabilistic model fit both the SEY and the emission energy spectrum better than the traditionally used model and distribution. The EM-PIC simulation method with the phenomenological probabilistic model for the surface collision simulation has been demonstrated for predicting the multipactor threshold in metal components for space application.« less
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...
2017-07-11
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
Probabilistic power flow using improved Monte Carlo simulation method with correlated wind sources
NASA Astrophysics Data System (ADS)
Bie, Pei; Zhang, Buhan; Li, Hang; Deng, Weisi; Wu, Jiasi
2017-01-01
Probabilistic Power Flow (PPF) is a very useful tool for power system steady-state analysis. However, the correlation among different random injection power (like wind power) brings great difficulties to calculate PPF. Monte Carlo simulation (MCS) and analytical methods are two commonly used methods to solve PPF. MCS has high accuracy but is very time consuming. Analytical method like cumulants method (CM) has high computing efficiency but the cumulants calculating is not convenient when wind power output does not obey any typical distribution, especially when correlated wind sources are considered. In this paper, an Improved Monte Carlo simulation method (IMCS) is proposed. The joint empirical distribution is applied to model different wind power output. This method combines the advantages of both MCS and analytical method. It not only has high computing efficiency, but also can provide solutions with enough accuracy, which is very suitable for on-line analysis.
A Probabilistic Feature Map-Based Localization System Using a Monocular Camera.
Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun
2015-08-31
Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments.
A Probabilistic Feature Map-Based Localization System Using a Monocular Camera
Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun
2015-01-01
Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments. PMID:26404284
Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure
NASA Astrophysics Data System (ADS)
Tsai, C.; Yeh, J. J. J.
2017-12-01
A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.
Cryogenic Fluid Management Technology Development Roadmaps
NASA Technical Reports Server (NTRS)
Stephens, J. R.; Johnson, W. L.
2017-01-01
Advancement in Cryogenic Fluid Management (CFM) Technologies is essential for achieving NASA's future long duration missions. Propulsion systems utilizing cryogens are necessary to achieve mission success. Current State Of the Art (SOA) CFM technologies enable cryogenic propellants to be stored for several hours. However, some envisioned mission architectures require cryogens to be stored for two years or longer. The fundamental roles of CFM technologies are long term storage of cryogens, propellant tank pressure control and propellant delivery. In the presence of heat, the cryogens will "boil-off" over time resulting in excessive pressure buildup, off-nominal propellant conditions, and propellant loss. To achieve long term storage and tank pressure control, the CFM elements will intercept and/or remove any heat from the propulsion system. All functions are required to perform both with and without the presence of a gravitational field. Which CFM technologies are required is a function of the cryogens used, mission architecture, vehicle design and propellant tank size. To enable NASA's crewed mission to the Martian surface, a total of seventeen CFM technologies have been identified to support an In-Space Stage and a Lander/Ascent Vehicle. Recognizing that FY2020 includes a Decision Point regarding the In-Space Stage Architecture, a set of CFM Technology Development Roadmaps have been created identifying the current Technology Readiness Level (TRL) of each element, current technology "gaps", and existing technology development efforts. The roadmaps include a methodical approach and schedule to achieve a flight demonstration in FY2023, hence maturing CFM technologies to TRL 7 for infusion into the In-Space Stage Preliminary Design.
NASA Astrophysics Data System (ADS)
Eversole, K.
2016-12-01
To meet the demands of a global human population expected to exceed 9.6 billion by 2055, crop productivity in sustainable agricultural systems must improve considerably in the face of a steadily changing climate and increased biotic and abiotic stressors. Traditional agricultural sciences have relied mostly on research within individual disciplines and linear, reductionist approaches for crop improvement. While significant advancements have been made in developing and characterizing genetic and genomic resources for crops, we still have a very limited understanding of genotype by environment x management (GxExM) interactions that determine productivity, sustainability, quality, and the ability to withstand biotic and abiotic stressors. Embracing complexity and the non-linear organization and regulation of biological systems would enable a paradigm shift in breeding and crop production by allowing us to move towards a holistic, systems level approach that integrates a wide range of disciplines (e.g., geophysics, biology, agronomy, physiology, genomics, genetics, breeding, physics, pattern recognition, feedback loops, modeling, and engineering) and knowledge about crop phytobiomes (i.e., plants, their associated macro- and micro-organisms, and the geophysical environment of distinct geographical sites). By focusing on the phytobiome, we will be able to elucidate, quantify, model, predict, act, manipulate, and prevent and ultimately prescribe the cropping systems, methods, and management practices most suited for a particular farm, grassland, or forest. The recently released, multidisciplinary roadmap entitled Phytobiomes: A Roadmap for Research and Translation and the new International Alliance for Phytobiomes Research, an industry-academic consortium, will be presented.
Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni
2016-01-01
How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a “specialized” domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the “community structure” of the ToH and their difficulties in executing so-called “counterintuitive” movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand—a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem solving and its deficits. PMID:27074140
Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni
2016-04-01
How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a "specialized" domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the "community structure" of the ToH and their difficulties in executing so-called "counterintuitive" movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand-a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem solving and its deficits.
The 2016 oxide electronic materials and oxide interfaces roadmap
NASA Astrophysics Data System (ADS)
Lorenz, M.; Ramachandra Rao, M. S.; Venkatesan, T.; Fortunato, E.; Barquinha, P.; Branquinho, R.; Salgueiro, D.; Martins, R.; Carlos, E.; Liu, A.; Shan, F. K.; Grundmann, M.; Boschker, H.; Mukherjee, J.; Priyadarshini, M.; DasGupta, N.; Rogers, D. J.; Teherani, F. H.; Sandana, E. V.; Bove, P.; Rietwyk, K.; Zaban, A.; Veziridis, A.; Weidenkaff, A.; Muralidhar, M.; Murakami, M.; Abel, S.; Fompeyrine, J.; Zuniga-Perez, J.; Ramesh, R.; Spaldin, N. A.; Ostanin, S.; Borisov, V.; Mertig, I.; Lazenka, V.; Srinivasan, G.; Prellier, W.; Uchida, M.; Kawasaki, M.; Pentcheva, R.; Gegenwart, P.; Miletto Granozio, F.; Fontcuberta, J.; Pryds, N.
2016-11-01
Oxide electronic materials provide a plethora of possible applications and offer ample opportunity for scientists to probe into some of the exciting and intriguing phenomena exhibited by oxide systems and oxide interfaces. In addition to the already diverse spectrum of properties, the nanoscale form of oxides provides a new dimension of hitherto unknown phenomena due to the increased surface-to-volume ratio. Oxide electronic materials are becoming increasingly important in a wide range of applications including transparent electronics, optoelectronics, magnetoelectronics, photonics, spintronics, thermoelectrics, piezoelectrics, power harvesting, hydrogen storage and environmental waste management. Synthesis and fabrication of these materials, as well as processing into particular device structures to suit a specific application is still a challenge. Further, characterization of these materials to understand the tunability of their properties and the novel properties that evolve due to their nanostructured nature is another facet of the challenge. The research related to the oxide electronic field is at an impressionable stage, and this has motivated us to contribute with a roadmap on ‘oxide electronic materials and oxide interfaces’. This roadmap envisages the potential applications of oxide materials in cutting edge technologies and focuses on the necessary advances required to implement these materials, including both conventional and novel techniques for the synthesis, characterization, processing and fabrication of nanostructured oxides and oxide-based devices. The contents of this roadmap will highlight the functional and correlated properties of oxides in bulk, nano, thin film, multilayer and heterostructure forms, as well as the theoretical considerations behind both present and future applications in many technologically important areas as pointed out by Venkatesan. The contributions in this roadmap span several thematic groups which are represented by the following authors: novel field effect transistors and bipolar devices by Fortunato, Grundmann, Boschker, Rao, and Rogers; energy conversion and saving by Zaban, Weidenkaff, and Murakami; new opportunities of photonics by Fompeyrine, and Zuniga-Perez; multiferroic materials including novel phenomena by Ramesh, Spaldin, Mertig, Lorenz, Srinivasan, and Prellier; and concepts for topological oxide electronics by Kawasaki, Pentcheva, and Gegenwart. Finally, Miletto Granozio presents the European action ‘towards oxide-based electronics’ which develops an oxide electronics roadmap with emphasis on future nonvolatile memories and the required technologies. In summary, we do hope that this oxide roadmap appears as an interesting up-to-date snapshot on one of the most exciting and active areas of solid state physics, materials science, and chemistry, which even after many years of very successful development shows in short intervals novel insights and achievements. Guest editors: M S Ramachandra Rao and Michael Lorenz
Backenroth, Daniel; He, Zihuai; Kiryluk, Krzysztof; Boeva, Valentina; Pethukova, Lynn; Khurana, Ekta; Christiano, Angela; Buxbaum, Joseph D; Ionita-Laza, Iuliana
2018-05-03
We describe a method based on a latent Dirichlet allocation model for predicting functional effects of noncoding genetic variants in a cell-type- and/or tissue-specific way (FUN-LDA). Using this unsupervised approach, we predict tissue-specific functional effects for every position in the human genome in 127 different tissues and cell types. We demonstrate the usefulness of our predictions by using several validation experiments. Using eQTL data from several sources, including the GTEx project, Geuvadis project, and TwinsUK cohort, we show that eQTLs in specific tissues tend to be most enriched among the predicted functional variants in relevant tissues in Roadmap. We further show how these integrated functional scores can be used for (1) deriving the most likely cell or tissue type causally implicated for a complex trait by using summary statistics from genome-wide association studies and (2) estimating a tissue-based correlation matrix of various complex traits. We found large enrichment of heritability in functional components of relevant tissues for various complex traits, and FUN-LDA yielded higher enrichment estimates than existing methods. Finally, using experimentally validated functional variants from the literature and variants possibly implicated in disease by previous studies, we rigorously compare FUN-LDA with state-of-the-art functional annotation methods and show that FUN-LDA has better prediction accuracy and higher resolution than these methods. In particular, our results suggest that tissue- and cell-type-specific functional prediction methods tend to have substantially better prediction accuracy than organism-level prediction methods. Scores for each position in the human genome and for each ENCODE and Roadmap tissue are available online (see Web Resources). Copyright © 2018 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
A model-based test for treatment effects with probabilistic classifications.
Cavagnaro, Daniel R; Davis-Stober, Clintin P
2018-05-21
Within modern psychology, computational and statistical models play an important role in describing a wide variety of human behavior. Model selection analyses are typically used to classify individuals according to the model(s) that best describe their behavior. These classifications are inherently probabilistic, which presents challenges for performing group-level analyses, such as quantifying the effect of an experimental manipulation. We answer this challenge by presenting a method for quantifying treatment effects in terms of distributional changes in model-based (i.e., probabilistic) classifications across treatment conditions. The method uses hierarchical Bayesian mixture modeling to incorporate classification uncertainty at the individual level into the test for a treatment effect at the group level. We illustrate the method with several worked examples, including a reanalysis of the data from Kellen, Mata, and Davis-Stober (2017), and analyze its performance more generally through simulation studies. Our simulations show that the method is both more powerful and less prone to type-1 errors than Fisher's exact test when classifications are uncertain. In the special case where classifications are deterministic, we find a near-perfect power-law relationship between the Bayes factor, derived from our method, and the p value obtained from Fisher's exact test. We provide code in an online supplement that allows researchers to apply the method to their own data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
A New Security Paradigm for Anti-Counterfeiting: Guidelines and an Implementation Roadmap
NASA Astrophysics Data System (ADS)
Lehtonen, Mikko
Product counterfeitingand piracy continue to plague brand and trademark owners across industry sectors. This chapter analyses the reasons for ineffectiveness of past technical anti-counterfeitingstrategies and formulates managerial guidelines for effective use of RFID in anti-counterfeiting. An implementation roadmap toward secure authentication of products tagged with EPC Gen-2 tags is proposed and possible supply chain locations for product checks are discussed.