Sample records for deterministic safety technology

  1. Structural Deterministic Safety Factors Selection Criteria and Verification

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1992-01-01

    Though current deterministic safety factors are arbitrarily and unaccountably specified, its ratio is rooted in resistive and applied stress probability distributions. This study approached the deterministic method from a probabilistic concept leading to a more systematic and coherent philosophy and criterion for designing more uniform and reliable high-performance structures. The deterministic method was noted to consist of three safety factors: a standard deviation multiplier of the applied stress distribution; a K-factor for the A- or B-basis material ultimate stress; and the conventional safety factor to ensure that the applied stress does not operate in the inelastic zone of metallic materials. The conventional safety factor is specifically defined as the ratio of ultimate-to-yield stresses. A deterministic safety index of the combined safety factors was derived from which the corresponding reliability proved the deterministic method is not reliability sensitive. The bases for selecting safety factors are presented and verification requirements are discussed. The suggested deterministic approach is applicable to all NASA, DOD, and commercial high-performance structures under static stresses.

  2. Quasi-Static Probabilistic Structural Analyses Process and Criteria

    NASA Technical Reports Server (NTRS)

    Goldberg, B.; Verderaime, V.

    1999-01-01

    Current deterministic structural methods are easily applied to substructures and components, and analysts have built great design insights and confidence in them over the years. However, deterministic methods cannot support systems risk analyses, and it was recently reported that deterministic treatment of statistical data is inconsistent with error propagation laws that can result in unevenly conservative structural predictions. Assuming non-nal distributions and using statistical data formats throughout prevailing stress deterministic processes lead to a safety factor in statistical format, which integrated into the safety index, provides a safety factor and first order reliability relationship. The embedded safety factor in the safety index expression allows a historically based risk to be determined and verified over a variety of quasi-static metallic substructures consistent with the traditional safety factor methods and NASA Std. 5001 criteria.

  3. Integrated deterministic and probabilistic safety analysis for safety assessment of nuclear power plants

    DOE PAGES

    Di Maio, Francesco; Zio, Enrico; Smith, Curtis; ...

    2015-07-06

    The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs andmore » activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).« less

  4. Coupling Legacy and Contemporary Deterministic Codes to Goldsim for Probabilistic Assessments of Potential Low-Level Waste Repository Sites

    NASA Astrophysics Data System (ADS)

    Mattie, P. D.; Knowlton, R. G.; Arnold, B. W.; Tien, N.; Kuo, M.

    2006-12-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in radioactive waste disposal and is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. International technology transfer efforts are often hampered by small budgets, time schedule constraints, and a lack of experienced personnel in countries with small radioactive waste disposal programs. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, re-vitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a creditable and solid computational platform for constructing probabilistic safety assessment models. External model linkage capabilities in Goldsim and the techniques applied to facilitate this process will be presented using example applications, including Breach, Leach, and Transport-Multiple Species (BLT-MS), a U.S. NRC sponsored code simulating release and transport of contaminants from a subsurface low-level waste disposal facility used in a cooperative technology transfer project between Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research (INER) for the preliminary assessment of several candidate low-level waste repository sites. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.

  5. First Order Reliability Application and Verification Methods for Semistatic Structures

    NASA Technical Reports Server (NTRS)

    Verderaime, Vincent

    1994-01-01

    Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored by conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments, its stress audits are shown to be arbitrary and incomplete, and it compromises high strength materials performance. A reliability method is proposed which combines first order reliability principles with deterministic design variables and conventional test technique to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety index expression. The application is reduced to solving for a factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and with the pace of semistatic structural designs.

  6. First-order reliability application and verification methods for semistatic structures

    NASA Astrophysics Data System (ADS)

    Verderaime, V.

    1994-11-01

    Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored in conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments; stress audits are shown to be arbitrary and incomplete, and the concept compromises the performance of high-strength materials. A reliability method is proposed that combines first-order reliability principles with deterministic design variables and conventional test techniques to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety-index expression. The application is reduced to solving for a design factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this design factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and the development of semistatic structural designs.

  7. Inherent Conservatism in Deterministic Quasi-Static Structural Analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1997-01-01

    The cause of the long-suspected excessive conservatism in the prevailing structural deterministic safety factor has been identified as an inherent violation of the error propagation laws when reducing statistical data to deterministic values and then combining them algebraically through successive structural computational processes. These errors are restricted to the applied stress computations, and because mean and variations of the tolerance limit format are added, the errors are positive, serially cumulative, and excessively conservative. Reliability methods circumvent these errors and provide more efficient and uniform safe structures. The document is a tutorial on the deficiencies and nature of the current safety factor and of its improvement and transition to absolute reliability.

  8. Solving Autonomy Technology Gaps through Wireless Technology and Orion Avionics Architectural Principles

    NASA Astrophysics Data System (ADS)

    Black, Randy; Bai, Haowei; Michalicek, Andrew; Shelton, Blaine; Villela, Mark

    2008-01-01

    Currently, autonomy in space applications is limited by a variety of technology gaps. Innovative application of wireless technology and avionics architectural principles drawn from the Orion crew exploration vehicle provide solutions for several of these gaps. The Vision for Space Exploration envisions extensive use of autonomous systems. Economic realities preclude continuing the level of operator support currently required of autonomous systems in space. In order to decrease the number of operators, more autonomy must be afforded to automated systems. However, certification authorities have been notoriously reluctant to certify autonomous software in the presence of humans or when costly missions may be jeopardized. The Orion avionics architecture, drawn from advanced commercial aircraft avionics, is based upon several architectural principles including partitioning in software. Robust software partitioning provides "brick wall" separation between software applications executing on a single processor, along with controlled data movement between applications. Taking advantage of these attributes, non-deterministic applications can be placed in one partition and a "Safety" application created in a separate partition. This "Safety" partition can track the position of astronauts or critical equipment and prevent any unsafe command from executing. Only the Safety partition need be certified to a human rated level. As a proof-of-concept demonstration, Honeywell has teamed with the Ultra WideBand (UWB) Working Group at NASA Johnson Space Center to provide tracking of humans, autonomous systems, and critical equipment. Using UWB the NASA team can determine positioning to within less than one inch resolution, allowing a Safety partition to halt operation of autonomous systems in the event that an unplanned collision is imminent. Another challenge facing autonomous systems is the coordination of multiple autonomous agents. Current approaches address the issue as one of networking and coordination of multiple independent units, each with its own mission. As a proof-of-concept Honeywell is developing and testing various algorithms that lead to a deterministic, fault tolerant, reliable wireless backplane. Just as advanced avionics systems control several subsystems, actuators, sensors, displays, etc.; a single "master" autonomous agent (or base station computer) could control multiple autonomous systems. The problem is simplified to controlling a flexible body consisting of several sensors and actuators, rather than one of coordinating multiple independent units. By filling technology gaps associated with space based autonomous system, wireless technology and Orion architectural principles provide the means for decreasing operational costs and simplifying problems associated with collaboration of multiple autonomous systems.

  9. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk

    2016-06-08

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  10. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    NASA Astrophysics Data System (ADS)

    Králik, Juraj

    2016-06-01

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  11. Soils: man-caused radioactivity and radiation forecast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gablin, Vassily

    2007-07-01

    Available in abstract form only. Full text of publication follows: One of the main tasks of the radiation safety guarantee is non-admission of the excess over critical radiation levels. In Russia they are man-caused radiation levels. Meanwhile any radiation measurement represents total radioactivity. That is why it is hard to assess natural and man-caused contributions to total radioactivity. It is shown that soil radioactivity depends on natural factors including radioactivity of rocks and cosmic radiation as well as man-caused factors including nuclear and non-nuclear technologies. Whole totality of these factors includes unpredictable (non-deterministic) factors - nuclear explosions and radiation accidents,more » and predictable ones (deterministic) - all the rest. Deterministic factors represent background radioactivity whose trends is the base of the radiation forecast. Non-deterministic factors represent man-caused radiation treatment contribution which is to be controlled. This contribution is equal to the difference in measured radioactivity and radiation background. The way of calculation of background radioactivity is proposed. Contemporary soils are complicated technologically influenced systems with multi-leveled spatial and temporary inhomogeneity of radionuclides distribution. Generally analysis area can be characterized by any set of factors of soil radioactivity including natural and man-caused factors. Natural factors are cosmic radiation and radioactivity of rocks. Man-caused factors are shown on Fig. 1. It is obvious that man-caused radioactivity is due to both artificial and natural emitters. Any result of radiation measurement represents total radioactivity i.e. the sum of activities resulting from natural and man-caused emitters. There is no gauge which could separately measure natural and man-caused radioactivity. That is why it is so hard to assess natural and man-caused contributions to soil radioactivity. It would have been possible if human activity had led to contamination of soil only by artificial radionuclides. But we can view a totality of soil radioactivity factors in the following way. (author)« less

  12. Universal first-order reliability concept applied to semistatic structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1994-01-01

    A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.

  13. Universal first-order reliability concept applied to semistatic structures

    NASA Astrophysics Data System (ADS)

    Verderaime, V.

    1994-07-01

    A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.

  14. Technological Utopia, Dystopia and Ambivalence: Teaching with Social Media at a South African University

    ERIC Educational Resources Information Center

    Rambe, Patient; Nel, Liezel

    2015-01-01

    The discourse of social media adoption in higher education has often been funnelled through utopian and dystopian perspectives, which are polarised but determinist theorisations of human engagement with educational technologies. Consequently, these determinist approaches have obscured a broadened grasp of the situated, socially constructed nature…

  15. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  16. Development of a Deterministic Ethernet Building blocks for Space Applications

    NASA Astrophysics Data System (ADS)

    Fidi, C.; Jakovljevic, Mirko

    2015-09-01

    The benefits of using commercially based networking standards and protocols have been widely discussed and are expected to include reduction in overall mission cost, shortened integration and test (I&T) schedules, increased operations flexibility, and hardware and software upgradeability/scalability with developments ongoing in the commercial world. The deterministic Ethernet technology TTEthernet [1] diploid on the NASA Orion spacecraft has demonstrated the use of the TTEthernet technology for a safety critical human space flight application during the Exploration Flight Test 1 (EFT-1). The TTEthernet technology used within the NASA Orion program has been matured for the use within this mission but did not lead to a broader use in space applications or an international space standard. Therefore TTTech has developed a new version which allows to scale the technology for different applications not only the high end missions allowing to decrease the size of the building blocks leading to a reduction of size weight and power enabling the use in smaller applications. TTTech is currently developing a full space products offering for its TTEthernet technology to allow the use in different space applications not restricted to launchers and human spaceflight. A broad space market assessment and the current ESA TRP7594 lead to the development of a space grade TTEthernet controller ASIC based on the ESA qualified Atmel AT1C8RHA95 process [2]. In this paper we will describe our current TTEthernet controller development towards a space qualified network component allowing future spacecrafts to operate in significant radiation environments while using a single onboard network for reliable commanding and data transfer.

  17. Study on the evaluation method for fault displacement based on characterized source model

    NASA Astrophysics Data System (ADS)

    Tonagi, M.; Takahama, T.; Matsumoto, Y.; Inoue, N.; Irikura, K.; Dalguer, L. A.

    2016-12-01

    In IAEA Specific Safety Guide (SSG) 9 describes that probabilistic methods for evaluating fault displacement should be used if no sufficient basis is provided to decide conclusively that the fault is not capable by using the deterministic methodology. In addition, International Seismic Safety Centre compiles as ANNEX to realize seismic hazard for nuclear facilities described in SSG-9 and shows the utility of the deterministic and probabilistic evaluation methods for fault displacement. In Japan, it is required that important nuclear facilities should be established on ground where fault displacement will not arise when earthquakes occur in the future. Under these situations, based on requirements, we need develop evaluation methods for fault displacement to enhance safety in nuclear facilities. We are studying deterministic and probabilistic methods with tentative analyses using observed records such as surface fault displacement and near-fault strong ground motions of inland crustal earthquake which fault displacements arose. In this study, we introduce the concept of evaluation methods for fault displacement. After that, we show parts of tentative analysis results for deterministic method as follows: (1) For the 1999 Chi-Chi earthquake, referring slip distribution estimated by waveform inversion, we construct a characterized source model (Miyake et al., 2003, BSSA) which can explain observed near-fault broad band strong ground motions. (2) Referring a characterized source model constructed in (1), we study an evaluation method for surface fault displacement using hybrid method, which combines particle method and distinct element method. At last, we suggest one of the deterministic method to evaluate fault displacement based on characterized source model. This research was part of the 2015 research project `Development of evaluating method for fault displacement` by the Secretariat of Nuclear Regulation Authority (S/NRA), Japan.

  18. Safety and integrity of pipeline systems - philosophy and experience in Germany

    DOT National Transportation Integrated Search

    1997-01-01

    The design, construction and operation of gas pipeline systems in Germany are subject to the Energy Act and associated regulations. This legal structure is based on a deterministic rather than a probabilistic safety philosophy, consisting of technica...

  19. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rucker, D.F.

    2000-09-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have beenmore » overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived from the 10,000 iteration batch, which included 5%, 50%, and 95% dose likelihood, and the sensitivity of each assumption to the calculated doses. As one would intuitively expect, the doses from the probabilistic assessment for most scenarios were found to be much less than the deterministic assessment. The lower dose of the probabilistic assessment can be attributed to a ''smearing'' of values from the high and low end of the PDF spectrum of the various input parameters. The analysis also found a potential weakness in the deterministic analysis used in the SAR, a detail on drum loading was not taken into consideration. Waste emplacement operations thus far have handled drums from each shipment as a single unit, i.e. drums from each shipment are kept together. Shipments typically come from a single waste stream, and therefore the curie loading of each drum can be considered nearly identical to that of its neighbor. Calculations show that if there are large numbers of drums used in the accident scenario assessment, e.g. 28 drums in the waste hoist failure scenario (CH5), then the probabilistic dose assessment calculations will diverge from the deterministically determined doses. As it is currently calculated, the deterministic dose assessment assumes one drum loaded to the maximum allowable (80 PE-Ci), and the remaining are 10% of the maximum. The effective average of drum curie content is therefore less in the deterministic assessment than the probabilistic assessment for a large number of drums. EEG recommends that the WIPP SAR calculations be revisited and updated to include a probabilistic safety assessment.« less

  20. Health safety nets can break cycles of poverty and disease: a stochastic ecological model.

    PubMed

    Plucinski, Mateusz M; Ngonghala, Calistus N; Bonds, Matthew H

    2011-12-07

    The persistence of extreme poverty is increasingly attributed to dynamic interactions between biophysical processes and economics, though there remains a dearth of integrated theoretical frameworks that can inform policy. Here, we present a stochastic model of disease-driven poverty traps. Whereas deterministic models can result in poverty traps that can only be broken by substantial external changes to the initial conditions, in the stochastic model there is always some probability that a population will leave or enter a poverty trap. We show that a 'safety net', defined as an externally enforced minimum level of health or economic conditions, can guarantee ultimate escape from a poverty trap, even if the safety net is set within the basin of attraction of the poverty trap, and even if the safety net is only in the form of a public health measure. Whereas the deterministic model implies that small improvements in initial conditions near the poverty-trap equilibrium are futile, the stochastic model suggests that the impact of changes in the location of the safety net on the rate of development may be strongest near the poverty-trap equilibrium.

  1. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  2. Efficient room-temperature source of polarized single photons

    DOEpatents

    Lukishova, Svetlana G.; Boyd, Robert W.; Stroud, Carlos R.

    2007-08-07

    An efficient technique for producing deterministically polarized single photons uses liquid-crystal hosts of either monomeric or oligomeric/polymeric form to preferentially align the single emitters for maximum excitation efficiency. Deterministic molecular alignment also provides deterministically polarized output photons; using planar-aligned cholesteric liquid crystal hosts as 1-D photonic-band-gap microcavities tunable to the emitter fluorescence band to increase source efficiency, using liquid crystal technology to prevent emitter bleaching. Emitters comprise soluble dyes, inorganic nanocrystals or trivalent rare-earth chelates.

  3. Health safety nets can break cycles of poverty and disease: a stochastic ecological model

    PubMed Central

    Pluciński, Mateusz M.; Ngonghala, Calistus N.; Bonds, Matthew H.

    2011-01-01

    The persistence of extreme poverty is increasingly attributed to dynamic interactions between biophysical processes and economics, though there remains a dearth of integrated theoretical frameworks that can inform policy. Here, we present a stochastic model of disease-driven poverty traps. Whereas deterministic models can result in poverty traps that can only be broken by substantial external changes to the initial conditions, in the stochastic model there is always some probability that a population will leave or enter a poverty trap. We show that a ‘safety net’, defined as an externally enforced minimum level of health or economic conditions, can guarantee ultimate escape from a poverty trap, even if the safety net is set within the basin of attraction of the poverty trap, and even if the safety net is only in the form of a public health measure. Whereas the deterministic model implies that small improvements in initial conditions near the poverty-trap equilibrium are futile, the stochastic model suggests that the impact of changes in the location of the safety net on the rate of development may be strongest near the poverty-trap equilibrium. PMID:21593026

  4. The Role of Probabilistic Design Analysis Methods in Safety and Affordability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2016-01-01

    For the last several years, NASA and its contractors have been working together to build space launch systems to commercialize space. Developing commercial affordable and safe launch systems becomes very important and requires a paradigm shift. This paradigm shift enforces the need for an integrated systems engineering environment where cost, safety, reliability, and performance need to be considered to optimize the launch system design. In such an environment, rule based and deterministic engineering design practices alone may not be sufficient to optimize margins and fault tolerance to reduce cost. As a result, introduction of Probabilistic Design Analysis (PDA) methods to support the current deterministic engineering design practices becomes a necessity to reduce cost without compromising reliability and safety. This paper discusses the importance of PDA methods in NASA's new commercial environment, their applications, and the key role they can play in designing reliable, safe, and affordable launch systems. More specifically, this paper discusses: 1) The involvement of NASA in PDA 2) Why PDA is needed 3) A PDA model structure 4) A PDA example application 5) PDA link to safety and affordability.

  5. Philosophy of Technology Assumptions in Educational Technology Leadership: Questioning Technological Determinism

    ERIC Educational Resources Information Center

    Webster, Mark David

    2013-01-01

    Scholars have emphasized that decisions about technology can be influenced by philosophy of technology assumptions, and have argued for research that critically questions technological determinist assumptions. Empirical studies of technology management in fields other than K-12 education provided evidence that philosophy of technology assumptions,…

  6. Food safety objective approach for controlling Clostridium botulinum growth and toxin production in commercially sterile foods.

    PubMed

    Anderson, N M; Larkin, J W; Cole, M B; Skinner, G E; Whiting, R C; Gorris, L G M; Rodriguez, A; Buchanan, R; Stewart, C M; Hanlin, J H; Keener, L; Hall, P A

    2011-11-01

    As existing technologies are refined and novel microbial inactivation technologies are developed, there is a growing need for a metric that can be used to judge equivalent levels of hazard control stringency to ensure food safety of commercially sterile foods. A food safety objective (FSO) is an output-oriented metric that designates the maximum level of a hazard (e.g., the pathogenic microorganism or toxin) tolerated in a food at the end of the food supply chain at the moment of consumption without specifying by which measures the hazard level is controlled. Using a risk-based approach, when the total outcome of controlling initial levels (H(0)), reducing levels (ΣR), and preventing an increase in levels (ΣI) is less than or equal to the target FSO, the product is considered safe. A cross-disciplinary international consortium of specialists from industry, academia, and government was organized with the objective of developing a document to illustrate the FSO approach for controlling Clostridium botulinum toxin in commercially sterile foods. This article outlines the general principles of an FSO risk management framework for controlling C. botulinum growth and toxin production in commercially sterile foods. Topics include historical approaches to establishing commercial sterility; a perspective on the establishment of an appropriate target FSO; a discussion of control of initial levels, reduction of levels, and prevention of an increase in levels of the hazard; and deterministic and stochastic examples that illustrate the impact that various control measure combinations have on the safety of well-established commercially sterile products and the ways in which variability all levels of control can heavily influence estimates in the FSO risk management framework. This risk-based framework should encourage development of innovative technologies that result in microbial safety levels equivalent to those achieved with traditional processing methods.

  7. The past, present and future of cyber-physical systems: a focus on models.

    PubMed

    Lee, Edward A

    2015-02-26

    This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical.

  8. The Past, Present and Future of Cyber-Physical Systems: A Focus on Models

    PubMed Central

    Lee, Edward A.

    2015-01-01

    This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical. PMID:25730486

  9. Experimental demonstration on the deterministic quantum key distribution based on entangled photons.

    PubMed

    Chen, Hua; Zhou, Zhi-Yuan; Zangana, Alaa Jabbar Jumaah; Yin, Zhen-Qiang; Wu, Juan; Han, Yun-Guang; Wang, Shuang; Li, Hong-Wei; He, De-Yong; Tawfeeq, Shelan Khasro; Shi, Bao-Sen; Guo, Guang-Can; Chen, Wei; Han, Zheng-Fu

    2016-02-10

    As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified "Ping-Pong"(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based quantum communications.

  10. Experimental demonstration on the deterministic quantum key distribution based on entangled photons

    PubMed Central

    Chen, Hua; Zhou, Zhi-Yuan; Zangana, Alaa Jabbar Jumaah; Yin, Zhen-Qiang; Wu, Juan; Han, Yun-Guang; Wang, Shuang; Li, Hong-Wei; He, De-Yong; Tawfeeq, Shelan Khasro; Shi, Bao-Sen; Guo, Guang-Can; Chen, Wei; Han, Zheng-Fu

    2016-01-01

    As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based quantum communications. PMID:26860582

  11. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  12. Philosophy of Technology Assumptions in Educational Technology Leadership

    ERIC Educational Resources Information Center

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  13. Illustrated structural application of universal first-order reliability method

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1994-01-01

    The general application of the proposed first-order reliability method was achieved through the universal normalization of engineering probability distribution data. The method superimposes prevailing deterministic techniques and practices on the first-order reliability method to surmount deficiencies of the deterministic method and provide benefits of reliability techniques and predictions. A reliability design factor is derived from the reliability criterion to satisfy a specified reliability and is analogous to the deterministic safety factor. Its application is numerically illustrated on several practical structural design and verification cases with interesting results and insights. Two concepts of reliability selection criteria are suggested. Though the method was developed to support affordable structures for access to space, the method should also be applicable for most high-performance air and surface transportation systems.

  14. Technological Change and HRD. Symposium.

    ERIC Educational Resources Information Center

    2002

    This document contains three papers from a symposium on technological change and human resource development. "New Technologies, Cognitive Demands, and the Implications for Learning Theory" (Richard J. Torraco) identifies four specific characteristics of the tasks involved in using new technologies (contingent versus deterministic tasks,…

  15. Nuclear power and probabilistic safety assessment (PSA): past through future applications

    NASA Astrophysics Data System (ADS)

    Stamatelatos, M. G.; Moieni, P.; Everline, C. J.

    1995-03-01

    Nuclear power reactor safety in the United States is about to enter a new era -- an era of risk- based management and risk-based regulation. First, there was the age of `prescribed safety assessment,' during which a series of design-basis accidents in eight categories of severity, or classes, were postulated and analyzed. Toward the end of that era, it was recognized that `Class 9,' or `beyond design basis,' accidents would need special attention because of the potentially severe health and financial consequences of these accidents. The accident at Three Mile Island showed that sequences of low-consequence, high-frequency events and human errors can be much more risk dominant than the Class 9 accidents. A different form of safety assessment, PSA, emerged and began to gain ground against the deterministic safety establishment. Eventually, this led to the current regulatory requirements for individual plant examinations (IPEs). The IPEs can serve as a basis for risk-based regulation and management, a concept that may ultimately transform the U.S. regulatory process from its traditional deterministic foundations to a process predicated upon PSA. Beyond the possibility of a regulatory environment predicated upon PSA lies the possibility of using PSA as the foundation for managing daily nuclear power plant operations.

  16. Deterministic and efficient quantum cryptography based on Bell's theorem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen Zengbing; Pan Jianwei; Physikalisches Institut, Universitaet Heidelberg, Philosophenweg 12, 69120 Heidelberg

    2006-05-15

    We propose a double-entanglement-based quantum cryptography protocol that is both efficient and deterministic. The proposal uses photon pairs with entanglement both in polarization and in time degrees of freedom; each measurement in which both of the two communicating parties register a photon can establish one and only one perfect correlation, and thus deterministically create a key bit. Eavesdropping can be detected by violation of local realism. A variation of the protocol shows a higher security, similar to the six-state protocol, under individual attacks. Our scheme allows a robust implementation under the current technology.

  17. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment.

  18. Deterministic ion beam material adding technology for high-precision optical surfaces.

    PubMed

    Liao, Wenlin; Dai, Yifan; Xie, Xuhui; Zhou, Lin

    2013-02-20

    Although ion beam figuring (IBF) provides a highly deterministic method for the precision figuring of optical components, several problems still need to be addressed, such as the limited correcting capability for mid-to-high spatial frequency surface errors and low machining efficiency for pit defects on surfaces. We propose a figuring method named deterministic ion beam material adding (IBA) technology to solve those problems in IBF. The current deterministic optical figuring mechanism, which is dedicated to removing local protuberances on optical surfaces, is enriched and developed by the IBA technology. Compared with IBF, this method can realize the uniform convergence of surface errors, where the particle transferring effect generated in the IBA process can effectively correct the mid-to-high spatial frequency errors. In addition, IBA can rapidly correct the pit defects on the surface and greatly improve the machining efficiency of the figuring process. The verification experiments are accomplished on our experimental installation to validate the feasibility of the IBA method. First, a fused silica sample with a rectangular pit defect is figured by using IBA. Through two iterations within only 47.5 min, this highly steep pit is effectively corrected, and the surface error is improved from the original 24.69 nm root mean square (RMS) to the final 3.68 nm RMS. Then another experiment is carried out to demonstrate the correcting capability of IBA for mid-to-high spatial frequency surface errors, and the final results indicate that the surface accuracy and surface quality can be simultaneously improved.

  19. The Coevolution of Society and Multimedia Technology: Issues in Predicting the Future Innovation and Use of a Ubiquitous Technology.

    ERIC Educational Resources Information Center

    Stewart, James; Williams, Robin

    1998-01-01

    Criticizes "technologically deterministic" approaches, which seek to extrapolate social change from technological potential. Shows how a three-layer model of component, system, and application technologies can be used to integrate findings from the use and development of technology in specific sectors. Examines three cases of…

  20. Monte Carlo capabilities of the SCALE code system

    DOE PAGES

    Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...

    2014-09-12

    SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less

  1. Safety design approach for external events in Japan sodium-cooled fast reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamano, H.; Kubo, S.; Tani, A.

    2012-07-01

    This paper describes a safety design approach for external events in the design study of Japan sodium-cooled fast reactor. An emphasis is introduction of a design extension external condition (DEEC). In addition to seismic design, other external events such as tsunami, strong wind, abnormal temperature, etc. were addressed in this study. From a wide variety of external events consisting of natural hazards and human-induced ones, a screening method was developed in terms of siting, consequence, frequency to select representative events. Design approaches for these events were categorized on the probabilistic, statistical and deterministic basis. External hazard conditions were considered mainlymore » for DEECs. In the probabilistic approach, the DEECs of earthquake, tsunami and strong wind were defined as 1/10 of exceedance probability of the external design bases. The other representative DEECs were also defined based on statistical or deterministic approaches. (authors)« less

  2. Making Sense of Young People, Education and Digital Technology: The Role of Sociological Theory

    ERIC Educational Resources Information Center

    Selwyn, Neil

    2012-01-01

    This paper considers the contribution of sociological theory to the academic study of young people, education and digital technology. First it discusses the shortcomings of the technological and socially determinist views of technology and education that prevail in current academic and policy discussions. Against this background the paper outlines…

  3. Development of a framework for the assessment of capacity and throughput technologies within the National Airspace System

    NASA Astrophysics Data System (ADS)

    Garcia, Elena

    The demand for air travel is expanding beyond the capacity of the existing National Airspace System. Excess traffic results in delays and compromised safety. Thus, a number of initiatives to improve airspace capacity have been proposed. To assess the impact of these technologies on air traffic one must move beyond the vehicle to a system-of-systems point of view. This top-level perspective must include consideration of the aircraft, airports, air traffic control and airlines that make up the airspace system. In addition to these components and their interactions economics, safety and government regulations must also be considered. Furthermore, the air transportation system is inherently variable with changes in everything from fuel prices to the weather. The development of a modeling environment that enables a comprehensive probabilistic evaluation of technological impacts was the subject of this thesis. The final modeling environment developed used economics as the thread to tie the airspace components together. Airport capacities and delays were calculated explicitly with due consideration to the impacts of air traffic control. The delay costs were then calculated for an entire fleet, and an airline economic analysis, considering the impact of these costs, was carried out. Airline return on investment was considered the metric of choice since it brings together all costs and revenues, including the cost of delays, landing fees for airport use and aircraft financing costs. Safety was found to require a level of detail unsuitable for a system-of-systems approach and was relegated to future airspace studies. Environmental concerns were considered to be incorporated into airport regulations and procedures and were not explicitly modeled. A deterministic case study was developed to test this modeling environment. The Atlanta airport operations for the year 2000 were used for validation purposes. A 2005 baseline was used as a basis for comparing the four technologies considered: a very large aircraft, Terminal Area Productivity air traffic control technologies, smoothing of an airline schedule, and the addition of a runway. A case including all four technologies simultaneously was also considered. Unfortunately, the complexity of the system prevented full exploration of the probabilistic aspects of the National Airspace System.

  4. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  5. NASA's Human Research Program at The Glenn Research Center: Progress and Opportunities

    NASA Technical Reports Server (NTRS)

    Nall, Marsha; Griffin, DeVon; Myers, Jerry; Perusek, Gail

    2008-01-01

    The NASA Human Research Program is aimed at correcting problems in critical areas that place NASA human spaceflight missions at risk due to shortfalls in astronaut health, safety and performance. The Glenn Research Center (GRC) and partners from Ohio are significant contributors to this effort. This presentation describes several areas of GRC emphasis, the first being NASA s path to creating exercise hardware requirements and protocols that mitigate the effects of long duration spaceflight. Computational simulations will be a second area that is discussed. This includes deterministic models that simulate the effects of spaceflight on the human body, as well as probabilistic models that bound and quantify the probability that adverse medical incidents will happen during an exploration mission. Medical technology development for exploration will be the final area to be discussed.

  6. RDS - A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    NASA Astrophysics Data System (ADS)

    Salim, Mohd Faiz; Roslan, Ridha; Ibrahim, Mohd Rizal Mamat @

    2014-02-01

    Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges.

  7. Effect of quantum noise on deterministic remote state preparation of an arbitrary two-particle state via various quantum entangled channels

    NASA Astrophysics Data System (ADS)

    Qu, Zhiguo; Wu, Shengyao; Wang, Mingming; Sun, Le; Wang, Xiaojun

    2017-12-01

    As one of important research branches of quantum communication, deterministic remote state preparation (DRSP) plays a significant role in quantum network. Quantum noises are prevalent in quantum communication, and it can seriously affect the safety and reliability of quantum communication system. In this paper, we study the effect of quantum noise on deterministic remote state preparation of an arbitrary two-particle state via different quantum channels including the χ state, Brown state and GHZ state. Firstly, the output states and fidelities of three DRSP algorithms via different quantum entangled channels in four noisy environments, including amplitude-damping, phase-damping, bit-flip and depolarizing noise, are presented, respectively. And then, the effects of noises on three kinds of preparation algorithms in the same noisy environment are discussed. In final, the theoretical analysis proves that the effect of noise in the process of quantum state preparation is only related to the noise type and the size of noise factor and independent of the different entangled quantum channels. Furthermore, another important conclusion is given that the effect of noise is also independent of how to distribute intermediate particles for implementing DRSP through quantum measurement during the concrete preparation process. These conclusions will be very helpful for improving the efficiency and safety of quantum communication in a noisy environment.

  8. Characterizing Uncertainty and Variability in PBPK Models ...

    EPA Pesticide Factsheets

    Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variability; such characterization for PBPK model predictions represents a continuing challenge to both modelers and users. Current practices show significant progress in specifying deterministic biological models and the non-deterministic (often statistical) models, estimating their parameters using diverse data sets from multiple sources, and using them to make predictions and characterize uncertainty and variability. The International Workshop on Uncertainty and Variability in PBPK Models, held Oct 31-Nov 2, 2006, sought to identify the state-of-the-science in this area and recommend priorities for research and changes in practice and implementation. For the short term, these include: (1) multidisciplinary teams to integrate deterministic and non-deterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through more complete documentation of the model structure(s) and parameter values, the results of sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include: (1) theoretic and practical methodological impro

  9. An overview of engineering concepts and current design algorithms for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Duffy, S. F.; Hu, J.; Hopkins, D. A.

    1995-01-01

    The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.

  10. Real-time logic modelling on SpaceWire

    NASA Astrophysics Data System (ADS)

    Zhou, Qiang; Ma, Yunpeng; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. However, it cannot meet the deterministic requirement for safety/time critical application in spacecraft, where the delay of real-time (RT) message streams must be guaranteed. Therefore, SpaceWire-D is developed that provides deterministic delivery over a SpaceWire network. Formal analysis and verification of real-time systems is critical to their development and safe implementation, and is a prerequisite for obtaining their safety certification. Failure to meet specified timing constraints such as deadlines in hard real-time systems may lead to catastrophic results. In this paper, a formal verification method, Real-Time Logic (RTL), has been proposed to specify and verify timing properties of SpaceWire-D network. Based on the principal of SpaceWire-D protocol, we firstly analyze the timing properties of fundamental transactions, such as RMAP WRITE, and RMAP READ. After that, the RMAP WRITE transaction structure is modeled in Real-Time Logic (RTL) and Presburger Arithmetic representations. And then, the associated constraint graph and safety analysis is provided. Finally, it is suggested that RTL method can be useful for the protocol evaluation and provision of recommendation for further protocol evolutions.

  11. Positive dwell time algorithm with minimum equal extra material removal in deterministic optical surfacing technology.

    PubMed

    Li, Longxiang; Xue, Donglin; Deng, Weijie; Wang, Xu; Bai, Yang; Zhang, Feng; Zhang, Xuejun

    2017-11-10

    In deterministic computer-controlled optical surfacing, accurate dwell time execution by computer numeric control machines is crucial in guaranteeing a high-convergence ratio for the optical surface error. It is necessary to consider the machine dynamics limitations in the numerical dwell time algorithms. In this paper, these constraints on dwell time distribution are analyzed, and a model of the equal extra material removal is established. A positive dwell time algorithm with minimum equal extra material removal is developed. Results of simulations based on deterministic magnetorheological finishing demonstrate the necessity of considering machine dynamics performance and illustrate the validity of the proposed algorithm. Indeed, the algorithm effectively facilitates the determinacy of sub-aperture optical surfacing processes.

  12. The Aeronautical Data Link: Decision Framework for Architecture Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Goode, Plesent W.

    2003-01-01

    A decision analytic approach that develops optimal data link architecture configuration and behavior to meet multiple conflicting objectives of concurrent and different airspace operations functions has previously been developed. The approach, premised on a formal taxonomic classification that correlates data link performance with operations requirements, information requirements, and implementing technologies, provides a coherent methodology for data link architectural analysis from top-down and bottom-up perspectives. This paper follows the previous research by providing more specific approaches for mapping and transitioning between the lower levels of the decision framework. The goal of the architectural analysis methodology is to assess the impact of specific architecture configurations and behaviors on the efficiency, capacity, and safety of operations. This necessarily involves understanding the various capabilities, system level performance issues and performance and interface concepts related to the conceptual purpose of the architecture and to the underlying data link technologies. Efficient and goal-directed data link architectural network configuration is conditioned on quantifying the risks and uncertainties associated with complex structural interface decisions. Deterministic and stochastic optimal design approaches will be discussed that maximize the effectiveness of architectural designs.

  13. The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Goode, Plesent W.

    2002-01-01

    The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.

  14. Active temporal multiplexing of indistinguishable heralded single photons

    PubMed Central

    Xiong, C.; Zhang, X.; Liu, Z.; Collins, M. J.; Mahendra, A.; Helt, L. G.; Steel, M. J.; Choi, D. -Y.; Chae, C. J.; Leong, P. H. W.; Eggleton, B. J.

    2016-01-01

    It is a fundamental challenge in quantum optics to deterministically generate indistinguishable single photons through non-deterministic nonlinear optical processes, due to the intrinsic coupling of single- and multi-photon-generation probabilities in these processes. Actively multiplexing photons generated in many temporal modes can decouple these probabilities, but key issues are to minimize resource requirements to allow scalability, and to ensure indistinguishability of the generated photons. Here we demonstrate the multiplexing of photons from four temporal modes solely using fibre-integrated optics and off-the-shelf electronic components. We show a 100% enhancement to the single-photon output probability without introducing additional multi-photon noise. Photon indistinguishability is confirmed by a fourfold Hong–Ou–Mandel quantum interference with a 91±16% visibility after subtracting multi-photon noise due to high pump power. Our demonstration paves the way for scalable multiplexing of many non-deterministic photon sources to a single near-deterministic source, which will be of benefit to future quantum photonic technologies. PMID:26996317

  15. Recent progress in the assembly of nanodevices and van der Waals heterostructures by deterministic placement of 2D materials.

    PubMed

    Frisenda, Riccardo; Navarro-Moratalla, Efrén; Gant, Patricia; Pérez De Lara, David; Jarillo-Herrero, Pablo; Gorbachev, Roman V; Castellanos-Gomez, Andres

    2018-01-02

    Designer heterostructures can now be assembled layer-by-layer with unmatched precision thanks to the recently developed deterministic placement methods to transfer two-dimensional (2D) materials. This possibility constitutes the birth of a very active research field on the so-called van der Waals heterostructures. Moreover, these deterministic placement methods also open the door to fabricate complex devices, which would be otherwise very difficult to achieve by conventional bottom-up nanofabrication approaches, and to fabricate fully-encapsulated devices with exquisite electronic properties. The integration of 2D materials with existing technologies such as photonic and superconducting waveguides and fiber optics is another exciting possibility. Here, we review the state-of-the-art of the deterministic placement methods, describing and comparing the different alternative methods available in the literature, and we illustrate their potential to fabricate van der Waals heterostructures, to integrate 2D materials into complex devices and to fabricate artificial bilayer structures where the layers present a user-defined rotational twisting angle.

  16. Deterministic secure quantum communication using a single d-level system.

    PubMed

    Jiang, Dong; Chen, Yuanyuan; Gu, Xuemei; Xie, Ling; Chen, Lijun

    2017-03-22

    Deterministic secure quantum communication (DSQC) can transmit secret messages between two parties without first generating a shared secret key. Compared with quantum key distribution (QKD), DSQC avoids the waste of qubits arising from basis reconciliation and thus reaches higher efficiency. In this paper, based on data block transmission and order rearrangement technologies, we propose a DSQC protocol. It utilizes a set of single d-level systems as message carriers, which are used to directly encode the secret message in one communication process. Theoretical analysis shows that these employed technologies guarantee the security, and the use of a higher dimensional quantum system makes our protocol achieve higher security and efficiency. Since only quantum memory is required for implementation, our protocol is feasible with current technologies. Furthermore, Trojan horse attack (THA) is taken into account in our protocol. We give a THA model and show that THA significantly increases the multi-photon rate and can thus be detected.

  17. Deterministic implementation of a bright, on-demand single photon source with near-unity indistinguishability via quantum dot imaging.

    PubMed

    He, Yu-Ming; Liu, Jin; Maier, Sebastian; Emmerling, Monika; Gerhardt, Stefan; Davanço, Marcelo; Srinivasan, Kartik; Schneider, Christian; Höfling, Sven

    2017-07-20

    Deterministic techniques enabling the implementation and engineering of bright and coherent solid-state quantum light sources are key for the reliable realization of a next generation of quantum devices. Such a technology, at best, should allow one to significantly scale up the number of implemented devices within a given processing time. In this work, we discuss a possible technology platform for such a scaling procedure, relying on the application of nanoscale quantum dot imaging to the pillar microcavity architecture, which promises to combine very high photon extraction efficiency and indistinguishability. We discuss the alignment technology in detail, and present the optical characterization of a selected device which features a strongly Purcell-enhanced emission output. This device, which yields an extraction efficiency of η = (49 ± 4) %, facilitates the emission of photons with (94 ± 2.7) % indistinguishability.

  18. Scientific and Technological Progress, Political Beliefs and Environmental Sustainability

    ERIC Educational Resources Information Center

    Makrakis, Vassilios

    2012-01-01

    With the development of science and technology, a basically optimistic ideology of progress has emerged. This deterministic attitude has been challenged in recent decades as a result of harmful side-effects generated by the way technology and science have been approached and used. The study presented here is a part of a larger international and…

  19. Damage prognosis: the future of structural health monitoring.

    PubMed

    Farrar, Charles R; Lieven, Nick A J

    2007-02-15

    This paper concludes the theme issue on structural health monitoring (SHM) by discussing the concept of damage prognosis (DP). DP attempts to forecast system performance by assessing the current damage state of the system (i.e. SHM), estimating the future loading environments for that system, and predicting through simulation and past experience the remaining useful life of the system. The successful development of a DP capability will require the further development and integration of many technology areas including both measurement/processing/telemetry hardware and a variety of deterministic and probabilistic predictive modelling capabilities, as well as the ability to quantify the uncertainty in these predictions. The multidisciplinary and challenging nature of the DP problem, its current embryonic state of development, and its tremendous potential for life-safety and economic benefits qualify DP as a 'grand challenge' problem for engineers in the twenty-first century.

  20. Impact of Passive Safety on FHR Instrumentation Systems Design and Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holcomb, David Eugene

    2015-01-01

    Fluoride salt-cooled high-temperature reactors (FHRs) will rely more extensively on passive safety than earlier reactor classes. 10CFR50 Appendix A, General Design Criteria for Nuclear Power Plants, establishes minimum design requirements to provide reasonable assurance of adequate safety. 10CFR50.69, Risk-Informed Categorization and Treatment of Structures, Systems and Components for Nuclear Power Reactors, provides guidance on how the safety significance of systems, structures, and components (SSCs) should be reflected in their regulatory treatment. The Nuclear Energy Institute (NEI) has provided 10 CFR 50.69 SSC Categorization Guideline (NEI-00-04) that factors in probabilistic risk assessment (PRA) model insights, as well as deterministic insights, throughmore » an integrated decision-making panel. Employing the PRA to inform deterministic requirements enables an appropriately balanced, technically sound categorization to be established. No FHR currently has an adequate PRA or set of design basis accidents to enable establishing the safety classification of its SSCs. While all SSCs used to comply with the general design criteria (GDCs) will be safety related, the intent is to limit the instrumentation risk significance through effective design and reliance on inherent passive safety characteristics. For example, FHRs have no safety-significant temperature threshold phenomena, thus enabling the primary and reserve reactivity control systems required by GDC 26 to be passively, thermally triggered at temperatures well below those for which core or primary coolant boundary damage would occur. Moreover, the passive thermal triggering of the primary and reserve shutdown systems may relegate the control rod drive motors to the control system, substantially decreasing the amount of safety-significant wiring needed. Similarly, FHR decay heat removal systems are intended to be running continuously to minimize the amount of safety-significant instrumentation needed to initiate operation of systems and components important to safety as required in GDC 20. This paper provides an overview of the design process employed to develop a pre-conceptual FHR instrumentation architecture intended to lower plant capital and operational costs by minimizing reliance on expensive, safety related, safety-significant instrumentation through the use of inherent passive features of FHRs.« less

  1. A hybrid multi-objective imperialist competitive algorithm and Monte Carlo method for robust safety design of a rail vehicle

    NASA Astrophysics Data System (ADS)

    Nejlaoui, Mohamed; Houidi, Ajmi; Affi, Zouhaier; Romdhane, Lotfi

    2017-10-01

    This paper deals with the robust safety design optimization of a rail vehicle system moving in short radius curved tracks. A combined multi-objective imperialist competitive algorithm and Monte Carlo method is developed and used for the robust multi-objective optimization of the rail vehicle system. This robust optimization of rail vehicle safety considers simultaneously the derailment angle and its standard deviation where the design parameters uncertainties are considered. The obtained results showed that the robust design reduces significantly the sensitivity of the rail vehicle safety to the design parameters uncertainties compared to the determinist one and to the literature results.

  2. Challenges Ahead for Nuclear Facility Site-Specific Seismic Hazard Assessment in France: The Alternative Energies and the Atomic Energy Commission (CEA) Vision

    NASA Astrophysics Data System (ADS)

    Berge-Thierry, C.; Hollender, F.; Guyonnet-Benaize, C.; Baumont, D.; Ameri, G.; Bollinger, L.

    2017-09-01

    Seismic analysis in the context of nuclear safety in France is currently guided by a pure deterministic approach based on Basic Safety Rule ( Règle Fondamentale de Sûreté) RFS 2001-01 for seismic hazard assessment, and on the ASN/2/01 Guide that provides design rules for nuclear civil engineering structures. After the 2011 Tohohu earthquake, nuclear operators worldwide were asked to estimate the ability of their facilities to sustain extreme seismic loads. The French licensees then defined the `hard core seismic levels', which are higher than those considered for design or re-assessment of the safety of a facility. These were initially established on a deterministic basis, and they have been finally justified through state-of-the-art probabilistic seismic hazard assessments. The appreciation and propagation of uncertainties when assessing seismic hazard in France have changed considerably over the past 15 years. This evolution provided the motivation for the present article, the objectives of which are threefold: (1) to provide a description of the current practices in France to assess seismic hazard in terms of nuclear safety; (2) to discuss and highlight the sources of uncertainties and their treatment; and (3) to use a specific case study to illustrate how extended source modeling can help to constrain the key assumptions or parameters that impact upon seismic hazard assessment. This article discusses in particular seismic source characterization, strong ground motion prediction, and maximal magnitude constraints, according to the practice of the French Atomic Energy Commission. Due to increases in strong motion databases in terms of the number and quality of the records in their metadata and the uncertainty characterization, several recently published empirical ground motion prediction models are eligible for seismic hazard assessment in France. We show that propagation of epistemic and aleatory uncertainties is feasible in a deterministic approach, as in a probabilistic way. Assessment of seismic hazard in France in the framework of the safety of nuclear facilities should consider these recent advances. In this sense, the opening of discussions with all of the stakeholders in France to update the reference documents (i.e., RFS 2001-01; ASN/2/01 Guide) appears appropriate in the short term.

  3. Deterministic transfection drives efficient nonviral reprogramming and uncovers reprogramming barriers.

    PubMed

    Gallego-Perez, Daniel; Otero, Jose J; Czeisler, Catherine; Ma, Junyu; Ortiz, Cristina; Gygli, Patrick; Catacutan, Fay Patsy; Gokozan, Hamza Numan; Cowgill, Aaron; Sherwood, Thomas; Ghatak, Subhadip; Malkoc, Veysi; Zhao, Xi; Liao, Wei-Ching; Gnyawali, Surya; Wang, Xinmei; Adler, Andrew F; Leong, Kam; Wulff, Brian; Wilgus, Traci A; Askwith, Candice; Khanna, Savita; Rink, Cameron; Sen, Chandan K; Lee, L James

    2016-02-01

    Safety concerns and/or the stochastic nature of current transduction approaches have hampered nuclear reprogramming's clinical translation. We report a novel non-viral nanotechnology-based platform permitting deterministic large-scale transfection with single-cell resolution. The superior capabilities of our technology are demonstrated by modification of the well-established direct neuronal reprogramming paradigm using overexpression of the transcription factors Brn2, Ascl1, and Myt1l (BAM). Reprogramming efficiencies were comparable to viral methodologies (up to ~9-12%) without the constraints of capsid size and with the ability to control plasmid dosage, in addition to showing superior performance relative to existing non-viral methods. Furthermore, increased neuronal complexity could be tailored by varying BAM ratio and by including additional proneural genes to the BAM cocktail. Furthermore, high-throughput NEP allowed easy interrogation of the reprogramming process. We discovered that BAM-mediated reprogramming is regulated by AsclI dosage, the S-phase cyclin CCNA2, and that some induced neurons passed through a nestin-positive cell stage. In the field of regenerative medicine, the ability to direct cell fate by nuclear reprogramming is an important facet in terms of clinical application. In this article, the authors described their novel technique of cell reprogramming through overexpression of the transcription factors Brn2, Ascl1, and Myt1l (BAM) by in situ electroporation through nanochannels. This new technique could provide a platform for further future designs. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Deterministic bead-in-droplet ejection utilizing an integrated plug-in bead dispenser for single bead-based applications

    NASA Astrophysics Data System (ADS)

    Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon

    2017-04-01

    This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead-encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin-biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules.

  5. Deterministic bead-in-droplet ejection utilizing an integrated plug-in bead dispenser for single bead-based applications.

    PubMed

    Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon

    2017-04-10

    This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead-encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin-biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules.

  6. Deterministic bead-in-droplet ejection utilizing an integrated plug-in bead dispenser for single bead–based applications

    PubMed Central

    Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon

    2017-01-01

    This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead–encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin–biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules. PMID:28393911

  7. Current issues and actions in radiation protection of patients.

    PubMed

    Holmberg, Ola; Malone, Jim; Rehani, Madan; McLean, Donald; Czarwinski, Renate

    2010-10-01

    Medical application of ionizing radiation is a massive and increasing activity globally. While the use of ionizing radiation in medicine brings tremendous benefits to the global population, the associated risks due to stochastic and deterministic effects make it necessary to protect patients from potential harm. Current issues in radiation protection of patients include not only the rapidly increasing collective dose to the global population from medical exposure, but also that a substantial percentage of diagnostic imaging examinations are unnecessary, and the cumulative dose to individuals from medical exposure is growing. In addition to this, continued reports on deterministic injuries from safety related events in the medical use of ionizing radiation are raising awareness on the necessity for accident prevention measures. The International Atomic Energy Agency is engaged in several activities to reverse the negative trends of these current issues, including improvement of the justification process, the tracking of radiation history of individual patients, shared learning of safety significant events, and the use of comprehensive quality audits in the clinical environment. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  8. On-line crack prognosis in attachment lug using Lamb wave-deterministic resampling particle filter-based method

    NASA Astrophysics Data System (ADS)

    Yuan, Shenfang; Chen, Jian; Yang, Weibo; Qiu, Lei

    2017-08-01

    Fatigue crack growth prognosis is important for prolonging service time, improving safety, and reducing maintenance cost in many safety-critical systems, such as in aircraft, wind turbines, bridges, and nuclear plants. Combining fatigue crack growth models with the particle filter (PF) method has proved promising to deal with the uncertainties during fatigue crack growth and reach a more accurate prognosis. However, research on prognosis methods integrating on-line crack monitoring with the PF method is still lacking, as well as experimental verifications. Besides, the PF methods adopted so far are almost all sequential importance resampling-based PFs, which usually encounter sample impoverishment problems, and hence performs poorly. To solve these problems, in this paper, the piezoelectric transducers (PZTs)-based active Lamb wave method is adopted for on-line crack monitoring. The deterministic resampling PF (DRPF) is proposed to be used in fatigue crack growth prognosis, which can overcome the sample impoverishment problem. The proposed method is verified through fatigue tests of attachment lugs, which are a kind of important joint component in aerospace systems.

  9. Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David

    2015-07-01

    Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less

  10. Scattering effects of machined optical surfaces

    NASA Astrophysics Data System (ADS)

    Thompson, Anita Kotha

    1998-09-01

    Optical fabrication is one of the most labor-intensive industries in existence. Lensmakers use pitch to affix glass blanks to metal chucks that hold the glass as they grind it with tools that have not changed much in fifty years. Recent demands placed on traditional optical fabrication processes in terms of surface accuracy, smoothnesses, and cost effectiveness has resulted in the exploitation of precision machining technology to develop a new generation of computer numerically controlled (CNC) optical fabrication equipment. This new kind of precision machining process is called deterministic microgrinding. The most conspicuous feature of optical surfaces manufactured by the precision machining processes (such as single-point diamond turning or deterministic microgrinding) is the presence of residual cutting tool marks. These residual tool marks exhibit a highly structured topography of periodic azimuthal or radial deterministic marks in addition to random microroughness. These distinct topographic features give rise to surface scattering effects that can significantly degrade optical performance. In this dissertation project we investigate the scattering behavior of machined optical surfaces and their imaging characteristics. In particular, we will characterize the residual optical fabrication errors and relate the resulting scattering behavior to the tool and machine parameters in order to evaluate and improve the deterministic microgrinding process. Other desired information derived from the investigation of scattering behavior is the optical fabrication tolerances necessary to satisfy specific image quality requirements. Optical fabrication tolerances are a major cost driver for any precision optical manufacturing technology. The derivation and control of the optical fabrication tolerances necessary for different applications and operating wavelength regimes will play a unique and central role in establishing deterministic microgrinding as a preferred and a cost-effective optical fabrication process. Other well understood optical fabrication processes will also be reviewed and a performance comparison with the conventional grinding and polishing technique will be made to determine any inherent advantages in the optical quality of surfaces produced by other techniques.

  11. Deterministic secure quantum communication using a single d-level system

    PubMed Central

    Jiang, Dong; Chen, Yuanyuan; Gu, Xuemei; Xie, Ling; Chen, Lijun

    2017-01-01

    Deterministic secure quantum communication (DSQC) can transmit secret messages between two parties without first generating a shared secret key. Compared with quantum key distribution (QKD), DSQC avoids the waste of qubits arising from basis reconciliation and thus reaches higher efficiency. In this paper, based on data block transmission and order rearrangement technologies, we propose a DSQC protocol. It utilizes a set of single d-level systems as message carriers, which are used to directly encode the secret message in one communication process. Theoretical analysis shows that these employed technologies guarantee the security, and the use of a higher dimensional quantum system makes our protocol achieve higher security and efficiency. Since only quantum memory is required for implementation, our protocol is feasible with current technologies. Furthermore, Trojan horse attack (THA) is taken into account in our protocol. We give a THA model and show that THA significantly increases the multi-photon rate and can thus be detected. PMID:28327557

  12. The concerted calculation of the BN-600 reactor for the deterministic and stochastic codes

    NASA Astrophysics Data System (ADS)

    Bogdanova, E. V.; Kuznetsov, A. N.

    2017-01-01

    The solution of the problem of increasing the safety of nuclear power plants implies the existence of complete and reliable information about the processes occurring in the core of a working reactor. Nowadays the Monte-Carlo method is the most general-purpose method used to calculate the neutron-physical characteristic of the reactor. But it is characterized by large time of calculation. Therefore, it may be useful to carry out coupled calculations with stochastic and deterministic codes. This article presents the results of research for possibility of combining stochastic and deterministic algorithms in calculation the reactor BN-600. This is only one part of the work, which was carried out in the framework of the graduation project at the NRC “Kurchatov Institute” in cooperation with S. S. Gorodkov and M. A. Kalugin. It is considering the 2-D layer of the BN-600 reactor core from the international benchmark test, published in the report IAEA-TECDOC-1623. Calculations of the reactor were performed with MCU code and then with a standard operative diffusion algorithm with constants taken from the Monte - Carlo computation. Macro cross-section, diffusion coefficients, the effective multiplication factor and the distribution of neutron flux and power were obtained in 15 energy groups. The reasonable agreement between stochastic and deterministic calculations of the BN-600 is observed.

  13. Impact of refining the assessment of dietary exposure to cadmium in the European adult population.

    PubMed

    Ferrari, Pietro; Arcella, Davide; Heraud, Fanny; Cappé, Stefano; Fabiansson, Stefan

    2013-01-01

    Exposure assessment constitutes an important step in any risk assessment of potentially harmful substances present in food. The European Food Safety Authority (EFSA) first assessed dietary exposure to cadmium in Europe using a deterministic framework, resulting in mean values of exposure in the range of health-based guidance values. Since then, the characterisation of foods has been refined to better match occurrence and consumption data, and a new strategy to handle left-censoring in occurrence data was devised. A probabilistic assessment was performed and compared with deterministic estimates, using occurrence values at the European level and consumption data from 14 national dietary surveys. Mean estimates in the probabilistic assessment ranged from 1.38 (95% CI = 1.35-1.44) to 2.08 (1.99-2.23) µg kg⁻¹ bodyweight (bw) week⁻¹ across the different surveys, which were less than 10% lower than deterministic (middle bound) mean values that ranged from 1.50 to 2.20 µg kg⁻¹ bw week⁻¹. Probabilistic 95th percentile estimates of dietary exposure ranged from 2.65 (2.57-2.72) to 4.99 (4.62-5.38) µg kg⁻¹ bw week⁻¹, which were, with the exception of one survey, between 3% and 17% higher than middle-bound deterministic estimates. Overall, the proportion of subjects exceeding the tolerable weekly intake of 2.5 µg kg⁻¹ bw ranged from 14.8% (13.6-16.0%) to 31.2% (29.7-32.5%) according to the probabilistic assessment. The results of this work indicate that mean values of dietary exposure to cadmium in the European population were of similar magnitude using determinist or probabilistic assessments. For higher exposure levels, probabilistic estimates were almost consistently larger than deterministic counterparts, thus reflecting the impact of using the full distribution of occurrence values to determine exposure levels. It is considered prudent to use probabilistic methodology should exposure estimates be close to or exceeding health-based guidance values.

  14. Digital Practices and Literacy Identities in English Education: From Deterministic Discourses to a Dialectic Framework

    ERIC Educational Resources Information Center

    Ortega, Leticia E.

    2008-01-01

    My research explores the challenges and questions that pre-service teachers in two English Education programs confronted with respect to the role of technology in their professional practices and identities. It is evident from the data that the decision to incorporate different technologies in their professional practices implied much more than…

  15. Computers in the Undergraduate Curriculum: An Aspect of the Many Section Problem.

    ERIC Educational Resources Information Center

    Churchill, Geoffrey

    A brief case study of the resistance to technological change is presented using DOG, a small scale deterministic business game, as the example of technology. DOG, a decision mathematics game for the purpose of providing an environment for application of mathematical concepts, consists of assignments mostly utilizing matrix algebra but also some…

  16. USING A RISK-BASED METHODOLOGY FOR THE TRANSFER OF RADIOACTIVE MATERIAL WITHIN THE SAVANNAH RIVER SITE BOUNDARY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loftin, B; Watkins, R; Loibl, M

    2010-06-03

    Shipment of radioactive materials (RAM) is discussed in the Code of Federal Regulations in parts of both 49 CFR and 10 CFR. The regulations provide the requirements and rules necessary for the safe shipment of RAM across public highways, railways, waterways, and through the air. These shipments are sometimes referred to as in-commerce shipments. Shipments of RAM entirely within the boundaries of Department of Energy sites, such as the Savannah River Site (SRS), can be made using methodology allowing provisions to maintain equivalent safety while deviating from the regulations for in-commerce shipments. These onsite shipments are known as transfers atmore » the SRS. These transfers must follow the requirements approved in a site-specific Transportation Safety Document (TSD). The TSD defines how the site will transfer materials so that they have equivalence to the regulations. These equivalences are documented in an Onsite Safety Assessment (OSA). The OSA can show how a particular packaging used onsite is equivalent to that which would be used for an in-commerce shipment. This is known as a deterministic approach. However, when a deterministic approach is not viable, the TSD allows for a risk-based OSA to be written. These risk-based assessments show that if a packaging does not provide the necessary safety to ensure that materials are not released (during normal or accident conditions) then the worst-case release of materials does not result in a dose consequence worse than that defined for the SRS. This paper will discuss recent challenges and successes using this methodology at the SRS.« less

  17. Deterministic generation of remote entanglement with active quantum feedback

    DOE PAGES

    Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...

    2015-12-10

    We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less

  18. Design of Critical Components

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.

    2001-01-01

    Critical component design is based on minimizing product failures that results in loss of life. Potential catastrophic failures are reduced to secondary failures where components removed for cause or operating time in the system. Issues of liability and cost of component removal become of paramount importance. Deterministic design with factors of safety and probabilistic design address but lack the essential characteristics for the design of critical components. In deterministic design and fabrication there are heuristic rules and safety factors developed over time for large sets of structural/material components. These factors did not come without cost. Many designs failed and many rules (codes) have standing committees to oversee their proper usage and enforcement. In probabilistic design, not only are failures a given, the failures are calculated; an element of risk is assumed based on empirical failure data for large classes of component operations. Failure of a class of components can be predicted, yet one can not predict when a specific component will fail. The analogy is to the life insurance industry where very careful statistics are book-kept on classes of individuals. For a specific class, life span can be predicted within statistical limits, yet life-span of a specific element of that class can not be predicted.

  19. Integrated Risk-Informed Decision-Making for an ALMR PRISM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muhlheim, Michael David; Belles, Randy; Denning, Richard S.

    Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less

  20. Research on Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Lobeck, William E.

    2002-01-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  1. Research on Intelligent Synthesis Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.; Loftin, R. Bowen

    2002-12-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  2. Chaos emerging in soil failure patterns observed during tillage: Normalized deterministic nonlinear prediction (NDNP) and its application.

    PubMed

    Sakai, Kenshi; Upadhyaya, Shrinivasa K; Andrade-Sanchez, Pedro; Sviridova, Nina V

    2017-03-01

    Real-world processes are often combinations of deterministic and stochastic processes. Soil failure observed during farm tillage is one example of this phenomenon. In this paper, we investigated the nonlinear features of soil failure patterns in a farm tillage process. We demonstrate emerging determinism in soil failure patterns from stochastic processes under specific soil conditions. We normalized the deterministic nonlinear prediction considering autocorrelation and propose it as a robust way of extracting a nonlinear dynamical system from noise contaminated motion. Soil is a typical granular material. The results obtained here are expected to be applicable to granular materials in general. From a global scale to nano scale, the granular material is featured in seismology, geotechnology, soil mechanics, and particle technology. The results and discussions presented here are applicable in these wide research areas. The proposed method and our findings are useful with respect to the application of nonlinear dynamics to investigate complex motions generated from granular materials.

  3. Electric field control of deterministic current-induced magnetization switching in a hybrid ferromagnetic/ferroelectric structure

    NASA Astrophysics Data System (ADS)

    Cai, Kaiming; Yang, Meiyin; Ju, Hailang; Wang, Sumei; Ji, Yang; Li, Baohe; Edmonds, Kevin William; Sheng, Yu; Zhang, Bao; Zhang, Nan; Liu, Shuai; Zheng, Houzhi; Wang, Kaiyou

    2017-07-01

    All-electrical and programmable manipulations of ferromagnetic bits are highly pursued for the aim of high integration and low energy consumption in modern information technology. Methods based on the spin-orbit torque switching in heavy metal/ferromagnet structures have been proposed with magnetic field, and are heading toward deterministic switching without external magnetic field. Here we demonstrate that an in-plane effective magnetic field can be induced by an electric field without breaking the symmetry of the structure of the thin film, and realize the deterministic magnetization switching in a hybrid ferromagnetic/ferroelectric structure with Pt/Co/Ni/Co/Pt layers on PMN-PT substrate. The effective magnetic field can be reversed by changing the direction of the applied electric field on the PMN-PT substrate, which fully replaces the controllability function of the external magnetic field. The electric field is found to generate an additional spin-orbit torque on the CoNiCo magnets, which is confirmed by macrospin calculations and micromagnetic simulations.

  4. Deterministic Creation of Macroscopic Cat States

    PubMed Central

    Lombardo, Daniel; Twamley, Jason

    2015-01-01

    Despite current technological advances, observing quantum mechanical effects outside of the nanoscopic realm is extremely challenging. For this reason, the observation of such effects on larger scale systems is currently one of the most attractive goals in quantum science. Many experimental protocols have been proposed for both the creation and observation of quantum states on macroscopic scales, in particular, in the field of optomechanics. The majority of these proposals, however, rely on performing measurements, making them probabilistic. In this work we develop a completely deterministic method of macroscopic quantum state creation. We study the prototypical optomechanical Membrane In The Middle model and show that by controlling the membrane’s opacity, and through careful choice of the optical cavity initial state, we can deterministically create and grow the spatial extent of the membrane’s position into a large cat state. It is found that by using a Bose-Einstein condensate as a membrane high fidelity cat states with spatial separations of up to ∼300 nm can be achieved. PMID:26345157

  5. Ordinal optimization and its application to complex deterministic problems

    NASA Astrophysics Data System (ADS)

    Yang, Mike Shang-Yu

    1998-10-01

    We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.

  6. A Comparison of Monte Carlo and Deterministic Solvers for keff and Sensitivity Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haeck, Wim; Parsons, Donald Kent; White, Morgan Curtis

    Verification and validation of our solutions for calculating the neutron reactivity for nuclear materials is a key issue to address for many applications, including criticality safety, research reactors, power reactors, and nuclear security. Neutronics codes solve variations of the Boltzmann transport equation. The two main variants are Monte Carlo versus deterministic solutions, e.g. the MCNP [1] versus PARTISN [2] codes, respectively. There have been many studies over the decades that examined the accuracy of such solvers and the general conclusion is that when the problems are well-posed, either solver can produce accurate results. However, the devil is always in themore » details. The current study examines the issue of self-shielding and the stress it puts on deterministic solvers. Most Monte Carlo neutronics codes use continuous-energy descriptions of the neutron interaction data that are not subject to this effect. The issue of self-shielding occurs because of the discretisation of data used by the deterministic solutions. Multigroup data used in these solvers are the average cross section and scattering parameters over an energy range. Resonances in cross sections can occur that change the likelihood of interaction by one to three orders of magnitude over a small energy range. Self-shielding is the numerical effect that the average cross section in groups with strong resonances can be strongly affected as neutrons within that material are preferentially absorbed or scattered out of the resonance energies. This affects both the average cross section and the scattering matrix.« less

  7. ACCELERATING FUSION REACTOR NEUTRONICS MODELING BY AUTOMATIC COUPLING OF HYBRID MONTE CARLO/DETERMINISTIC TRANSPORT ON CAD GEOMETRY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biondo, Elliott D; Ibrahim, Ahmad M; Mosher, Scott W

    2015-01-01

    Detailed radiation transport calculations are necessary for many aspects of the design of fusion energy systems (FES) such as ensuring occupational safety, assessing the activation of system components for waste disposal, and maintaining cryogenic temperatures within superconducting magnets. Hybrid Monte Carlo (MC)/deterministic techniques are necessary for this analysis because FES are large, heavily shielded, and contain streaming paths that can only be resolved with MC. The tremendous complexity of FES necessitates the use of CAD geometry for design and analysis. Previous ITER analysis has required the translation of CAD geometry to MCNP5 form in order to use the AutomateD VAriaNcemore » reducTion Generator (ADVANTG) for hybrid MC/deterministic transport. In this work, ADVANTG was modified to support CAD geometry, allowing hybrid (MC)/deterministic transport to be done automatically and eliminating the need for this translation step. This was done by adding a new ray tracing routine to ADVANTG for CAD geometries using the Direct Accelerated Geometry Monte Carlo (DAGMC) software library. This new capability is demonstrated with a prompt dose rate calculation for an ITER computational benchmark problem using both the Consistent Adjoint Driven Importance Sampling (CADIS) method an the Forward Weighted (FW)-CADIS method. The variance reduction parameters produced by ADVANTG are shown to be the same using CAD geometry and standard MCNP5 geometry. Significant speedups were observed for both neutrons (as high as a factor of 7.1) and photons (as high as a factor of 59.6).« less

  8. Health risk assessment of inorganic arsenic intake of Ronphibun residents via duplicate diet study.

    PubMed

    Saipan, Piyawat; Ruangwises, Suthep

    2009-06-01

    To assess health risk from exposure to inorganic arsenic via duplicate portion sampling method in Ronphibun residents. A hundred and forty samples (140 subject-days) were collected from participants in Ronphibun sub-district. Inorganic arsenic in duplicate diet sample was determined by acid digestion and hydride generation-atomic absorption spectrometry. Deterministic risk assessment is referenced throughout the present paper using United States Environmental Protection Agency (U.S. EPA) guidelines. The average daily dose and lifetime average daily dose of inorganic arsenic via duplicate diet were 0.0021 mg/kg/d and 0.00084 mg/kg/d, respectively. The risk estimates in terms of hazard quotient was 6.98 and cancer risk was 1.26 x 10(-3). The results of deterministic risk characterization both hazard quotient and cancer risk from exposure inorganic arsenic in duplicate diets were greater than safety risk levels of hazard quotient (1) and cancer risk (1 x 10(-4)).

  9. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  10. ICT and Pedagogy: Opportunities Missed?

    ERIC Educational Resources Information Center

    Adams, Paul

    2011-01-01

    The pace of Information and Communications Technology (ICT) development necessitates radical and rapid change for education. Given the English prevalence for an economically determinist orientation for educational outcomes, it seems pertinent to ask how learning in relation to ICT is to be conceptualised. Accepting the view that education needs to…

  11. Recent Developments in the UltraForm Finishing and UltraSurf Measuring of Axisymmetric IR Domes

    DTIC Science & Technology

    2010-06-08

    Approved for public release; distribution is unlimited. 13. SUPPLEMENTARY NOTES Presented at Mirror Technology Days, Boulder, Colorado, USA......deterministic fabrication solution for a wide range of newly developed windows , domes and mirrors . COMMERCIALIZATION  UltraForm Finishing ( UFF

  12. Reducing the critical particle diameter in (highly) asymmetric sieve-based lateral displacement devices.

    PubMed

    Dijkshoorn, J P; Schutyser, M A I; Sebris, M; Boom, R M; Wagterveld, R M

    2017-10-26

    Deterministic lateral displacement technology was originally developed in the realm of microfluidics, but has potential for larger scale separation as well. In our previous studies, we proposed a sieve-based lateral displacement device inspired on the principle of deterministic lateral displacement. The advantages of this new device is that it gives a lower pressure drop, lower risk of particle accumulation, higher throughput and is simpler to manufacture. However, until now this device has only been investigated for its separation of large particles of around 785 µm diameter. To separate smaller particles, we investigate several design parameters for their influence on the critical particle diameter. In a dimensionless evaluation, device designs with different geometry and dimensions were compared. It was found that sieve-based lateral displacement devices are able to displace particles due to the crucial role of the flow profile, despite of their unusual and asymmetric design. These results demonstrate the possibility to actively steer the velocity profile in order to reduce the critical diameter in deterministic lateral displacement devices, which makes this separation principle more accessible for large-scale, high throughput applications.

  13. Hybrid quantum teleportation: A theoretical model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeda, Shuntaro; Mizuta, Takahiro; Fuwa, Maria

    2014-12-04

    Hybrid quantum teleportation – continuous-variable teleportation of qubits – is a promising approach for deterministically teleporting photonic qubits. We propose how to implement it with current technology. Our theoretical model shows that faithful qubit transfer can be achieved for this teleportation by choosing an optimal gain for the teleporter’s classical channel.

  14. An Equilibrium Flow Model of a University Campus.

    ERIC Educational Resources Information Center

    Oliver, Robert M.; Hopkins, David S. P.

    This paper develops a simple deterministic model that relates student admissions and enrollments to the final demand for educated students. It includes the effects of dropout rates and student-teacher ratios on student enrollments and faculty staffing levels. Certain technological requirements are assumed known and given. These, as well as the…

  15. Churchill: an ultra-fast, deterministic, highly scalable and balanced parallelization strategy for the discovery of human genetic variation in clinical and population-scale genomics.

    PubMed

    Kelly, Benjamin J; Fitch, James R; Hu, Yangqiu; Corsmeier, Donald J; Zhong, Huachun; Wetzel, Amy N; Nordquist, Russell D; Newsom, David L; White, Peter

    2015-01-20

    While advances in genome sequencing technology make population-scale genomics a possibility, current approaches for analysis of these data rely upon parallelization strategies that have limited scalability, complex implementation and lack reproducibility. Churchill, a balanced regional parallelization strategy, overcomes these challenges, fully automating the multiple steps required to go from raw sequencing reads to variant discovery. Through implementation of novel deterministic parallelization techniques, Churchill allows computationally efficient analysis of a high-depth whole genome sample in less than two hours. The method is highly scalable, enabling full analysis of the 1000 Genomes raw sequence dataset in a week using cloud resources. http://churchill.nchri.org/.

  16. Deterministic transfer of an unknown qutrit state assisted by the low-Q microwave resonators

    NASA Astrophysics Data System (ADS)

    Liu, Tong; Zhang, Yang; Yu, Chang-Shui; Zhang, Wei-Ning

    2017-05-01

    Qutrits (i.e., three-level quantum systems) can be used to achieve many quantum information and communication tasks due to their large Hilbert spaces. In this work, we propose a scheme to transfer an unknown quantum state between two flux qutrits coupled to two superconducting coplanar waveguide resonators. The quantum state transfer can be deterministically achieved without measurements. Because resonator photons are virtually excited during the operation time, the decoherences caused by the resonator decay and the unwanted inter-resonator crosstalk are greatly suppressed. Moreover, our approach can be adapted to other solid-state qutrits coupled to circuit resonators. Numerical simulations show that the high-fidelity transfer of quantum state between the two qutrits is feasible with current circuit QED technology.

  17. Low technology systems for wastewater treatment: perspectives.

    PubMed

    Brissaud, F

    2007-01-01

    Low technology systems for the treatment of wastewater are sometimes presented as remnants of the past, nowadays supposedly only meant to serve developing countries and remote rural areas. However, considering their advantages and disadvantages together with enhanced treatment requirements and recent research and technological developments, the future of these systems still appears promising. Successful applications of low technology systems require that more care is taken of their design and operation than often observed. Correlatively, more efforts should be made to decipher the treatment mechanisms and determine the related reaction parameters, so as to provide more deterministic approaches of the natural wastewater treatment systems and better predict their performance.

  18. How to Stop Disagreeing and Start Cooperatingin the Presence of Asymmetric Packet Loss.

    PubMed

    Morales-Ponce, Oscar; Schiller, Elad M; Falcone, Paolo

    2018-04-22

    We consider the design of a disagreement correction protocol in multi-vehicle systems. Vehicles broadcast in real-time vital information such as position, direction, speed, acceleration, intention, etc. This information is then used to identify the risks and adapt their trajectory to maintain the highest performance without compromising the safety. To minimize the risk due to the use of inconsistent information, all cooperating vehicles must agree whether to use the exchanged information to operate in a cooperative mode or use the only local information to operate in an autonomous mode. However, since wireless communications are prone to failures, it is impossible to deterministically reach an agreement. Therefore, any protocol will exhibit necessary disagreement periods. In this paper, we investigate whether vehicles can still cooperate despite communication failures even in the scenario where communication is suddenly not available. We present a deterministic protocol that allows all participants to either operate a cooperative mode when vehicles can exchange all the information in a timely manner or operate in autonomous mode when messages are lost. We show formally that the disagreement time is bounded by the time that the communication channel requires to deliver messages and validate our protocol using NS-3 simulations. We explain how the proposed solution can be used in vehicular platooning to attain high performance and still guarantee high safety standards despite communication failures.

  19. How to Stop Disagreeing and Start Cooperatingin the Presence of Asymmetric Packet Loss

    PubMed Central

    2018-01-01

    We consider the design of a disagreement correction protocol in multi-vehicle systems. Vehicles broadcast in real-time vital information such as position, direction, speed, acceleration, intention, etc. This information is then used to identify the risks and adapt their trajectory to maintain the highest performance without compromising the safety. To minimize the risk due to the use of inconsistent information, all cooperating vehicles must agree whether to use the exchanged information to operate in a cooperative mode or use the only local information to operate in an autonomous mode. However, since wireless communications are prone to failures, it is impossible to deterministically reach an agreement. Therefore, any protocol will exhibit necessary disagreement periods. In this paper, we investigate whether vehicles can still cooperate despite communication failures even in the scenario where communication is suddenly not available. We present a deterministic protocol that allows all participants to either operate a cooperative mode when vehicles can exchange all the information in a timely manner or operate in autonomous mode when messages are lost. We show formally that the disagreement time is bounded by the time that the communication channel requires to deliver messages and validate our protocol using NS-3 simulations. We explain how the proposed solution can be used in vehicular platooning to attain high performance and still guarantee high safety standards despite communication failures. PMID:29690572

  20. Landslide prediction using combined deterministic and probabilistic methods in hilly area of Mt. Medvednica in Zagreb City, Croatia

    NASA Astrophysics Data System (ADS)

    Wang, Chunxiang; Watanabe, Naoki; Marui, Hideaki

    2013-04-01

    The hilly slopes of Mt. Medvednica are stretched in the northwestern part of Zagreb City, Croatia, and extend to approximately 180km2. In this area, landslides, e.g. Kostanjek landslide and Črešnjevec landslide, have brought damage to many houses, roads, farmlands, grassland and etc. Therefore, it is necessary to predict the potential landslides and to enhance landslide inventory for hazard mitigation and security management of local society in this area. We combined deterministic method and probabilistic method to assess potential landslides including their locations, size and sliding surfaces. Firstly, this study area is divided into several slope units that have similar topographic and geological characteristics using the hydrology analysis tool in ArcGIS. Then, a GIS-based modified three-dimensional Hovland's method for slope stability analysis system is developed to identify the sliding surface and corresponding three-dimensional safety factor for each slope unit. Each sliding surface is assumed to be the lower part of each ellipsoid. The direction of inclination of the ellipsoid is considered to be the same as the main dip direction of the slope unit. The center point of the ellipsoid is randomly set to the center point of a grid cell in the slope unit. The minimum three-dimensional safety factor and corresponding critical sliding surface are also obtained for each slope unit. Thirdly, since a single value of safety factor is insufficient to evaluate the slope stability of a slope unit, the ratio of the number of calculation cases in which the three-dimensional safety factor values less than 1.0 to the total number of trial calculation is defined as the failure probability of the slope unit. If the failure probability is more than 80%, the slope unit is distinguished as 'unstable' from other slope units and the landslide hazard can be mapped for the whole study area.

  1. A Hierarchy Fuzzy MCDM Method for Studying Electronic Marketing Strategies in the Information Service Industry.

    ERIC Educational Resources Information Center

    Tang, Michael T.; Tzeng, Gwo-Hshiung

    In this paper, the impacts of Electronic Commerce (EC) on the international marketing strategies of information service industries are studied. In seeking to blend humanistic concerns in this research with technological development by addressing challenges for deterministic attitudes, the paper examines critical environmental factors relevant to…

  2. Nanoscale lateral displacement arrays for the separation of exosomes and colloids down to 20 nm

    NASA Astrophysics Data System (ADS)

    Austin, Robert; Wunsch, Benjamin; Smith, Joshua; Gifford, Stacey; Wang, Chao; Brink, Markus; Bruce, Robert; Stolovitzky, Gustavo; Astier, Yann

    Deterministic lateral displacement (DLD) pillar arrays are an efficient technology to sort, separate and enrich micrometre-scale particles, which include parasites1, bacteria2, blood cells3 and circulating tumour cells in blood4. However, this technology has not been translated to the true nanoscale, where it could function on biocolloids, such as exosomes. Exosomes, a key target of liquid biopsies, are secreted by cells and contain nucleic acid and protein information about their originating tissue5. One challenge in the study of exosome biology is to sort exosomes by size and surface markers6, 7. We use manufacturable silicon processes to produce nanoscale DLD (nano-DLD) arrays of uniform gap sizes ranging from 25 to 235 nm. We show that at low Péclet (Pe) numbers, at which diffusion and deterministic displacement compete, nano-DLD arrays separate particles between 20 to 110 nm based on size with sharp resolution. Further, we demonstrate the size-based displacement of exosomes, and so open up the potential for on-chip sorting and quantification of these important biocolloids.

  3. Nanoscale lateral displacement arrays for the separation of exosomes and colloids down to 20 nm

    NASA Astrophysics Data System (ADS)

    Wunsch, Benjamin H.; Smith, Joshua T.; Gifford, Stacey M.; Wang, Chao; Brink, Markus; Bruce, Robert L.; Austin, Robert H.; Stolovitzky, Gustavo; Astier, Yann

    2016-11-01

    Deterministic lateral displacement (DLD) pillar arrays are an efficient technology to sort, separate and enrich micrometre-scale particles, which include parasites, bacteria, blood cells and circulating tumour cells in blood. However, this technology has not been translated to the true nanoscale, where it could function on biocolloids, such as exosomes. Exosomes, a key target of 'liquid biopsies', are secreted by cells and contain nucleic acid and protein information about their originating tissue. One challenge in the study of exosome biology is to sort exosomes by size and surface markers. We use manufacturable silicon processes to produce nanoscale DLD (nano-DLD) arrays of uniform gap sizes ranging from 25 to 235 nm. We show that at low Péclet (Pe) numbers, at which diffusion and deterministic displacement compete, nano-DLD arrays separate particles between 20 to 110 nm based on size with sharp resolution. Further, we demonstrate the size-based displacement of exosomes, and so open up the potential for on-chip sorting and quantification of these important biocolloids.

  4. Transient deterministic shallow landslide modeling: Requirements for susceptibility and hazard assessments in a GIS framework

    USGS Publications Warehouse

    Godt, J.W.; Baum, R.L.; Savage, W.Z.; Salciarini, D.; Schulz, W.H.; Harp, E.L.

    2008-01-01

    Application of transient deterministic shallow landslide models over broad regions for hazard and susceptibility assessments requires information on rainfall, topography and the distribution and properties of hillside materials. We survey techniques for generating the spatial and temporal input data for such models and present an example using a transient deterministic model that combines an analytic solution to assess the pore-pressure response to rainfall infiltration with an infinite-slope stability calculation. Pore-pressures and factors of safety are computed on a cell-by-cell basis and can be displayed or manipulated in a grid-based GIS. Input data are high-resolution (1.8??m) topographic information derived from LiDAR data and simple descriptions of initial pore-pressure distribution and boundary conditions for a study area north of Seattle, Washington. Rainfall information is taken from a previously defined empirical rainfall intensity-duration threshold and material strength and hydraulic properties were measured both in the field and laboratory. Results are tested by comparison with a shallow landslide inventory. Comparison of results with those from static infinite-slope stability analyses assuming fixed water-table heights shows that the spatial prediction of shallow landslide susceptibility is improved using the transient analyses; moreover, results can be depicted in terms of the rainfall intensity and duration known to trigger shallow landslides in the study area.

  5. An investigation of air transportation technology at the Massachusetts Institute of Technology, 1990-1991

    NASA Technical Reports Server (NTRS)

    Simpson, Robert W.

    1991-01-01

    Brief summaries are given of research activities at the Massachusetts Institute of Technology (MIT) under the sponsorship of the FAA/NASA Joint University Program. Topics covered include hazard assessment and cockpit presentation issues for microburst alerting systems; the situational awareness effect of automated air traffic control (ATC) datalink clearance amendments; a graphical simulation system for adaptive, automated approach spacing; an expert system for temporal planning with application to runway configuration management; deterministic multi-zone ice accretion modeling; alert generation and cockpit presentation for an integrated microburst alerting system; and passive infrared ice detection for helicopter applications.

  6. A TTC upgrade proposal using bidirectional 10G-PON FTTH technology

    NASA Astrophysics Data System (ADS)

    Kolotouros, D. M.; Baron, S.; Soos, C.; Vasey, F.

    2015-04-01

    A new generation FPGA-based Timing-Trigger and Control (TTC) system based on emerging Passive Optical Network (PON) technology is being proposed to replace the existing off-detector TTC system used by the LHC experiments. High split ratio, dynamic software partitioning, low and deterministic latency, as well as low jitter are required. Exploiting the latest available technologies allows delivering higher capacity together with bidirectionality, a feature absent from the legacy TTC system. This article focuses on the features and capabilities of the latest TTC-PON prototype based on 10G-PON FTTH components along with some metrics characterizing its performance.

  7. Robust planning of dynamic wireless charging infrastructure for battery electric buses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zhaocai; Song, Ziqi

    Battery electric buses with zero tailpipe emissions have great potential in improving environmental sustainability and livability of urban areas. However, the problems of high cost and limited range associated with on-board batteries have substantially limited the popularity of battery electric buses. The technology of dynamic wireless power transfer (DWPT), which provides bus operators with the ability to charge buses while in motion, may be able to effectively alleviate the drawbacks of electric buses. In this paper, we address the problem of simultaneously selecting the optimal location of the DWPT facilities and designing the optimal battery sizes of electric buses formore » a DWPT electric bus system. The problem is first constructed as a deterministic model in which the uncertainty of energy consumption and travel time of electric buses is neglected. The methodology of robust optimization (RO) is then adopted to address the uncertainty of energy consumption and travel time. The affinely adjustable robust counterpart (AARC) of the deterministic model is developed, and its equivalent tractable mathematical programming is derived. Both the deterministic model and the robust model are demonstrated with a real-world bus system. The results of our study demonstrate that the proposed deterministic model can effectively determine the allocation of DWPT facilities and the battery sizes of electric buses for a DWPT electric bus system; and the robust model can further provide optimal designs that are robust against the uncertainty of energy consumption and travel time for electric buses.« less

  8. Robust planning of dynamic wireless charging infrastructure for battery electric buses

    DOE PAGES

    Liu, Zhaocai; Song, Ziqi

    2017-10-01

    Battery electric buses with zero tailpipe emissions have great potential in improving environmental sustainability and livability of urban areas. However, the problems of high cost and limited range associated with on-board batteries have substantially limited the popularity of battery electric buses. The technology of dynamic wireless power transfer (DWPT), which provides bus operators with the ability to charge buses while in motion, may be able to effectively alleviate the drawbacks of electric buses. In this paper, we address the problem of simultaneously selecting the optimal location of the DWPT facilities and designing the optimal battery sizes of electric buses formore » a DWPT electric bus system. The problem is first constructed as a deterministic model in which the uncertainty of energy consumption and travel time of electric buses is neglected. The methodology of robust optimization (RO) is then adopted to address the uncertainty of energy consumption and travel time. The affinely adjustable robust counterpart (AARC) of the deterministic model is developed, and its equivalent tractable mathematical programming is derived. Both the deterministic model and the robust model are demonstrated with a real-world bus system. The results of our study demonstrate that the proposed deterministic model can effectively determine the allocation of DWPT facilities and the battery sizes of electric buses for a DWPT electric bus system; and the robust model can further provide optimal designs that are robust against the uncertainty of energy consumption and travel time for electric buses.« less

  9. Reliability Analysis of Retaining Walls Subjected to Blast Loading by Finite Element Approach

    NASA Astrophysics Data System (ADS)

    GuhaRay, Anasua; Mondal, Stuti; Mohiuddin, Hisham Hasan

    2018-02-01

    Conventional design methods adopt factor of safety as per practice and experience, which are deterministic in nature. The limit state method, though not completely deterministic, does not take into account effect of design parameters, which are inherently variable such as cohesion, angle of internal friction, etc. for soil. Reliability analysis provides a measure to consider these variations into analysis and hence results in a more realistic design. Several studies have been carried out on reliability of reinforced concrete walls and masonry walls under explosions. Also, reliability analysis of retaining structures against various kinds of failure has been done. However, very few research works are available on reliability analysis of retaining walls subjected to blast loading. Thus, the present paper considers the effect of variation of geotechnical parameters when a retaining wall is subjected to blast loading. However, it is found that the variation of geotechnical random variables does not have a significant effect on the stability of retaining walls subjected to blast loading.

  10. Application of Deterministic and Probabilistic System Design Methods and Enhancements of Conceptual Design Tools for ERA Project

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.; Schutte, Jeff S.

    2016-01-01

    This report documents work done by the Aerospace Systems Design Lab (ASDL) at the Georgia Institute of Technology, Daniel Guggenheim School of Aerospace Engineering for the National Aeronautics and Space Administration, Aeronautics Research Mission Directorate, Integrated System Research Program, Environmentally Responsible Aviation (ERA) Project. This report was prepared under contract NNL12AA12C, "Application of Deterministic and Probabilistic System Design Methods and Enhancement of Conceptual Design Tools for ERA Project". The research within this report addressed the Environmentally Responsible Aviation (ERA) project goal stated in the NRA solicitation "to advance vehicle concepts and technologies that can simultaneously reduce fuel burn, noise, and emissions." To identify technology and vehicle solutions that simultaneously meet these three metrics requires the use of system-level analysis with the appropriate level of fidelity to quantify feasibility, benefits and degradations, and associated risk. In order to perform the system level analysis, the Environmental Design Space (EDS) [Kirby 2008, Schutte 2012a] environment developed by ASDL was used to model both conventional and unconventional configurations as well as to assess technologies from the ERA and N+2 timeframe portfolios. A well-established system design approach was used to perform aircraft conceptual design studies, including technology trade studies to identify technology portfolios capable of accomplishing the ERA project goal and to obtain accurate tradeoffs between performance, noise, and emissions. The ERA goal, shown in Figure 1, is to simultaneously achieve the N+2 benefits of a cumulative noise margin of 42 EPNdB relative to stage 4, a 75 percent reduction in LTO NOx emissions relative to CAEP 6 and a 50 percent reduction in fuel burn relative to the 2005 best in class aircraft. There were 5 research task associated with this research: 1) identify technology collectors, 2) model technology collectors in EDS, 3) model and assess ERA technologies, 4) LTO and cruise emission prediction, and 5) probabilistic analysis of technology collectors and portfolios.

  11. Multi-dimensional photonic states from a quantum dot

    NASA Astrophysics Data System (ADS)

    Lee, J. P.; Bennett, A. J.; Stevenson, R. M.; Ellis, D. J. P.; Farrer, I.; Ritchie, D. A.; Shields, A. J.

    2018-04-01

    Quantum states superposed across multiple particles or degrees of freedom offer an advantage in the development of quantum technologies. Creating these states deterministically and with high efficiency is an ongoing challenge. A promising approach is the repeated excitation of multi-level quantum emitters, which have been shown to naturally generate light with quantum statistics. Here we describe how to create one class of higher dimensional quantum state, a so called W-state, which is superposed across multiple time bins. We do this by repeated Raman scattering of photons from a charged quantum dot in a pillar microcavity. We show this method can be scaled to larger dimensions with no reduction in coherence or single-photon character. We explain how to extend this work to enable the deterministic creation of arbitrary time-bin encoded qudits.

  12. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.

  13. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine Engines) with the fast probability integration technique (FPI). FPI was developed by Southwest Research Institute under contract with the NASA Glenn Research Center. The results were plotted in the form of cumulative distribution functions and sensitivity analyses and were compared with results from the traditional deterministic approach. The comparison showed that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system. The current work addressed the application of the probabilistic approach to assess specific fuel consumption, engine thrust, and weight. Similarly, the approach can be used to assess other aspects of aeropropulsion system performance, such as cost, acoustic noise, and emissions. Additional information is included in the original extended abstract.

  14. History of magnetorheological finishing

    NASA Astrophysics Data System (ADS)

    Harris, Daniel C.

    2011-06-01

    Magnetorheological finishing (MRF) is a deterministic method for producing complex optics with figure accuracy <50 nm and surface roughness <1 nm. MRF was invented at the Luikov Institute of Heat and Mass Transfer in Minsk, Belarus in the late 1980s by a team led by William Kordonski. When the Soviet Union opened up, New York businessman Lowell Mintz was invited to Minsk in 1990 to explore possibilities for technology transfer. Mintz was told of the potential for MRF, but did not understand whether it had value. Mintz was referred to Harvey Pollicove at the Center for Optics Manufacturing of the University of Rochester. As a result of their conversation, they sent Prof. Steve Jacobs to visit Minsk and evaluate MRF. From Jacobs' positive findings, and with support from Lowell Mintz, Kordonski and his colleagues were invited in 1993 to work at the Center for Optics Manufacturing with Jacobs and Don Golini to refine MRF technology. A "preprototype" finishing machine was operating by 1994. Prof. Greg Forbes and doctoral student Paul Dumas developed algorithms for deterministic control of MRF. In 1996, Golini recognized the commercial potential of MRF, secured investment capital from Lowell Mintz, and founded QED Technologies. The first commercial MRF machine was unveiled in 1998. It was followed by more advanced models and by groundbreaking subaperture stitching interferometers for metrology. In 2006, QED was acquired by and became a division of Cabot Microelectronics. This paper recounts the history of the development of MRF and the founding of QED Technologies.

  15. A Technological Determinist Viewpoint of the Stanton-Staggers Conflict over "The Selling of the Pentagon": Print Man Versus Electronic Man.

    ERIC Educational Resources Information Center

    Breen, Myles P.

    Media, specifically documentary films on television, profoundly affect both social structure and man's psychological percepts. The clash of views depicted is between "print man" (using U.S. Representative Harley Staggers as an example) and "electronic man" (portrayed as Frank Stanton of CBS) centering on Stagger's objections to…

  16. Development of Methodologies for IV and V of Neural Networks

    NASA Technical Reports Server (NTRS)

    Taylor, Brian; Darrah, Marjorie

    2003-01-01

    Non-deterministic systems often rely upon neural network (NN) technology to "lean" to manage flight systems under controlled conditions using carefully chosen training sets. How can these adaptive systems be certified to ensure that they will become increasingly efficient and behave appropriately in real-time situations? The bulk of Independent Verification and Validation (IV&V) research of non-deterministic software control systems such as Adaptive Flight Controllers (AFC's) addresses NNs in well-behaved and constrained environments such as simulations and strict process control. However, neither substantive research, nor effective IV&V techniques have been found to address AFC's learning in real-time and adapting to live flight conditions. Adaptive flight control systems offer good extensibility into commercial aviation as well as military aviation and transportation. Consequently, this area of IV&V represents an area of growing interest and urgency. ISR proposes to further the current body of knowledge to meet two objectives: Research the current IV&V methods and assess where these methods may be applied toward a methodology for the V&V of Neural Network; and identify effective methods for IV&V of NNs that learn in real-time, including developing a prototype test bed for IV&V of AFC's. Currently. no practical method exists. lSR will meet these objectives through the tasks identified and described below. First, ISR will conduct a literature review of current IV&V technology. TO do this, ISR will collect the existing body of research on IV&V of non-deterministic systems and neural network. ISR will also develop the framework for disseminating this information through specialized training. This effort will focus on developing NASA's capability to conduct IV&V of neural network systems and to provide training to meet the increasing need for IV&V expertise in such systems.

  17. Efficient quantum computing using coherent photon conversion.

    PubMed

    Langford, N K; Ramelow, S; Prevedel, R; Munro, W J; Milburn, G J; Zeilinger, A

    2011-10-12

    Single photons are excellent quantum information carriers: they were used in the earliest demonstrations of entanglement and in the production of the highest-quality entanglement reported so far. However, current schemes for preparing, processing and measuring them are inefficient. For example, down-conversion provides heralded, but randomly timed, single photons, and linear optics gates are inherently probabilistic. Here we introduce a deterministic process--coherent photon conversion (CPC)--that provides a new way to generate and process complex, multiquanta states for photonic quantum information applications. The technique uses classically pumped nonlinearities to induce coherent oscillations between orthogonal states of multiple quantum excitations. One example of CPC, based on a pumped four-wave-mixing interaction, is shown to yield a single, versatile process that provides a full set of photonic quantum processing tools. This set satisfies the DiVincenzo criteria for a scalable quantum computing architecture, including deterministic multiqubit entanglement gates (based on a novel form of photon-photon interaction), high-quality heralded single- and multiphoton states free from higher-order imperfections, and robust, high-efficiency detection. It can also be used to produce heralded multiphoton entanglement, create optically switchable quantum circuits and implement an improved form of down-conversion with reduced higher-order effects. Such tools are valuable building blocks for many quantum-enabled technologies. Finally, using photonic crystal fibres we experimentally demonstrate quantum correlations arising from a four-colour nonlinear process suitable for CPC and use these measurements to study the feasibility of reaching the deterministic regime with current technology. Our scheme, which is based on interacting bosonic fields, is not restricted to optical systems but could also be implemented in optomechanical, electromechanical and superconducting systems with extremely strong intrinsic nonlinearities. Furthermore, exploiting higher-order nonlinearities with multiple pump fields yields a mechanism for multiparty mediation of the complex, coherent dynamics.

  18. 75 FR 20038 - Railroad Safety Technology Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ...] Railroad Safety Technology Grant Program AGENCY: Federal Railroad Administration, Department of Transportation. ACTION: Notice of Funds Availability, Railroad Safety Technology Program-Correction of Grant... Railroad Safety Technology Program, in the section, ``Requirements and Conditions for Grant Applications...

  19. Safety Evaluation of an Automated Remote Monitoring System for Heart Failure in an Urban, Indigent Population.

    PubMed

    Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Hertz, Crystal Coyazo; Guterman, Jeffrey J

    2017-12-01

    Heart Failure (HF) is the most expensive preventable condition, regardless of patient ethnicity, race, socioeconomic status, sex, and insurance status. Remote telemonitoring with timely outpatient care can significantly reduce avoidable HF hospitalizations. Human outreach, the traditional method used for remote monitoring, is effective but costly. Automated systems can potentially provide positive clinical, fiscal, and satisfaction outcomes in chronic disease monitoring. The authors implemented a telephonic HF automated remote monitoring system that utilizes deterministic decision tree logic to identify patients who are at risk of clinical decompensation. This safety study evaluated the degree of clinical concordance between the automated system and traditional human monitoring. This study focused on a broad underserved population and demonstrated a safe, reliable, and inexpensive method of monitoring patients with HF.

  20. Plenary: Progress in Regional Landslide Hazard Assessment—Examples from the USA

    USGS Publications Warehouse

    Baum, Rex L.; Schulz, William; Brien, Dianne L.; Burns, William J.; Reid, Mark E.; Godt, Jonathan W.

    2014-01-01

    Landslide hazard assessment at local and regional scales contributes to mitigation of landslides in developing and densely populated areas by providing information for (1) land development and redevelopment plans and regulations, (2) emergency preparedness plans, and (3) economic analysis to (a) set priorities for engineered mitigation projects and (b) define areas of similar levels of hazard for insurance purposes. US Geological Survey (USGS) research on landslide hazard assessment has explored a range of methods that can be used to estimate temporal and spatial landslide potential and probability for various scales and purposes. Cases taken primarily from our work in the U.S. Pacific Northwest illustrate and compare a sampling of methods, approaches, and progress. For example, landform mapping using high-resolution topographic data resulted in identification of about four times more landslides in Seattle, Washington, than previous efforts using aerial photography. Susceptibility classes based on the landforms captured 93 % of all historical landslides (all types) throughout the city. A deterministic model for rainfall infiltration and shallow landslide initiation, TRIGRS, was able to identify locations of 92 % of historical shallow landslides in southwest Seattle. The potentially unstable areas identified by TRIGRS occupied only 26 % of the slope areas steeper than 20°. Addition of an unsaturated infiltration model to TRIGRS expands the applicability of the model to areas of highly permeable soils. Replacement of the single cell, 1D factor of safety with a simple 3D method of columns improves accuracy of factor of safety predictions for both saturated and unsaturated infiltration models. A 3D deterministic model for large, deep landslides, SCOOPS, combined with a three-dimensional model for groundwater flow, successfully predicted instability in steep areas of permeable outwash sand and topographic reentrants. These locations are consistent with locations of large, deep, historically active landslides. For an area in Seattle, a composite of the three maps illustrates how maps produced by different approaches might be combined to assess overall landslide potential. Examples from Oregon, USA, illustrate how landform mapping and deterministic analysis for shallow landslide potential have been adapted into standardized methods for efficiently producing detailed landslide inventory and shallow landslide susceptibility maps that have consistent content and format statewide.

  1. FY2017 Updates to the SAS4A/SASSYS-1 Safety Analysis Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanning, T. H.

    The SAS4A/SASSYS-1 safety analysis software is used to perform deterministic analysis of anticipated events as well as design-basis and beyond-design-basis accidents for advanced fast reactors. It plays a central role in the analysis of U.S. DOE conceptual designs, proposed test and demonstration reactors, and in domestic and international collaborations. This report summarizes the code development activities that have taken place during FY2017. Extensions to the void and cladding reactivity feedback models have been implemented, and Control System capabilities have been improved through a new virtual data acquisition system for plant state variables and an additional Block Signal for a variablemore » lag compensator to represent reactivity feedback for novel shutdown devices. Current code development and maintenance needs are also summarized in three key areas: software quality assurance, modeling improvements, and maintenance of related tools. With ongoing support, SAS4A/SASSYS-1 can continue to fulfill its growing role in fast reactor safety analysis and help solidify DOE’s leadership role in fast reactor safety both domestically and in international collaborations.« less

  2. Deterministic composite nanophotonic lattices in large area for broadband applications

    NASA Astrophysics Data System (ADS)

    Xavier, Jolly; Probst, Jürgen; Becker, Christiane

    2016-12-01

    Exotic manipulation of the flow of photons in nanoengineered materials with an aperiodic distribution of nanostructures plays a key role in efficiency-enhanced broadband photonic and plasmonic technologies for spectrally tailorable integrated biosensing, nanostructured thin film solarcells, white light emitting diodes, novel plasmonic ensembles etc. Through a generic deterministic nanotechnological route here we show subwavelength-scale silicon (Si) nanostructures on nanoimprinted glass substrate in large area (4 cm2) with advanced functional features of aperiodic composite nanophotonic lattices. These nanophotonic aperiodic lattices have easily tailorable supercell tiles with well-defined and discrete lattice basis elements and they show rich Fourier spectra. The presented nanophotonic lattices are designed functionally akin to two-dimensional aperiodic composite lattices with unconventional flexibility- comprising periodic photonic crystals and/or in-plane photonic quasicrystals as pattern design subsystems. The fabricated composite lattice-structured Si nanostructures are comparatively analyzed with a range of nanophotonic structures with conventional lattice geometries of periodic, disordered random as well as in-plane quasicrystalline photonic lattices with comparable lattice parameters. As a proof of concept of compatibility with advanced bottom-up liquid phase crystallized (LPC) Si thin film fabrication, the experimental structural analysis is further extended to double-side-textured deterministic aperiodic lattice-structured 10 μm thick large area LPC Si film on nanoimprinted substrates.

  3. Chaotic dynamics and control of deterministic ratchets.

    PubMed

    Family, Fereydoon; Larrondo, H A; Zarlenga, D G; Arizmendi, C M

    2005-11-30

    Deterministic ratchets, in the inertial and also in the overdamped limit, have a very complex dynamics, including chaotic motion. This deterministically induced chaos mimics, to some extent, the role of noise, changing, on the other hand, some of the basic properties of thermal ratchets; for example, inertial ratchets can exhibit multiple reversals in the current direction. The direction depends on the amount of friction and inertia, which makes it especially interesting for technological applications such as biological particle separation. We overview in this work different strategies to control the current of inertial ratchets. The control parameters analysed are the strength and frequency of the periodic external force, the strength of the quenched noise that models a non-perfectly-periodic potential, and the mass of the particles. Control mechanisms are associated with the fractal nature of the basins of attraction of the mean velocity attractors. The control of the overdamped motion of noninteracting particles in a rocking periodic asymmetric potential is also reviewed. The analysis is focused on synchronization of the motion of the particles with the external sinusoidal driving force. Two cases are considered: a perfect lattice without disorder and a lattice with noncorrelated quenched noise. The amplitude of the driving force and the strength of the quenched noise are used as control parameters.

  4. Deterministic composite nanophotonic lattices in large area for broadband applications

    PubMed Central

    Xavier, Jolly; Probst, Jürgen; Becker, Christiane

    2016-01-01

    Exotic manipulation of the flow of photons in nanoengineered materials with an aperiodic distribution of nanostructures plays a key role in efficiency-enhanced broadband photonic and plasmonic technologies for spectrally tailorable integrated biosensing, nanostructured thin film solarcells, white light emitting diodes, novel plasmonic ensembles etc. Through a generic deterministic nanotechnological route here we show subwavelength-scale silicon (Si) nanostructures on nanoimprinted glass substrate in large area (4 cm2) with advanced functional features of aperiodic composite nanophotonic lattices. These nanophotonic aperiodic lattices have easily tailorable supercell tiles with well-defined and discrete lattice basis elements and they show rich Fourier spectra. The presented nanophotonic lattices are designed functionally akin to two-dimensional aperiodic composite lattices with unconventional flexibility- comprising periodic photonic crystals and/or in-plane photonic quasicrystals as pattern design subsystems. The fabricated composite lattice-structured Si nanostructures are comparatively analyzed with a range of nanophotonic structures with conventional lattice geometries of periodic, disordered random as well as in-plane quasicrystalline photonic lattices with comparable lattice parameters. As a proof of concept of compatibility with advanced bottom-up liquid phase crystallized (LPC) Si thin film fabrication, the experimental structural analysis is further extended to double-side-textured deterministic aperiodic lattice-structured 10 μm thick large area LPC Si film on nanoimprinted substrates. PMID:27941869

  5. Comprehensive European dietary exposure model (CEDEM) for food additives.

    PubMed

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.

  6. On Transform Domain Communication Systems under Spectrum Sensing Mismatch: A Deterministic Analysis.

    PubMed

    Jin, Chuanxue; Hu, Su; Huang, Yixuan; Luo, Qu; Huang, Dan; Li, Yi; Gao, Yuan; Cheng, Shaochi

    2017-07-08

    Towards the era of mobile Internet and the Internet of Things (IoT), numerous sensors and devices are being introduced and interconnected. To support such an amount of data traffic, traditional wireless communication technologies are facing challenges both in terms of the increasing shortage of spectrum resources and massive multiple access. The transform-domain communication system (TDCS) is considered as an alternative multiple access system, where 5G and mobile IoT are mainly focused. However, previous studies about TDCS are under the assumption that the transceiver has the global spectrum information, without the consideration of spectrum sensing mismatch (SSM). In this paper, we present the deterministic analysis of TDCS systems under arbitrary given spectrum sensing scenarios, especially the influence of the SSM pattern to the signal to noise ratio (SNR) performance. Simulation results show that arbitrary SSM pattern can lead to inferior bit error rate (BER) performance.

  7. Interactive Reliability Model for Whisker-toughened Ceramics

    NASA Technical Reports Server (NTRS)

    Palko, Joseph L.

    1993-01-01

    Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.

  8. Deterministic entanglement distillation for secure double-server blind quantum computation.

    PubMed

    Sheng, Yu-Bo; Zhou, Lan

    2015-01-15

    Blind quantum computation (BQC) provides an efficient method for the client who does not have enough sophisticated technology and knowledge to perform universal quantum computation. The single-server BQC protocol requires the client to have some minimum quantum ability, while the double-server BQC protocol makes the client's device completely classical, resorting to the pure and clean Bell state shared by two servers. Here, we provide a deterministic entanglement distillation protocol in a practical noisy environment for the double-server BQC protocol. This protocol can get the pure maximally entangled Bell state. The success probability can reach 100% in principle. The distilled maximally entangled states can be remaind to perform the BQC protocol subsequently. The parties who perform the distillation protocol do not need to exchange the classical information and they learn nothing from the client. It makes this protocol unconditionally secure and suitable for the future BQC protocol.

  9. Deterministic entanglement distillation for secure double-server blind quantum computation

    PubMed Central

    Sheng, Yu-Bo; Zhou, Lan

    2015-01-01

    Blind quantum computation (BQC) provides an efficient method for the client who does not have enough sophisticated technology and knowledge to perform universal quantum computation. The single-server BQC protocol requires the client to have some minimum quantum ability, while the double-server BQC protocol makes the client's device completely classical, resorting to the pure and clean Bell state shared by two servers. Here, we provide a deterministic entanglement distillation protocol in a practical noisy environment for the double-server BQC protocol. This protocol can get the pure maximally entangled Bell state. The success probability can reach 100% in principle. The distilled maximally entangled states can be remaind to perform the BQC protocol subsequently. The parties who perform the distillation protocol do not need to exchange the classical information and they learn nothing from the client. It makes this protocol unconditionally secure and suitable for the future BQC protocol. PMID:25588565

  10. Photonic Quantum Networks formed from NV− centers

    PubMed Central

    Nemoto, Kae; Trupke, Michael; Devitt, Simon J.; Scharfenberger, Burkhard; Buczak, Kathrin; Schmiedmayer, Jörg; Munro, William J.

    2016-01-01

    In this article we present a simple repeater scheme based on the negatively-charged nitrogen vacancy centre in diamond. Each repeater node is built from modules comprising an optical cavity containing a single NV−, with one nuclear spin from 15N as quantum memory. The module uses only deterministic processes and interactions to achieve high fidelity operations (>99%), and modules are connected by optical fiber. In the repeater node architecture, the processes between modules by photons can be in principle deterministic, however current limitations on optical components lead the processes to be probabilistic but heralded. Our resource-modest repeater architecture contains two modules at each node, and the repeater nodes are then connected by entangled photon pairs. We discuss the performance of such a quantum repeater network with modest resources and then incorporate more resource-intense strategies step by step. Our architecture should allow large-scale quantum information networks with existing or near future technology. PMID:27215433

  11. Data-driven gradient algorithm for high-precision quantum control

    NASA Astrophysics Data System (ADS)

    Wu, Re-Bing; Chu, Bing; Owens, David H.; Rabitz, Herschel

    2018-04-01

    In the quest to achieve scalable quantum information processing technologies, gradient-based optimal control algorithms (e.g., grape) are broadly used for implementing high-precision quantum gates, but their performance is often hindered by deterministic or random errors in the system model and the control electronics. In this paper, we show that grape can be taught to be more effective by jointly learning from the design model and the experimental data obtained from process tomography. The resulting data-driven gradient optimization algorithm (d-grape) can in principle correct all deterministic gate errors, with a mild efficiency loss. The d-grape algorithm may become more powerful with broadband controls that involve a large number of control parameters, while other algorithms usually slow down due to the increased size of the search space. These advantages are demonstrated by simulating the implementation of a two-qubit controlled-not gate.

  12. Single-photon non-linear optics with a quantum dot in a waveguide

    NASA Astrophysics Data System (ADS)

    Javadi, A.; Söllner, I.; Arcari, M.; Hansen, S. Lindskov; Midolo, L.; Mahmoodian, S.; Kiršanskė, G.; Pregnolato, T.; Lee, E. H.; Song, J. D.; Stobbe, S.; Lodahl, P.

    2015-10-01

    Strong non-linear interactions between photons enable logic operations for both classical and quantum-information technology. Unfortunately, non-linear interactions are usually feeble and therefore all-optical logic gates tend to be inefficient. A quantum emitter deterministically coupled to a propagating mode fundamentally changes the situation, since each photon inevitably interacts with the emitter, and highly correlated many-photon states may be created. Here we show that a single quantum dot in a photonic-crystal waveguide can be used as a giant non-linearity sensitive at the single-photon level. The non-linear response is revealed from the intensity and quantum statistics of the scattered photons, and contains contributions from an entangled photon-photon bound state. The quantum non-linearity will find immediate applications for deterministic Bell-state measurements and single-photon transistors and paves the way to scalable waveguide-based photonic quantum-computing architectures.

  13. Photonic Quantum Networks formed from NV(-) centers.

    PubMed

    Nemoto, Kae; Trupke, Michael; Devitt, Simon J; Scharfenberger, Burkhard; Buczak, Kathrin; Schmiedmayer, Jörg; Munro, William J

    2016-05-24

    In this article we present a simple repeater scheme based on the negatively-charged nitrogen vacancy centre in diamond. Each repeater node is built from modules comprising an optical cavity containing a single NV(-), with one nuclear spin from (15)N as quantum memory. The module uses only deterministic processes and interactions to achieve high fidelity operations (>99%), and modules are connected by optical fiber. In the repeater node architecture, the processes between modules by photons can be in principle deterministic, however current limitations on optical components lead the processes to be probabilistic but heralded. Our resource-modest repeater architecture contains two modules at each node, and the repeater nodes are then connected by entangled photon pairs. We discuss the performance of such a quantum repeater network with modest resources and then incorporate more resource-intense strategies step by step. Our architecture should allow large-scale quantum information networks with existing or near future technology.

  14. Deterministic Joint Remote Preparation of an Arbitrary Sevenqubit Cluster-type State

    NASA Astrophysics Data System (ADS)

    Ding, MengXiao; Jiang, Min

    2017-06-01

    In this paper, we propose a scheme for joint remotely preparing an arbitrary seven-qubit cluster-type state by using several GHZ entangled states as the quantum channel. The coefficients of the prepared states can be not only real, but also complex. Firstly, Alice performs a three-qubit projective measurement according to the amplitude coefficients of the target state, and then Bob carries out another three-qubit projective measurement based on its phase coefficients. Next, one three-qubit state containing all information of the target state is prepared with suitable operation. Finally, the target seven-qubit cluster-type state can be prepared by introducing four auxiliary qubits and performing appropriate local unitary operations based on the prepared three-qubit state in a deterministic way. The receiver's all recovery operations are summarized into a concise formula. Furthermore, it's worth noting that our scheme is more novel and feasible with the present technologies than most other previous schemes.

  15. On Transform Domain Communication Systems under Spectrum Sensing Mismatch: A Deterministic Analysis

    PubMed Central

    Jin, Chuanxue; Hu, Su; Huang, Yixuan; Luo, Qu; Huang, Dan; Li, Yi; Cheng, Shaochi

    2017-01-01

    Towards the era of mobile Internet and the Internet of Things (IoT), numerous sensors and devices are being introduced and interconnected. To support such an amount of data traffic, traditional wireless communication technologies are facing challenges both in terms of the increasing shortage of spectrum resources and massive multiple access. The transform-domain communication system (TDCS) is considered as an alternative multiple access system, where 5G and mobile IoT are mainly focused. However, previous studies about TDCS are under the assumption that the transceiver has the global spectrum information, without the consideration of spectrum sensing mismatch (SSM). In this paper, we present the deterministic analysis of TDCS systems under arbitrary given spectrum sensing scenarios, especially the influence of the SSM pattern to the signal to noise ratio (SNR) performance. Simulation results show that arbitrary SSM pattern can lead to inferior bit error rate (BER) performance. PMID:28698477

  16. FMCSA’s advanced system testing utilizing a data acquisition system on the highways (FAST DASH) safety technology evaluation project #2 : technology brief.

    DOT National Transportation Integrated Search

    2016-11-01

    The Federal Motor Carrier Safety Administration (FMCSA) established the FAST DASH program to perform efficient independent evaluations of promising safety technologies aimed at commercial vehicle operations. In this second FAST DASH safety technology...

  17. The impact of health information technology on patient safety.

    PubMed

    Alotaibi, Yasser K; Federico, Frank

    2017-12-01

    Since the original Institute of Medicine (IOM) report was published there has been an accelerated development and adoption of health information technology with varying degrees of evidence about the impact of health information technology on patient safety.  This article is intended to review the current available scientific evidence on the impact of different health information technologies on improving patient safety outcomes. We conclude that health information technology improves patient's safety by reducing medication errors, reducing adverse drug reactions, and improving compliance to practice guidelines. There should be no doubt that health information technology is an important tool for improving healthcare quality and safety. Healthcare organizations need to be selective in which technology to invest in, as literature shows that some technologies have limited evidence in improving patient safety outcomes.

  18. The impact of health information technology on patient safety

    PubMed Central

    Alotaibi, Yasser K.; Federico, Frank

    2017-01-01

    Since the original Institute of Medicine (IOM) report was published there has been an accelerated development and adoption of health information technology with varying degrees of evidence about the impact of health information technology on patient safety. This article is intended to review the current available scientific evidence on the impact of different health information technologies on improving patient safety outcomes. We conclude that health information technology improves patient’s safety by reducing medication errors, reducing adverse drug reactions, and improving compliance to practice guidelines. There should be no doubt that health information technology is an important tool for improving healthcare quality and safety. Healthcare organizations need to be selective in which technology to invest in, as literature shows that some technologies have limited evidence in improving patient safety outcomes. PMID:29209664

  19. FMCSA’s advanced system testing utilizing a data acquisition system on the highways (FAST DASH) safety technology evaluation project #3 : novel convex mirrors : technology brief.

    DOT National Transportation Integrated Search

    2016-11-01

    The Federal Motor Carrier Safety Administration (FMCSA) established the FAST DASH program to perform efficient independent evaluations of promising safety technologies aimed at commercial vehicle operations. In this third FAST DASH safety technology ...

  20. Deterministic chaotic dynamics of Raba River flow (Polish Carpathian Mountains)

    NASA Astrophysics Data System (ADS)

    Kędra, Mariola

    2014-02-01

    Is the underlying dynamics of river flow random or deterministic? If it is deterministic, is it deterministic chaotic? This issue is still controversial. The application of several independent methods, techniques and tools for studying daily river flow data gives consistent, reliable and clear-cut results to the question. The outcomes point out that the investigated discharge dynamics is not random but deterministic. Moreover, the results completely confirm the nonlinear deterministic chaotic nature of the studied process. The research was conducted on daily discharge from two selected gauging stations of the mountain river in southern Poland, the Raba River.

  1. Safety System Design for Technology Education. A Safety Guide for Technology Education Courses K-12.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh. Div. of Vocational Education.

    This manual is designed to involve both teachers and students in planning and controlling a safety system for technology education classrooms. The safety program involves students in the design and maintenance of the system by including them in the analysis of the classroom environment, job safety analysis, safety inspection, and machine safety…

  2. Applications of aerospace technology in industry, a technology transfer profile: Fire safety

    NASA Technical Reports Server (NTRS)

    Kottenstette, J. P.; Freeman, J. E.; Heins, C. R.; Hildred, W. M.; Johnson, F. D.; Staskin, E. R.

    1971-01-01

    The fire safety field is considered as being composed of three parts: an industry, a technology base, and a user base. An overview of the field is presented, including a perspective on the magnitude of the national fire safety problem. Selected NASA contributions to the technology of fire safety are considered. Communication mechanisms, particularly conferences and publications, used by NASA to alert the community to new developments in the fire safety field, are reviewed. Several examples of nonaerospace applications of NASA-generated fire safety technology are also presented. Issues associated with attempts to transfer this technology from the space program to other sectors of the American economy are outlined.

  3. Hydrogen and Fuel Cell Technology | Transportation Research | NREL

    Science.gov Websites

    Outlines Safety Considerations for Hydrogen Technologies While safety requirements for industrial uses of vehicles have created the need for additional safety requirements. The new Hydrogen Technologies Safety hydrogen safety in context. For example, code officials reviewing permit applications for hydrogen projects

  4. Improvement of driving safety in road traffic system

    NASA Astrophysics Data System (ADS)

    Li, Ke-Ping; Gao, Zi-You

    2005-05-01

    A road traffic system is a complex system in which humans participate directly. In this system, human factors play a very important role. In this paper, a kind of control signal is designated at a given site (i.e., signal point) of the road. Under the effect of the control signal, the drivers will decrease their velocities when their vehicles pass the signal point. Our aim is to transit the traffic flow states from disorder to order and then improve the traffic safety. We have tested this technique for the two-lane traffic model that is based on the deterministic Nagel-Schreckenberg (NaSch) traffic model. The simulation results indicate that the traffic flow states can be transited from disorder to order. Different order states can be observed in the system and these states are safer.

  5. Current and anticipated uses of thermalhydraulic and neutronic codes at PSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G.

    1997-07-01

    The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together withmore » some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized.« less

  6. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Technical Reports Server (NTRS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-01-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  7. On-chip single photon filtering and multiplexing in hybrid quantum photonic circuits.

    PubMed

    Elshaari, Ali W; Zadeh, Iman Esmaeil; Fognini, Andreas; Reimer, Michael E; Dalacu, Dan; Poole, Philip J; Zwiller, Val; Jöns, Klaus D

    2017-08-30

    Quantum light plays a pivotal role in modern science and future photonic applications. Since the advent of integrated quantum nanophotonics different material platforms based on III-V nanostructures-, colour centers-, and nonlinear waveguides as on-chip light sources have been investigated. Each platform has unique advantages and limitations; however, all implementations face major challenges with filtering of individual quantum states, scalable integration, deterministic multiplexing of selected quantum emitters, and on-chip excitation suppression. Here we overcome all of these challenges with a hybrid and scalable approach, where single III-V quantum emitters are positioned and deterministically integrated in a complementary metal-oxide-semiconductor-compatible photonic circuit. We demonstrate reconfigurable on-chip single-photon filtering and wavelength division multiplexing with a foot print one million times smaller than similar table-top approaches, while offering excitation suppression of more than 95 dB and efficient routing of single photons over a bandwidth of 40 nm. Our work marks an important step to harvest quantum optical technologies' full potential.Combining different integration platforms on the same chip is currently one of the main challenges for quantum technologies. Here, Elshaari et al. show III-V Quantum Dots embedded in nanowires operating in a CMOS compatible circuit, with controlled on-chip filtering and tunable routing.

  8. Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia

    2015-04-26

    The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less

  9. GUINEVERE experiment: Kinetic analysis of some reactivity measurement methods by deterministic and Monte Carlo codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bianchini, G.; Burgio, N.; Carta, M.

    The GUINEVERE experiment (Generation of Uninterrupted Intense Neutrons at the lead Venus Reactor) is an experimental program in support of the ADS technology presently carried out at SCK-CEN in Mol (Belgium). In the experiment a modified lay-out of the original thermal VENUS critical facility is coupled to an accelerator, built by the French body CNRS in Grenoble, working in both continuous and pulsed mode and delivering 14 MeV neutrons by bombardment of deuterons on a tritium-target. The modified lay-out of the facility consists of a fast subcritical core made of 30% U-235 enriched metallic Uranium in a lead matrix. Severalmore » off-line and on-line reactivity measurement techniques will be investigated during the experimental campaign. This report is focused on the simulation by deterministic (ERANOS French code) and Monte Carlo (MCNPX US code) calculations of three reactivity measurement techniques, Slope ({alpha}-fitting), Area-ratio and Source-jerk, applied to a GUINEVERE subcritical configuration (namely SC1). The inferred reactivity, in dollar units, by the Area-ratio method shows an overall agreement between the two deterministic and Monte Carlo computational approaches, whereas the MCNPX Source-jerk results are affected by large uncertainties and allow only partial conclusions about the comparison. Finally, no particular spatial dependence of the results is observed in the case of the GUINEVERE SC1 subcritical configuration. (authors)« less

  10. A statistical approach to nuclear fuel design and performance

    NASA Astrophysics Data System (ADS)

    Cunning, Travis Andrew

    As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance, with an average sensitivity index of 48.93% on key output quantities. Pellet grain size and dish depth are also significant contributors, at 31.53% and 13.46%, respectively. A traditional limit of operating envelope case is also evaluated. This case produces output values that exceed the maximum values observed during the 105 Monte Carlo trials for all output quantities of interest. In many cases the difference between the predictions of the two methods is very prominent, and the highly conservative nature of the deterministic approach is demonstrated. A reliability analysis of CANDU fuel manufacturing parametric data, specifically pertaining to the quantification of fuel performance margins, has not been conducted previously. Key Words: CANDU, nuclear fuel, Cameco, fuel manufacturing, fuel modelling, fuel performance, fuel reliability, ELESTRES, ELOCA, dimensional reduction methods, global sensitivity analysis, deterministic safety analysis, probabilistic safety analysis.

  11. Exploring the sociotechnical intersection of patient safety and electronic health record implementation

    PubMed Central

    Meeks, Derek W; Takian, Amirhossein; Sittig, Dean F; Singh, Hardeep; Barber, Nick

    2014-01-01

    Objective The intersection of electronic health records (EHR) and patient safety is complex. To examine the applicability of two previously developed conceptual models comprehensively to understand safety implications of EHR implementation in the English National Health Service (NHS). Methods We conducted a secondary analysis of interview data from a 30-month longitudinal, prospective, case study-based evaluation of EHR implementation in 12 NHS hospitals. We used a framework analysis approach to apply conceptual models developed by Sittig and Singh to understand better EHR implementation and use: an eight-dimension sociotechnical model and a three-phase patient safety model (safe technology, safe use of technology, and use of technology to improve safety). Results The intersection of patient safety and EHR implementation and use was characterized by risks involving technology (hardware and software, clinical content, and human–computer interfaces), the interaction of technology with non-technological factors, and improper or unsafe use of technology. Our data support that patient safety improvement activities as well as patient safety hazards change as an organization evolves from concerns about safe EHR functionality, ensuring safe and appropriate EHR use, to using the EHR itself to provide ongoing surveillance and monitoring of patient safety. Discussion We demonstrate the face validity of two models for understanding the sociotechnical aspects of safe EHR implementation and the complex interactions of technology within a healthcare system evolving from paper to integrated EHR. Conclusions Using sociotechnical models, including those presented in this paper, may be beneficial to help stakeholders understand, synthesize, and anticipate risks at the intersection of patient safety and health information technology. PMID:24052536

  12. Exploring the sociotechnical intersection of patient safety and electronic health record implementation.

    PubMed

    Meeks, Derek W; Takian, Amirhossein; Sittig, Dean F; Singh, Hardeep; Barber, Nick

    2014-02-01

    The intersection of electronic health records (EHR) and patient safety is complex. To examine the applicability of two previously developed conceptual models comprehensively to understand safety implications of EHR implementation in the English National Health Service (NHS). We conducted a secondary analysis of interview data from a 30-month longitudinal, prospective, case study-based evaluation of EHR implementation in 12 NHS hospitals. We used a framework analysis approach to apply conceptual models developed by Sittig and Singh to understand better EHR implementation and use: an eight-dimension sociotechnical model and a three-phase patient safety model (safe technology, safe use of technology, and use of technology to improve safety). The intersection of patient safety and EHR implementation and use was characterized by risks involving technology (hardware and software, clinical content, and human-computer interfaces), the interaction of technology with non-technological factors, and improper or unsafe use of technology. Our data support that patient safety improvement activities as well as patient safety hazards change as an organization evolves from concerns about safe EHR functionality, ensuring safe and appropriate EHR use, to using the EHR itself to provide ongoing surveillance and monitoring of patient safety. We demonstrate the face validity of two models for understanding the sociotechnical aspects of safe EHR implementation and the complex interactions of technology within a healthcare system evolving from paper to integrated EHR. Using sociotechnical models, including those presented in this paper, may be beneficial to help stakeholders understand, synthesize, and anticipate risks at the intersection of patient safety and health information technology.

  13. Sustainability likelihood of remediation options for metal-contaminated soil/sediment.

    PubMed

    Chen, Season S; Taylor, Jessica S; Baek, Kitae; Khan, Eakalak; Tsang, Daniel C W; Ok, Yong Sik

    2017-05-01

    Multi-criteria analysis and detailed impact analysis were carried out to assess the sustainability of four remedial alternatives for metal-contaminated soil/sediment at former timber treatment sites and harbour sediment with different scales. The sustainability was evaluated in the aspects of human health and safety, environment, stakeholder concern, and land use, under four different scenarios with varying weighting factors. The Monte Carlo simulation was performed to reveal the likelihood of accomplishing sustainable remediation with different treatment options at different sites. The results showed that in-situ remedial technologies were more sustainable than ex-situ ones, where in-situ containment demonstrated both the most sustainable result and the highest probability to achieve sustainability amongst the four remedial alternatives in this study, reflecting the lesser extent of off-site and on-site impacts. Concerns associated with ex-situ options were adverse impacts tied to all four aspects and caused by excavation, extraction, and off-site disposal. The results of this study suggested the importance of considering the uncertainties resulting from the remedial options (i.e., stochastic analysis) in addition to the overall sustainability scores (i.e., deterministic analysis). The developed framework and model simulation could serve as an assessment for the sustainability likelihood of remedial options to ensure sustainable remediation of contaminated sites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Performance assessment of deterministic and probabilistic weather predictions for the short-term optimization of a tropical hydropower reservoir

    NASA Astrophysics Data System (ADS)

    Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter

    2016-04-01

    Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of actual forecasts with shorter lead times of up to 15 days shows the practical benefit of actual operational data. It appears that the use of stochastic optimization combined with ensemble forecasts leads to a significant higher level of flood protection without compromising the HPP's energy production.

  15. PAM-4 Signaling over VCSELs with 0.13µm CMOS Chip Technology

    NASA Astrophysics Data System (ADS)

    Cunningham, J. E.; Beckman, D.; Zheng, Xuezhe; Huang, Dawei; Sze, T.; Krishnamoorthy, A. V.

    2006-12-01

    We present results for VCSEL based links operating PAM-4 signaling using a commercial 0.13µm CMOS technology. We perform a complete link analysis of the Bit Error Rate, Q factor, random and deterministic jitter by measuring waterfall curves versus margins in time and amplitude. We demonstrate that VCSEL based PAM 4 can match or even improve performance over binary signaling under conditions of a bandwidth limited, 100meter multi-mode optical link at 5Gbps. We present the first sensitivity measurements for optical PAM-4 and compare it with binary signaling. Measured benefits are reconciled with information theory predictions.

  16. PAM-4 Signaling over VCSELs with 0.13microm CMOS Chip Technology.

    PubMed

    Cunningham, J E; Beckman, D; Zheng, Xuezhe; Huang, Dawei; Sze, T; Krishnamoorthy, A V

    2006-12-11

    We present results for VCSEL based links operating PAM-4 signaling using a commercial 0.13microm CMOS technology. We perform a complete link analysis of the Bit Error Rate, Q factor, random and deterministic jitter by measuring waterfall curves versus margins in time and amplitude. We demonstrate that VCSEL based PAM-4 can match or even improve performance over binary signaling under conditions of a bandwidth limited, 100meter multi-mode optical link at 5Gbps. We present the first sensitivity measurements for optical PAM-4 and compare it with binary signaling. Measured benefits are reconciled with information theory predictions.

  17. Introduction of Virtualization Technology to Multi-Process Model Checking

    NASA Technical Reports Server (NTRS)

    Leungwattanakit, Watcharin; Artho, Cyrille; Hagiya, Masami; Tanabe, Yoshinori; Yamamoto, Mitsuharu

    2009-01-01

    Model checkers find failures in software by exploring every possible execution schedule. Java PathFinder (JPF), a Java model checker, has been extended recently to cover networked applications by caching data transferred in a communication channel. A target process is executed by JPF, whereas its peer process runs on a regular virtual machine outside. However, non-deterministic target programs may produce different output data in each schedule, causing the cache to restart the peer process to handle the different set of data. Virtualization tools could help us restore previous states of peers, eliminating peer restart. This paper proposes the application of virtualization technology to networked model checking, concentrating on JPF.

  18. Criticality Calculations with MCNP6 - Practical Lectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2016-11-29

    These slides are used to teach MCNP (Monte Carlo N-Particle) usage to nuclear criticality safety analysts. The following are the lecture topics: course information, introduction, MCNP basics, criticality calculations, advanced geometry, tallies, adjoint-weighted tallies and sensitivities, physics and nuclear data, parameter studies, NCS validation I, NCS validation II, NCS validation III, case study 1 - solution tanks, case study 2 - fuel vault, case study 3 - B&W core, case study 4 - simple TRIGA, case study 5 - fissile mat. vault, criticality accident alarm systems. After completion of this course, you should be able to: Develop an input modelmore » for MCNP; Describe how cross section data impact Monte Carlo and deterministic codes; Describe the importance of validation of computer codes and how it is accomplished; Describe the methodology supporting Monte Carlo codes and deterministic codes; Describe pitfalls of Monte Carlo calculations; Discuss the strengths and weaknesses of Monte Carlo and Discrete Ordinants codes; The diffusion theory model is not strictly valid for treating fissile systems in which neutron absorption, voids, and/or material boundaries are present. In the context of these limitations, identify a fissile system for which a diffusion theory solution would be adequate.« less

  19. Laser targets compensate for limitations in inertial confinement fusion drivers

    NASA Astrophysics Data System (ADS)

    Kilkenny, J. D.; Alexander, N. B.; Nikroo, A.; Steinman, D. A.; Nobile, A.; Bernat, T.; Cook, R.; Letts, S.; Takagi, M.; Harding, D.

    2005-10-01

    Success in inertial confinement fusion (ICF) requires sophisticated, characterized targets. The increasing fidelity of three-dimensional (3D), radiation hydrodynamic computer codes has made it possible to design targets for ICF which can compensate for limitations in the existing single shot laser and Z pinch ICF drivers. Developments in ICF target fabrication technology allow more esoteric target designs to be fabricated. At present, requirements require new deterministic nano-material fabrication on micro scale.

  20. Considering inventory distributions in a stochastic periodic inventory routing system

    NASA Astrophysics Data System (ADS)

    Yadollahi, Ehsan; Aghezzaf, El-Houssaine

    2017-07-01

    Dealing with the stochasticity of parameters is one of the critical issues in business and industry nowadays. Supply chain planners have difficulties in forecasting stochastic parameters of a distribution system. Demand rates of customers during their lead time are one of these parameters. In addition, holding a huge level of inventory at the retailers is costly and inefficient. To cover the uncertainty of forecasting demand rates, researchers have proposed the usage of safety stock to avoid stock-out. However, finding the precise level of safety stock depends on forecasting the statistical distribution of demand rates and their variations in different settings among the planning horizon. In this paper the demand rate distributions and its parameters are taken into account for each time period in a stochastic periodic IRP. An analysis of the achieved statistical distribution of the inventory and safety stock level is provided to measure the effects of input parameters on the output indicators. Different values for coefficient of variation are applied to the customers' demand rate in the optimization model. The outcome of the deterministic equivalent model of SPIRP is simulated in form of an illustrative case.

  1. Processes of technology assessment: The National Transportation Safety Board

    NASA Technical Reports Server (NTRS)

    Weiss, E.

    1972-01-01

    The functions and operations of the Safety Board as related to technology assessment are described, and a brief history of the Safety Board is given. Recommendations made for safety in all areas of transportation and the actions taken are listed. Although accident investigation is an important aspect of NTSB's activity, it is felt that the greatest contribution is in pressing for development of better accident prevention programs. Efforts of the Safety Board in changing transportation technology to improve safety and prevent accidents are illustrated.

  2. Deterministic and robust generation of single photons from a single quantum dot with 99.5% indistinguishability using adiabatic rapid passage.

    PubMed

    Wei, Yu-Jia; He, Yu-Ming; Chen, Ming-Cheng; Hu, Yi-Nan; He, Yu; Wu, Dian; Schneider, Christian; Kamp, Martin; Höfling, Sven; Lu, Chao-Yang; Pan, Jian-Wei

    2014-11-12

    Single photons are attractive candidates of quantum bits (qubits) for quantum computation and are the best messengers in quantum networks. Future scalable, fault-tolerant photonic quantum technologies demand both stringently high levels of photon indistinguishability and generation efficiency. Here, we demonstrate deterministic and robust generation of pulsed resonance fluorescence single photons from a single semiconductor quantum dot using adiabatic rapid passage, a method robust against fluctuation of driving pulse area and dipole moments of solid-state emitters. The emitted photons are background-free, have a vanishing two-photon emission probability of 0.3% and a raw (corrected) two-photon Hong-Ou-Mandel interference visibility of 97.9% (99.5%), reaching a precision that places single photons at the threshold for fault-tolerant surface-code quantum computing. This single-photon source can be readily scaled up to multiphoton entanglement and used for quantum metrology, boson sampling, and linear optical quantum computing.

  3. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    PubMed Central

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  4. Advanced Information Technology in Simulation Based Life Cycle Design

    NASA Technical Reports Server (NTRS)

    Renaud, John E.

    2003-01-01

    In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.

  5. Origins of Chaos in Autonomous Boolean Networks

    NASA Astrophysics Data System (ADS)

    Socolar, Joshua; Cavalcante, Hugo; Gauthier, Daniel; Zhang, Rui

    2010-03-01

    Networks with nodes consisting of ideal Boolean logic gates are known to display either steady states, periodic behavior, or an ultraviolet catastrophe where the number of logic-transition events circulating in the network per unit time grows as a power-law. In an experiment, non-ideal behavior of the logic gates prevents the ultraviolet catastrophe and may lead to deterministic chaos. We identify certain non-ideal features of real logic gates that enable chaos in experimental networks. We find that short-pulse rejection and the asymmetry between the logic states tends to engender periodic behavior. On the other hand, a memory effect termed ``degradation'' can generate chaos. Our results strongly suggest that deterministic chaos can be expected in a large class of experimental Boolean-like networks. Such devices may find application in a variety of technologies requiring fast complex waveforms or flat power spectra. The non-ideal effects identified here also have implications for the statistics of attractors in large complex networks.

  6. Experimental realization of real-time feedback-control of single-atom arrays

    NASA Astrophysics Data System (ADS)

    Kim, Hyosub; Lee, Woojun; Ahn, Jaewook

    2016-05-01

    Deterministic loading of neutral atoms on particular locations has remained a challenging problem. Here we show, in a proof-of-principle experimental demonstration, that such deterministic loading can be achieved by rearrangement of atoms. In the experiment, cold rubidium atom were trapped by optical tweezers, which are the hologram images made by a liquid-crystal spatial light modulator (LC-SLM). After the initial occupancy was identified, the hologram was actively controlled to rearrange the captured atoms on to unfilled sites. For this, we developed a new flicker-free hologram algorithm that enables holographic atom translation. Our demonstration show that up to N=9 atoms were simultaneously moved in the 2D plane with the movable degrees of freedom of 2N=18 and the fidelity of 99% for single-atom 5- μm translation. It is hoped that our in situ atom rearrangement becomes useful in scaling quantum computers. Samsung Science and Technology Foundation [SSTF-BA1301-12].

  7. An approach to model reactor core nodalization for deterministic safety analysis

    NASA Astrophysics Data System (ADS)

    Salim, Mohd Faiz; Samsudin, Mohd Rafie; Mamat @ Ibrahim, Mohd Rizal; Roslan, Ridha; Sadri, Abd Aziz; Farid, Mohd Fairus Abd

    2016-01-01

    Adopting good nodalization strategy is essential to produce an accurate and high quality input model for Deterministic Safety Analysis (DSA) using System Thermal-Hydraulic (SYS-TH) computer code. The purpose of such analysis is to demonstrate the compliance against regulatory requirements and to verify the behavior of the reactor during normal and accident conditions as it was originally designed. Numerous studies in the past have been devoted to the development of the nodalization strategy for small research reactor (e.g. 250kW) up to the bigger research reactor (e.g. 30MW). As such, this paper aims to discuss the state-of-arts thermal hydraulics channel to be employed in the nodalization for RTP-TRIGA Research Reactor specifically for the reactor core. At present, the required thermal-hydraulic parameters for reactor core, such as core geometrical data (length, coolant flow area, hydraulic diameters, and axial power profile) and material properties (including the UZrH1.6, stainless steel clad, graphite reflector) have been collected, analyzed and consolidated in the Reference Database of RTP using standardized methodology, mainly derived from the available technical documentations. Based on the available information in the database, assumptions made on the nodalization approach and calculations performed will be discussed and presented. The development and identification of the thermal hydraulics channel for the reactor core will be implemented during the SYS-TH calculation using RELAP5-3D® computer code. This activity presented in this paper is part of the development of overall nodalization description for RTP-TRIGA Research Reactor under the IAEA Norwegian Extra-Budgetary Programme (NOKEBP) mentoring project on Expertise Development through the Analysis of Reactor Thermal-Hydraulics for Malaysia, denoted as EARTH-M.

  8. An approach to model reactor core nodalization for deterministic safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salim, Mohd Faiz, E-mail: mohdfaizs@tnb.com.my; Samsudin, Mohd Rafie, E-mail: rafies@tnb.com.my; Mamat Ibrahim, Mohd Rizal, E-mail: m-rizal@nuclearmalaysia.gov.my

    Adopting good nodalization strategy is essential to produce an accurate and high quality input model for Deterministic Safety Analysis (DSA) using System Thermal-Hydraulic (SYS-TH) computer code. The purpose of such analysis is to demonstrate the compliance against regulatory requirements and to verify the behavior of the reactor during normal and accident conditions as it was originally designed. Numerous studies in the past have been devoted to the development of the nodalization strategy for small research reactor (e.g. 250kW) up to the bigger research reactor (e.g. 30MW). As such, this paper aims to discuss the state-of-arts thermal hydraulics channel to bemore » employed in the nodalization for RTP-TRIGA Research Reactor specifically for the reactor core. At present, the required thermal-hydraulic parameters for reactor core, such as core geometrical data (length, coolant flow area, hydraulic diameters, and axial power profile) and material properties (including the UZrH{sub 1.6}, stainless steel clad, graphite reflector) have been collected, analyzed and consolidated in the Reference Database of RTP using standardized methodology, mainly derived from the available technical documentations. Based on the available information in the database, assumptions made on the nodalization approach and calculations performed will be discussed and presented. The development and identification of the thermal hydraulics channel for the reactor core will be implemented during the SYS-TH calculation using RELAP5-3D{sup ®} computer code. This activity presented in this paper is part of the development of overall nodalization description for RTP-TRIGA Research Reactor under the IAEA Norwegian Extra-Budgetary Programme (NOKEBP) mentoring project on Expertise Development through the Analysis of Reactor Thermal-Hydraulics for Malaysia, denoted as EARTH-M.« less

  9. Technology applications for traffic safety programs : a primer

    DOT National Transportation Integrated Search

    2008-09-01

    This document explores how emerging digital and communications technology can advance safety on the Nations highways. The range of technology described in this report is available or will be available in the near future to improve traffic safety. ...

  10. Probabilistic vs. deterministic fiber tracking and the influence of different seed regions to delineate cerebellar-thalamic fibers in deep brain stimulation.

    PubMed

    Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M

    2017-06-01

    This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  11. 49 CFR 1.94 - The National Highway Traffic Safety Administration.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... information system improvements, motorcyclist safety, and child safety restraints; administering a nationwide... concerning motor vehicle safety, including vehicle to vehicle and vehicle to infrastructure technologies and other new or advanced vehicle technologies; and investigating safety-related defects and non-compliance...

  12. 49 CFR 1.94 - The National Highway Traffic Safety Administration.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... information system improvements, motorcyclist safety, and child safety restraints; administering a nationwide... concerning motor vehicle safety, including vehicle to vehicle and vehicle to infrastructure technologies and other new or advanced vehicle technologies; and investigating safety-related defects and non-compliance...

  13. 49 CFR 1.94 - The National Highway Traffic Safety Administration.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... information system improvements, motorcyclist safety, and child safety restraints; administering a nationwide... concerning motor vehicle safety, including vehicle to vehicle and vehicle to infrastructure technologies and other new or advanced vehicle technologies; and investigating safety-related defects and non-compliance...

  14. Printed droplet microfluidics for on demand dispensing of picoliter droplets and cells

    PubMed Central

    Cole, Russell H.; Tang, Shi-Yang; Siltanen, Christian A.; Shahi, Payam; Zhang, Jesse Q.; Poust, Sean; Gartner, Zev J.; Abate, Adam R.

    2017-01-01

    Although the elementary unit of biology is the cell, high-throughput methods for the microscale manipulation of cells and reagents are limited. The existing options either are slow, lack single-cell specificity, or use fluid volumes out of scale with those of cells. Here we present printed droplet microfluidics, a technology to dispense picoliter droplets and cells with deterministic control. The core technology is a fluorescence-activated droplet sorter coupled to a specialized substrate that together act as a picoliter droplet and single-cell printer, enabling high-throughput generation of intricate arrays of droplets, cells, and microparticles. Printed droplet microfluidics provides a programmable and robust technology to construct arrays of defined cell and reagent combinations and to integrate multiple measurement modalities together in a single assay. PMID:28760972

  15. Printed droplet microfluidics for on demand dispensing of picoliter droplets and cells.

    PubMed

    Cole, Russell H; Tang, Shi-Yang; Siltanen, Christian A; Shahi, Payam; Zhang, Jesse Q; Poust, Sean; Gartner, Zev J; Abate, Adam R

    2017-08-15

    Although the elementary unit of biology is the cell, high-throughput methods for the microscale manipulation of cells and reagents are limited. The existing options either are slow, lack single-cell specificity, or use fluid volumes out of scale with those of cells. Here we present printed droplet microfluidics, a technology to dispense picoliter droplets and cells with deterministic control. The core technology is a fluorescence-activated droplet sorter coupled to a specialized substrate that together act as a picoliter droplet and single-cell printer, enabling high-throughput generation of intricate arrays of droplets, cells, and microparticles. Printed droplet microfluidics provides a programmable and robust technology to construct arrays of defined cell and reagent combinations and to integrate multiple measurement modalities together in a single assay.

  16. Printed droplet microfluidics for on demand dispensing of picoliter droplets and cells

    NASA Astrophysics Data System (ADS)

    Cole, Russell H.; Tang, Shi-Yang; Siltanen, Christian A.; Shahi, Payam; Zhang, Jesse Q.; Poust, Sean; Gartner, Zev J.; Abate, Adam R.

    2017-08-01

    Although the elementary unit of biology is the cell, high-throughput methods for the microscale manipulation of cells and reagents are limited. The existing options either are slow, lack single-cell specificity, or use fluid volumes out of scale with those of cells. Here we present printed droplet microfluidics, a technology to dispense picoliter droplets and cells with deterministic control. The core technology is a fluorescence-activated droplet sorter coupled to a specialized substrate that together act as a picoliter droplet and single-cell printer, enabling high-throughput generation of intricate arrays of droplets, cells, and microparticles. Printed droplet microfluidics provides a programmable and robust technology to construct arrays of defined cell and reagent combinations and to integrate multiple measurement modalities together in a single assay.

  17. Controlled deterministic implantation by nanostencil lithography at the limit of ion-aperture straggling

    NASA Astrophysics Data System (ADS)

    Alves, A. D. C.; Newnham, J.; van Donkelaar, J. A.; Rubanov, S.; McCallum, J. C.; Jamieson, D. N.

    2013-04-01

    Solid state electronic devices fabricated in silicon employ many ion implantation steps in their fabrication. In nanoscale devices deterministic implants of dopant atoms with high spatial precision will be needed to overcome problems with statistical variations in device characteristics and to open new functionalities based on controlled quantum states of single atoms. However, to deterministically place a dopant atom with the required precision is a significant technological challenge. Here we address this challenge with a strategy based on stepped nanostencil lithography for the construction of arrays of single implanted atoms. We address the limit on spatial precision imposed by ion straggling in the nanostencil—fabricated with the readily available focused ion beam milling technique followed by Pt deposition. Two nanostencils have been fabricated; a 60 nm wide aperture in a 3 μm thick Si cantilever and a 30 nm wide aperture in a 200 nm thick Si3N4 membrane. The 30 nm wide aperture demonstrates the fabricating process for sub-50 nm apertures while the 60 nm aperture was characterized with 500 keV He+ ion forward scattering to measure the effect of ion straggling in the collimator and deduce a model for its internal structure using the GEANT4 ion transport code. This model is then applied to simulate collimation of a 14 keV P+ ion beam in a 200 nm thick Si3N4 membrane nanostencil suitable for the implantation of donors in silicon. We simulate collimating apertures with widths in the range of 10-50 nm because we expect the onset of J-coupling in a device with 30 nm donor spacing. We find that straggling in the nanostencil produces mis-located implanted ions with a probability between 0.001 and 0.08 depending on the internal collimator profile and the alignment with the beam direction. This result is favourable for the rapid prototyping of a proof-of-principle device containing multiple deterministically implanted dopants.

  18. The NYU System for MUC-6 or Where’s the Syntax?

    DTIC Science & Technology

    1995-01-01

    34 and only in the face of compelling syntactic or semantic evidence, in a (nearly) deterministic manner . Speed was particularly an issue for MUC-6...thank BBN Systems and Technologies for providing us with this tagger. 168 Name Recognitio n The input stage is followed by several stages of pattern...Group Recognitio n The third stage of pattern matching recognizes verb groups : simple tensed verbs ("sleeps"), and verbs with auxiliaries ("will sleep

  19. Safety huddles to proactively identify and address electronic health record safety

    PubMed Central

    Menon, Shailaja; Singh, Hardeep; Giardina, Traber D; Rayburn, William L; Davis, Brenda P; Russo, Elise M

    2017-01-01

    Objective: Methods to identify and study safety risks of electronic health records (EHRs) are underdeveloped and largely depend on limited end-user reports. “Safety huddles” have been found useful in creating a sense of collective situational awareness that increases an organization’s capacity to respond to safety concerns. We explored the use of safety huddles for identifying and learning about EHR-related safety concerns. Design: Data were obtained from daily safety huddle briefing notes recorded at a single midsized tertiary-care hospital in the United States over 1 year. Huddles were attended by key administrative, clinical, and information technology staff. We conducted a content analysis of huddle notes to identify what EHR-related safety concerns were discussed. We expanded a previously developed EHR-related error taxonomy to categorize types of EHR-related safety concerns recorded in the notes. Results: On review of daily huddle notes spanning 249 days, we identified 245 EHR-related safety concerns. For our analysis, we defined EHR technology to include a specific EHR functionality, an entire clinical software application, or the hardware system. Most concerns (41.6%) involved “EHR technology working incorrectly,” followed by 25.7% involving “EHR technology not working at all.” Concerns related to “EHR technology missing or absent” accounted for 16.7%, whereas 15.9% were linked to “user errors.” Conclusions: Safety huddles promoted discussion of several technology-related issues at the organization level and can serve as a promising technique to identify and address EHR-related safety concerns. Based on our findings, we recommend that health care organizations consider huddles as a strategy to promote understanding and improvement of EHR safety. PMID:28031286

  20. Deterministic quantum dense coding networks

    NASA Astrophysics Data System (ADS)

    Roy, Saptarshi; Chanda, Titas; Das, Tamoghna; Sen(De), Aditi; Sen, Ujjwal

    2018-07-01

    We consider the scenario of deterministic classical information transmission between multiple senders and a single receiver, when they a priori share a multipartite quantum state - an attempt towards building a deterministic dense coding network. Specifically, we prove that in the case of two or three senders and a single receiver, generalized Greenberger-Horne-Zeilinger (gGHZ) states are not beneficial for sending classical information deterministically beyond the classical limit, except when the shared state is the GHZ state itself. On the other hand, three- and four-qubit generalized W (gW) states with specific parameters as well as the four-qubit Dicke states can provide a quantum advantage of sending the information in deterministic dense coding. Interestingly however, numerical simulations in the three-qubit scenario reveal that the percentage of states from the GHZ-class that are deterministic dense codeable is higher than that of states from the W-class.

  1. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    NASA Astrophysics Data System (ADS)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.

  2. Severe Accident Sequence Analysis Program: Anticipated transient without scram simulations for Browns Ferry Nuclear Plant Unit 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dallman, R J; Gottula, R C; Holcomb, E E

    1987-05-01

    An analysis of five anticipated transients without scram (ATWS) was conducted at the Idaho National Engineering Laboratory (INEL). The five detailed deterministic simulations of postulated ATWS sequences were initiated from a main steamline isolation valve (MSIV) closure. The subject of the analysis was the Browns Ferry Nuclear Plant Unit 1, a boiling water reactor (BWR) of the BWR/4 product line with a Mark I containment. The simulations yielded insights to the possible consequences resulting from a MSIV closure ATWS. An evaluation of the effects of plant safety systems and operator actions on accident progression and mitigation is presented.

  3. Health, safety and environmental unit performance assessment model under uncertainty (case study: steel industry).

    PubMed

    Shamaii, Azin; Omidvari, Manouchehr; Lotfi, Farhad Hosseinzadeh

    2017-01-01

    Performance assessment is a critical objective of management systems. As a result of the non-deterministic and qualitative nature of performance indicators, assessments are likely to be influenced by evaluators' personal judgments. Furthermore, in developing countries, performance assessments by the Health, Safety and Environment (HSE) department are based solely on the number of accidents. A questionnaire is used to conduct the study in one of the largest steel production companies in Iran. With respect to health, safety, and environment, the results revealed that control of disease, fire hazards, and air pollution are of paramount importance, with coefficients of 0.057, 0.062, and 0.054, respectively. Furthermore, health and environment indicators were found to be the most common causes of poor performance. Finally, it was shown that HSE management systems can affect the majority of performance safety indicators in the short run, whereas health and environment indicators require longer periods of time. The objective of this study is to present an HSE-MS unit performance assessment model in steel industries. Moreover, we seek to answer the following question: what are the factors that affect HSE unit system in the steel industry? Also, for each factor, the extent of impact on the performance of the HSE management system in the organization is determined.

  4. Promoting the safe and strategic use of technology for victims of intimate partner violence: evaluation of the technology safety project.

    PubMed

    Finn, Jerry; Atkinson, Teresa

    2009-11-01

    The Technology Safety Project of the Washington State Coalition Against Domestic Violence was designed to increase awareness and knowledge of technology safety issues for domestic violence victims, survivors, and advocacy staff. The project used a "train-the-trainer" model and provided computer and Internet resources to domestic violence service providers to (a) increase safe computer and Internet access for domestic violence survivors in Washington, (b) reduce the risk posed by abusers by educating survivors about technology safety and privacy, and (c) increase the ability of survivors to help themselves and their children through information technology. Evaluation of the project suggests that the program is needed, useful, and effective. Consumer satisfaction was high, and there was perceived improvement in computer confidence and knowledge of computer safety. Areas for future program development and further research are discussed.

  5. Maker Cultures and the Prospects for Technological Action.

    PubMed

    Nascimento, Susana; Pólvora, Alexandre

    2018-06-01

    Supported by easier and cheaper access to tools and expanding communities, maker cultures are pointing towards the ideas of (almost) everyone designing, creating, producing and distributing renewed, new and improved products, machines, things or artefacts. A careful analysis of the assumptions and challenges of maker cultures emphasizes the relevance of what may be called technological action, that is, active and critical interventions regarding the purposes and applications of technologies within ordinary lives, thus countering the deterministic trends of current directions of technology. In such transformative potential, we will explore a set of elements what is and could be technological action through snapshots of maker cultures based on the empirical research conducted in three particular contexts: the Fab Lab Network, Maker Media core outputs and initiatives such as Maker Faires, and the Open Source Hardware Association (OSHWA). Elements such as control and empowerment through material engagement, openness and sharing, and social, cultural, political and ethical values of the common good in topics such as diversity, sustainability and transparency, are critically analysed.

  6. The relationship between stochastic and deterministic quasi-steady state approximations.

    PubMed

    Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R

    2015-11-23

    The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.

  7. Dopant-controlled single-electron pumping through a metallic island

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenz, Tobias, E-mail: tobias.wenz@ptb.de; Hohls, Frank, E-mail: frank.hohls@ptb.de; Jehl, Xavier

    We investigate a hybrid metallic island/single dopant electron pump based on fully depleted silicon-on-insulator technology. Electron transfer between the central metallic island and the leads is controlled by resonant tunneling through single phosphorus dopants in the barriers. Top gates above the barriers are used to control the resonance conditions. Applying radio frequency signals to the gates, non-adiabatic quantized electron pumping is achieved. A simple deterministic model is presented and confirmed by comparing measurements with simulations.

  8. Heralded ions via ionization coincidence

    NASA Astrophysics Data System (ADS)

    McCulloch, A. J.; Speirs, R. W.; Wissenberg, S. H.; Tielen, R. P. M.; Sparkes, B. M.; Scholten, R. E.

    2018-04-01

    We demonstrate a method for the deterministic production of single ions by exploiting the correlation between an electron and associated ion following ionization. Coincident detection and feedback in combination with Coulomb-driven particle selection allows for high-fidelity heralding of ions at a high repetition rate. Extension of the scheme beyond time-correlated feedback to position- and momentum-correlated feedback will provide a general and powerful means to optimize the ion beam brightness for the development of next-generation focused ion beam technologies.

  9. Fire safety: A case study of technology transfer

    NASA Technical Reports Server (NTRS)

    Heins, C. F.

    1975-01-01

    Two basic ways in which NASA-generated technology is being used by the fire safety community are described. First, improved products and systems that embody NASA technical advances are entering the marketplace. Second, NASA test data and technical information related to fire safety are being used by persons concerned with reducing the hazards of fire through improved design information and standards. The development of commercial fire safety products and systems typically requires adaptation and integration of aerospace technologies that may not have been originated for NASA fire safety applications.

  10. Developments in Test Facility and Data Networking for the Altitude Test Stand at the John C. Stennis Space Center, MS - A General Overview

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.

    2008-01-01

    May 2007, NASA's Constellation Program selected John C Stennis Space Center (SSC) near Waveland Mississippi as the site to construct an altitude test facility for the developmental and qualification testing of the Ares1 upper stage (US) engine. Test requirements born out of the Ares1 US propulsion system design necessitate exceptional Data Acquisition System (DAS) design solutions that support facility and propellant systems conditioning, test operations control and test data analysis. This paper reviews the new A3 Altitude Test Facility's DAS design requirements for real-time deterministic digital data, DAS technology enhancements, system trades, technology validation activities, and the current status of this system's new architecture. Also to be discussed will be current network technologies to improve data transfer.

  11. Advanced structures technology and aircraft safety

    NASA Technical Reports Server (NTRS)

    Mccomb, H. G., Jr.

    1983-01-01

    NASA research and development on advanced aeronautical structures technology related to flight safety is reviewed. The effort is categorized as research in the technology base and projects sponsored by the Aircraft Energy Efficiency (ACEE) Project Office. Base technology research includes mechanics of composite structures, crash dynamics, and landing dynamics. The ACEE projects involve development and fabrication of selected composite structural components for existing commercial transport aircraft. Technology emanating from this research is intended to result in airframe structures with improved efficiency and safety.

  12. Analyzing simulation-based PRA data through traditional and topological clustering: A BWR station blackout case study

    DOE PAGES

    Maljovec, D.; Liu, S.; Wang, B.; ...

    2015-07-14

    Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less

  13. Aviation safety and operation problems research and technology

    NASA Technical Reports Server (NTRS)

    Enders, J. H.; Strickle, J. W.

    1977-01-01

    Aircraft operating problems are described for aviation safety. It is shown that as aircraft technology improves, the knowledge and understanding of operating problems must also improve for economics, reliability and safety.

  14. Deterministic Walks with Choice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beeler, Katy E.; Berenhaut, Kenneth S.; Cooper, Joshua N.

    2014-01-10

    This paper studies deterministic movement over toroidal grids, integrating local information, bounded memory and choice at individual nodes. The research is motivated by recent work on deterministic random walks, and applications in multi-agent systems. Several results regarding passing tokens through toroidal grids are discussed, as well as some open questions.

  15. 78 FR 11902 - Review of Gun Safety Technologies

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-20

    ... Technologies AGENCY: National Institute of Justice, JPO, DOJ. ACTION: Notice. SUMMARY: Following the President... emerging gun safety technologies and plans to issue a report on the availability and use of those technologies. NIJ seeks input from all interested stakeholders to help inform its technology assessment and...

  16. Using microsimulation to evaluate the effects of advanced vehicle technologies on congestion.

    DOT National Transportation Integrated Search

    2011-06-30

    Advanced driver assistance technologies are continuously being developed to enhance traffic : safety. Evaluations of such technologies typically focus on safety and there has been limited : research on the impacts of such technologies on traffic oper...

  17. Using Microsimulation to Evaluate the Effects of Advanced Vehicle Technologies on Congestion

    DOT National Transportation Integrated Search

    2011-06-30

    Advanced driver assistance technologies are continuously being developed to enhance traffic safety. Evaluations of such technologies typically focus on safety and there has been limited research on the impacts of such technologies on traffic operatio...

  18. Potential applications of video technology for traffic management and safety in Alabama

    DOT National Transportation Integrated Search

    2002-11-25

    Video technology applications for traffic management and safety are being implemented by state and local government agencies in Alabama. This technology offers both tangible and intangible benefits. Although video technology provides many benefits, i...

  19. How Safe is Vehicle Safety? The Contribution of Vehicle Technologies to the Reduction in Road Casualties in France from 2000 to 2010

    PubMed Central

    Page, Yves; Hermitte, Thierry; Cuny, Sophie

    2011-01-01

    In France, over the last 10 years, road fatalities have decreased dramatically by 48%. This reduction is somewhat close to the target fixed by the European Commision in 2001 for the whole of Europe (−50 %). According to the French govnerment, 75% of this reduction was due to the implementation of automatic speed cameras on the roadsides from 2003 onwards. Yet, during this period, there was also a significantly increase in safety technology, new regulations in front and side impacts, and developments in Euro NCAP to improve passive safety in the vehicles. This paper set out to estimate the extent that vehicle safety technologies contributed to the road safety benefits over this decade. Using a combination of databases and fitment rates, the number of fatalities and hospitalized injuries saved in passenger car crashes was estimated for a number of safety technologies, individually and as a package including a 5 star EuroNCAP rating. The additional benefits from other public safety measures were also similarly estimated. The results showed that overall safety measures during this decade saved 240,676 fatalities + serious injuries, of which 173,663 were car occupants. Of these, 27,365 car occupants and 1,083 pedestrian savings could be attributed directly to vehicle safety improvements (11% overall). It was concluded that while public safety measures were responsible for the majority of the savings, enhanced vehicle safety technologies also made a significant improvement in the road toll in France during the last decade. As the take-up rate for these technologies improves, is expected to continue to provide even more benefits in the next 10-year period. PMID:22105388

  20. Nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates

    DOEpatents

    Melechko, Anatoli V [Oak Ridge, TN; McKnight, Timothy E. , Guillorn, Michael A.; Ilic, Bojan [Ithaca, NY; Merkulov, Vladimir I [Knoxville, TN; Doktycz, Mitchel J [Knoxville, TN; Lowndes, Douglas H [Knoxville, TN; Simpson, Michael L [Knoxville, TN

    2011-05-17

    Methods, manufactures, machines and compositions are described for nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates. A method includes depositing a catalyst particle on a surface of a substrate to define a deterministically located position; growing an aligned elongated nanostructure on the substrate, an end of the aligned elongated nanostructure coupled to the substrate at the deterministically located position; coating the aligned elongated nanostructure with a conduit material; removing a portion of the conduit material to expose the catalyst particle; removing the catalyst particle; and removing the elongated nanostructure to define a nanoconduit.

  1. Human brain detects short-time nonlinear predictability in the temporal fine structure of deterministic chaotic sounds

    NASA Astrophysics Data System (ADS)

    Itoh, Kosuke; Nakada, Tsutomu

    2013-04-01

    Deterministic nonlinear dynamical processes are ubiquitous in nature. Chaotic sounds generated by such processes may appear irregular and random in waveform, but these sounds are mathematically distinguished from random stochastic sounds in that they contain deterministic short-time predictability in their temporal fine structures. We show that the human brain distinguishes deterministic chaotic sounds from spectrally matched stochastic sounds in neural processing and perception. Deterministic chaotic sounds, even without being attended to, elicited greater cerebral cortical responses than the surrogate control sounds after about 150 ms in latency after sound onset. Listeners also clearly discriminated these sounds in perception. The results support the hypothesis that the human auditory system is sensitive to the subtle short-time predictability embedded in the temporal fine structure of sounds.

  2. A deterministic particle method for one-dimensional reaction-diffusion equations

    NASA Technical Reports Server (NTRS)

    Mascagni, Michael

    1995-01-01

    We derive a deterministic particle method for the solution of nonlinear reaction-diffusion equations in one spatial dimension. This deterministic method is an analog of a Monte Carlo method for the solution of these problems that has been previously investigated by the author. The deterministic method leads to the consideration of a system of ordinary differential equations for the positions of suitably defined particles. We then consider the time explicit and implicit methods for this system of ordinary differential equations and we study a Picard and Newton iteration for the solution of the implicit system. Next we solve numerically this system and study the discretization error both analytically and numerically. Numerical computation shows that this deterministic method is automatically adaptive to large gradients in the solution.

  3. Diesel Technology: Safety Skills. [Teacher and Student Editions.

    ERIC Educational Resources Information Center

    Kellum, Mary

    Competency-based teacher and student materials are provided for three units on safety skills as part of a diesel technology curriculum. The units cover the following topics: general safety; workplace safety; and first aid. The materials are based on the curriculum-alignment concept of first stating the objectives, then developing instructional…

  4. Factors shaping effective utilization of health information technology in urban safety-net clinics.

    PubMed

    George, Sheba; Garth, Belinda; Fish, Allison; Baker, Richard

    2013-09-01

    Urban safety-net clinics are considered prime targets for the adoption of health information technology innovations; however, little is known about their utilization in such safety-net settings. Current scholarship provides limited guidance on the implementation of health information technology into safety-net settings as it typically assumes that adopting institutions have sufficient basic resources. This study addresses this gap by exploring the unique challenges urban resource-poor safety-net clinics must consider when adopting and utilizing health information technology. In-depth interviews (N = 15) were used with key stakeholders (clinic chief executive officers, medical directors, nursing directors, chief financial officers, and information technology directors) from staff at four clinics to explore (a) nonhealth information technology-related clinic needs, (b) how health information technology may provide solutions, and (c) perceptions of and experiences with health information technology. Participants identified several challenges, some of which appear amenable to health information technology solutions. Also identified were requirements for effective utilization of health information technology including physical infrastructural improvements, funding for equipment/training, creation of user groups to share health information technology knowledge/experiences, and specially tailored electronic billing guidelines. We found that despite the potential benefit that can be derived from health information technologies, the unplanned and uninformed introduction of these tools into these settings might actually create more problems than are solved. From these data, we were able to identify a set of factors that should be considered when integrating health information technology into the existing workflows of low-resourced urban safety-net clinics in order to maximize their utilization and enhance the quality of health care in such settings.

  5. Improving Student Concern for Safety in a Production Technology Lab through the Use of Teambuilding.

    ERIC Educational Resources Information Center

    Lacina, Dale Robert

    The effectiveness of team building as a strategy for improving students' concern for safety in a production technology laboratory was examined in a study involving a group of grade 9 and 10 production technology students from an urban, lower-middle-class community in western Illinois. Students' safety test scores, teacher checklists, and…

  6. 49 CFR 533.6 - Measurement and calculation procedures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the technology is related to crash-avoidance technologies, safety critical systems or systems affecting safety-critical functions, or technologies designed for the purpose of reducing the frequency of... improvements related to air conditioning efficiency, off-cycle technologies, and hybridization and other...

  7. Nurse-Technology Interactions and Patient Safety.

    PubMed

    Ruppel, Halley; Funk, Marjorie

    2018-06-01

    Nurses are the end-users of most technology in intensive care units, and the ways in which they interact with technology affect quality of care and patient safety. Nurses' interactions include the processes of ensuring proper input of data into the technology as well as extracting and interpreting the output (clinical data, technical data, alarms). Current challenges in nurse-technology interactions for physiologic monitoring include issues regarding alarm management, workflow interruptions, and monitor surveillance. Patient safety concepts, like high reliability organizations and human factors, can advance efforts to enhance nurse-technology interactions. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Computational analysis of the roles of biochemical reactions in anomalous diffusion dynamics

    NASA Astrophysics Data System (ADS)

    Naruemon, Rueangkham; Charin, Modchang

    2016-04-01

    Most biochemical processes in cells are usually modeled by reaction-diffusion (RD) equations. In these RD models, the diffusive process is assumed to be Gaussian. However, a growing number of studies have noted that intracellular diffusion is anomalous at some or all times, which may result from a crowded environment and chemical kinetics. This work aims to computationally study the effects of chemical reactions on the diffusive dynamics of RD systems by using both stochastic and deterministic algorithms. Numerical method to estimate the mean-square displacement (MSD) from a deterministic algorithm is also investigated. Our computational results show that anomalous diffusion can be solely due to chemical reactions. The chemical reactions alone can cause anomalous sub-diffusion in the RD system at some or all times. The time-dependent anomalous diffusion exponent is found to depend on many parameters, including chemical reaction rates, reaction orders, and chemical concentrations. Project supported by the Thailand Research Fund and Mahidol University (Grant No. TRG5880157), the Thailand Center of Excellence in Physics (ThEP), CHE, Thailand, and the Development Promotion of Science and Technology.

  9. Deterministic reshaping of single-photon spectra using cross-phase modulation.

    PubMed

    Matsuda, Nobuyuki

    2016-03-01

    The frequency conversion of light has proved to be a crucial technology for communication, spectroscopy, imaging, and signal processing. In the quantum regime, it also offers great potential for realizing quantum networks incorporating disparate physical systems and quantum-enhanced information processing over a large computational space. The frequency conversion of quantum light, such as single photons, has been extensively investigated for the last two decades using all-optical frequency mixing, with the ultimate goal of realizing lossless and noiseless conversion. I demonstrate another route to this target using frequency conversion induced by cross-phase modulation in a dispersion-managed photonic crystal fiber. Owing to the deterministic and all-optical nature of the process, the lossless and low-noise spectral reshaping of a single-photon wave packet in the telecommunication band has been readily achieved with a modulation bandwidth as large as 0.4 THz. I further demonstrate that the scheme is applicable to manipulations of a nonclassical frequency correlation, wave packet interference, and entanglement between two photons. This approach presents a new coherent frequency interface for photons for quantum information processing.

  10. Deterministic reshaping of single-photon spectra using cross-phase modulation

    PubMed Central

    Matsuda, Nobuyuki

    2016-01-01

    The frequency conversion of light has proved to be a crucial technology for communication, spectroscopy, imaging, and signal processing. In the quantum regime, it also offers great potential for realizing quantum networks incorporating disparate physical systems and quantum-enhanced information processing over a large computational space. The frequency conversion of quantum light, such as single photons, has been extensively investigated for the last two decades using all-optical frequency mixing, with the ultimate goal of realizing lossless and noiseless conversion. I demonstrate another route to this target using frequency conversion induced by cross-phase modulation in a dispersion-managed photonic crystal fiber. Owing to the deterministic and all-optical nature of the process, the lossless and low-noise spectral reshaping of a single-photon wave packet in the telecommunication band has been readily achieved with a modulation bandwidth as large as 0.4 THz. I further demonstrate that the scheme is applicable to manipulations of a nonclassical frequency correlation, wave packet interference, and entanglement between two photons. This approach presents a new coherent frequency interface for photons for quantum information processing. PMID:27051862

  11. Deterministic Line-Shape Programming of Silicon Nanowires for Extremely Stretchable Springs and Electronics.

    PubMed

    Xue, Zhaoguo; Sun, Mei; Dong, Taige; Tang, Zhiqiang; Zhao, Yaolong; Wang, Junzhuan; Wei, Xianlong; Yu, Linwei; Chen, Qing; Xu, Jun; Shi, Yi; Chen, Kunji; Roca I Cabarrocas, Pere

    2017-12-13

    Line-shape engineering is a key strategy to endow extra stretchability to 1D silicon nanowires (SiNWs) grown with self-assembly processes. We here demonstrate a deterministic line-shape programming of in-plane SiNWs into extremely stretchable springs or arbitrary 2D patterns with the aid of indium droplets that absorb amorphous Si precursor thin film to produce ultralong c-Si NWs along programmed step edges. A reliable and faithful single run growth of c-SiNWs over turning tracks with different local curvatures has been established, while high resolution transmission electron microscopy analysis reveals a high quality monolike crystallinity in the line-shaped engineered SiNW springs. Excitingly, in situ scanning electron microscopy stretching and current-voltage characterizations also demonstrate a superelastic and robust electric transport carried by the SiNW springs even under large stretching of more than 200%. We suggest that this highly reliable line-shape programming approach holds a strong promise to extend the mature c-Si technology into the development of a new generation of high performance biofriendly and stretchable electronics.

  12. Faithful deterministic secure quantum communication and authentication protocol based on hyperentanglement against collective noise

    NASA Astrophysics Data System (ADS)

    Chang, Yan; Zhang, Shi-Bin; Yan, Li-Li; Han, Gui-Hua

    2015-08-01

    Higher channel capacity and security are difficult to reach in a noisy channel. The loss of photons and the distortion of the qubit state are caused by noise. To solve these problems, in our study, a hyperentangled Bell state is used to design faithful deterministic secure quantum communication and authentication protocol over collective-rotation and collective-dephasing noisy channel, which doubles the channel capacity compared with using an ordinary Bell state as a carrier; a logical hyperentangled Bell state immune to collective-rotation and collective-dephasing noise is constructed. The secret message is divided into several parts to transmit, however the identity strings of Alice and Bob are reused. Unitary operations are not used. Project supported by the National Natural Science Foundation of China (Grant No. 61402058), the Science and Technology Support Project of Sichuan Province, China (Grant No. 2013GZX0137), the Fund for Young Persons Project of Sichuan Province, China (Grant No. 12ZB017), and the Foundation of Cyberspace Security Key Laboratory of Sichuan Higher Education Institutions, China (Grant No. szjj2014-074).

  13. Multiple vehicle tracking in aerial video sequence using driver behavior analysis and improved deterministic data association

    NASA Astrophysics Data System (ADS)

    Zhang, Xunxun; Xu, Hongke; Fang, Jianwu

    2018-01-01

    Along with the rapid development of the unmanned aerial vehicle technology, multiple vehicle tracking (MVT) in aerial video sequence has received widespread interest for providing the required traffic information. Due to the camera motion and complex background, MVT in aerial video sequence poses unique challenges. We propose an efficient MVT algorithm via driver behavior-based Kalman filter (DBKF) and an improved deterministic data association (IDDA) method. First, a hierarchical image registration method is put forward to compensate the camera motion. Afterward, to improve the accuracy of the state estimation, we propose the DBKF module by incorporating the driver behavior into the Kalman filter, where artificial potential field is introduced to reflect the driver behavior. Then, to implement the data association, a local optimization method is designed instead of global optimization. By introducing the adaptive operating strategy, the proposed IDDA method can also deal with the situation in which the vehicles suddenly appear or disappear. Finally, comprehensive experiments on the DARPA VIVID data set and KIT AIS data set demonstrate that the proposed algorithm can generate satisfactory and superior results.

  14. A nonlinear dynamic age-structured model of e-commerce in spain: Stability analysis of the equilibrium by delay and stochastic perturbations

    NASA Astrophysics Data System (ADS)

    Burgos, C.; Cortés, J.-C.; Shaikhet, L.; Villanueva, R.-J.

    2018-11-01

    First, we propose a deterministic age-structured epidemiological model to study the diffusion of e-commerce in Spain. Afterwards, we determine the parameters (death, birth and growth rates) of the underlying demographic model as well as the parameters (transmission of the use of e-commerce rates) of the proposed epidemiological model that best fit real data retrieved from the Spanish National Statistical Institute. Motivated by the two following facts: first the dynamics of acquiring the use of a new technology as e-commerce is mainly driven by the feedback after interacting with our peers (family, friends, mates, mass media, etc.), hence having a certain delay, and second the inherent uncertainty of sampled real data and the social complexity of the phenomena under analysis, we introduce aftereffect and stochastic perturbations in the initial deterministic model. This leads to a delayed stochastic model for e-commerce. We then investigate sufficient conditions in order to guarantee the stability in probability of the equilibrium point of the dynamic e-commerce delayed stochastic model. Our theoretical findings are numerically illustrated using real data.

  15. cDNA Microarray Screening in Food Safety

    PubMed Central

    ROY, SASHWATI; SEN, CHANDAN K

    2009-01-01

    The cDNA microarray technology and related bioinformatics tools presents a wide range of novel application opportunities. The technology may be productively applied to address food safety. In this mini-review article, we present an update highlighting the late breaking discoveries that demonstrate the vitality of cDNA microarray technology as a tool to analyze food safety with reference to microbial pathogens and genetically modified foods. In order to bring the microarray technology to mainstream food safety, it is important to develop robust user-friendly tools that may be applied in a field setting. In addition, there needs to be a standardized process for regulatory agencies to interpret and act upon microarray-based data. The cDNA microarray approach is an emergent technology in diagnostics. Its values lie in being able to provide complimentary molecular insight when employed in addition to traditional tests for food safety, as part of a more comprehensive battery of tests. PMID:16466843

  16. 49 CFR 533.6 - Measurement and calculation procedures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... technology is related to crash-avoidance technologies, safety critical systems or systems affecting safety-critical functions, or technologies designed for the purpose of reducing the frequency of vehicle crashes... improvements related to air conditioning efficiency, off-cycle technologies, and hybridization and other...

  17. Tracking the use of onboard safety technologies across the truck fleet.

    DOT National Transportation Integrated Search

    2009-03-01

    The Transportation Safety Analysis and the Automotive Analysis Divisons at the University of Michigan Transportation Research Institute (UMTRI) initiated the Onboard Safety Technologies project in 2007, supported by FMCSA, to collect detailed informa...

  18. Applying Sensor-Based Technology to Improve Construction Safety Management.

    PubMed

    Zhang, Mingyuan; Cao, Tianzhuo; Zhao, Xuefeng

    2017-08-11

    Construction sites are dynamic and complicated systems. The movement and interaction of people, goods and energy make construction safety management extremely difficult. Due to the ever-increasing amount of information, traditional construction safety management has operated under difficult circumstances. As an effective way to collect, identify and process information, sensor-based technology is deemed to provide new generation of methods for advancing construction safety management. It makes the real-time construction safety management with high efficiency and accuracy a reality and provides a solid foundation for facilitating its modernization, and informatization. Nowadays, various sensor-based technologies have been adopted for construction safety management, including locating sensor-based technology, vision-based sensing and wireless sensor networks. This paper provides a systematic and comprehensive review of previous studies in this field to acknowledge useful findings, identify the research gaps and point out future research directions.

  19. Applying Sensor-Based Technology to Improve Construction Safety Management

    PubMed Central

    Zhang, Mingyuan; Cao, Tianzhuo; Zhao, Xuefeng

    2017-01-01

    Construction sites are dynamic and complicated systems. The movement and interaction of people, goods and energy make construction safety management extremely difficult. Due to the ever-increasing amount of information, traditional construction safety management has operated under difficult circumstances. As an effective way to collect, identify and process information, sensor-based technology is deemed to provide new generation of methods for advancing construction safety management. It makes the real-time construction safety management with high efficiency and accuracy a reality and provides a solid foundation for facilitating its modernization, and informatization. Nowadays, various sensor-based technologies have been adopted for construction safety management, including locating sensor-based technology, vision-based sensing and wireless sensor networks. This paper provides a systematic and comprehensive review of previous studies in this field to acknowledge useful findings, identify the research gaps and point out future research directions. PMID:28800061

  20. VASP-4096: a very high performance programmable device for digital media processing applications

    NASA Astrophysics Data System (ADS)

    Krikelis, Argy

    2001-03-01

    Over the past few years, technology drivers for microprocessors have changed significantly. Media data delivery and processing--such as telecommunications, networking, video processing, speech recognition and 3D graphics--is increasing in importance and will soon dominate the processing cycles consumed in computer-based systems. This paper presents the architecture of the VASP-4096 processor. VASP-4096 provides high media performance with low energy consumption by integrating associative SIMD parallel processing with embedded microprocessor technology. The major innovations in the VASP-4096 is the integration of thousands of processing units in a single chip that are capable of support software programmable high-performance mathematical functions as well as abstract data processing. In addition to 4096 processing units, VASP-4096 integrates on a single chip a RISC controller that is an implementation of the SPARC architecture, 128 Kbytes of Data Memory, and I/O interfaces. The SIMD processing in VASP-4096 implements the ASProCore architecture, which is a proprietary implementation of SIMD processing, operates at 266 MHz with program instructions issued by the RISC controller. The device also integrates a 64-bit synchronous main memory interface operating at 133 MHz (double-data rate), and a 64- bit 66 MHz PCI interface. VASP-4096, compared with other processors architectures that support media processing, offers true performance scalability, support for deterministic and non-deterministic data processing on a single device, and software programmability that can be re- used in future chip generations.

  1. Deterministic and Stochastic Analysis of a Prey-Dependent Predator-Prey System

    ERIC Educational Resources Information Center

    Maiti, Alakes; Samanta, G. P.

    2005-01-01

    This paper reports on studies of the deterministic and stochastic behaviours of a predator-prey system with prey-dependent response function. The first part of the paper deals with the deterministic analysis of uniform boundedness, permanence, stability and bifurcation. In the second part the reproductive and mortality factors of the prey and…

  2. ShinyGPAS: interactive genomic prediction accuracy simulator based on deterministic formulas.

    PubMed

    Morota, Gota

    2017-12-20

    Deterministic formulas for the accuracy of genomic predictions highlight the relationships among prediction accuracy and potential factors influencing prediction accuracy prior to performing computationally intensive cross-validation. Visualizing such deterministic formulas in an interactive manner may lead to a better understanding of how genetic factors control prediction accuracy. The software to simulate deterministic formulas for genomic prediction accuracy was implemented in R and encapsulated as a web-based Shiny application. Shiny genomic prediction accuracy simulator (ShinyGPAS) simulates various deterministic formulas and delivers dynamic scatter plots of prediction accuracy versus genetic factors impacting prediction accuracy, while requiring only mouse navigation in a web browser. ShinyGPAS is available at: https://chikudaisei.shinyapps.io/shinygpas/ . ShinyGPAS is a shiny-based interactive genomic prediction accuracy simulator using deterministic formulas. It can be used for interactively exploring potential factors that influence prediction accuracy in genome-enabled prediction, simulating achievable prediction accuracy prior to genotyping individuals, or supporting in-class teaching. ShinyGPAS is open source software and it is hosted online as a freely available web-based resource with an intuitive graphical user interface.

  3. A Synthetic Vision Preliminary Integrated Safety Analysis

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Houser, Scott

    2001-01-01

    This report documents efforts to analyze a sample of aviation safety programs, using the LMI-developed integrated safety analysis tool to determine the change in system risk resulting from Aviation Safety Program (AvSP) technology implementation. Specifically, we have worked to modify existing system safety tools to address the safety impact of synthetic vision (SV) technology. Safety metrics include reliability, availability, and resultant hazard. This analysis of SV technology is intended to be part of a larger effort to develop a model that is capable of "providing further support to the product design and development team as additional information becomes available". The reliability analysis portion of the effort is complete and is fully documented in this report. The simulation analysis is still underway; it will be documented in a subsequent report. The specific goal of this effort is to apply the integrated safety analysis to SV technology. This report also contains a brief discussion of data necessary to expand the human performance capability of the model, as well as a discussion of human behavior and its implications for system risk assessment in this modeling environment.

  4. Advanced reactors and associated fuel cycle facilities: safety and environmental impacts.

    PubMed

    Hill, R N; Nutt, W M; Laidler, J J

    2011-01-01

    The safety and environmental impacts of new technology and fuel cycle approaches being considered in current U.S. nuclear research programs are contrasted to conventional technology options in this paper. Two advanced reactor technologies, the sodium-cooled fast reactor (SFR) and the very high temperature gas-cooled reactor (VHTR), are being developed. In general, the new reactor technologies exploit inherent features for enhanced safety performance. A key distinction of advanced fuel cycles is spent fuel recycle facilities and new waste forms. In this paper, the performance of existing fuel cycle facilities and applicable regulatory limits are reviewed. Technology options to improve recycle efficiency, restrict emissions, and/or improve safety are identified. For a closed fuel cycle, potential benefits in waste management are significant, and key waste form technology alternatives are described. Copyright © 2010 Health Physics Society

  5. Evaluate error correction ability of magnetorheological finishing by smoothing spectral function

    NASA Astrophysics Data System (ADS)

    Wang, Jia; Fan, Bin; Wan, Yongjian; Shi, Chunyan; Zhuo, Bin

    2014-08-01

    Power Spectral Density (PSD) has been entrenched in optics design and manufacturing as a characterization of mid-high spatial frequency (MHSF) errors. Smoothing Spectral Function (SSF) is a newly proposed parameter that based on PSD to evaluate error correction ability of computer controlled optical surfacing (CCOS) technologies. As a typical deterministic and sub-aperture finishing technology based on CCOS, magnetorheological finishing (MRF) leads to MHSF errors inevitably. SSF is employed to research different spatial frequency error correction ability of MRF process. The surface figures and PSD curves of work-piece machined by MRF are presented. By calculating SSF curve, the correction ability of MRF for different spatial frequency errors will be indicated as a normalized numerical value.

  6. Comparing the performance of residential fire sprinklers with other life-safety technologies.

    PubMed

    Butry, David T

    2012-09-01

    Residential fire sprinklers have long proven themselves as life-safety technologies to the fire service community. Yet, about 1% of all one- and two-family dwelling fires occur in homes protected by sprinklers. It has been argued that measured sprinkler performance has ignored factors confounding the relationship between sprinkler use and performance. In this analysis, sprinkler performance is measured by comparing 'like' structure fires, while conditioning on smoke detection technology and neighborhood housing and socioeconomic conditions, using propensity score matching. Results show that residential fire sprinklers protect occupant and firefighter health and safety, and are comparable to other life-safety technologies. Published by Elsevier Ltd.

  7. NASA-STD-7009 Guidance Document for Human Health and Performance Models and Simulations

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Mulugeta, Lealem; Nelson, Emily S.; Myers, Jerry G.

    2014-01-01

    Rigorous verification, validation, and credibility (VVC) processes are imperative to ensure that models and simulations (MS) are sufficiently reliable to address issues within their intended scope. The NASA standard for MS, NASA-STD-7009 (7009) [1] was a resultant outcome of the Columbia Accident Investigation Board (CAIB) to ensure MS are developed, applied, and interpreted appropriately for making decisions that may impact crew or mission safety. Because the 7009 focus is engineering systems, a NASA-STD-7009 Guidance Document is being developed to augment the 7009 and provide information, tools, and techniques applicable to the probabilistic and deterministic biological MS more prevalent in human health and performance (HHP) and space biomedical research and operations.

  8. Research and technology in the Federal Motor Carrier Safety Administration

    DOT National Transportation Integrated Search

    2002-01-01

    As the Federal Government's chief commercial vehicle safety agency, the Federal Motor Carrier Safety Administration's (FMCSA), Office of Research and Technology (R&T) focuses on saving lives and reducing injuries by helping to prevent crashes involvi...

  9. Safety inspection of plant products

    USDA-ARS?s Scientific Manuscript database

    Advances in hyperspectral imaging technology have provided enormous opportunity for the food industry and research community to develop rapid and non-invasive inspection methods for food safety inspection. This chapter reviews and discusses different aspects of using this technology in safety inspec...

  10. What next after determinism in the ontology of technology? Distributing responsibility in the biofuel debate.

    PubMed

    Boucher, Philip

    2011-09-01

    This article builds upon previous discussion of social and technical determinisms as implicit positions in the biofuel debate. To ensure these debates are balanced, it has been suggested that they should be designed to contain a variety of deterministic positions. Whilst it is agreed that determinism does not feature strongly in contemporary academic literatures, it is found that they have generally been superseded by an absence of any substantive conceptualisation of how the social shaping of technology may be related to, or occur alongside, an objective or autonomous reality. The problem of determinism emerges at an ontological level and must be resolved in situ. A critical realist approach to technology is presented which may provide a more appropriate framework for debate. In dialogue with previous discussion, the distribution of responsibility is revisited with reference to the role of scientists and engineers.

  11. Electromagnetic sensing for deterministic finishing gridded domes

    NASA Astrophysics Data System (ADS)

    Galbraith, Stephen L.

    2013-06-01

    Electromagnetic sensing is a promising technology for precisely locating conductive grid structures that are buried in optical ceramic domes. Burying grid structures directly in the ceramic makes gridded dome construction easier, but a practical sensing technology is required to locate the grid relative to the dome surfaces. This paper presents a novel approach being developed for locating mesh grids that are physically thin, on the order of a mil, curved, and 75% to 90% open space. Non-contact location sensing takes place over a distance of 1/2 inch. A non-contact approach was required because the presence of the ceramic material precludes touching the grid with a measurement tool. Furthermore, the ceramic which may be opaque or transparent is invisible to the sensing technology which is advantageous for calibration. The paper first details the physical principles being exploited. Next, sensor impedance response is discussed for thin, open mesh, grids versus thick, solid, metal conductors. Finally, the technology approach is incorporated into a practical field tool for use in inspecting gridded domes.

  12. Next generation safety performance monitoring at signalized intersections using connected vehicle technology.

    DOT National Transportation Integrated Search

    2014-08-01

    Crash-based safety evaluation is often hampered by randomness, lack of timeliness, and rarity of crash : occurrences. This is particularly the case for technology-driven safety improvement projects that are : frequently updated or replaced by newer o...

  13. Integration of safety technologies into rheumatology and orthopedics practices: a randomized, controlled trial.

    PubMed

    Moorjani, Gautam R; Bedrick, Edward J; Michael, Adrian A; Peisajovich, Andres; Sibbitt, Wilmer L; Bankhurst, Arthur D

    2008-07-01

    To identify and integrate new safety technologies into outpatient musculoskeletal procedures and measure the effect on outcome, including pain. Using national resources for patient safety and literature review, the following safety technologies were identified: a safety needle to reduce inadvertent needlesticks to heath care workers, and the reciprocating procedure device (RPD) to improve patient safety and reduce pain. Five hundred sixty-six musculoskeletal procedures involving syringes and needles were randomized to either an RPD group or a conventional syringe group, and pain, quality, safety, and physician acceptance were measured. During 566 procedures, no accidental needlesticks occurred with safety needles. Use of the RPD resulted in a 35.4% reduction (95% confidence interval [95% CI] 24-46%) in patient-assessed pain (mean +/- SD scores on a visual analog pain scale [VAPS] 3.12 +/- 2.23 for the RPD and 4.83 +/- 3.22 for the conventional syringe; P < 0.001) and a 49.5% reduction (95% CI 34-64%) in patient-assessed significant pain (VAPS score > or =5) (P < 0.001). Physician acceptance of the RPD combined with a safety needle was excellent. As mandated by the Joint Commission and the Occupational Safety and Health Administration, safety technologies and the use of pain scales can be successfully integrated into rheumatologic and orthopedic procedures. The combination of a safety needle to reduce needlestick injuries to health care workers and the RPD to improve safety and outcome of patients is effective and well accepted by physicians.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Y M; Bush, K; Han, B

    Purpose: Accurate and fast dose calculation is a prerequisite of precision radiation therapy in modern photon and particle therapy. While Monte Carlo (MC) dose calculation provides high dosimetric accuracy, the drastically increased computational time hinders its routine use. Deterministic dose calculation methods are fast, but problematic in the presence of tissue density inhomogeneity. We leverage the useful features of deterministic methods and MC to develop a hybrid dose calculation platform with autonomous utilization of MC and deterministic calculation depending on the local geometry, for optimal accuracy and speed. Methods: Our platform utilizes a Geant4 based “localized Monte Carlo” (LMC) methodmore » that isolates MC dose calculations only to volumes that have potential for dosimetric inaccuracy. In our approach, additional structures are created encompassing heterogeneous volumes. Deterministic methods calculate dose and energy fluence up to the volume surfaces, where the energy fluence distribution is sampled into discrete histories and transported using MC. Histories exiting the volume are converted back into energy fluence, and transported deterministically. By matching boundary conditions at both interfaces, deterministic dose calculation account for dose perturbations “downstream” of localized heterogeneities. Hybrid dose calculation was performed for water and anthropomorphic phantoms. Results: We achieved <1% agreement between deterministic and MC calculations in the water benchmark for photon and proton beams, and dose differences of 2%–15% could be observed in heterogeneous phantoms. The saving in computational time (a factor ∼4–7 compared to a full Monte Carlo dose calculation) was found to be approximately proportional to the volume of the heterogeneous region. Conclusion: Our hybrid dose calculation approach takes advantage of the computational efficiency of deterministic method and accuracy of MC, providing a practical tool for high performance dose calculation in modern RT. The approach is generalizable to all modalities where heterogeneities play a large role, notably particle therapy.« less

  15. New technology for food safety: role of the new technology staff in FSIS

    NASA Astrophysics Data System (ADS)

    Early, Howard

    2004-03-01

    The Food Safety and Inspection Service (FSIS) has implemented new procedures for meat and poultry establishments, egg products plants, and companies that manufacture and sell technology to official establishments to notify the Agency of new technology that they propose to use in meat and poultry establishments or egg products plants. If the new technology could affect FSIS regulations, product safety, inspection procedures, or the safety of Federal inspection program personnel, then the establishment or plant would need to submit a written protocol to the Agency. As part of this process, the submitter will be expected to conduct in-plant trials of the new technology. The submitter will need to provide data to FSIS throughout the duration of the in-plant trial for the Agency to examine. Data may take several forms: laboratory results, weekly or monthly summary production reports, and evaluations from inspection program personnel.

  16. Probabilistic margin evaluation on accidental transients for the ASTRID reactor project

    NASA Astrophysics Data System (ADS)

    Marquès, Michel

    2014-06-01

    ASTRID is a technological demonstrator of Sodium cooled Fast Reactor (SFR) under development. The conceptual design studies are being conducted in accordance with the Generation IV reactor objectives, particularly in terms of improving safety. For the hypothetical events, belonging to the accidental category "severe accident prevention situations" having a very low frequency of occurrence, the safety demonstration is no more based on a deterministic demonstration with conservative assumptions on models and parameters but on a "Best-Estimate Plus Uncertainty" (BEPU) approach. This BEPU approach ispresented in this paper for an Unprotected Loss-of-Flow (ULOF) event. The Best-Estimate (BE) analysis of this ULOFt ransient is performed with the CATHARE2 code, which is the French reference system code for SFR applications. The objective of the BEPU analysis is twofold: first evaluate the safety margin to sodium boiling in taking into account the uncertainties on the input parameters of the CATHARE2 code (twenty-two uncertain input parameters have been identified, which can be classified into five groups: reactor power, accident management, pumps characteristics, reactivity coefficients, thermal parameters and head losses); secondly quantify the contribution of each input uncertainty to the overall uncertainty of the safety margins, in order to refocusing R&D efforts on the most influential factors. This paper focuses on the methodological aspects of the evaluation of the safety margin. At least for the preliminary phase of the project (conceptual design), a probabilistic criterion has been fixed in the context of this BEPU analysis; this criterion is the value of the margin to sodium boiling, which has a probability 95% to be exceeded, obtained with a confidence level of 95% (i.e. the M5,95percentile of the margin distribution). This paper presents two methods used to assess this percentile: the Wilks method and the Bootstrap method ; the effectiveness of the two methods is compared on the basis of 500 simulations performed with theCATHARE2 code. We conclude that, with only 100 simulations performed with the CATHARE2 code, which is a number of simulations workable in the conceptual design phase of the ASTRID project where the models and the hypothesis are often modified, it is best in order to evaluate the percentile M5,95 of the margin to sodium boiling to use the bootstrap method, which will provide a slightly conservative result. On the other hand, in order to obtain an accurate estimation of the percentileM5,95, for the safety report for example, it will be necessary to perform at least 300 simulations with the CATHARE2 code. In this case, both methods (Wilks and Bootstrap) would give equivalent results.

  17. Food Safety and Intervention Technologies Research: Cold Plasma as a Nonthermal food processing technology

    USDA-ARS?s Scientific Manuscript database

    Contamination of meats, seafood, poultry, eggs, and fresh and fresh-cut fruits and vegetables is an ongoing concern. The Food Safety and Intervention Technologies Research Unit develops and validates innovative approaches and new technologies that control pathogenic bacteria and viruses while preser...

  18. Food Safety and Intervention Technologies research: cold plasma as a nonthermal food processing technology

    USDA-ARS?s Scientific Manuscript database

    Contamination of meats, seafood, poultry, eggs, and fresh and fresh-cut fruits and vegetables is an ongoing concern. The Food Safety and Intervention Technologies Research Unit develops and validates innovative approaches and new technologies that control pathogenic bacteria and viruses while preser...

  19. Validation of Safety-Critical Systems for Aircraft Loss-of-Control Prevention and Recovery

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    2012-01-01

    Validation of technologies developed for loss of control (LOC) prevention and recovery poses significant challenges. Aircraft LOC can result from a wide spectrum of hazards, often occurring in combination, which cannot be fully replicated during evaluation. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of hazardous and uncertain conditions, and the validation framework must provide some measure of assurance that the new vehicle safety technologies do no harm (i.e., that they themselves do not introduce new safety risks). This paper summarizes a proposed validation framework for safety-critical systems, provides an overview of validation methods and tools developed by NASA to date within the Vehicle Systems Safety Project, and develops a preliminary set of test scenarios for the validation of technologies for LOC prevention and recovery

  20. Nearly deterministic quantum Fredkin gate based on weak cross-Kerr nonlinearity

    NASA Astrophysics Data System (ADS)

    Wu, Yun-xiang; Zhu, Chang-hua; Pei, Chang-xing

    2016-09-01

    A scheme of an optical quantum Fredkin gate is presented based on weak cross-Kerr nonlinearity. By an auxiliary coherent state with the cross-Kerr nonlinearity effect, photons can interact with each other indirectly, and a non-demolition measurement for photons can be implemented. Combined with the homodyne detection, classical feedforward, polarization beam splitters and Pauli-X operations, a controlled-path gate is constructed. Furthermore, a quantum Fredkin gate is built based on the controlled-path gate. The proposed Fredkin gate is simple in structure and feasible by current experimental technology.

  1. Deterministic implementations of single-photon multi-qubit Deutsch-Jozsa algorithms with linear optics

    NASA Astrophysics Data System (ADS)

    Wei, Hai-Rui; Liu, Ji-Zhen

    2017-02-01

    It is very important to seek an efficient and robust quantum algorithm demanding less quantum resources. We propose one-photon three-qubit original and refined Deutsch-Jozsa algorithms with polarization and two linear momentums degrees of freedom (DOFs). Our schemes are constructed by solely using linear optics. Compared to the traditional ones with one DOF, our schemes are more economic and robust because the necessary photons are reduced from three to one. Our linear-optic schemes are working in a determinate way, and they are feasible with current experimental technology.

  2. Deterministic implementations of single-photon multi-qubit Deutsch–Jozsa algorithms with linear optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Hai-Rui, E-mail: hrwei@ustb.edu.cn; Liu, Ji-Zhen

    2017-02-15

    It is very important to seek an efficient and robust quantum algorithm demanding less quantum resources. We propose one-photon three-qubit original and refined Deutsch–Jozsa algorithms with polarization and two linear momentums degrees of freedom (DOFs). Our schemes are constructed by solely using linear optics. Compared to the traditional ones with one DOF, our schemes are more economic and robust because the necessary photons are reduced from three to one. Our linear-optic schemes are working in a determinate way, and they are feasible with current experimental technology.

  3. Analysis of whisker-toughened CMC structural components using an interactive reliability model

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Palko, Joseph L.

    1992-01-01

    Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.

  4. Application of technology developed for flight simulation at NASA. Langley Research Center

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1991-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations including mathematical model computation and data input/output to the simulators must be deterministic and be completed in as short a time as possible. Personnel at NASA's Langley Research Center are currently developing the use of supercomputers for simulation mathematical model computation for real-time simulation. This, coupled with the use of an open systems software architecture, will advance the state-of-the-art in real-time flight simulation.

  5. Bidirectional Controlled Joint Remote State Preparation via a Seven-Qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-yu; Mo, Zhi-wen

    2017-04-01

    A new protocol for implementing five-party bidirectional controlled joint remote state preparation is proposed by using a seven-qubit entangled state as the quantum channel. It can be shown that two distant senders can simultaneously and deterministically exchange their states with the other senders under the control of the supervisor, and it cannot be succeed without permission of the controller. Only pauli operation and single-qubit measurement are used in our scheme, so the scheme with five-party is feasible within the reach of current technologies.

  6. Stability analysis of multi-group deterministic and stochastic epidemic models with vaccination rate

    NASA Astrophysics Data System (ADS)

    Wang, Zhi-Gang; Gao, Rui-Mei; Fan, Xiao-Ming; Han, Qi-Xing

    2014-09-01

    We discuss in this paper a deterministic multi-group MSIR epidemic model with a vaccination rate, the basic reproduction number ℛ0, a key parameter in epidemiology, is a threshold which determines the persistence or extinction of the disease. By using Lyapunov function techniques, we show if ℛ0 is greater than 1 and the deterministic model obeys some conditions, then the disease will prevail, the infective persists and the endemic state is asymptotically stable in a feasible region. If ℛ0 is less than or equal to 1, then the infective disappear so the disease dies out. In addition, stochastic noises around the endemic equilibrium will be added to the deterministic MSIR model in order that the deterministic model is extended to a system of stochastic ordinary differential equations. In the stochastic version, we carry out a detailed analysis on the asymptotic behavior of the stochastic model. In addition, regarding the value of ℛ0, when the stochastic system obeys some conditions and ℛ0 is greater than 1, we deduce the stochastic system is stochastically asymptotically stable. Finally, the deterministic and stochastic model dynamics are illustrated through computer simulations.

  7. Factors Associated With Barcode Medication Administration Technology That Contribute to Patient Safety: An Integrative Review.

    PubMed

    Strudwick, Gillian; Reisdorfer, Emilene; Warnock, Caroline; Kalia, Kamini; Sulkers, Heather; Clark, Carrie; Booth, Richard

    In an effort to prevent medication errors, barcode medication administration technology has been implemented in many health care organizations. An integrative review was conducted to understand the effect of barcode medication administration technology on medication errors, and characteristics of use demonstrated by nurses contribute to medication safety. Addressing poor system use may support improved patient safety through the reduction of medication administration errors.

  8. Technology and teen drivers.

    PubMed

    Lee, John D

    2007-01-01

    The rapid evolution of computing, communication, and sensor technology is likely to affect young drivers more than others. The distraction potential of infotainment technology stresses the same vulnerabilities that already lead young drivers to crash more frequently than other drivers. Cell phones, text messaging, MP3 players, and other nomadic devices all present a threat because young drivers may lack the spare attentional capacity for vehicle control and the ability to anticipate and manage hazards. Moreover, young drivers are likely to be the first and most aggressive users of new technology. Fortunately, emerging technology can also support safe driving. Electronic stability control, collision avoidance systems, intelligent speed adaptation, and vehicle tracking systems can all help mitigate the threats to young drivers. However, technology alone is unlikely to make young drivers safer. One promising approach to tailoring technology to teen drivers is to extend proven methods for enhancing young driver safety. The success of graduated drivers license programs (GDL) and the impressive safety benefit of supervised driving suggest ways of tailoring technology to the needs of young drivers. To anticipate the effects of technology on teen driving it may be useful to draw an analogy between the effects of passengers and the effects of technology. Technology can act as a teen passenger and undermine safety or it can act as an adult passenger and enhance safety. Rapidly developing technology may have particularly large effects on teen drivers. To maximize the positive effects and minimize the negative effects will require a broad range of industries to work together. Ideally, vehicle manufacturers would work with infotainment providers, insurance companies, and policy makers to craft new technologies so that they accommodate the needs of young drivers. Without such collaboration young drivers will face even greater challenges to their safety as new technologies emerge.

  9. New technologies and worker safety in western agriculture.

    PubMed

    Fenske, Richard A

    2009-01-01

    The New Paths: Health and Safety in Western Agriculture conference, November 11-13, 2008, highlighted the role of technological innovation in agricultural production. The tree fruit industry in the Pacific Northwest has adopted a "technology road map" to reduce production costs and improve efficiency. An agricultural tour provided field demonstrations and discussions on such topics as mobile work platforms in orchards, traumatic and musculoskeletal injuries, and new pest control technologies. Occupational safety and health research will need to adapt to and keep pace with rapid changes in agricultural production processes.

  10. [Innovative technology and blood safety].

    PubMed

    Begue, S; Morel, P; Djoudi, R

    2016-11-01

    If technological innovations are not enough alone to improve blood safety, their contributions for several decades in blood transfusion are major. The improvement of blood donation (new apheresis devices, RFID) or blood components (additive solutions, pathogen reduction technology, automated processing of platelets concentrates) or manufacturing process of these products (by automated processing of whole blood), all these steps where technological innovations were implemented, lead us to better traceability, more efficient processes, quality improvement of blood products and therefore increased blood safety for blood donors and patients. If we are on the threshold of a great change with the progress of pathogen reduction technology (for whole blood and red blood cells), we hope to see production of ex vivo red blood cells or platelets who are real and who open new conceptual paths on blood safety. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  11. 75 FR 27734 - Agency Information Collection Activities; Proposed Collection; Comment Request; Safety Standard...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-18

    ... Glatz, Division of Policy and Planning, Office of Information Technology, Consumer Product Safety... appropriate, and other forms of information technology. Title: Safety Standard for Bicycle Helmets--16 CFR... and process for Commission acceptance of accreditation of third party conformity assessment bodies for...

  12. Qualitative Research for Patient Safety Using ICTs: Methodological Considerations in the Technological Age.

    PubMed

    Yee, Kwang Chien; Wong, Ming Chao; Turner, Paul

    2017-01-01

    Considerable effort and resources have been dedicated to improving the quality and safety of patient care through health information systems, but there is still significant scope for improvement. One contributing factor to the lack of progress in patient safety improvement especially where technology has been deployed relates to an over-reliance on purely objective, quantitative, positivist research paradigms as the basis for generating and validating evidence of improvement. This paper argues the need for greater recognition and accommodation of evidence of improvement generated through more subjective, qualitative and pragmatic research paradigms to aid patient safety especially where technology is deployed. This paper discusses how acknowledging the role and value of more subjective ontologies and pragmatist epistemologies can support improvement science research. This paper illustrates some challenges and benefits from adopting qualitative research methods in patient safety improvement projects, particularly focusing challenges in the technological era. While adopting methods that can more readily capture, analyse and interpret direct user experiences, attitudes, insights and behaviours in their contextual settings, patient safety can be enhanced 'on the ground' and errors reduced and/or mitigated, challenges of using these methods with the younger "technologically-centred" healthcare professionals and patients needs to recognised.

  13. Harnessing hospital purchase power to design safe care delivery.

    PubMed

    Ebben, Steven F; Gieras, Izabella A; Gosbee, Laura Lin

    2008-01-01

    Since the Institute of Medicine's well-publicized 1999 report To Err is Human, the healthcare patient safety movement has grown at an exponential pace. However, much more can be done to advance patient safety from a care process design vantage point-improving safety through effective care processes and technology integration. While progress is being made, the chasm between technology developers and caregivers remains a profound void. Why hasn't more been done to expand our view of patient safety to include technology design? Healthcare organizations have not consolidated their purchasing power to expect improved designs. This article will (1) provide an assessment of the present state of healthcare technology management and (2) provide recommendations for collaborative design of safe healthcare delivery systems.

  14. Deterministic or Probabilistic - Robustness or Resilience: How to Respond to Climate Change?

    NASA Astrophysics Data System (ADS)

    Plag, H.; Earnest, D.; Jules-Plag, S.

    2013-12-01

    Our response to climate change is dominated by a deterministic approach that emphasizes the interaction between only the natural and the built environment. But in the non-ergodic world of unprecedented climate change, social factors drive recovery from unforeseen Black Swans much more than natural or built ones. Particularly the sea level rise discussion focuses on deterministic predictions, accounting for uncertainties in major driving processes with a set of forcing scenarios and public deliberations on which of the plausible trajectories is most likely. Science focuses on the prediction of future climate change, and policies focus on mitigation of both climate change itself and its impacts. The deterministic approach is based on two basic assumptions: 1) Climate change is an ergodic process; 2) The urban coast is a robust system. Evidence suggests that these assumptions may not hold. Anthropogenic changes are pushing key parameters of the climate system outside of the natural range of variability from the last 1 Million years, creating the potential for environmental Black Swans. A probabilistic approach allows for non-ergodic processes and focuses more on resilience, hence does not depend on the two assumptions. Recent experience with hurricanes revealed threshold limitations of the built environment of the urban coast, which, once exceeded, brought to the forefront the importance of the social fabric and social networking in evaluating resilience. Resilience strongly depends on social capital, and building social capital that can create resilience must be a key element in our response to climate change. Although social capital cannot mitigate hazards, social scientists have found that communities rich in strong norms of cooperation recover more quickly than communities without social capital. There is growing evidence that the built environment can affect the social capital of a community, for example public health and perceptions of public safety. This suggests an intriguing hypothesis: disaster risk reduction programs need to account for whether they also facilitate the public trust, cooperation, and communication needed to recover from a disaster. Our work in the Hampton Roads area, where the probability of hazardous flooding and inundation events exceeding the thresholds of the infrastructure is high, suggests that to facilitate the paradigm shift from the deterministic to a probabilistic approach, natural sciences have to focus on hazard probabilities, while engineering and social sciences have to work together to understand how interactions of the built and social environments impact robustness and resilience. The current science-policy relationship needs to be augmented by social structures that can learn from previous unexpected events. In this response to climate change, science does not have the primary goal to reduce uncertainties and prediction errors, but rather to develop processes that can utilize uncertainties and surprises to increase robustness, strengthen resilience, and reduce fragility of the social systems during times when infrastructure fails.

  15. Evaluating and Predicting Patient Safety for Medical Devices With Integral Information Technology

    DTIC Science & Technology

    2005-01-01

    have the potential to become solid tools for manufacturers, purchasers, and consumers to evaluate patient safety issues in various health related...323 Evaluating and Predicting Patient Safety for Medical Devices with Integral Information Technology Jiajie Zhang, Vimla L. Patel, Todd R...errors are due to inappropriate designs for user interactions, rather than mechanical failures. Evaluating and predicting patient safety in medical

  16. Conflict Detection and Resolution for Future Air Transportation Management

    NASA Technical Reports Server (NTRS)

    Krozel, Jimmy; Peters, Mark E.; Hunter, George

    1997-01-01

    With a Free Flight policy, the emphasis for air traffic control is shifting from active control to passive air traffic management with a policy of intervention by exception. Aircraft will be allowed to fly user preferred routes, as long as safety Alert Zones are not violated. If there is a potential conflict, two (or more) aircraft must be able to arrive at a solution for conflict resolution without controller intervention. Thus, decision aid tools are needed in Free Flight to detect and resolve conflicts, and several problems must be solved to develop such tools. In this report, we analyze and solve problems of proximity management, conflict detection, and conflict resolution under a Free Flight policy. For proximity management, we establish a system based on Delaunay Triangulations of aircraft at constant flight levels. Such a system provides a means for analyzing the neighbor relationships between aircraft and the nearby free space around air traffic which can be utilized later in conflict resolution. For conflict detection, we perform both 2-dimensional and 3-dimensional analyses based on the penetration of the Protected Airspace Zone. Both deterministic and non-deterministic analyses are performed. We investigate several types of conflict warnings including tactical warnings prior to penetrating the Protected Airspace Zone, methods based on the reachability overlap of both aircraft, and conflict probability maps to establish strategic Alert Zones around aircraft.

  17. Deterministic quantum teleportation with feed-forward in a solid state system.

    PubMed

    Steffen, L; Salathe, Y; Oppliger, M; Kurpiers, P; Baur, M; Lang, C; Eichler, C; Puebla-Hellmann, G; Fedorov, A; Wallraff, A

    2013-08-15

    Engineered macroscopic quantum systems based on superconducting electronic circuits are attractive for experimentally exploring diverse questions in quantum information science. At the current state of the art, quantum bits (qubits) are fabricated, initialized, controlled, read out and coupled to each other in simple circuits. This enables the realization of basic logic gates, the creation of complex entangled states and the demonstration of algorithms or error correction. Using different variants of low-noise parametric amplifiers, dispersive quantum non-demolition single-shot readout of single-qubit states with high fidelity has enabled continuous and discrete feedback control of single qubits. Here we realize full deterministic quantum teleportation with feed-forward in a chip-based superconducting circuit architecture. We use a set of two parametric amplifiers for both joint two-qubit and individual qubit single-shot readout, combined with flexible real-time digital electronics. Our device uses a crossed quantum bus technology that allows us to create complex networks with arbitrary connecting topology in a planar architecture. The deterministic teleportation process succeeds with order unit probability for any input state, as we prepare maximally entangled two-qubit states as a resource and distinguish all Bell states in a single two-qubit measurement with high efficiency and high fidelity. We teleport quantum states between two macroscopic systems separated by 6 mm at a rate of 10(4) s(-1), exceeding other reported implementations. The low transmission loss of superconducting waveguides is likely to enable the range of this and other schemes to be extended to significantly larger distances, enabling tests of non-locality and the realization of elements for quantum communication at microwave frequencies. The demonstrated feed-forward may also find application in error correction schemes.

  18. Developing safety performance functions incorporating reliability-based risk measures.

    PubMed

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. 76 FR 31350 - Cruise Vessel Safety and Security Act of 2010, Available Technology

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-31

    ... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2011-0357] Cruise Vessel Safety and Security Act of 2010, Available Technology AGENCY: Coast Guard, DHS. ACTION: Notice of request for comments... Security and Safety Act of 2010(CVSSA), specifically related to video recording and overboard detection...

  20. 78 FR 17140 - Upholstered Furniture Fire Safety Technology; Meeting and Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-20

    ... Furniture Fire Safety Technology; Meeting and Request for Comments AGENCY: Consumer Product Safety... Commission (CPSC, Commission, or we) is announcing its intent to hold a meeting on upholstered furniture fire... http://www.cpsc.gov/meetingsignup.html and click on the link titled, ``Upholstered Furniture Fire...

  1. Deterministic and stochastic CTMC models from Zika disease transmission

    NASA Astrophysics Data System (ADS)

    Zevika, Mona; Soewono, Edy

    2018-03-01

    Zika infection is one of the most important mosquito-borne diseases in the world. Zika virus (ZIKV) is transmitted by many Aedes-type mosquitoes including Aedes aegypti. Pregnant women with the Zika virus are at risk of having a fetus or infant with a congenital defect and suffering from microcephaly. Here, we formulate a Zika disease transmission model using two approaches, a deterministic model and a continuous-time Markov chain stochastic model. The basic reproduction ratio is constructed from a deterministic model. Meanwhile, the CTMC stochastic model yields an estimate of the probability of extinction and outbreaks of Zika disease. Dynamical simulations and analysis of the disease transmission are shown for the deterministic and stochastic models.

  2. Distinguishing between stochasticity and determinism: Examples from cell cycle duration variability.

    PubMed

    Pearl Mizrahi, Sivan; Sandler, Oded; Lande-Diner, Laura; Balaban, Nathalie Q; Simon, Itamar

    2016-01-01

    We describe a recent approach for distinguishing between stochastic and deterministic sources of variability, focusing on the mammalian cell cycle. Variability between cells is often attributed to stochastic noise, although it may be generated by deterministic components. Interestingly, lineage information can be used to distinguish between variability and determinism. Analysis of correlations within a lineage of the mammalian cell cycle duration revealed its deterministic nature. Here, we discuss the sources of such variability and the possibility that the underlying deterministic process is due to the circadian clock. Finally, we discuss the "kicked cell cycle" model and its implication on the study of the cell cycle in healthy and cancerous tissues. © 2015 WILEY Periodicals, Inc.

  3. The impact of assay technology as applied to safety assessment in reducing compound attrition in drug discovery.

    PubMed

    Thomas, Craig E; Will, Yvonne

    2012-02-01

    Attrition in the drug industry due to safety findings remains high and requires a shift in the current safety testing paradigm. Many companies are now positioning safety assessment at each stage of the drug development process, including discovery, where an early perspective on potential safety issues is sought, often at chemical scaffold level, using a variety of emerging technologies. Given the lengthy development time frames of drugs in the pharmaceutical industry, the authors believe that the impact of new technologies on attrition is best measured as a function of the quality and timeliness of candidate compounds entering development. The authors provide an overview of in silico and in vitro models, as well as more complex approaches such as 'omics,' and where they are best positioned within the drug discovery process. It is important to take away that not all technologies should be applied to all projects. Technologies vary widely in their validation state, throughput and cost. A thoughtful combination of validated and emerging technologies is crucial in identifying the most promising candidates to move to proof-of-concept testing in humans. In spite of the challenges inherent in applying new technologies to drug discovery, the successes and recognition that we cannot continue to rely on safety assessment practices used for decades have led to rather dramatic strategy shifts and fostered partnerships across government agencies and industry. We are optimistic that these efforts will ultimately benefit patients by delivering effective and safe medications in a timely fashion.

  4. A new method to evaluate future impact of vehicle safety technology in Sweden.

    PubMed

    Strandroth, Johan; Sternlund, Simon; Tingvall, Claes; Johansson, Roger; Rizzi, Matteo; Kullgren, Anders

    2012-10-01

    In the design of a safe road transport system there is a need to better understand the safety challenges lying ahead. One way of doing that is to evaluate safety technology with retrospective analysis of crashes. However, by using retros- pective data there is the risk of adapting safety innovations to scenarios irrelevant in the future. Also, challenges arise as safety interventions do not act alone but are rather interacting components in a complex road transport system. The objective of this study was therefore to facilitate the prioritizing of road safety measures by developing and applying a new method to consider possible impact of future vehicle safety technology. The key point was to project the chain of events leading to a crash today into the crashes for a given time in the future. Assumptions on implementation on safety technologies were made and these assump- tions were applied on the crashes of today. It was estimated which crashes would be prevented and the residual was analyzed to identify the characteristics of future crashes. The Swedish Transport Administration's in-depth studies of fatal crashes from 2010 involving car passengers (n=156) were used. This study estimated that the number of killed car occupant would be reduced with 53 percent from the year 2010 to 2020. Through this new method, valuable information regarding the characteristic of the future crashes was found. The results of this study showed that it was possible to evaluate future impact of vehicle safety technology if detailed and representative crash data is available.

  5. 6 CFR 25.6 - Procedures for designation of qualified anti-terrorism technologies.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....safetyact.gov and by mail upon request sent to: Directorate of Science and Technology, SAFETY Act/room 4320...://www.safetyact.gov and by mail by written request sent to: Directorate of Science and Technology....safetyact.gov and by mail upon request sent to: Directorate of Science and Technology, SAFETY Act/room 4320...

  6. Optimal Control of Hybrid Systems in Air Traffic Applications

    NASA Astrophysics Data System (ADS)

    Kamgarpour, Maryam

    Growing concerns over the scalability of air traffic operations, air transportation fuel emissions and prices, as well as the advent of communication and sensing technologies motivate improvements to the air traffic management system. To address such improvements, in this thesis a hybrid dynamical model as an abstraction of the air traffic system is considered. Wind and hazardous weather impacts are included using a stochastic model. This thesis focuses on the design of algorithms for verification and control of hybrid and stochastic dynamical systems and the application of these algorithms to air traffic management problems. In the deterministic setting, a numerically efficient algorithm for optimal control of hybrid systems is proposed based on extensions of classical optimal control techniques. This algorithm is applied to optimize the trajectory of an Airbus 320 aircraft in the presence of wind and storms. In the stochastic setting, the verification problem of reaching a target set while avoiding obstacles (reach-avoid) is formulated as a two-player game to account for external agents' influence on system dynamics. The solution approach is applied to air traffic conflict prediction in the presence of stochastic wind. Due to the uncertainty in forecasts of the hazardous weather, and hence the unsafe regions of airspace for aircraft flight, the reach-avoid framework is extended to account for stochastic target and safe sets. This methodology is used to maximize the probability of the safety of aircraft paths through hazardous weather. Finally, the problem of modeling and optimization of arrival air traffic and runway configuration in dense airspace subject to stochastic weather data is addressed. This problem is formulated as a hybrid optimal control problem and is solved with a hierarchical approach that decouples safety and performance. As illustrated with this problem, the large scale of air traffic operations motivates future work on the efficient implementation of the proposed algorithms.

  7. Medication safety.

    PubMed

    Keohane, Carol A; Bates, David W

    2008-03-01

    Patient safety is a state of mind, not a technology. The technologies used in the medical setting represent tools that must be properly designed, used well, and assessed on an on-going basis. Moreover, in all settings, building a culture of safety is pivotal for improving safety, and many nontechnologic approaches, such as medication reconciliation and teaching patients about their medications, are also essential. This article addresses the topic of medication safety and examines specific strategies being used to decrease the incidence of medication errors across various clinical settings.

  8. Disentangling Mechanisms That Mediate the Balance Between Stochastic and Deterministic Processes in Microbial Succession

    DOE PAGES

    Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan D.; ...

    2015-03-17

    Despite growing recognition that deterministic and stochastic factors simultaneously influence bacterial communities, little is known about mechanisms shifting their relative importance. To better understand underlying mechanisms, we developed a conceptual model linking ecosystem development during primary succession to shifts in the stochastic/deterministic balance. To evaluate the conceptual model we coupled spatiotemporal data on soil bacterial communities with environmental conditions spanning 105 years of salt marsh development. At the local scale there was a progression from stochasticity to determinism due to Na accumulation with increasing ecosystem age, supporting a main element of the conceptual model. At the regional-scale, soil organic mattermore » (SOM) governed the relative influence of stochasticity and the type of deterministic ecological selection, suggesting scale-dependency in how deterministic ecological selection is imposed. Analysis of a new ecological simulation model supported these conceptual inferences. Looking forward, we propose an extended conceptual model that integrates primary and secondary succession in microbial systems.« less

  9. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 2. Application to Owens Valley, California

    USGS Publications Warehouse

    Guymon, Gary L.; Yen, Chung-Cheng

    1990-01-01

    The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.

  10. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 2. Application to Owens Valley, California

    NASA Astrophysics Data System (ADS)

    Guymon, Gary L.; Yen, Chung-Cheng

    1990-07-01

    The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.

  11. Statistics of Delta v magnitude for a trajectory correction maneuver containing deterministic and random components

    NASA Technical Reports Server (NTRS)

    Bollman, W. E.; Chadwick, C.

    1982-01-01

    A number of interplanetary missions now being planned involve placing deterministic maneuvers along the flight path to alter the trajectory. Lee and Boain (1973) examined the statistics of trajectory correction maneuver (TCM) magnitude with no deterministic ('bias') component. The Delta v vector magnitude statistics were generated for several values of random Delta v standard deviations using expansions in terms of infinite hypergeometric series. The present investigation uses a different technique (Monte Carlo simulation) to generate Delta v magnitude statistics for a wider selection of random Delta v standard deviations and also extends the analysis to the case of nonzero deterministic Delta v's. These Delta v magnitude statistics are plotted parametrically. The plots are useful in assisting the analyst in quickly answering questions about the statistics of Delta v magnitude for single TCM's consisting of both a deterministic and a random component. The plots provide quick insight into the nature of the Delta v magnitude distribution for the TCM.

  12. Safety of foods treated with novel process intervention technologies

    USDA-ARS?s Scientific Manuscript database

    Many consumers are familiar with traditional food safety and preservation technologies such as thermal processing (cooking), salting, and pickling to inactivate common foodborne pathogens such as Salmonella spp. and Escherichia coli O157:H7. Many consumers are less familiar with other technologies s...

  13. Evaluating driver reactions to new vehicle technologies intended to increase safety and mobility across the lifespan.

    DOT National Transportation Integrated Search

    2013-05-01

    Personal vehicle manufactures are introducing a wide range of new technologies that are : intended to increase the safety, comfort, and mobility of drivers of all ages. Examples range from : semi-autonomous technologies such as adaptive cruise contro...

  14. 75 FR 21602 - Online Safety and Technology Working Group Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-26

    ... OSTWG is tasked with evaluating industry efforts to promote a safe online environment for children. The... and Technology Working Group Meeting AGENCY: National Telecommunications and Information... public meeting of the Online Safety and Technology Working Group (OSTWG). DATES: The meeting will be held...

  15. Methodology for safety optimization of highway cross-sections for horizontal curves with restricted sight distance.

    PubMed

    Ibrahim, Shewkar E; Sayed, Tarek; Ismail, Karim

    2012-11-01

    Several earlier studies have noted the shortcomings with existing geometric design guides which provide deterministic standards. In these standards the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from the standards. To mitigate these shortcomings, probabilistic geometric design has been advocated where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a mechanism for risk measurement to evaluate the safety impact of deviations from design standards. This paper applies reliability analysis for optimizing the safety of highway cross-sections. The paper presents an original methodology to select a suitable combination of cross-section elements with restricted sight distance to result in reduced collisions and consistent risk levels. The purpose of this optimization method is to provide designers with a proactive approach to the design of cross-section elements in order to (i) minimize the risk associated with restricted sight distance, (ii) balance the risk across the two carriageways of the highway, and (iii) reduce the expected collision frequency. A case study involving nine cross-sections that are parts of two major highway developments in British Columbia, Canada, was presented. The results showed that an additional reduction in collisions can be realized by incorporating the reliability component, P(nc) (denoting the probability of non-compliance), in the optimization process. The proposed approach results in reduced and consistent risk levels for both travel directions in addition to further collision reductions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Adequacy of the default values for skin surface area used for risk assessment and French anthropometric data by a probabilistic approach.

    PubMed

    Dornic, N; Ficheux, A S; Bernard, A; Roudot, A C

    2017-08-01

    The notes of guidance for the testing of cosmetic ingredients and their safety evaluation by the Scientific Committee on Consumer Safety (SCCS) is a document dedicated to ensuring the safety of European consumers. This contains useful data for risk assessment such as default values for Skin Surface Area (SSA). A more in-depth study of anthropometric data across Europe reveals considerable variations. The default SSA value was derived from a study on the Dutch population, which is known to be one of the tallest nations in the World. This value could be inadequate for shorter populations of Europe. Data were collected in a survey on cosmetic consumption in France. Probabilistic treatment of these data and analysis of the case of methylisothiazolinone, a sensitizer recently evaluated by a deterministic approach submitted to SCCS, suggest that the default value for SSA used in the quantitative risk assessment might not be relevant for a significant share of the French female population. Others female populations of Southern Europe may also be excluded. This is of importance given that some studies show an increasing risk of developping skin sensitization among women. The disparities in anthropometric data across Europe should be taken into consideration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  18. A Framework for Assessment of Aviation Safety Technology Portfolios

    NASA Technical Reports Server (NTRS)

    Jones, Sharon M.; Reveley, Mary S.

    2014-01-01

    The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.

  19. Simultaneous estimation of deterministic and fractal stochastic components in non-stationary time series

    NASA Astrophysics Data System (ADS)

    García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.

    2018-07-01

    In the past few decades, it has been recognized that 1 / f fluctuations are ubiquitous in nature. The most widely used mathematical models to capture the long-term memory properties of 1 / f fluctuations have been stochastic fractal models. However, physical systems do not usually consist of just stochastic fractal dynamics, but they often also show some degree of deterministic behavior. The present paper proposes a model based on fractal stochastic and deterministic components that can provide a valuable basis for the study of complex systems with long-term correlations. The fractal stochastic component is assumed to be a fractional Brownian motion process and the deterministic component is assumed to be a band-limited signal. We also provide a method that, under the assumptions of this model, is able to characterize the fractal stochastic component and to provide an estimate of the deterministic components present in a given time series. The method is based on a Bayesian wavelet shrinkage procedure that exploits the self-similar properties of the fractal processes in the wavelet domain. This method has been validated over simulated signals and over real signals with economical and biological origin. Real examples illustrate how our model may be useful for exploring the deterministic-stochastic duality of complex systems, and uncovering interesting patterns present in time series.

  20. Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model

    PubMed Central

    Nené, Nuno R.; Dunham, Alistair S.; Illingworth, Christopher J. R.

    2018-01-01

    A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. PMID:29500183

  1. Development of a category 2 approach system model

    NASA Technical Reports Server (NTRS)

    Johnson, W. A.; Mcruer, D. T.

    1972-01-01

    An analytical model is presented which provides, as its primary output, the probability of a successful Category II approach. Typical applications are included using several example systems (manual and automatic) which are subjected to random gusts and deterministic wind shear. The primary purpose of the approach system model is to establish a structure containing the system elements, command inputs, disturbances, and their interactions in an analytical framework so that the relative effects of changes in the various system elements on precision of control and available margins of safety can be estimated. The model is intended to provide insight for the design and integration of suitable autopilot, display, and navigation elements; and to assess the interaction of such elements with the pilot/copilot.

  2. Automatic mesh adaptivity for hybrid Monte Carlo/deterministic neutronics modeling of difficult shielding problems

    DOE PAGES

    Ibrahim, Ahmad M.; Wilson, Paul P.H.; Sawan, Mohamed E.; ...

    2015-06-30

    The CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques dramatically increase the efficiency of neutronics modeling, but their use in the accurate design analysis of very large and geometrically complex nuclear systems has been limited by the large number of processors and memory requirements for their preliminary deterministic calculations and final Monte Carlo calculation. Three mesh adaptivity algorithms were developed to reduce the memory requirements of CADIS and FW-CADIS without sacrificing their efficiency improvement. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as muchmore » geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility. Using these algorithms resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation and, additionally, increased the efficiency of the Monte Carlo simulation by a factor of at least 3.4. The three algorithms enabled this difficult calculation to be accurately solved using an FW-CADIS simulation on a regular computer cluster, eliminating the need for a world-class super computer.« less

  3. Improving ground-penetrating radar data in sedimentary rocks using deterministic deconvolution

    USGS Publications Warehouse

    Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.; Byrnes, A.P.

    2003-01-01

    Resolution is key to confidently identifying unique geologic features using ground-penetrating radar (GPR) data. Source wavelet "ringing" (related to bandwidth) in a GPR section limits resolution because of wavelet interference, and can smear reflections in time and/or space. The resultant potential for misinterpretation limits the usefulness of GPR. Deconvolution offers the ability to compress the source wavelet and improve temporal resolution. Unlike statistical deconvolution, deterministic deconvolution is mathematically simple and stable while providing the highest possible resolution because it uses the source wavelet unique to the specific radar equipment. Source wavelets generated in, transmitted through and acquired from air allow successful application of deterministic approaches to wavelet suppression. We demonstrate the validity of using a source wavelet acquired in air as the operator for deterministic deconvolution in a field application using "400-MHz" antennas at a quarry site characterized by interbedded carbonates with shale partings. We collected GPR data on a bench adjacent to cleanly exposed quarry faces in which we placed conductive rods to provide conclusive groundtruth for this approach to deconvolution. The best deconvolution results, which are confirmed by the conductive rods for the 400-MHz antenna tests, were observed for wavelets acquired when the transmitter and receiver were separated by 0.3 m. Applying deterministic deconvolution to GPR data collected in sedimentary strata at our study site resulted in an improvement in resolution (50%) and improved spatial location (0.10-0.15 m) of geologic features compared to the same data processed without deterministic deconvolution. The effectiveness of deterministic deconvolution for increased resolution and spatial accuracy of specific geologic features is further demonstrated by comparing results of deconvolved data with nondeconvolved data acquired along a 30-m transect immediately adjacent to a fresh quarry face. The results at this site support using deterministic deconvolution, which incorporates the GPR instrument's unique source wavelet, as a standard part of routine GPR data processing. ?? 2003 Elsevier B.V. All rights reserved.

  4. Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.

    PubMed

    Kang, Yun; Lanchier, Nicolas

    2011-06-01

    We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the probability of extinction for the stochastic model.

  5. Accurate measurement of RF exposure from emerging wireless communication systems

    NASA Astrophysics Data System (ADS)

    Letertre, Thierry; Monebhurrun, Vikass; Toffano, Zeno

    2013-04-01

    Isotropic broadband probes or spectrum analyzers (SAs) may be used for the measurement of rapidly varying electromagnetic fields generated by emerging wireless communication systems. In this paper this problematic is investigated by comparing the responses measured by two different isotropic broadband probes typically used to perform electric field (E-field) evaluations. The broadband probes are submitted to signals with variable duty cycles (DC) and crest factors (CF) either with or without Orthogonal Frequency Division Multiplexing (OFDM) modulation but with the same root-mean-square (RMS) power. The two probes do not provide accurate enough results for deterministic signals such as Worldwide Interoperability for Microwave Access (WIMAX) or Long Term Evolution (LTE) as well as for non-deterministic signals such as Wireless Fidelity (WiFi). The legacy measurement protocols should be adapted to cope for the emerging wireless communication technologies based on the OFDM modulation scheme. This is not easily achieved except when the statistics of the RF emission are well known. In this case the measurement errors are shown to be systematic and a correction factor or calibration can be applied to obtain a good approximation of the total RMS power.

  6. Deterministic quantum nonlinear optics with single atoms and virtual photons

    NASA Astrophysics Data System (ADS)

    Kockum, Anton Frisk; Miranowicz, Adam; Macrı, Vincenzo; Savasta, Salvatore; Nori, Franco

    2017-06-01

    We show how analogs of a large number of well-known nonlinear-optics phenomena can be realized with one or more two-level atoms coupled to one or more resonator modes. Through higher-order processes, where virtual photons are created and annihilated, an effective deterministic coupling between two states of such a system can be created. In this way, analogs of three-wave mixing, four-wave mixing, higher-harmonic and -subharmonic generation (i.e., up- and down-conversion), multiphoton absorption, parametric amplification, Raman and hyper-Raman scattering, the Kerr effect, and other nonlinear processes can be realized. In contrast to most conventional implementations of nonlinear optics, these analogs can reach unit efficiency, only use a minimal number of photons (they do not require any strong external drive), and do not require more than two atomic levels. The strength of the effective coupling in our proposed setups becomes weaker the more intermediate transition steps are needed. However, given the recent experimental progress in ultrastrong light-matter coupling and improvement of coherence times for engineered quantum systems, especially in the field of circuit quantum electrodynamics, we estimate that many of these nonlinear-optics analogs can be realized with currently available technology.

  7. A methodology for the stochastic generation of hourly synthetic direct normal irradiation time series

    NASA Astrophysics Data System (ADS)

    Larrañeta, M.; Moreno-Tejera, S.; Lillo-Bravo, I.; Silva-Pérez, M. A.

    2018-02-01

    Many of the available solar radiation databases only provide global horizontal irradiance (GHI) while there is a growing need of extensive databases of direct normal radiation (DNI) mainly for the development of concentrated solar power and concentrated photovoltaic technologies. In the present work, we propose a methodology for the generation of synthetic DNI hourly data from the hourly average GHI values by dividing the irradiance into a deterministic and stochastic component intending to emulate the dynamics of the solar radiation. The deterministic component is modeled through a simple classical model. The stochastic component is fitted to measured data in order to maintain the consistency of the synthetic data with the state of the sky, generating statistically significant DNI data with a cumulative frequency distribution very similar to the measured data. The adaptation and application of the model to the location of Seville shows significant improvements in terms of frequency distribution over the classical models. The proposed methodology applied to other locations with different climatological characteristics better results than the classical models in terms of frequency distribution reaching a reduction of the 50% in the Finkelstein-Schafer (FS) and Kolmogorov-Smirnov test integral (KSI) statistics.

  8. Nonparametric estimates of drift and diffusion profiles via Fokker-Planck algebra.

    PubMed

    Lund, Steven P; Hubbard, Joseph B; Halter, Michael

    2014-11-06

    Diffusion processes superimposed upon deterministic motion play a key role in understanding and controlling the transport of matter, energy, momentum, and even information in physics, chemistry, material science, biology, and communications technology. Given functions defining these random and deterministic components, the Fokker-Planck (FP) equation is often used to model these diffusive systems. Many methods exist for estimating the drift and diffusion profiles from one or more identifiable diffusive trajectories; however, when many identical entities diffuse simultaneously, it may not be possible to identify individual trajectories. Here we present a method capable of simultaneously providing nonparametric estimates for both drift and diffusion profiles from evolving density profiles, requiring only the validity of Langevin/FP dynamics. This algebraic FP manipulation provides a flexible and robust framework for estimating stationary drift and diffusion coefficient profiles, is not based on fluctuation theory or solved diffusion equations, and may facilitate predictions for many experimental systems. We illustrate this approach on experimental data obtained from a model lipid bilayer system exhibiting free diffusion and electric field induced drift. The wide range over which this approach provides accurate estimates for drift and diffusion profiles is demonstrated through simulation.

  9. Research on Occupational Safety, Health Management and Risk Control Technology in Coal Mines.

    PubMed

    Zhou, Lu-Jie; Cao, Qing-Gui; Yu, Kai; Wang, Lin-Lin; Wang, Hai-Bin

    2018-04-26

    This paper studies the occupational safety and health management methods as well as risk control technology associated with the coal mining industry, including daily management of occupational safety and health, identification and assessment of risks, early warning and dynamic monitoring of risks, etc.; also, a B/S mode software (Geting Coal Mine, Jining, Shandong, China), i.e., Coal Mine Occupational Safety and Health Management and Risk Control System, is developed to attain the aforementioned objectives, namely promoting the coal mine occupational safety and health management based on early warning and dynamic monitoring of risks. Furthermore, the practical effectiveness and the associated pattern for applying this software package to coal mining is analyzed. The study indicates that the presently developed coal mine occupational safety and health management and risk control technology and the associated software can support the occupational safety and health management efforts in coal mines in a standardized and effective manner. It can also control the accident risks scientifically and effectively; its effective implementation can further improve the coal mine occupational safety and health management mechanism, and further enhance the risk management approaches. Besides, its implementation indicates that the occupational safety and health management and risk control technology has been established based on a benign cycle involving dynamic feedback and scientific development, which can provide a reliable assurance to the safe operation of coal mines.

  10. Research on Occupational Safety, Health Management and Risk Control Technology in Coal Mines

    PubMed Central

    Zhou, Lu-jie; Cao, Qing-gui; Yu, Kai; Wang, Lin-lin; Wang, Hai-bin

    2018-01-01

    This paper studies the occupational safety and health management methods as well as risk control technology associated with the coal mining industry, including daily management of occupational safety and health, identification and assessment of risks, early warning and dynamic monitoring of risks, etc.; also, a B/S mode software (Geting Coal Mine, Jining, Shandong, China), i.e., Coal Mine Occupational Safety and Health Management and Risk Control System, is developed to attain the aforementioned objectives, namely promoting the coal mine occupational safety and health management based on early warning and dynamic monitoring of risks. Furthermore, the practical effectiveness and the associated pattern for applying this software package to coal mining is analyzed. The study indicates that the presently developed coal mine occupational safety and health management and risk control technology and the associated software can support the occupational safety and health management efforts in coal mines in a standardized and effective manner. It can also control the accident risks scientifically and effectively; its effective implementation can further improve the coal mine occupational safety and health management mechanism, and further enhance the risk management approaches. Besides, its implementation indicates that the occupational safety and health management and risk control technology has been established based on a benign cycle involving dynamic feedback and scientific development, which can provide a reliable assurance to the safe operation of coal mines. PMID:29701715

  11. 48 CFR 952.223 - Clauses related to environment, energy and water efficiency, renewable energy technologies...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... environment, energy and water efficiency, renewable energy technologies, occupational safety, and drug-free workplace. 952.223 Section 952.223 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CLAUSES AND... related to environment, energy and water efficiency, renewable energy technologies, occupational safety...

  12. 48 CFR 952.223 - Clauses related to environment, energy and water efficiency, renewable energy technologies...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... environment, energy and water efficiency, renewable energy technologies, occupational safety, and drug-free workplace. 952.223 Section 952.223 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CLAUSES AND... related to environment, energy and water efficiency, renewable energy technologies, occupational safety...

  13. 48 CFR 952.223 - Clauses related to environment, energy and water efficiency, renewable energy technologies...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... environment, energy and water efficiency, renewable energy technologies, occupational safety, and drug-free workplace. 952.223 Section 952.223 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CLAUSES AND... related to environment, energy and water efficiency, renewable energy technologies, occupational safety...

  14. 48 CFR 952.223 - Clauses related to environment, energy and water efficiency, renewable energy technologies...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... environment, energy and water efficiency, renewable energy technologies, occupational safety, and drug-free workplace. 952.223 Section 952.223 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CLAUSES AND... related to environment, energy and water efficiency, renewable energy technologies, occupational safety...

  15. 48 CFR 952.223 - Clauses related to environment, energy and water efficiency, renewable energy technologies...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... environment, energy and water efficiency, renewable energy technologies, occupational safety, and drug-free workplace. 952.223 Section 952.223 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CLAUSES AND... related to environment, energy and water efficiency, renewable energy technologies, occupational safety...

  16. 75 FR 1338 - Online Safety and Technology Working Group Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-11

    ... promote a safe online environment for children. The Act requires the OSTWG to report its findings and... and Technology Working Group Meeting AGENCY: National Telecommunications and Information... public meeting of the Online Safety and Technology Working Group (OSTWG). DATES: The meeting will be held...

  17. Tools and Equipment in Nontraditional Spaces: Safety and Liability Issues. Safety Spotlight

    ERIC Educational Resources Information Center

    Love, Tyler S.; Roy, Ken R.

    2017-01-01

    "Safety Spotlight" encourages the submission of questions from Technology and Engineering (T&E) Educators, and this month's question involves the risks of placing hazardous equipment (e.g., 3D printer, laser cutter, CNC router, etc.) in a non-technology & engineering lab under the supervision of teachers not certified to teach…

  18. Patient safety goals for the proposed Federal Health Information Technology Safety Center.

    PubMed

    Sittig, Dean F; Classen, David C; Singh, Hardeep

    2015-03-01

    The Office of the National Coordinator for Health Information Technology is expected to oversee creation of a Health Information Technology (HIT) Safety Center. While its functions are still being defined, the center is envisioned as a public-private entity focusing on promotion of HIT related patient safety. We propose that the HIT Safety Center leverages its unique position to work with key administrative and policy stakeholders, healthcare organizations (HCOs), and HIT vendors to achieve four goals: (1) facilitate creation of a nationwide 'post-marketing' surveillance system to monitor HIT related safety events; (2) develop methods and governance structures to support investigation of major HIT related safety events; (3) create the infrastructure and methods needed to carry out random assessments of HIT related safety in complex HCOs; and (4) advocate for HIT safety with government and private entities. The convening ability of a federally supported HIT Safety Center could be critically important to our transformation to a safe and effective HIT enabled healthcare system. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Estimating the epidemic threshold on networks by deterministic connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Kezan, E-mail: lkzzr@sohu.com; Zhu, Guanghu; Fu, Xinchu

    2014-12-15

    For many epidemic networks some connections between nodes are treated as deterministic, while the remainder are random and have different connection probabilities. By applying spectral analysis to several constructed models, we find that one can estimate the epidemic thresholds of these networks by investigating information from only the deterministic connections. Nonetheless, in these models, generic nonuniform stochastic connections and heterogeneous community structure are also considered. The estimation of epidemic thresholds is achieved via inequalities with upper and lower bounds, which are found to be in very good agreement with numerical simulations. Since these deterministic connections are easier to detect thanmore » those stochastic connections, this work provides a feasible and effective method to estimate the epidemic thresholds in real epidemic networks.« less

  20. National Institute of Occupational Safety and Health (NIOSH) Partnered Development of Cryogenic Life Support Technologies

    NASA Technical Reports Server (NTRS)

    Bush, David R.

    2014-01-01

    Partnering with National Institute of Occupational Safety and Health (NIOSH) to develop several cyrogenically based life support technologies to be used in mine escape and rescue scenarios. Technologies developed for mine rescue directly benefit future NASA rescue and ground operation missions.

  1. System of systems design: Evaluating aircraft in a fleet context using reliability and non-deterministic approaches

    NASA Astrophysics Data System (ADS)

    Frommer, Joshua B.

    This work develops and implements a solution framework that allows for an integrated solution to a resource allocation system-of-systems problem associated with designing vehicles for integration into an existing fleet to extend that fleet's capability while improving efficiency. Typically, aircraft design focuses on using a specific design mission while a fleet perspective would provide a broader capability. Aspects of design for both the vehicles and missions may be, for simplicity, deterministic in nature or, in a model that reflects actual conditions, uncertain. Toward this end, the set of tasks or goals for the to-be-planned system-of-systems will be modeled more accurately with non-deterministic values, and the designed platforms will be evaluated using reliability analysis. The reliability, defined as the probability of a platform or set of platforms to complete possible missions, will contribute to the fitness of the overall system. The framework includes building surrogate models for metrics such as capability and cost, and includes the ideas of reliability in the overall system-level design space. The concurrent design and allocation system-of-systems problem is a multi-objective mixed integer nonlinear programming (MINLP) problem. This study considered two system-of-systems problems that seek to simultaneously design new aircraft and allocate these aircraft into a fleet to provide a desired capability. The Coast Guard's Integrated Deepwater System program inspired the first problem, which consists of a suite of search-and-find missions for aircraft based on descriptions from the National Search and Rescue Manual. The second represents suppression of enemy air defense operations similar to those carried out by the U.S. Air Force, proposed as part of the Department of Defense Network Centric Warfare structure, and depicted in MILSTD-3013. The two problems seem similar, with long surveillance segments, but because of the complex nature of aircraft design, the analysis of the vehicle for high-speed attack combined with a long loiter period is considerably different from that for quick cruise to an area combined with a low speed search. However, the framework developed to solve this class of system-of-systems problem handles both scenarios and leads to a solution type for this kind of problem. On the vehicle-level of the problem, different technology can have an impact on the fleet-level. One such technology is Morphing, the ability to change shape, which is an ideal candidate technology for missions with dissimilar segments, such as the aforementioned two. A framework, using surrogate models based on optimally-sized aircraft, and using probabilistic parameters to define a concept of operations, is investigated; this has provided insight into the setup of the optimization problem, the use of the reliability metric, and the measurement of fleet level impacts of morphing aircraft. The research consisted of four phases. The two initial phases built and defined the framework to solve system-of-systems problem; these investigations used the search-and-find scenario as the example application. The first phase included the design of fixed-geometry and morphing aircraft for a range of missions and evaluated the aircraft capability using non-deterministic mission parameters. The second phase introduced the idea of multiple aircraft in a fleet, but only considered a fleet consisting of one aircraft type. The third phase incorporated the simultaneous design of a new vehicle and allocation into a fleet for the search-and-find scenario; in this phase, multiple types of aircraft are considered. The fourth phase repeated the simultaneous new aircraft design and fleet allocation for the SEAD scenario to show that the approach is not specific to the search-and-find scenario. The framework presented in this work appears to be a viable approach for concurrently designing and allocating constituents in a system, specifically aircraft in a fleet. The research also shows that new technology impact can be assessed at the fleet level using conceptual design principles.

  2. Does the concept of safety culture help or hinder systems thinking in safety?

    PubMed

    Reiman, Teemu; Rollenhagen, Carl

    2014-07-01

    The concept of safety culture has become established in safety management applications in all major safety-critical domains. The idea that safety culture somehow represents a "systemic view" on safety is seldom explicitly spoken out, but nevertheless seem to linger behind many safety culture discourses. However, in this paper we argue that the "new" contribution to safety management from safety culture never really became integrated with classical engineering principles and concepts. This integration would have been necessary for the development of a more genuine systems-oriented view on safety; e.g. a conception of safety in which human, technological, organisational and cultural factors are understood as mutually interacting elements. Without of this integration, researchers and the users of the various tools and methods associated with safety culture have sometimes fostered a belief that "safety culture" in fact represents such a systemic view about safety. This belief is, however, not backed up by theoretical or empirical evidence. It is true that safety culture, at least in some sense, represents a holistic term-a totality of factors that include human, organisational and technological aspects. However, the departure for such safety culture models is still human and organisational factors rather than technology (or safety) itself. The aim of this paper is to critically review the various uses of the concept of safety culture as representing a systemic view on safety. The article will take a look at the concepts of culture and safety culture based on previous studies, and outlines in more detail the theoretical challenges in safety culture as a systems concept. The paper also presents recommendations on how to make safety culture more systemic. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Application Agreement and Integration Services

    NASA Technical Reports Server (NTRS)

    Driscoll, Kevin R.; Hall, Brendan; Schweiker, Kevin

    2013-01-01

    Application agreement and integration services are required by distributed, fault-tolerant, safety critical systems to assure required performance. An analysis of distributed and hierarchical agreement strategies are developed against the backdrop of observed agreement failures in fielded systems. The documented work was performed under NASA Task Order NNL10AB32T, Validation And Verification of Safety-Critical Integrated Distributed Systems Area 2. This document is intended to satisfy the requirements for deliverable 5.2.11 under Task 4.2.2.3. This report discusses the challenges of maintaining application agreement and integration services. A literature search is presented that documents previous work in the area of replica determinism. Sources of non-deterministic behavior are identified and examples are presented where system level agreement failed to be achieved. We then explore how TTEthernet services can be extended to supply some interesting application agreement frameworks. This document assumes that the reader is familiar with the TTEthernet protocol. The reader is advised to read the TTEthernet protocol standard [1] before reading this document. This document does not re-iterate the content of the standard.

  4. An expert system for wind shear avoidance

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.; Stratton, D. Alexander

    1990-01-01

    A study of intelligent guidance and control concepts for protecting against the adverse effects of wind shear during aircraft takeoffs and landings is being conducted, with current emphasis on developing an expert system for wind shear avoidance. Principal objectives are to develop methods for assessing the likelihood of wind shear encounter (based on real-time information in the cockpit), for deciding what flight path to pursue (e.g., takeoff abort, landing go-around, or normal climbout or glide slope), and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information , for making go/no-go decisions, and for generating commands to the manually controlled flight. The program has begun with the development of the WindShear Safety Advisor, an expert system for pilot aiding that is based on the FAA Windshear Training Aid; a two-volume manual that presents an overview , pilot guide, training program, and substantiating data provides guidelines for this initial development. The WindShear Safety Advisor expert system currently contains over 200 rules and is coded in the LISP programming language.

  5. Using game technologies to improve the safety of construction plant operations.

    PubMed

    Guo, Hongling; Li, Heng; Chan, Greg; Skitmore, Martin

    2012-09-01

    Many accidents occur world-wide in the use of construction plant and equipment, and safety training is considered by many to be one of the best approaches to their prevention. However, current safety training methods/tools are unable to provide trainees with the hands-on practice needed. Game technology-based safety training platforms have the potential to overcome this problem in a virtual environment. One such platform is described in this paper - its characteristics are analysed and its possible contribution to safety training identified. This is developed and tested by means of a case study involving three major pieces of construction plant, which successfully demonstrates that the platform can improve the process and performance of the safety training involved in their operation. This research not only presents a new and useful solution to the safety training of construction operations, but illustrates the potential use of advanced technologies in solving construction industry problems in general. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Proceedings of the Nuclear Criticality Technology Safety Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rene G. Sanchez

    1998-04-01

    This document contains summaries of most of the papers presented at the 1995 Nuclear Criticality Technology Safety Project (NCTSP) meeting, which was held May 16 and 17 at San Diego, Ca. The meeting was broken up into seven sessions, which covered the following topics: (1) Criticality Safety of Project Sapphire; (2) Relevant Experiments For Criticality Safety; (3) Interactions with the Former Soviet Union; (4) Misapplications and Limitations of Monte Carlo Methods Directed Toward Criticality Safety Analyses; (5) Monte Carlo Vulnerabilities of Execution and Interpretation; (6) Monte Carlo Vulnerabilities of Representation; and (7) Benchmark Comparisons.

  7. Characterization of normality of chaotic systems including prediction and detection of anomalies

    NASA Astrophysics Data System (ADS)

    Engler, Joseph John

    Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational techniques and novel prediction methodologies. The value and efficiency of these methods are explored in various case studies. Presented is an overview of chaotic systems with examples taken from the real world. A representation schema for rapid understanding of the various states of deterministically chaotic systems is presented. This schema is then used to detect anomalies and system state changes. Additionally, a novel prediction methodology which utilizes Lyapunov exponents to facilitate longer term prediction accuracy is presented and compared with other nonlinear prediction methodologies. These novel methodologies are then demonstrated on applications such as wind energy, cyber security and classification of social networks.

  8. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  9. "Seeing is believing": perspectives of applying imaging technology in discovery toxicology.

    PubMed

    Xu, Jinghai James; Dunn, Margaret Condon; Smith, Arthur Russell

    2009-11-01

    Efficiency and accuracy in addressing drug safety issues proactively are critical in minimizing late-stage drug attritions. Discovery toxicology has become a specialty subdivision of toxicology seeking to effectively provide early predictions and safety assessment in the drug discovery process. Among the many technologies utilized to select safer compounds for further development, in vitro imaging technology is one of the best characterized and validated to provide translatable biomarkers towards clinically-relevant outcomes of drug safety. By carefully applying imaging technologies in genetic, hepatic, and cardiac toxicology, and integrating them with the rest of the drug discovery processes, it was possible to demonstrate significant impact of imaging technology on drug research and development and substantial returns on investment.

  10. Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model.

    PubMed

    Nené, Nuno R; Dunham, Alistair S; Illingworth, Christopher J R

    2018-05-01

    A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. Copyright © 2018 Nené et al.

  11. Sensors and Rotordynamics Health Management Research for Aircraft Turbine Engines

    NASA Technical Reports Server (NTRS)

    Lekki, J.; Abdul-Aziz, A.; Adamovsky, G.; Berger, D.; Fralick, G.; Gyekenyesi, A.; Hunter, G.; Tokars, R.; Venti, M.; Woike, M.; hide

    2011-01-01

    Develop Advanced Sensor Technology and rotordynamic structural diagnostics to address existing Aviation Safety Propulsion Health Management needs as well as proactively begin to address anticipated safety issues for new technologies.

  12. High-Speed Maglev Trains; German Safety Requirements

    DOT National Transportation Integrated Search

    1991-12-31

    This document is a translation of technology-specific safety requirements developed : for the German Transrapid Maglev technology. These requirements were developed by a : working group composed of representatives of German Federal Railways (DB), Tes...

  13. Scalable implementation of boson sampling with trapped ions.

    PubMed

    Shen, C; Zhang, Z; Duan, L-M

    2014-02-07

    Boson sampling solves a classically intractable problem by sampling from a probability distribution given by matrix permanents. We propose a scalable implementation of boson sampling using local transverse phonon modes of trapped ions to encode the bosons. The proposed scheme allows deterministic preparation and high-efficiency readout of the bosons in the Fock states and universal mode mixing. With the state-of-the-art trapped ion technology, it is feasible to realize boson sampling with tens of bosons by this scheme, which would outperform the most powerful classical computers and constitute an effective disproof of the famous extended Church-Turing thesis.

  14. Scale-invariance underlying the logistic equation and its social applications

    NASA Astrophysics Data System (ADS)

    Hernando, A.; Plastino, A.

    2013-01-01

    On the basis of dynamical principles we i) advance a derivation of the Logistic Equation (LE), widely employed (among multiple applications) in the simulation of population growth, and ii) demonstrate that scale-invariance and a mean-value constraint are sufficient and necessary conditions for obtaining it. We also generalize the LE to multi-component systems and show that the above dynamical mechanisms underlie a large number of scale-free processes. Examples are presented regarding city-populations, diffusion in complex networks, and popularity of technological products, all of them obeying the multi-component logistic equation in an either stochastic or deterministic way.

  15. Contribution to solving the energy crisis - Simulating the prospects for low cost energy through silicon solar cells

    NASA Technical Reports Server (NTRS)

    Kran, A.

    1978-01-01

    PECAN (Photovoltaic Energy Conversion Analysis) is a highly interactive decision analysis and support system. It simulates the prospects for widespread use of solar cells for the generation of electrical power. PECAN consists of a set of integrated APL functions for evaluating the potential of terrestrial photovoltaics. Specifically, the system is a deterministic simulator, which translates present and future manufacturing technology into economic and financial terms, using the production unit concept. It guides solar cell development in three areas: tactical decision making, strategic planning, and the formulation of alternative options.

  16. Controllability of Deterministic Networks with the Identical Degree Sequence

    PubMed Central

    Ma, Xiujuan; Zhao, Haixing; Wang, Binghong

    2015-01-01

    Controlling complex network is an essential problem in network science and engineering. Recent advances indicate that the controllability of complex network is dependent on the network's topology. Liu and Barabási, et.al speculated that the degree distribution was one of the most important factors affecting controllability for arbitrary complex directed network with random link weights. In this paper, we analysed the effect of degree distribution to the controllability for the deterministic networks with unweighted and undirected. We introduce a class of deterministic networks with identical degree sequence, called (x,y)-flower. We analysed controllability of the two deterministic networks ((1, 3)-flower and (2, 2)-flower) by exact controllability theory in detail and give accurate results of the minimum number of driver nodes for the two networks. In simulation, we compare the controllability of (x,y)-flower networks. Our results show that the family of (x,y)-flower networks have the same degree sequence, but their controllability is totally different. So the degree distribution itself is not sufficient to characterize the controllability of deterministic networks with unweighted and undirected. PMID:26020920

  17. Inverse kinematic problem for a random gradient medium in geometric optics approximation

    NASA Astrophysics Data System (ADS)

    Petersen, N. V.

    1990-03-01

    Scattering at random inhomogeneities in a gradient medium results in systematic deviations of the rays and travel times of refracted body waves from those corresponding to the deterministic velocity component. The character of the difference depends on the parameters of the deterministic and random velocity component. However, at great distances to the source, independently of the velocity parameters (weakly or strongly inhomogeneous medium), the most probable depth of the ray turning point is smaller than that corresponding to the deterministic velocity component, the most probable travel times also being lower. The relative uncertainty in the deterministic velocity component, derived from the mean travel times using methods developed for laterally homogeneous media (for instance, the Herglotz-Wiechert method), is systematic in character, but does not exceed the contrast of velocity inhomogeneities by magnitude. The gradient of the deterministic velocity component has a significant effect on the travel-time fluctuations. The variance at great distances to the source is mainly controlled by shallow inhomogeneities. The travel-time flucutations are studied only for weakly inhomogeneous media.

  18. [Survey and analysis of radiation safety education at radiological technology schools].

    PubMed

    Ohba, Hisateru; Ogasawara, Katsuhiko; Aburano, Tamio

    2004-10-01

    We carried out a questionnaire survey of all radiological technology schools, to investigate the status of radiation safety education. The questionnaire consisted of questions concerning full-time teachers, measures being taken for the Radiation Protection Supervisor Qualifying Examination, equipment available for radiation safety education, radiation safety education for other departments, curriculum of radiation safety education, and related problems. The returned questionnaires were analyzed according to different groups categorized by form of education and type of establishment. The overall response rate was 55%, and there were statistically significant differences in the response rates among the different forms of education. No statistically significant differences were found in the items relating to full-time teachers, measures for Radiation Protection Supervisor Qualifying Examination, and radiation safety education for other departments, either for the form of education or type of establishment. Queries on the equipment used for radiation safety education revealed a statistically significant difference in unsealed radioisotope institutes among the forms of education. In terms of curriculum, the percentage of radiological technology schools which dealt with neither the shielding calculation method for radiation facilities nor with the control of medical waste was found to be approximately 10%. Other educational problems that were indicated included shortages of full-time teachers and equipment for radiation safety education. In the future, in order to improve radiation safety education at radiological technology schools, we consider it necessary to develop unsealed radioisotope institutes, to appoint more full-time teachers, and to educate students about risk communication.

  19. 76 FR 62894 - Agency Information Collection Activities: Notice of Request for Renewal of a Previously Approved...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-11

    ... innovative technologies that will improve safety, reduce congestion due to construction, and improve quality... project, the innovative technologies to be used and a description of how these technologies will improve safety, reduce construction congestion, and improve quality. The collected information will be used by...

  20. 49 CFR 234.275 - Processor-based systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... new or novel technology, or which provide safety-critical data to a railroad signal or train control... requirements. New or novel technology refers to a technology not previously recognized for use as of March 7... but which provides safety-critical data to a signal or train control system shall be included in the...

  1. 49 CFR 234.275 - Processor-based systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... new or novel technology, or which provide safety-critical data to a railroad signal or train control... requirements. New or novel technology refers to a technology not previously recognized for use as of March 7... but which provides safety-critical data to a signal or train control system shall be included in the...

  2. 76 FR 80408 - Addendum to the Memorandum of Understanding with the Department of Energy (August 28, 1992); Oak...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ... East Tennessee Technology Park in Oak Ridge, Tennessee; transfer of employee safety and health... occupational safety and health regulatory authority over employees at the East Tennessee Technology Park in Oak... facilities and properties at the East Tennessee Technology Park were transferred to TOSHA jurisdiction under...

  3. Less is (sometimes) more in cognitive engineering: the role of automation technology in improving patient safety

    PubMed Central

    Vicente, K

    2003-01-01

    

 There is a tendency to assume that medical error can be stamped out by automation. Technology may improve patient safety, but cognitive engineering research findings in several complex safety critical systems, including both aviation and health care, show that more is not always better. Less sophisticated technological systems can sometimes lead to better performance than more sophisticated systems. This "less is more" effect arises because safety critical systems are open systems where unanticipated events are bound to occur. In these contexts, decision support provided by a technological aid will be less than perfect because there will always be situations that the technology cannot accommodate. Designing sophisticated automation that suggests an uncertain course of action seems to encourage people to accept the imperfect advice, even though information to decide independently on a better course of action is available. It may be preferable to create more modest designs that merely provide feedback about the current state of affairs or that critique human generated solutions than to rush to automate by creating sophisticated technological systems that recommend (fallible) courses of action. PMID:12897363

  4. Safety Features in Anaesthesia Machine

    PubMed Central

    Subrahmanyam, M; Mohan, S

    2013-01-01

    Anaesthesia is one of the few sub-specialties of medicine, which has quickly adapted technology to improve patient safety. This application of technology can be seen in patient monitoring, advances in anaesthesia machines, intubating devices, ultrasound for visualisation of nerves and vessels, etc., Anaesthesia machines have come a long way in the last 100 years, the improvements being driven both by patient safety as well as functionality and economy of use. Incorporation of safety features in anaesthesia machines and ensuring that a proper check of the machine is done before use on a patient ensures patient safety. This review will trace all the present safety features in the machine and their evolution. PMID:24249880

  5. Test plan and report for Space Shuttle launch environment testing of Bergen cable technology safety cable

    NASA Technical Reports Server (NTRS)

    Ralph, John

    1992-01-01

    Bergen Cable Technology (BCT) has introduced a new product they refer to as 'safety cable'. This product is intended as a replacement for lockwire when installed per Aerospace Standard (AS) 4536 (included in Appendix D of this document). Installation of safety cable is reportedly faster and more uniform than lockwire. NASA/GSFC proposes to use this safety cable in Shuttle Small Payloads Project (SSPP) applications on upcoming Shuttle missions. To assure that BCT safety cable will provide positive locking of fasteners equivalent to lockwire, the SSPP will conduct vibration and pull tests of the safety cable.

  6. Effect of Uncertainty on Deterministic Runway Scheduling

    NASA Technical Reports Server (NTRS)

    Gupta, Gautam; Malik, Waqar; Jung, Yoon C.

    2012-01-01

    Active runway scheduling involves scheduling departures for takeoffs and arrivals for runway crossing subject to numerous constraints. This paper evaluates the effect of uncertainty on a deterministic runway scheduler. The evaluation is done against a first-come- first-serve scheme. In particular, the sequence from a deterministic scheduler is frozen and the times adjusted to satisfy all separation criteria; this approach is tested against FCFS. The comparison is done for both system performance (throughput and system delay) and predictability, and varying levels of congestion are considered. The modeling of uncertainty is done in two ways: as equal uncertainty in availability at the runway as for all aircraft, and as increasing uncertainty for later aircraft. Results indicate that the deterministic approach consistently performs better than first-come-first-serve in both system performance and predictability.

  7. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.

    PubMed

    Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M

    2016-12-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.

  8. Implications of Emerging Data Mining

    NASA Astrophysics Data System (ADS)

    Kulathuramaiyer, Narayanan; Maurer, Hermann

    Data Mining describes a technology that discovers non-trivial hidden patterns in a large collection of data. Although this technology has a tremendous impact on our lives, the invaluable contributions of this invisible technology often go unnoticed. This paper discusses advances in data mining while focusing on the emerging data mining capability. Such data mining applications perform multidimensional mining on a wide variety of heterogeneous data sources, providing solutions to many unresolved problems. This paper also highlights the advantages and disadvantages arising from the ever-expanding scope of data mining. Data Mining augments human intelligence by equipping us with a wealth of knowledge and by empowering us to perform our daily tasks better. As the mining scope and capacity increases, users and organizations become more willing to compromise privacy. The huge data stores of the ‚master miners` allow them to gain deep insights into individual lifestyles and their social and behavioural patterns. Data integration and analysis capability of combining business and financial trends together with the ability to deterministically track market changes will drastically affect our lives.

  9. A Concept of Operations for an Integrated Vehicle Health Assurance System

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.; Ross, Richard W.; Berger, David E.; Lekki, John D.; Mah, Robert W.; Perey, Danie F.; Schuet, Stefan R.; Simon, Donald L.; Smith, Stephen W.

    2013-01-01

    This document describes a Concept of Operations (ConOps) for an Integrated Vehicle Health Assurance System (IVHAS). This ConOps is associated with the Maintain Vehicle Safety (MVS) between Major Inspections Technical Challenge in the Vehicle Systems Safety Technologies (VSST) Project within NASA s Aviation Safety Program. In particular, this document seeks to describe an integrated system concept for vehicle health assurance that integrates ground-based inspection and repair information with in-flight measurement data for airframe, propulsion, and avionics subsystems. The MVS Technical Challenge intends to maintain vehicle safety between major inspections by developing and demonstrating new integrated health management and failure prevention technologies to assure the integrity of vehicle systems between major inspection intervals and maintain vehicle state awareness during flight. The approach provided by this ConOps is intended to help optimize technology selection and development, as well as allow the initial integration and demonstration of these subsystem technologies over the 5 year span of the VSST program, and serve as a guideline for developing IVHAS technologies under the Aviation Safety Program within the next 5 to 15 years. A long-term vision of IVHAS is provided to describe a basic roadmap for more intelligent and autonomous vehicle systems.

  10. 48 CFR 923.7003 - Contract clauses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Section 923.7003 Federal Acquisition Regulations System DEPARTMENT OF ENERGY SOCIOECONOMIC PROGRAMS ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Environmental, Energy and Water Efficiency, Renewable Energy Technologies, and Occupational Safety...

  11. 48 CFR 923.7003 - Contract clauses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Section 923.7003 Federal Acquisition Regulations System DEPARTMENT OF ENERGY SOCIOECONOMIC PROGRAMS ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Environmental, Energy and Water Efficiency, Renewable Energy Technologies, and Occupational Safety...

  12. 48 CFR 923.7003 - Contract clauses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Section 923.7003 Federal Acquisition Regulations System DEPARTMENT OF ENERGY SOCIOECONOMIC PROGRAMS ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Environmental, Energy and Water Efficiency, Renewable Energy Technologies, and Occupational Safety...

  13. 48 CFR 923.7003 - Contract clauses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Section 923.7003 Federal Acquisition Regulations System DEPARTMENT OF ENERGY SOCIOECONOMIC PROGRAMS ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Environmental, Energy and Water Efficiency, Renewable Energy Technologies, and Occupational Safety...

  14. Alternative food safety intervention technologies

    USDA-ARS?s Scientific Manuscript database

    Alternative nonthermal and thermal food safety interventions are gaining acceptance by the food processing industry and consumers. These technologies include high pressure processing, ultraviolet and pulsed light, ionizing radiation, pulsed and radiofrequency electric fields, cold atmospheric plasm...

  15. Propulsion Airframe Aeroacoustics Technology Evaluation and Selection Using a Multi-Attribute Decision Making Process and Non-Deterministic Design

    NASA Technical Reports Server (NTRS)

    Burg, Cecile M.; Hill, Geoffrey A.; Brown, Sherilyn A.; Geiselhart, Karl A.

    2004-01-01

    The Systems Analysis Branch at NASA Langley Research Center has investigated revolutionary Propulsion Airframe Aeroacoustics (PAA) technologies and configurations for a Blended-Wing-Body (BWB) type aircraft as part of its research for NASA s Quiet Aircraft Technology (QAT) Project. Within the context of the long-term NASA goal of reducing the perceived aircraft noise level by a factor of 4 relative to 1997 state of the art, major configuration changes in the propulsion airframe integration system were explored with noise as a primary design consideration. An initial down-select and assessment of candidate PAA technologies for the BWB was performed using a Multi-Attribute Decision Making (MADM) process consisting of organized brainstorming and decision-making tools. The assessments focused on what effect the PAA technologies had on both the overall noise level of the BWB and what effect they had on other major design considerations such as weight, performance and cost. A probabilistic systems analysis of the PAA configurations that presented the best noise reductions with the least negative impact on the system was then performed. Detailed results from the MADM study and the probabilistic systems analysis will be published in the near future.

  16. Practices and Exploration on Competition of Molecular Biological Detection Technology among Students in Food Quality and Safety Major

    ERIC Educational Resources Information Center

    Chang, Yaning; Peng, Yuke; Li, Pengfei; Zhuang, Yingping

    2017-01-01

    With the increasing importance in the application of the molecular biological detection technology in the field of food safety, strengthening education in molecular biology experimental techniques is more necessary for the culture of the students in food quality and safety major. However, molecular biology experiments are not always in curricula…

  17. Nurses' Perceptions of the Impact of Work Systems and Technology on Patient Safety during the Medication Administration Process

    ERIC Educational Resources Information Center

    Gallagher Gordon, Mary

    2012-01-01

    This dissertation examines nurses' perceptions of the impacts of systems and technology utilized during the medication administration process on patient safety and the culture of medication error reporting. This exploratory research study was grounded in a model of patient safety based on Patricia Benner's Novice to Expert Skill Acquisition model,…

  18. Safe laser application requires more than laser safety

    NASA Astrophysics Data System (ADS)

    Frevel, A.; Steffensen, B.; Vassie, L.

    1995-02-01

    An overview is presented concerning aspects of laser safety in European industrial laser use. Surveys indicate that there is a large variation in the safety strategies amongst industrial laser users. Some key problem areas are highlighted. Emission of hazardous substances is a major problem for users of laser material processing systems where the majority of the particulate is of a sub-micrometre size, presenting a respiratory hazard. Studies show that in many cases emissions are not frequently monitored in factories and uncertainty exists over the hazards. Operators of laser machines do not receive adequate job training or safety training. The problem is compounded by a plethora of regulations and standards which are difficult to interpret and implement, and inspectors who are not conversant with the technology or the issues. A case is demonstrated for a more integrated approach to laser safety, taking into account the development of laser applications, organizational and personnel development, in addition to environmental and occupational health and safety aspects. It is necessary to achieve a harmonization between these elements in any organization involved in laser technology. This might be achieved through establishing technology transfer centres in laser technology.

  19. Analysis of Aviation Safety Reporting System Incident Data Associated With the Technical Challenges of the Vehicle Systems Safety Technology Project

    NASA Technical Reports Server (NTRS)

    Withrow, Colleen A.; Reveley, Mary S.

    2014-01-01

    This analysis was conducted to support the Vehicle Systems Safety Technology (VSST) Project of the Aviation Safety Program (AVsP) milestone VSST4.2.1.01, "Identification of VSST-Related Trends." In particular, this is a review of incident data from the NASA Aviation Safety Reporting System (ASRS). The following three VSST-related technical challenges (TCs) were the focus of the incidents searched in the ASRS database: (1) Vechicle health assurance, (2) Effective crew-system interactions and decisions in all conditions; and (3) Aircraft loss of control prevention, mitigation, and recovery.

  20. Practices and exploration on competition of molecular biological detection technology among students in food quality and safety major.

    PubMed

    Chang, Yaning; Peng, Yuke; Li, Pengfei; Zhuang, Yingping

    2017-07-08

    With the increasing importance in the application of the molecular biological detection technology in the field of food safety, strengthening education in molecular biology experimental techniques is more necessary for the culture of the students in food quality and safety major. However, molecular biology experiments are not always in curricula of Food quality and safety Majors. This paper introduced a project "competition of molecular biological detection technology for food safety among undergraduate sophomore students in food quality and safety major", students participating in this project needed to learn the fundamental molecular biology experimental techniques such as the principles of molecular biology experiments and genome extraction, PCR and agarose gel electrophoresis analysis, and then design the experiments in groups to identify the meat species in pork and beef products using molecular biological methods. The students should complete the experimental report after basic experiments, write essays and make a presentation after the end of the designed experiments. This project aims to provide another way for food quality and safety majors to improve their knowledge of molecular biology, especially experimental technology, and enhances them to understand the scientific research activities as well as give them a chance to learn how to write a professional thesis. In addition, in line with the principle of an open laboratory, the project is also open to students in other majors in East China University of Science and Technology, in order to enhance students in other majors to understand the fields of molecular biology and food safety. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(4):343-350, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.

  1. Automated Mixed Traffic Vehicle (AMTV) technology and safety study

    NASA Technical Reports Server (NTRS)

    Johnston, A. R.; Peng, T. K. C.; Vivian, H. C.; Wang, P. K.

    1978-01-01

    Technology and safety related to the implementation of an Automated Mixed Traffic Vehicle (AMTV) system are discussed. System concepts and technology status were reviewed and areas where further development is needed are identified. Failure and hazard modes were also analyzed and methods for prevention were suggested. The results presented are intended as a guide for further efforts in AMTV system design and technology development for both near term and long term applications. The AMTV systems discussed include a low speed system, and a hybrid system consisting of low speed sections and high speed sections operating in a semi-guideway. The safety analysis identified hazards that may arise in a properly functioning AMTV system, as well as hardware failure modes. Safety related failure modes were emphasized. A risk assessment was performed in order to create a priority order and significant hazards and failure modes were summarized. Corrective measures were proposed for each hazard.

  2. National Institute of Occupational Safety and Health (NIOSH) Partnered Development of Cryogenic Life Support Technologies

    NASA Technical Reports Server (NTRS)

    Bush, David R.

    2017-01-01

    Cryogenic life support technology, used by NASA to protect crews working around hazardous gases soon could be called on for a number of life-saving applications as well as the agency's new human spaceflight endeavors. This technology under development in Kennedy Space Center's Biomedical Laboratory has the potential to store more than twice the amount of breathable air than traditional compressed gas systems. The National Institute for Occupational Safety and Health (NIOSH) is contributing to the funding for this project in the hopes that the liquid air-based systems could change the way workers dependent on life support technologies accomplish their mission, improving their safety and efficiency.

  3. The cost-effectiveness of air bags by seating position.

    PubMed

    Graham, J D; Thompson, K M; Goldie, S J; Segui-Gomez, M; Weinstein, M C

    1997-11-05

    Motor vehicle crashes continue to cause significant mortality and morbidity in the United States. Installation of air bags in new passenger vehicles is a major initiative in the field of injury prevention. To assess the net health consequences and cost-effectiveness of driver's side and front passenger air bags from a societal perspective, taking into account the increased risk to children who occupy the front passenger seat and the diminished effectiveness for older adults. A deterministic state transition model tracked a hypothetical cohort of new vehicles over a 20-year period for 3 strategies: (1) installation of safety belts, (2) installation of driver's side air bags in addition to safety belts, and (3) installation of front passenger air bags in addition to safety belts and driver's side air bags. Changes in health outcomes, valued in terms of quality-adjusted life-years (QALYs) and costs (in 1993 dollars), were projected following the recommendations of the Panel on Cost-effectiveness in Health and Medicine. US population-based and convenience sample data were used. Incremental cost-effectiveness ratios. Safety belts are cost saving, even at 50% use. The addition of driver's side air bags to safety belts results in net health benefits at an incremental cost of $24000 per QALY saved. The further addition of front passenger air bags results in an incremental net benefit at a higher incremental cost of $61000 per QALY saved. Results were sensitive to the unit cost of air bag systems, their effectiveness, baseline fatality rates, the ratio of injuries to fatalities, and the real discount rate. Both air bag systems save life-years at costs that are comparable to many medical and public health practices. Immediate steps can be taken to enhance the cost-effectiveness of front passenger air bags, such as moving children to the rear seat.

  4. Information collection and processing of dam distortion in digital reservoir system

    NASA Astrophysics Data System (ADS)

    Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju

    2007-06-01

    The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.

  5. Multiprocessor shared-memory information exchange

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santoline, L.L.; Bowers, M.D.; Crew, A.W.

    1989-02-01

    In distributed microprocessor-based instrumentation and control systems, the inter-and intra-subsystem communication requirements ultimately form the basis for the overall system architecture. This paper describes a software protocol which addresses the intra-subsystem communications problem. Specifically the protocol allows for multiple processors to exchange information via a shared-memory interface. The authors primary goal is to provide a reliable means for information to be exchanged between central application processor boards (masters) and dedicated function processor boards (slaves) in a single computer chassis. The resultant Multiprocessor Shared-Memory Information Exchange (MSMIE) protocol, a standard master-slave shared-memory interface suitable for use in nuclear safety systems, ismore » designed to pass unidirectional buffers of information between the processors while providing a minimum, deterministic cycle time for this data exchange.« less

  6. Nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates

    DOEpatents

    Melechko, Anatoli V [Oak Ridge, TN; McKnight, Timothy E [Greenback, TN; Guillorn, Michael A [Ithaca, NY; Ilic, Bojan [Ithaca, NY; Merkulov, Vladimir I [Knoxville, TN; Doktycz, Mitchel J [Knoxville, TN; Lowndes, Douglas H [Knoxville, TN; Simpson, Michael L [Knoxville, TN

    2011-08-23

    Methods, manufactures, machines and compositions are described for nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates. An apparatus, includes a substrate and a nanoreplicant structure coupled to a surface of the substrate.

  7. Climatology and Predictability of Cool-Season High Wind Events in the New York City Metropolitan and Surrounding Area

    NASA Astrophysics Data System (ADS)

    Layer, Michael

    Damaging wind events not associated with severe convective storms or tropical cyclones can occur over the Northeast U.S. during the cool season and can cause significant problems with transportation, infrastructure, and public safety. These non-convective wind events (NCWEs) events are difficult for operational forecasters to predict in the NYC region as revealed by relatively poor verification statistics in recent years. This study investigates the climatology of NCWEs occurring between 15 September and 15 May over 13 seasons from 2000-2001 through 2012-2013. The events are broken down into three distinct types commonly observed in the region: pre-cold frontal (PRF), post-cold frontal (POF), and nor'easter/coastal storm (NEC) cases. Relationships between observed winds and some atmospheric parameters such as 900 hPa height gradient, 3-hour MSLP tendency, low-level wind profile, and stability are also studied. Overall, PRF and NEC events exhibit stronger height gradients, stronger low-level winds, and stronger low-level stability than POF events. Model verification is also conducted over the 2009-2014 time period using the Short Range Ensemble Forecast system (SREF) from the National Centers for Environmental Prediction (NCEP). Both deterministic and probabilistic verification metrics are used to evaluate the performance of the ensemble during NCWEs. Although the SREF has better forecast skill than most of the deterministic SREF control members, it is rather poorly calibrated, and exhibits a significant overforecasting, or positive wind speed bias in the lower atmosphere.

  8. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology

    PubMed Central

    Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.

    2016-01-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915

  9. Stochasticity and determinism in models of hematopoiesis.

    PubMed

    Kimmel, Marek

    2014-01-01

    This chapter represents a novel view of modeling in hematopoiesis, synthesizing both deterministic and stochastic approaches. Whereas the stochastic models work in situations where chance dominates, for example when the number of cells is small, or under random mutations, the deterministic models are more important for large-scale, normal hematopoiesis. New types of models are on the horizon. These models attempt to account for distributed environments such as hematopoietic niches and their impact on dynamics. Mixed effects of such structures and chance events are largely unknown and constitute both a challenge and promise for modeling. Our discussion is presented under the separate headings of deterministic and stochastic modeling; however, the connections between both are frequently mentioned. Four case studies are included to elucidate important examples. We also include a primer of deterministic and stochastic dynamics for the reader's use.

  10. Leaving patients to their own devices? Smart technology, safety and therapeutic relationships.

    PubMed

    Ho, Anita; Quick, Oliver

    2018-03-06

    This debate article explores how smart technologies may create a double-edged sword for patient safety and effective therapeutic relationships. Increasing utilization of health monitoring devices by patients will likely become an important aspect of self-care and preventive medicine. It may also help to enhance accurate symptom reports, diagnoses, and prompt referral to specialist care where appropriate. However, the development, marketing, and use of such technology raise significant ethical implications for therapeutic relationships and patient safety. Drawing on lessons learned from other direct-to-consumer health products such as genetic testing, this article explores how smart technology can also pose regulatory challenges and encourage overutilization of healthcare services. In order for smart technology to promote safer care and effective therapeutic encounters, the technology and its utilization must be safe. This article argues for unified regulatory guidelines and better education for both healthcare providers and patients regarding the benefits and risks of these devices.

  11. Identification of Vehicle Health Assurance Related Trends

    NASA Technical Reports Server (NTRS)

    Phojanamongkolkij, Nipa; Evans, Joni K.; Barr, Lawrence C.; Leone, Karen M.; Reveley, Mary S.

    2014-01-01

    Trend analysis in aviation as related to vehicle health management (VHM) was performed by reviewing the most current statistical and prognostics data available from the National Transportation Safety Board (NTSB) accident, the Federal Aviation Administration (FAA) incident, and the NASA Aviation Safety Reporting System (ASRS) incident datasets. In addition, future directions in aviation technology related to VHM research areas were assessed through the Commercial Aviation Safety Team (CAST) Safety Enhancements Reserved for Future Implementations (SERFIs), the National Transportation Safety Board (NTSB) Most-Wanted List and recent open safety recommendations, the National Research Council (NRC) Decadal Survey of Civil Aeronautics, and the Future Aviation Safety Team (FAST) areas of change. Future research direction in the VHM research areas is evidently strong as seen from recent research solicitations from the Naval Air Systems Command (NAVAIR), and VHM-related technologies actively being developed by aviation industry leaders, including GE, Boeing, Airbus, and UTC Aerospace Systems. Given the highly complex VHM systems, modifications can be made in the future so that the Vehicle Systems Safety Technology Project (VSST) technical challenges address inadequate maintenance crew's trainings and skills, and the certification methods of such systems as recommended by the NTSB, NRC, and FAST areas of change.

  12. Technologies for precision manufacture of current and future windows and domes

    NASA Astrophysics Data System (ADS)

    Hallock, Bob; Shorey, Aric

    2009-05-01

    The final finish and characterization of windows and domes presents a number of challenges in achieving desired precision with acceptable cost and schedule. This becomes more difficult with advanced materials and as window and dome shapes and requirements become more complex, including acute angle corners, transmitted wavefront specifications, aspheric geometries and trending toward conformal surfaces. Magnetorheological Finishing (MRF®) and Magnetorheological Jet (MR Jet®), along with metrology provided by Sub-aperture Stitching Interferometry (SSI®) have several unique attributes that provide them advantages in enhancing fabrication of current and next generation windows and domes. The advantages that MRF brings to the precision finishing of a wide range of shapes such as flats, spheres (including hemispheres), cylinders, aspheres and even freeform optics, has been well documented. Recent advancements include the ability to finish freeform shapes up to 2-meters in size as well as progress in finishing challenging IR materials. Due to its shear-based removal mechanism in contrast to the pressure-based process of other techniques, edges are not typically rolled, in particular on parts with acute angle corners. MR Jet provides additional benefits, particularly in the finishing of the inside of steep concave domes and other irregular shapes. The ability of MR Jet to correct the figure of conformal domes deterministically and to high precision has been demonstrated. Combining these technologies with metrology techniques, such as SSI provides a solution for finishing current and future windows and domes in a reliable, deterministic and cost-effective way. The ability to use the SSI to characterize a range of shapes such as domes and aspheres, as well as progress in using MRF and MR Jet for finishing conventional and conformal windows and domes with increasing size and complexity of design will be presented.

  13. A Robust Scalable Transportation System Concept

    NASA Technical Reports Server (NTRS)

    Hahn, Andrew; DeLaurentis, Daniel

    2006-01-01

    This report documents the 2005 Revolutionary System Concept for Aeronautics (RSCA) study entitled "A Robust, Scalable Transportation System Concept". The objective of the study was to generate, at a high-level of abstraction, characteristics of a new concept for the National Airspace System, or the new NAS, under which transportation goals such as increased throughput, delay reduction, and improved robustness could be realized. Since such an objective can be overwhelmingly complex if pursued at the lowest levels of detail, instead a System-of-Systems (SoS) approach was adopted to model alternative air transportation architectures at a high level. The SoS approach allows the consideration of not only the technical aspects of the NAS", but also incorporates policy, socio-economic, and alternative transportation system considerations into one architecture. While the representations of the individual systems are basic, the higher level approach allows for ways to optimize the SoS at the network level, determining the best topology (i.e. configuration of nodes and links). The final product (concept) is a set of rules of behavior and network structure that not only satisfies national transportation goals, but represents the high impact rules that accomplish those goals by getting the agents to "do the right thing" naturally. The novel combination of Agent Based Modeling and Network Theory provides the core analysis methodology in the System-of-Systems approach. Our method of approach is non-deterministic which means, fundamentally, it asks and answers different questions than deterministic models. The nondeterministic method is necessary primarily due to our marriage of human systems with technological ones in a partially unknown set of future worlds. Our goal is to understand and simulate how the SoS, human and technological components combined, evolve.

  14. Safety benefits of implementing adaptive signal control technology : survey results.

    DOT National Transportation Integrated Search

    2013-01-01

    The safety benefits and costs associated with implementing adaptive signal control technology (ASCT) were evaluated in : this study. A user-friendly online survey was distributed to 62 agencies that had implemented ASCT in the United States. : Twenty...

  15. Alternative food safety intervention technologies: flash pasteurization of finfish

    USDA-ARS?s Scientific Manuscript database

    Alternative nonthermal and thermal food safety interventions are gaining acceptance by the food processing industry and consumers. These technologies include high pressure processing, ultraviolet and pulsed light, ionizing radiation, pulsed and radiofrequency electric fields, cold atmospheric plasm...

  16. SmartPark Technology Demonstration Project, Phase II: Final Report : Technology Brief

    DOT National Transportation Integrated Search

    2018-05-01

    In 2000, the National Transportation Safety Board recommended that the Federal Motor Carrier Safety Administration (FMCSA) create a guide to inform truck drivers about locations and availability of parking. In 2002, the Federal Highway Administration...

  17. Computational toxicity in 21st century safety sciences (China ...

    EPA Pesticide Factsheets

    presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China

  18. Improving Patient Safety in Hospitals through Usage of Cloud Supported Video Surveillance.

    PubMed

    Dašić, Predrag; Dašić, Jovan; Crvenković, Bojan

    2017-04-15

    Patient safety in hospitals is of equal importance as providing treatments and urgent healthcare. With the development of Cloud technologies and Big Data analytics, it is possible to employ VSaaS technology virtually anywhere, for any given security purpose. For the listed benefits, in this paper, we give an overview of the existing cloud surveillance technologies which can be implemented for improving patient safety. Modern VSaaS systems provide higher elasticity and project scalability in dealing with real-time information processing. Modern surveillance technologies can prove to be an effective tool for prevention of patient falls, undesired movement and tempering with attached life supporting devices. Given a large number of patients who require constant supervision, a cloud-based monitoring system can dramatically reduce the occurring costs. It provides continuous real-time monitoring, increased overall security and safety, improved staff productivity, prevention of dishonest claims and long-term digital archiving. Patient safety is a growing issue which can be improved with the usage of high-end centralised surveillance systems allowing the staff to focus more on treating health issues rather that keeping a watchful eye on potential incidents.

  19. Assessment of the safety benefits of vehicles' advanced driver assistance, connectivity and low level automation systems.

    PubMed

    Yue, Lishengsa; Abdel-Aty, Mohamed; Wu, Yina; Wang, Ling

    2018-08-01

    The Connected Vehicle (CV) technologies together with other Driving Assistance (DA) technologies are believed to have great effects on traffic operation and safety, and they are expected to impact the future of our cities. However, few research has estimated the exact safety benefits when all vehicles are equipped with these technologies. This paper seeks to fill the gap by using a general crash avoidance effectiveness framework for major CV&DA technologies to make a comprehensive crash reduction estimation. Twenty technologies that were tested in recent studies are summarized and sensitivity analysis is used for estimating their total crash avoidance effectiveness. The results show that crash avoidance effectiveness of CV&DA technology is significantly affected by the vehicle type and the safety estimation methodology. A 70% crash avoidance rate seems to be the highest effectiveness for the CV&DA technologies operating in the real-world environment. Based on the 2005-2008 U.S. GES Crash Records, this research found that the CV&DA technologies could lead to the reduction of light vehicles' crashes and heavy trucks' crashes by at least 32.99% and 40.88%, respectively. The rear-end crashes for both light vehicles and heavy trucks have the most expected crash benefits from the technologies. The paper also studies the effectiveness of Forward Collision Warning technology (FCW) under fog conditions, and the results show that FCW could reduce 35% of the near-crash events under fog conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Hybrid deterministic/stochastic simulation of complex biochemical systems.

    PubMed

    Lecca, Paola; Bagagiolo, Fabio; Scarpa, Marina

    2017-11-21

    In a biological cell, cellular functions and the genetic regulatory apparatus are implemented and controlled by complex networks of chemical reactions involving genes, proteins, and enzymes. Accurate computational models are indispensable means for understanding the mechanisms behind the evolution of a complex system, not always explored with wet lab experiments. To serve their purpose, computational models, however, should be able to describe and simulate the complexity of a biological system in many of its aspects. Moreover, it should be implemented by efficient algorithms requiring the shortest possible execution time, to avoid enlarging excessively the time elapsing between data analysis and any subsequent experiment. Besides the features of their topological structure, the complexity of biological networks also refers to their dynamics, that is often non-linear and stiff. The stiffness is due to the presence of molecular species whose abundance fluctuates by many orders of magnitude. A fully stochastic simulation of a stiff system is computationally time-expensive. On the other hand, continuous models are less costly, but they fail to capture the stochastic behaviour of small populations of molecular species. We introduce a new efficient hybrid stochastic-deterministic computational model and the software tool MoBioS (MOlecular Biology Simulator) implementing it. The mathematical model of MoBioS uses continuous differential equations to describe the deterministic reactions and a Gillespie-like algorithm to describe the stochastic ones. Unlike the majority of current hybrid methods, the MoBioS algorithm divides the reactions' set into fast reactions, moderate reactions, and slow reactions and implements a hysteresis switching between the stochastic model and the deterministic model. Fast reactions are approximated as continuous-deterministic processes and modelled by deterministic rate equations. Moderate reactions are those whose reaction waiting time is greater than the fast reaction waiting time but smaller than the slow reaction waiting time. A moderate reaction is approximated as a stochastic (deterministic) process if it was classified as a stochastic (deterministic) process at the time at which it crosses the threshold of low (high) waiting time. A Gillespie First Reaction Method is implemented to select and execute the slow reactions. The performances of MoBios were tested on a typical example of hybrid dynamics: that is the DNA transcription regulation. The simulated dynamic profile of the reagents' abundance and the estimate of the error introduced by the fully deterministic approach were used to evaluate the consistency of the computational model and that of the software tool.

  1. Failed rib region prediction in a human body model during crash events with precrash braking.

    PubMed

    Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S

    2018-02-28

    The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.

  2. Pro Free Will Priming Enhances “Risk-Taking” Behavior in the Iowa Gambling Task, but Not in the Balloon Analogue Risk Task: Two Independent Priming Studies

    PubMed Central

    Schrag, Yann; Tremea, Alessandro; Lagger, Cyril; Ohana, Noé; Mohr, Christine

    2016-01-01

    Studies indicated that people behave less responsibly after exposure to information containing deterministic statements as compared to free will statements or neutral statements. Thus, deterministic primes should lead to enhanced risk-taking behavior. We tested this prediction in two studies with healthy participants. In experiment 1, we tested 144 students (24 men) in the laboratory using the Iowa Gambling Task. In experiment 2, we tested 274 participants (104 men) online using the Balloon Analogue Risk Task. In the Iowa Gambling Task, the free will priming condition resulted in more risky decisions than both the deterministic and neutral priming conditions. We observed no priming effects on risk-taking behavior in the Balloon Analogue Risk Task. To explain these unpredicted findings, we consider the somatic marker hypothesis, a gain frequency approach as well as attention to gains and / or inattention to losses. In addition, we highlight the necessity to consider both pro free will and deterministic priming conditions in future studies. Importantly, our and previous results indicate that the effects of pro free will and deterministic priming do not oppose each other on a frequently assumed continuum. PMID:27018854

  3. Pro Free Will Priming Enhances "Risk-Taking" Behavior in the Iowa Gambling Task, but Not in the Balloon Analogue Risk Task: Two Independent Priming Studies.

    PubMed

    Schrag, Yann; Tremea, Alessandro; Lagger, Cyril; Ohana, Noé; Mohr, Christine

    2016-01-01

    Studies indicated that people behave less responsibly after exposure to information containing deterministic statements as compared to free will statements or neutral statements. Thus, deterministic primes should lead to enhanced risk-taking behavior. We tested this prediction in two studies with healthy participants. In experiment 1, we tested 144 students (24 men) in the laboratory using the Iowa Gambling Task. In experiment 2, we tested 274 participants (104 men) online using the Balloon Analogue Risk Task. In the Iowa Gambling Task, the free will priming condition resulted in more risky decisions than both the deterministic and neutral priming conditions. We observed no priming effects on risk-taking behavior in the Balloon Analogue Risk Task. To explain these unpredicted findings, we consider the somatic marker hypothesis, a gain frequency approach as well as attention to gains and / or inattention to losses. In addition, we highlight the necessity to consider both pro free will and deterministic priming conditions in future studies. Importantly, our and previous results indicate that the effects of pro free will and deterministic priming do not oppose each other on a frequently assumed continuum.

  4. 49 CFR 1.50 - Delegation to the National Highway Traffic Safety Administrator.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... program; (9) Section 2010, motorcyclist safety; (10) Section 2011, child safety and child booster seat... use technologies; (24) Section 10307(b), regulations, in regard to safety labeling requirements; (25...

  5. 49 CFR 1.50 - Delegation to the National Highway Traffic Safety Administrator.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... program; (9) Section 2010, motorcyclist safety; (10) Section 2011, child safety and child booster seat... use technologies; (24) Section 10307(b), regulations, in regard to safety labeling requirements; (25...

  6. 48 CFR 52.250-3 - SAFETY Act Block Designation/Certification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... QATTs have been deployed. It also confers other important benefits. SAFETY Act designation and SAFETY... or requests may be mailed to: Directorate of Science and Technology, SAFETY Act/Room 4320, Department...

  7. 48 CFR 52.250-3 - SAFETY Act Block Designation/Certification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... QATTs have been deployed. It also confers other important benefits. SAFETY Act designation and SAFETY... or requests may be mailed to: Directorate of Science and Technology, SAFETY Act/Room 4320, Department...

  8. 48 CFR 52.250-3 - SAFETY Act Block Designation/Certification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... QATTs have been deployed. It also confers other important benefits. SAFETY Act designation and SAFETY... or requests may be mailed to: Directorate of Science and Technology, SAFETY Act/Room 4320, Department...

  9. 48 CFR 52.250-3 - SAFETY Act Block Designation/Certification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... QATTs have been deployed. It also confers other important benefits. SAFETY Act designation and SAFETY... or requests may be mailed to: Directorate of Science and Technology, SAFETY Act/Room 4320, Department...

  10. 48 CFR 52.250-3 - SAFETY Act Block Designation/Certification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... QATTs have been deployed. It also confers other important benefits. SAFETY Act designation and SAFETY... or requests may be mailed to: Directorate of Science and Technology, SAFETY Act/Room 4320, Department...

  11. Food Safety Informatics: A Public Health Imperative

    PubMed Central

    Tucker, Cynthia A.; Larkin, Stephanie N.; Akers, Timothy A.

    2011-01-01

    To date, little has been written about the implementation of utilizing food safety informatics as a technological tool to protect consumers, in real-time, against foodborne illnesses. Food safety outbreaks have become a major public health problem, causing an estimated 48 million illnesses, 128,000 hospitalizations, and 3,000 deaths in the U.S. each year. Yet, government inspectors/regulators that monitor foodservice operations struggle with how to collect, organize, and analyze data; implement, monitor, and enforce safe food systems. Currently, standardized technologies have not been implemented to efficiently establish “near-in-time” or “just-in-time” electronic awareness to enhance early detection of public health threats regarding food safety. To address the potential impact of collection, organization and analyses of data in a foodservice operation, a wireless food safety informatics (FSI) tool was pilot tested at a university student foodservice center. The technological platform in this test collected data every six minutes over a 24 hour period, across two primary domains: time and temperatures within freezers, walk-in refrigerators and dry storage areas. The results of this pilot study briefly illustrated how technology can assist in food safety surveillance and monitoring by efficiently detecting food safety abnormalities related to time and temperatures so that efficient and proper response in “real time” can be addressed to prevent potential foodborne illnesses. PMID:23569605

  12. Enhancing the safety and quality of fresh produce and low-moisture foods by waterless non-thermal technologies: Cold plasma and monochromatic light

    USDA-ARS?s Scientific Manuscript database

    NIFA Project 2015-69003-23410 addresses the urgent need for novel technologies that improve the safety of fresh and fresh-cut fruits and vegetables that preserve quality while reducing water usage. This portion of the project is to investigate emerging non-thermal technologies, such as antimicrobial...

  13. A Risk Assessment Model for Reduced Aircraft Separation: A Quantitative Method to Evaluate the Safety of Free Flight

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Connors, Mary; Wojciech, Jack; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    As new technologies and procedures are introduced into the National Airspace System, whether they are intended to improve efficiency, capacity, or safety level, the quantification of potential changes in safety levels is of vital concern. Applications of technology can improve safety levels and allow the reduction of separation standards. An excellent example is the Precision Runway Monitor (PRM). By taking advantage of the surveillance and display advances of PRM, airports can run instrument parallel approaches to runways separated by 3400 feet with the same level of safety as parallel approaches to runways separated by 4300 feet using the standard technology. Despite a wealth of information from flight operations and testing programs, there is no readily quantifiable relationship between numerical safety levels and the separation standards that apply to aircraft on final approach. This paper presents a modeling approach to quantify the risk associated with reducing separation on final approach. Reducing aircraft separation, both laterally and longitudinally, has been the goal of several aviation R&D programs over the past several years. Many of these programs have focused on technological solutions to improve navigation accuracy, surveillance accuracy, aircraft situational awareness, controller situational awareness, and other technical and operational factors that are vital to maintaining flight safety. The risk assessment model relates different types of potential aircraft accidents and incidents and their contribution to overall accident risk. The framework links accident risks to a hierarchy of failsafe mechanisms characterized by procedures and interventions. The model will be used to assess the overall level of safety associated with reducing separation standards and the introduction of new technology and procedures, as envisaged under the Free Flight concept. The model framework can be applied to various aircraft scenarios, including parallel and in-trail approaches. This research was performed under contract to NASA and in cooperation with the FAA's Safety Division (ASY).

  14. Final Report: Fire Prevention, Detection, and Suppression Project, Exploration Technology Development Program

    NASA Technical Reports Server (NTRS)

    Ruff, Gary A.

    2011-01-01

    The Fire Prevention, Detection, and Suppression (FPDS) project is a technology development effort within the Exploration Technology Development Program of the Exploration System Missions Directorate (ESMD) that addresses all aspects of fire safety aboard manned exploration systems. The overarching goal for work in the FPDS area is to develop technologies that will ensure crew health and safety on exploration missions by reducing the likelihood of a fire, or, if one does occur, minimizing the risk to the crew, mission, or system. This is accomplished by addressing the areas of (1) fire prevention and material flammability, (2) fire signatures and detection, and (3) fire suppression and response. This report describes the outcomes of this project from the formation of the Exploration Technology Development Program (ETDP) in October 2005 to September 31, 2010 when the Exploration Technology Development Program was replaced by the Enabling Technology Development and Demonstration Program. NASA s fire safety work will continue under this new program and will build upon the accomplishments described herein.

  15. Ion implantation for deterministic single atom devices

    NASA Astrophysics Data System (ADS)

    Pacheco, J. L.; Singh, M.; Perry, D. L.; Wendt, J. R.; Ten Eyck, G.; Manginell, R. P.; Pluym, T.; Luhman, D. R.; Lilly, M. P.; Carroll, M. S.; Bielejec, E.

    2017-12-01

    We demonstrate a capability of deterministic doping at the single atom level using a combination of direct write focused ion beam and solid-state ion detectors. The focused ion beam system can position a single ion to within 35 nm of a targeted location and the detection system is sensitive to single low energy heavy ions. This platform can be used to deterministically fabricate single atom devices in materials where the nanostructure and ion detectors can be integrated, including donor-based qubits in Si and color centers in diamond.

  16. Counterfactual Quantum Deterministic Key Distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Sheng; Wang, Jian; Tang, Chao-Jing

    2013-01-01

    We propose a new counterfactual quantum cryptography protocol concerning about distributing a deterministic key. By adding a controlled blocking operation module to the original protocol [T.G. Noh, Phys. Rev. Lett. 103 (2009) 230501], the correlation between the polarizations of the two parties, Alice and Bob, is extended, therefore, one can distribute both deterministic keys and random ones using our protocol. We have also given a simple proof of the security of our protocol using the technique we ever applied to the original protocol. Most importantly, our analysis produces a bound tighter than the existing ones.

  17. Ion implantation for deterministic single atom devices

    DOE PAGES

    Pacheco, J. L.; Singh, M.; Perry, D. L.; ...

    2017-12-04

    Here, we demonstrate a capability of deterministic doping at the single atom level using a combination of direct write focused ion beam and solid-state ion detectors. The focused ion beam system can position a single ion to within 35 nm of a targeted location and the detection system is sensitive to single low energy heavy ions. This platform can be used to deterministically fabricate single atom devices in materials where the nanostructure and ion detectors can be integrated, including donor-based qubits in Si and color centers in diamond.

  18. Deterministic quantum splitter based on time-reversed Hong-Ou-Mandel interference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jun; Lee, Kim Fook; Kumar, Prem

    2007-09-15

    By utilizing a fiber-based indistinguishable photon-pair source in the 1.55 {mu}m telecommunications band [J. Chen et al., Opt. Lett. 31, 2798 (2006)], we present the first, to the best of our knowledge, deterministic quantum splitter based on the principle of time-reversed Hong-Ou-Mandel quantum interference. The deterministically separated identical photons' indistinguishability is then verified by using a conventional Hong-Ou-Mandel quantum interference, which exhibits a near-unity dip visibility of 94{+-}1%, making this quantum splitter useful for various quantum information processing applications.

  19. Fire Safety of Passenger Trains: A Review of U.S. and Foreign Approaches

    DOT National Transportation Integrated Search

    1993-12-01

    could develop into potentially life-threatening events. Fire safety is an area of particular interest for both : conventional intercity and commuter trains, as well as new alternative high-speed train technologies. These : technologies include steel-...

  20. Safety Relevant Observations on the ICE High Speed Train

    DOT National Transportation Integrated Search

    1991-07-01

    The safety of high speed rail technology proposed for possible application in the United States is of concern to the Federal Railroad Administration. This report, one in a series of reports planned for high speed rail technologies presents an initial...

  1. Development and validation of nonthermal and advanced thermal food safety intervention technologies

    USDA-ARS?s Scientific Manuscript database

    Alternative nonthermal and thermal food safety interventions are gaining acceptance by the food processing industry and consumers. These technologies include high pressure processing, ultraviolet and pulsed light, ionizing radiation, pulsed and radiofrequency electric fields, cold atmospheric plasm...

  2. 2015 Accomplishments Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    This report covers selected highlights from the four research pathways in the LWRS Program: Materials Aging and Degradation; Risk-Informed Safety Margin Characterization; Advanced Instrumentation, Information, and Control Systems Technologies; and Reactor Safety Technologies, as well as a look-ahead at planned activities for 2017.

  3. 2016 Accomplishments Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    This report covers selected highlights from the four research pathways in the LWRS Program: Materials Aging and Degradation; Risk-Informed Safety Margin Characterization; Advanced Instrumentation, Information, and Control Systems Technologies; and Reactor Safety Technologies, as well as a look-ahead at planned activities for 2017.

  4. Utah ITS/CVO business plan : using technology to maximize highway safety and improve government and industry productivity

    DOT National Transportation Integrated Search

    1997-12-31

    This plan was produced to maximize highway safety and increase government and industry productivity through the application of Intelligent Transportation System/Commercial Vehicle Operations (ITS/CVO) technologies to support regulatory and enforcemen...

  5. Future Data Communication Architectures for Safety Critical Aircraft Cabin Systems

    NASA Astrophysics Data System (ADS)

    Berkhahn, Sven-Olaf

    2012-05-01

    The cabin of modern aircraft is subject to increasing demands for fast reconfiguration and hence flexibility. These demands require studies for new network architectures and technologies of the electronic cabin systems, which consider also weight and cost reductions as well as safety constraints. Two major approaches are in consideration to reduce the complex and heavy wiring harness: the usage of a so called hybrid data bus technology, which enables the common usage of the same data bus for several electronic cabin systems with different safety and security requirements and the application of wireless data transfer technologies for electronic cabin systems.

  6. History of nuclear technology development in Japan

    NASA Astrophysics Data System (ADS)

    Yamashita, Kiyonobu

    2015-04-01

    Nuclear technology development in Japan has been carried out based on the Atomic Energy Basic Act brought into effect in 1955. The nuclear technology development is limited to peaceful purposes and made in a principle to assure their safety. Now, the technologies for research reactors radiation application and nuclear power plants are delivered to developing countries. First of all, safety measures of nuclear power plants (NPPs) will be enhanced based on lesson learned from TEPCO Fukushima Daiichi NPS accident.

  7. OSMA Research and Technology Strategy Team Summary

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2010-01-01

    This slide presentation reviews the work of the Office of Safety and Mission Assurance (OSMA), and the OSMA Research and Technology Strategy (ORTS) team. There is discussion of the charter of the team, Technology Readiness Levels (TRLs) and how the teams responsibilities are related to these TRLs. In order to improve the safety of all levels of the development through the TRL phases, improved communication, understanding and cooperation is required at all levels, particularly at the mid level technologies development.

  8. Assessment of Superstructure Ice Protection as Applied to Offshore Oil Operations Safety: Ice Protection Technologies, Safety Enhancements, and Development Needs

    DTIC Science & Technology

    2009-04-01

    companies and Web site own- ers to use their tables and figures. This report was prepared under the general supervision of Janet Hardy, Chief...through reports about the technologies, sales and engineering literature, Web sites, and patents. Information in some circumstances was available from...the technologies are proprietary, some information sources were limited to Web sites and open literature. 5. TRL: Technology Readiness Level (TRL

  9. History of nuclear technology development in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, Kiyonobu, E-mail: yamashita.kiyonobu@jaea.go.jp; General Advisor Nuclear HRD Centre, Japan Atomic Energy Agency, TOKAI-mura, NAKA-gun, IBARAKI-ken, 319-1195

    2015-04-29

    Nuclear technology development in Japan has been carried out based on the Atomic Energy Basic Act brought into effect in 1955. The nuclear technology development is limited to peaceful purposes and made in a principle to assure their safety. Now, the technologies for research reactors radiation application and nuclear power plants are delivered to developing countries. First of all, safety measures of nuclear power plants (NPPs) will be enhanced based on lesson learned from TEPCO Fukushima Daiichi NPS accident.

  10. Safety management of complex research operations

    NASA Technical Reports Server (NTRS)

    Brown, W. J.

    1981-01-01

    Complex research and technology operations present many varied potential hazards which must be addressed in a disciplined independent safety review and approval process. The research and technology effort at the Lewis Research Center is divided into programmatic areas of aeronautics, space and energy. Potential hazards vary from high energy fuels to hydrocarbon fuels, high pressure systems to high voltage systems, toxic chemicals to radioactive materials and high speed rotating machinery to high powered lasers. A Safety Permit System presently covers about 600 potentially hazardous operations. The Safety Management Program described in this paper is believed to be a major factor in maintaining an excellent safety record at the Lewis Research Center.

  11. Back-door cost-benefit analysis under a safety-first Clean Air Act

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, D.W.

    The Clean Air Act emphasizes safety over cost considerations, but a cost-conscious administration which emphasizes economic impacts has not enforced the letter of the safety-first law. A solution could be to budget cost-justified rather than safety-first levels of pollution reduction. A comparison of cost-benefit balancing and budgetary control measures examines administrative procedures and probable outcomes in terms of enforcement costs. The author notes that the two concepts require different technology. The higher cost of safety-first technology tend to discourage investment, and could lead to less pollution control than the cost-benefit approach. 59 references, 12 figures. (DCK)

  12. Barcode identification for transfusion safety.

    PubMed

    Murphy, M F; Kay, J D S

    2004-09-01

    Errors related to blood transfusion in hospitals may produce catastrophic consequences. This review addresses potential solutions to prevent patient misidentification including the use of new technology, such as barcoding. A small number of studies using new technology for the transfusion process in hospitals have shown promising results in preventing errors. The studies demonstrated improved transfusion safety and staff preference for new technology such as bedside handheld scanners to carry out pretransfusion bedside checking. They also highlighted the need for considerable efforts in the training of staff in the new procedures before their successful implementation. Improvements in hospital transfusion safety are a top priority for transfusion medicine, and will depend on a combined approach including a better understanding of the causes of errors, a reduction in the complexity of routine procedures taking advantage of new technology, improved staff training, and regular monitoring of practice. The use of new technology to improve the safety of transfusion is very promising. Further development of the systems is needed to enable staff to carry out bedside transfusion procedures quickly and accurately, and to increase their functionality to justify the cost of their wider implementation.

  13. The impact of information technology and organizational focus on the visibility of patient care errors.

    PubMed

    Walston, Stephen L; Mwachofi, Ari; Aldosari, Bakheet; Al-Omar, Badran A; Yousef, Asmaa Al; Sheikh, Asiya

    2010-01-01

    INVESTIGATED: The implementation of information systems and the creation of an open culture, characterized by emphasis on patient safety and problem solving, are 2 means suggested to improve health care quality. This study examines the effects of use of information technology and focus on patient safety and problem solving on the visibility of patient care errors. A survey of nurses in Saudi Arabia is analyzed by means of factor analysis and multiregression analysis to examine nurses' use of information technology and culture in controlling errors. Our research suggests that greater use of information technology to control patient care errors may reduce the prevalence of such errors while an increased focus on patient safety and problem solving facilitates an open environment where errors can be more openly discussed and addressed. The use of technology appears to have a role in decreasing errors. Yet, an organization that focuses on problem solving and patient safety can open lines of communication and create a culture in which errors can be discussed and resolved.

  14. Changing contributions of stochastic and deterministic processes in community assembly over a successional gradient.

    PubMed

    Måren, Inger Elisabeth; Kapfer, Jutta; Aarrestad, Per Arild; Grytnes, John-Arvid; Vandvik, Vigdis

    2018-01-01

    Successional dynamics in plant community assembly may result from both deterministic and stochastic ecological processes. The relative importance of different ecological processes is expected to vary over the successional sequence, between different plant functional groups, and with the disturbance levels and land-use management regimes of the successional systems. We evaluate the relative importance of stochastic and deterministic processes in bryophyte and vascular plant community assembly after fire in grazed and ungrazed anthropogenic coastal heathlands in Northern Europe. A replicated series of post-fire successions (n = 12) were initiated under grazed and ungrazed conditions, and vegetation data were recorded in permanent plots over 13 years. We used redundancy analysis (RDA) to test for deterministic successional patterns in species composition repeated across the replicate successional series and analyses of co-occurrence to evaluate to what extent species respond synchronously along the successional gradient. Change in species co-occurrences over succession indicates stochastic successional dynamics at the species level (i.e., species equivalence), whereas constancy in co-occurrence indicates deterministic dynamics (successional niche differentiation). The RDA shows high and deterministic vascular plant community compositional change, especially early in succession. Co-occurrence analyses indicate stochastic species-level dynamics the first two years, which then give way to more deterministic replacements. Grazed and ungrazed successions are similar, but the early stage stochasticity is higher in ungrazed areas. Bryophyte communities in ungrazed successions resemble vascular plant communities. In contrast, bryophytes in grazed successions showed consistently high stochasticity and low determinism in both community composition and species co-occurrence. In conclusion, stochastic and individualistic species responses early in succession give way to more niche-driven dynamics in later successional stages. Grazing reduces predictability in both successional trends and species-level dynamics, especially in plant functional groups that are not well adapted to disturbance. © 2017 The Authors. Ecology, published by Wiley Periodicals, Inc., on behalf of the Ecological Society of America.

  15. A Deterministic and Random Propagation Study with the Design of an Open Path 320 GHz to 340 GHz Transmissometer

    NASA Astrophysics Data System (ADS)

    Scally, Lawrence J.

    This program was implemented by Lawrence J. Scally for a Ph.D. under the EECE department at the University of Colorado at Boulder with most funding provided by the U.S. Army. Professor Gasiewski is the advisor and guider for the entire program; he has a strong history decades ago in this type of program. This program is developing a more advanced than previous years transmissometer, called Terahertz Atmospheric and Ionospheric Propagation, Absorption and Scattering System (TAIPAS), on an open path between the University of Colorado EE building roof and the mesa on owned by National Institute of Standards and Technology (NIST); NIST has invested money, location and support for the program. Besides designing and building the transmissometer, that has never be accomplished at this level, the system also analyzes the atmospheric propagation of frequencies by scanning between 320 GHz and 340 GHz, which includes the peak absorption frequency at 325.1529 GHz due to water absorption. The processing and characterization of the deterministic and random propagation characteristics of the atmosphere in the real world was significantly started; this will be executed with varies aerosols for decades on the permanently mounted system that is accessible 24/7 via a network over the CU Virtual Private Network (VPN).

  16. Trends in Health Information Technology Safety: From Technology-Induced Errors to Current Approaches for Ensuring Technology Safety

    PubMed Central

    2013-01-01

    Objectives Health information technology (HIT) research findings suggested that new healthcare technologies could reduce some types of medical errors while at the same time introducing classes of medical errors (i.e., technology-induced errors). Technology-induced errors have their origins in HIT, and/or HIT contribute to their occurrence. The objective of this paper is to review current trends in the published literature on HIT safety. Methods A review and synthesis of the medical and life sciences literature focusing on the area of technology-induced error was conducted. Results There were four main trends in the literature on technology-induced error. The following areas were addressed in the literature: definitions of technology-induced errors; models, frameworks and evidence for understanding how technology-induced errors occur; a discussion of monitoring; and methods for preventing and learning about technology-induced errors. Conclusions The literature focusing on technology-induced errors continues to grow. Research has focused on the defining what an error is, models and frameworks used to understand these new types of errors, monitoring of such errors and methods that can be used to prevent these errors. More research will be needed to better understand and mitigate these types of errors. PMID:23882411

  17. 48 CFR 323.7000 - Scope of subpart.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Safety and Health 323.7000 Scope of subpart. This subpart prescribes the use of a safety and... administering safety and health provisions. ...

  18. 48 CFR 323.7000 - Scope of subpart.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Safety and Health 323.7000 Scope of subpart. This subpart prescribes the use of a safety and... administering safety and health provisions. ...

  19. 48 CFR 323.7000 - Scope of subpart.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE WORKPLACE Safety and Health 323.7000 Scope of subpart. This subpart prescribes the use of a safety and... administering safety and health provisions. ...

  20. Onboard Safety Technology Survey Synthesis - Final Report

    DOT National Transportation Integrated Search

    2008-01-01

    The Federal Motor Carrier Safety Administration (FMCSA) funded this project to collect, merge, and conduct an assessment of onboard safety system surveys and resulting data sets that may benefit commercial vehicle operations safety and future researc...

  1. Deterministic multidimensional nonuniform gap sampling.

    PubMed

    Worley, Bradley; Powers, Robert

    2015-12-01

    Born from empirical observations in nonuniformly sampled multidimensional NMR data relating to gaps between sampled points, the Poisson-gap sampling method has enjoyed widespread use in biomolecular NMR. While the majority of nonuniform sampling schemes are fully randomly drawn from probability densities that vary over a Nyquist grid, the Poisson-gap scheme employs constrained random deviates to minimize the gaps between sampled grid points. We describe a deterministic gap sampling method, based on the average behavior of Poisson-gap sampling, which performs comparably to its random counterpart with the additional benefit of completely deterministic behavior. We also introduce a general algorithm for multidimensional nonuniform sampling based on a gap equation, and apply it to yield a deterministic sampling scheme that combines burst-mode sampling features with those of Poisson-gap schemes. Finally, we derive a relationship between stochastic gap equations and the expectation value of their sampling probability densities. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Apparatus for fixing latency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, David R; Bartholomew, David B; Moon, Justin

    2009-09-08

    An apparatus for fixing computational latency within a deterministic region on a network comprises a network interface modem, a high priority module and at least one deterministic peripheral device. The network interface modem is in communication with the network. The high priority module is in communication with the network interface modem. The at least one deterministic peripheral device is connected to the high priority module. The high priority module comprises a packet assembler/disassembler, and hardware for performing at least one operation. Also disclosed is an apparatus for executing at least one instruction on a downhole device within a deterministic region,more » the apparatus comprising a control device, a downhole network, and a downhole device. The control device is near the surface of a downhole tool string. The downhole network is integrated into the tool string. The downhole device is in communication with the downhole network.« less

  3. Stochastic Petri Net extension of a yeast cell cycle model.

    PubMed

    Mura, Ivan; Csikász-Nagy, Attila

    2008-10-21

    This paper presents the definition, solution and validation of a stochastic model of the budding yeast cell cycle, based on Stochastic Petri Nets (SPN). A specific family of SPNs is selected for building a stochastic version of a well-established deterministic model. We describe the procedure followed in defining the SPN model from the deterministic ODE model, a procedure that can be largely automated. The validation of the SPN model is conducted with respect to both the results provided by the deterministic one and the experimental results available from literature. The SPN model catches the behavior of the wild type budding yeast cells and a variety of mutants. We show that the stochastic model matches some characteristics of budding yeast cells that cannot be found with the deterministic model. The SPN model fine-tunes the simulation results, enriching the breadth and the quality of its outcome.

  4. Effect of sample volume on metastable zone width and induction time

    NASA Astrophysics Data System (ADS)

    Kubota, Noriaki

    2012-04-01

    The metastable zone width (MSZW) and the induction time, measured for a large sample (say>0.1 L) are reproducible and deterministic, while, for a small sample (say<1 mL), these values are irreproducible and stochastic. Such behaviors of MSZW and induction time were theoretically discussed both with stochastic and deterministic models. Equations for the distribution of stochastic MSZW and induction time were derived. The average values of stochastic MSZW and induction time both decreased with an increase in sample volume, while, the deterministic MSZW and induction time remained unchanged. Such different behaviors with variation in sample volume were explained in terms of detection sensitivity of crystallization events. The average values of MSZW and induction time in the stochastic model were compared with the deterministic MSZW and induction time, respectively. Literature data reported for paracetamol aqueous solution were explained theoretically with the presented models.

  5. Fencing network direct memory access data transfers in a parallel active messaging interface of a parallel computer

    DOEpatents

    Blocksome, Michael A.; Mamidala, Amith R.

    2015-07-07

    Fencing direct memory access (`DMA`) data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including specifications of a client, a context, and a task, the endpoints coupled for data communications through the PAMI and through DMA controllers operatively coupled to a deterministic data communications network through which the DMA controllers deliver data communications deterministically, including initiating execution through the PAMI of an ordered sequence of active DMA instructions for DMA data transfers between two endpoints, effecting deterministic DMA data transfers through a DMA controller and the deterministic data communications network; and executing through the PAMI, with no FENCE accounting for DMA data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all DMA instructions initiated prior to execution of the FENCE instruction for DMA data transfers between the two endpoints.

  6. Fencing network direct memory access data transfers in a parallel active messaging interface of a parallel computer

    DOEpatents

    Blocksome, Michael A.; Mamidala, Amith R.

    2015-07-14

    Fencing direct memory access (`DMA`) data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including specifications of a client, a context, and a task, the endpoints coupled for data communications through the PAMI and through DMA controllers operatively coupled to a deterministic data communications network through which the DMA controllers deliver data communications deterministically, including initiating execution through the PAMI of an ordered sequence of active DMA instructions for DMA data transfers between two endpoints, effecting deterministic DMA data transfers through a DMA controller and the deterministic data communications network; and executing through the PAMI, with no FENCE accounting for DMA data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all DMA instructions initiated prior to execution of the FENCE instruction for DMA data transfers between the two endpoints.

  7. Airline Safety Improvement Through Experience with Near-Misses: A Cautionary Tale.

    PubMed

    Madsen, Peter; Dillon, Robin L; Tinsley, Catherine H

    2016-05-01

    In recent years, the U.S. commercial airline industry has achieved unprecedented levels of safety, with the statistical risk associated with U.S. commercial aviation falling to 0.003 fatalities per 100 million passengers. But decades of research on organizational learning show that success often breeds complacency and failure inspires improvement. With accidents as rare events, can the airline industry continue safety advancements? This question is complicated by the complex system in which the industry operates where chance combinations of multiple factors contribute to what are largely probabilistic (rather than deterministic) outcomes. Thus, some apparent successes are realized because of good fortune rather than good processes, and this research intends to bring attention to these events, the near-misses. The processes that create these near-misses could pose a threat if multiple contributing factors combine in adverse ways without the intervention of good fortune. Yet, near-misses (if recognized as such) can, theoretically, offer a mechanism for continuing safety improvements, above and beyond learning gleaned from observable failure. We test whether or not this learning is apparent in the airline industry. Using data from 1990 to 2007, fixed effects Poisson regressions show that airlines learn from accidents (their own and others), and from one category of near-misses-those where the possible dangers are salient. Unfortunately, airlines do not improve following near-miss incidents when the focal event has no clear warnings of significant danger. Therefore, while airlines need to and can learn from certain near-misses, we conclude with recommendations for improving airline learning from all near-misses. © 2015 Society for Risk Analysis.

  8. Carbon Monoxide Safety

    MedlinePlus

    ... portable generators? Source: National Institute of Standards and Technology More information on carbon monoxide safety Heating fire safety NFPA Educational Messages Desk Reference – these messages provide fire and ...

  9. 75 FR 48366 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-10

    ...: OMB Desk Officer for the Department of Labor--Mine Safety and Health Administration (MSHA), Office of..., electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses. Agency: Mine Safety and Health Administration...

  10. Factors in Decisions to Make, Purchase, and Use On-board Safety Technologies

    DOT National Transportation Integrated Search

    2005-12-01

    In support of its goal to reduce large truck-related fatalities and crashes, FMCSA plans to facilitate the deployment of Intelligent Vehicle Initiative (IVI) technologies that have shown a potential to improve the safety of commercial vehicle operati...

  11. United States Postal Service Alaska Hovercraft Demonstration Project Technology and Safety Assessment

    DOT National Transportation Integrated Search

    2000-02-01

    This report presents the results of the technology and safety assessment of the Bethel/Kuskokwim River hovercraft service,operated by the Alaska Hovercraft Joint Venture (AHJV). The primary purpose of the service was a two-year demonstration of bypas...

  12. Physically-based landslide assessment for railway infrastructure

    NASA Astrophysics Data System (ADS)

    Heyerdahl, Håkon; Høydal, Øyvind

    2017-04-01

    A new high-speed railway line in Eastern Norway passes through areas with Quaternary soil deposits where stability of natural slopes poses considerable challenges. The ground typically consist of thick layers of marine clay deposits, overlain by 8-10 m of silt and sand. Both shallow landslides in the top layers of silt and sand and deep-seated failures in clay must be accounted for. In one section of the railway, the potential for performing stabilizing measures is limited due to existing cultural heritage on top of the slope. Hence, the stability of a steep top section of the slope needs to be evaluated. Assessment of the slope stability for rainfall-triggered slides relies on many parameters. An approach based only on empirical relations will not comply with the design criteria, which only allows deterministic safety margins. From a classic geotechnical approach, the slope would also normally be considered unsafe. However, considerable suction is assumed to exist in the silty and sandy deposits above ground-water level, which will improve the stability. The stabilizing effect however is highly dependent on rainfall, infiltration and soil moisture, and thereby varies continuously. An unsaturated geomechanical approach was taken to assess the slope stability. Soil moisture sensors were installed to monitor changes of in situ water content in the vadose zone. Retention curves for silt/sand specimens samples were measured by pressure plate tests. Some triaxial tests soil strength were performed to check the effect of suction on soil shear strength (performed as drained constant water content tests on compacted specimens). Based on the performed laboratory tests, the unsaturated response of the slope will be modelled numerically and compared with measured soil moisture in situ. Work is still on-going. Initial conditions after respectively dry and wet periods need to be coupled with selected rainfall intensities and duration to see the effect on slope stability. The aim of the work is to reach a result informing the client about the probability of a landslide in the slope, based on expected critical rainfall. A strictly deterministic criterion for minimum safety margin may need to be replaced by scenarios for probability and geometry of potential failures for given return periods and rainfall events.

  13. Mathematical modeling and pharmaceutical pricing: analyses used to inform in-licensing and developmental go/No-Go decisions.

    PubMed

    Vernon, John A; Hughen, W Keener; Johnson, Scott J

    2005-05-01

    In the face of significant real healthcare cost inflation, pressured budgets, and ongoing launches of myriad technology of uncertain value, payers have formalized new valuation techniques that represent a barrier to entry for drugs. Cost-effectiveness analysis predominates among these methods, which involves differencing a new technological intervention's marginal costs and benefits with a comparator's, and comparing the resulting ratio to a payer's willingness-to-pay threshold. In this paper we describe how firms are able to model the feasible range of future product prices when making in-licensing and developmental Go/No-Go decisions by considering payers' use of the cost-effectiveness method. We illustrate this analytic method with a simple deterministic example and then incorporate stochastic assumptions using both analytic and simulation methods. Using this strategic approach, firms may reduce product development and in-licensing risk.

  14. High-performance semiconductor quantum-dot single-photon sources

    NASA Astrophysics Data System (ADS)

    Senellart, Pascale; Solomon, Glenn; White, Andrew

    2017-11-01

    Single photons are a fundamental element of most quantum optical technologies. The ideal single-photon source is an on-demand, deterministic, single-photon source delivering light pulses in a well-defined polarization and spatiotemporal mode, and containing exactly one photon. In addition, for many applications, there is a quantum advantage if the single photons are indistinguishable in all their degrees of freedom. Single-photon sources based on parametric down-conversion are currently used, and while excellent in many ways, scaling to large quantum optical systems remains challenging. In 2000, semiconductor quantum dots were shown to emit single photons, opening a path towards integrated single-photon sources. Here, we review the progress achieved in the past few years, and discuss remaining challenges. The latest quantum dot-based single-photon sources are edging closer to the ideal single-photon source, and have opened new possibilities for quantum technologies.

  15. 48 CFR 923.7001 - Nuclear safety.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Nuclear safety. 923.7001... Efficiency, Renewable Energy Technologies, and Occupational Safety Programs 923.7001 Nuclear safety. The DOE regulates the nuclear safety of its major facilities under its own statutory authority derived from the...

  16. Realistic Simulation for Body Area and Body-To-Body Networks

    PubMed Central

    Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D’Errico, Raffaele

    2016-01-01

    In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices. PMID:27104537

  17. Realistic Simulation for Body Area and Body-To-Body Networks.

    PubMed

    Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D'Errico, Raffaele

    2016-04-20

    In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices.

  18. Ceramic Electrolyte Membrane Technology: Enabling Revolutionary Electrochemical Energy Storage

    DTIC Science & Technology

    2015-10-05

    ion batteries . Solid-state Li- ion batteries could significantly improve safety and eliminate the need for complex...advancing ceramic electrolyte technology for use in solid-state Li- ion batteries . Solid-state Li- ion batteries could significantly improve safety and...technology for use in solid-state Li- ion batteries and high specific energy Li-S and Li- air batteries . Solid-state Li- ion batteries could

  19. An evaluation of safety culture initiatives at BNSF Railway

    DOT National Transportation Integrated Search

    2015-04-01

    Major safety culture (SC) initiatives initiated in the FRA Office of Research, Technology and Development (RT&D), such as Clear Signal for Action (CSA), the Investigation of Safety Related Occurrences Protocol (ISROP), the Participative Safety Rules ...

  20. Optimal Joint Remote State Preparation of Arbitrary Equatorial Multi-qudit States

    NASA Astrophysics Data System (ADS)

    Cai, Tao; Jiang, Min

    2017-03-01

    As an important communication technology, quantum information transmission plays an important role in the future network communication. It involves two kinds of transmission ways: quantum teleportation and remote state preparation. In this paper, we put forward a new scheme for optimal joint remote state preparation (JRSP) of an arbitrary equatorial two-qudit state with hybrid dimensions. Moreover, the receiver can reconstruct the target state with 100 % success probability in a deterministic manner via two spatially separated senders. Based on it, we can extend it to joint remote preparation of arbitrary equatorial multi-qudit states with hybrid dimensions using the same strategy.

  1. Chemically intuited, large-scale screening of MOFs by machine learning techniques

    NASA Astrophysics Data System (ADS)

    Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.

    2017-10-01

    A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.

  2. Feed-forward control of a solid oxide fuel cell system with anode offgas recycle

    NASA Astrophysics Data System (ADS)

    Carré, Maxime; Brandenburger, Ralf; Friede, Wolfgang; Lapicque, François; Limbeck, Uwe; da Silva, Pedro

    2015-05-01

    In this work a combined heat and power unit (CHP unit) based on the solid oxide fuel cell (SOFC) technology is analysed. This unit has a special feature: the anode offgas is partially recycled to the anode inlet. Thus it is possible to increase the electrical efficiency and the system can be operated without external water feeding. A feed-forward control concept which allows secure operating conditions of the CHP unit as well as a maximization of its electrical efficiency is introduced and validated experimentally. The control algorithm requires a limited number of measurement values and few deterministic relations for its description.

  3. ({The) Solar System Large Planets influence on a new Maunder Miniμm}

    NASA Astrophysics Data System (ADS)

    Yndestad, Harald; Solheim, Jan-Erik

    2016-04-01

    In 1890´s G. Spörer and E. W. Maunder (1890) reported that the solar activity stopped in a period of 70 years from 1645 to 1715. Later a reconstruction of the solar activity confirms the grand minima Maunder (1640-1720), Spörer (1390-1550), Wolf (1270-1340), and the minima Oort (1010-1070) and Dalton (1785-1810) since the year 1000 A.D. (Usoskin et al. 2007). These minimum periods have been associated with less irradiation from the Sun and cold climate periods on Earth. An identification of a three grand Maunder type periods and two Dalton type periods in a period thousand years, indicates that sooner or later there will be a colder climate on Earth from a new Maunder- or Dalton- type period. The cause of these minimum periods, are not well understood. An expected new Maunder-type period is based on the properties of solar variability. If the solar variability has a deterministic element, we can estimate better a new Maunder grand minimum. A random solar variability can only explain the past. This investigation is based on the simple idea that if the solar variability has a deterministic property, it must have a deterministic source, as a first cause. If this deterministic source is known, we can compute better estimates the next expected Maunder grand minimum period. The study is based on a TSI ACRIM data series from 1700, a TSI ACRIM data series from 1000 A.D., sunspot data series from 1611 and a Solar Barycenter orbit data series from 1000. The analysis method is based on a wavelet spectrum analysis, to identify stationary periods, coincidence periods and their phase relations. The result shows that the TSI variability and the sunspots variability have deterministic oscillations, controlled by the large planets Jupiter, Uranus and Neptune, as the first cause. A deterministic model of TSI variability and sunspot variability confirms the known minimum and grand minimum periods since 1000. From this deterministic model we may expect a new Maunder type sunspot minimum period from about 2018 to 2055. The deterministic model of a TSI ACRIM data series from 1700 computes a new Maunder type grand minimum period from 2015 to 2071. A model of the longer TSI ACRIM data series from 1000 computes a new Dalton to Maunder type minimum irradiation period from 2047 to 2068.

  4. An analysis of electronic health record-related patient safety concerns

    PubMed Central

    Meeks, Derek W; Smith, Michael W; Taylor, Lesley; Sittig, Dean F; Scott, Jean M; Singh, Hardeep

    2014-01-01

    Objective A recent Institute of Medicine report called for attention to safety issues related to electronic health records (EHRs). We analyzed EHR-related safety concerns reported within a large, integrated healthcare system. Methods The Informatics Patient Safety Office of the Veterans Health Administration (VA) maintains a non-punitive, voluntary reporting system to collect and investigate EHR-related safety concerns (ie, adverse events, potential events, and near misses). We analyzed completed investigations using an eight-dimension sociotechnical conceptual model that accounted for both technical and non-technical dimensions of safety. Using the framework analysis approach to qualitative data, we identified emergent and recurring safety concerns common to multiple reports. Results We extracted 100 consecutive, unique, closed investigations between August 2009 and May 2013 from 344 reported incidents. Seventy-four involved unsafe technology and 25 involved unsafe use of technology. A majority (70%) involved two or more model dimensions. Most often, non-technical dimensions such as workflow, policies, and personnel interacted in a complex fashion with technical dimensions such as software/hardware, content, and user interface to produce safety concerns. Most (94%) safety concerns related to either unmet data-display needs in the EHR (ie, displayed information available to the end user failed to reduce uncertainty or led to increased potential for patient harm), software upgrades or modifications, data transmission between components of the EHR, or ‘hidden dependencies’ within the EHR. Discussion EHR-related safety concerns involving both unsafe technology and unsafe use of technology persist long after ‘go-live’ and despite the sophisticated EHR infrastructure represented in our data source. Currently, few healthcare institutions have reporting and analysis capabilities similar to the VA. Conclusions Because EHR-related safety concerns have complex sociotechnical origins, institutions with long-standing as well as recent EHR implementations should build a robust infrastructure to monitor and learn from them. PMID:24951796

  5. Food Safety Practices in the Egg Products Industry.

    PubMed

    Viator, Catherine L; Cates, Sheryl C; Karns, Shawn A; Muth, Mary K; Noyes, Gary

    2016-07-01

    We conducted a national census survey of egg product plants (n = 57) to obtain information on the technological and food safety practices of the egg products industry and to assess changes in these practices from 2004 to 2014. The questionnaire asked about operational and sanitation practices, microbiological testing practices, food safety training for employees, other food safety issues, and plant characteristics. The findings suggest that improvements were made in the industry's use of food safety technologies and practices between 2004 and 2014. The percentage of plants using advanced pasteurization technology and an integrated, computerized processing system increased by almost 30 percentage points. Over 90% of plants voluntarily use a written hazard analysis and critical control point (HACCP) plan to address food safety for at least one production step. Further, 90% of plants have management employees who are trained in a written HACCP plan. Most plants (93%) conduct voluntary microbiological testing. The percentage of plants conducting this testing on egg products before pasteurization has increased by almost 30 percentage points since 2004. The survey findings identify strengths and weaknesses in egg product plants' food safety practices and can be used to guide regulatory policymaking and to conduct required regulatory impact analysis of potential regulations.

  6. 75 FR 53345 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-31

    ... the Department of Labor--Mine Safety and Health Administration (MSHA), Office of Management and Budget..., mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses. Agency: Mine Safety and Health Administration. Type of Review...

  7. 78 FR 12065 - National Institute for Occupational Safety and Health Personal Protective Technology for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-21

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention National Institute for Occupational Safety and Health Personal Protective Technology for Pesticide Handlers... for Disease Control and Prevention (CDC), Department of Health and Human Services (HHS). ACTION...

  8. Nonthermal processing technologies as food safety intervention processes

    USDA-ARS?s Scientific Manuscript database

    Foods should provide sensorial satisfaction and nutrition to people. Yet, foodborne pathogens cause significant illness and lose of life to human kind every year. A processing intervention step may be necessary prior to the consumption to ensure the safety of foods. Nonthermal processing technologi...

  9. 2009 Human Factors and Roadway Safety Workshop : Teen Driver Safety [SD .WMV (720x480/29fps/177.0 MB)

    DOT National Transportation Integrated Search

    2009-11-05

    Iowa Department of Transportation Research and Technology Bureau video presentation from the 2009 human factors and roadway safety workshop session titled: Teen Driver Safety : Keynote Speaker Dan McGehee, director, Human Factors & Vehicle Safety Res...

  10. Application of the Digital Image Technology in the Visual Monitoring and Prediction of Shuttering Construction Safety

    NASA Astrophysics Data System (ADS)

    Ummin, Okumura; Tian, Han; Zhu, Haiyu; Liu, Fuqiang

    2018-03-01

    Construction safety has always been the first priority in construction process. The common safety problem is the instability of the template support. In order to solve this problem, the digital image measurement technology has been contrived to support real-time monitoring system which can be triggered if the deformation value exceed the specified range. Thus the economic loss could be reduced to the lowest level.

  11. NASA Aviation Safety Program Systems Analysis/Program Assessment Metrics Review

    NASA Technical Reports Server (NTRS)

    Louis, Garrick E.; Anderson, Katherine; Ahmad, Tisan; Bouabid, Ali; Siriwardana, Maya; Guilbaud, Patrick

    2003-01-01

    The goal of this project is to evaluate the metrics and processes used by NASA's Aviation Safety Program in assessing technologies that contribute to NASA's aviation safety goals. There were three objectives for reaching this goal. First, NASA's main objectives for aviation safety were documented and their consistency was checked against the main objectives of the Aviation Safety Program. Next, the metrics used for technology investment by the Program Assessment function of AvSP were evaluated. Finally, other metrics that could be used by the Program Assessment Team (PAT) were identified and evaluated. This investigation revealed that the objectives are in fact consistent across organizational levels at NASA and with the FAA. Some of the major issues discussed in this study which should be further investigated, are the removal of the Cost and Return-on-Investment metrics, the lack of the metrics to measure the balance of investment and technology, the interdependencies between some of the metric risk driver categories, and the conflict between 'fatal accident rate' and 'accident rate' in the language of the Aviation Safety goal as stated in different sources.

  12. Aviation Trends Related to Atmospheric Environment Safety Technologies Project Technical Challenges

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Barr, Lawrence C.; Evans, Joni K.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    Current and future aviation safety trends related to the National Aeronautics and Space Administration's Atmospheric Environment Safety Technologies Project's three technical challenges (engine icing characterization and simulation capability; airframe icing simulation and engineering tool capability; and atmospheric hazard sensing and mitigation technology capability) were assessed by examining the National Transportation Safety Board (NTSB) accident database (1989 to 2008), incidents from the Federal Aviation Administration (FAA) accident/incident database (1989 to 2006), and literature from various industry and government sources. The accident and incident data were examined for events involving fixed-wing airplanes operating under Federal Aviation Regulation (FAR) Parts 121, 135, and 91 for atmospheric conditions related to airframe icing, ice-crystal engine icing, turbulence, clear air turbulence, wake vortex, lightning, and low visibility (fog, low ceiling, clouds, precipitation, and low lighting). Five future aviation safety risk areas associated with the three AEST technical challenges were identified after an exhaustive survey of a variety of sources and include: approach and landing accident reduction, icing/ice detection, loss of control in flight, super density operations, and runway safety.

  13. Stability analysis of a deterministic dose calculation for MRI-guided radiotherapy.

    PubMed

    Zelyak, O; Fallone, B G; St-Aubin, J

    2017-12-14

    Modern effort in radiotherapy to address the challenges of tumor localization and motion has led to the development of MRI guided radiotherapy technologies. Accurate dose calculations must properly account for the effects of the MRI magnetic fields. Previous work has investigated the accuracy of a deterministic linear Boltzmann transport equation (LBTE) solver that includes magnetic field, but not the stability of the iterative solution method. In this work, we perform a stability analysis of this deterministic algorithm including an investigation of the convergence rate dependencies on the magnetic field, material density, energy, and anisotropy expansion. The iterative convergence rate of the continuous and discretized LBTE including magnetic fields is determined by analyzing the spectral radius using Fourier analysis for the stationary source iteration (SI) scheme. The spectral radius is calculated when the magnetic field is included (1) as a part of the iteration source, and (2) inside the streaming-collision operator. The non-stationary Krylov subspace solver GMRES is also investigated as a potential method to accelerate the iterative convergence, and an angular parallel computing methodology is investigated as a method to enhance the efficiency of the calculation. SI is found to be unstable when the magnetic field is part of the iteration source, but unconditionally stable when the magnetic field is included in the streaming-collision operator. The discretized LBTE with magnetic fields using a space-angle upwind stabilized discontinuous finite element method (DFEM) was also found to be unconditionally stable, but the spectral radius rapidly reaches unity for very low-density media and increasing magnetic field strengths indicating arbitrarily slow convergence rates. However, GMRES is shown to significantly accelerate the DFEM convergence rate showing only a weak dependence on the magnetic field. In addition, the use of an angular parallel computing strategy is shown to potentially increase the efficiency of the dose calculation.

  14. Stability analysis of a deterministic dose calculation for MRI-guided radiotherapy

    NASA Astrophysics Data System (ADS)

    Zelyak, O.; Fallone, B. G.; St-Aubin, J.

    2018-01-01

    Modern effort in radiotherapy to address the challenges of tumor localization and motion has led to the development of MRI guided radiotherapy technologies. Accurate dose calculations must properly account for the effects of the MRI magnetic fields. Previous work has investigated the accuracy of a deterministic linear Boltzmann transport equation (LBTE) solver that includes magnetic field, but not the stability of the iterative solution method. In this work, we perform a stability analysis of this deterministic algorithm including an investigation of the convergence rate dependencies on the magnetic field, material density, energy, and anisotropy expansion. The iterative convergence rate of the continuous and discretized LBTE including magnetic fields is determined by analyzing the spectral radius using Fourier analysis for the stationary source iteration (SI) scheme. The spectral radius is calculated when the magnetic field is included (1) as a part of the iteration source, and (2) inside the streaming-collision operator. The non-stationary Krylov subspace solver GMRES is also investigated as a potential method to accelerate the iterative convergence, and an angular parallel computing methodology is investigated as a method to enhance the efficiency of the calculation. SI is found to be unstable when the magnetic field is part of the iteration source, but unconditionally stable when the magnetic field is included in the streaming-collision operator. The discretized LBTE with magnetic fields using a space-angle upwind stabilized discontinuous finite element method (DFEM) was also found to be unconditionally stable, but the spectral radius rapidly reaches unity for very low-density media and increasing magnetic field strengths indicating arbitrarily slow convergence rates. However, GMRES is shown to significantly accelerate the DFEM convergence rate showing only a weak dependence on the magnetic field. In addition, the use of an angular parallel computing strategy is shown to potentially increase the efficiency of the dose calculation.

  15. Corrigendum to "Stability analysis of a deterministic dose calculation for MRI-guided radiotherapy".

    PubMed

    Zelyak, Oleksandr; Fallone, B Gino; St-Aubin, Joel

    2018-03-12

    Modern effort in radiotherapy to address the challenges of tumor localization and motion has led to the development of MRI guided radiotherapy technologies. Accurate dose calculations must properly account for the effects of the MRI magnetic fields. Previous work has investigated the accuracy of a deterministic linear Boltzmann transport equation (LBTE) solver that includes magnetic field, but not the stability of the iterative solution method. In this work, we perform a stability analysis of this deterministic algorithm including an investigation of the convergence rate dependencies on the magnetic field, material density, energy, and anisotropy expansion. The iterative convergence rate of the continuous and discretized LBTE including magnetic fields is determined by analyzing the spectral radius using Fourier analysis for the stationary source iteration (SI) scheme. The spectral radius is calculated when the magnetic field is included (1) as a part of the iteration source, and (2) inside the streaming-collision operator. The non-stationary Krylov subspace solver GMRES is also investigated as a potential method to accelerate the iterative convergence, and an angular parallel computing methodology is investigated as a method to enhance the efficiency of the calculation. SI is found to be unstable when the magnetic field is part of the iteration source, but unconditionally stable when the magnetic field is included in the streaming-collision operator. The discretized LBTE with magnetic fields using a space-angle upwind stabilized discontinuous finite element method (DFEM) was also found to be unconditionally stable, but the spectral radius rapidly reaches unity for very low density media and increasing magnetic field strengths indicating arbitrarily slow convergence rates. However, GMRES is shown to significantly accelerate the DFEM convergence rate showing only a weak dependence on the magnetic field. In addition, the use of an angular parallel computing strategy is shown to potentially increase the efficiency of the dose calculation. © 2018 Institute of Physics and Engineering in Medicine.

  16. Deterministic compressive sampling for high-quality image reconstruction of ultrasound tomography.

    PubMed

    Huy, Tran Quang; Tue, Huynh Huu; Long, Ton That; Duc-Tan, Tran

    2017-05-25

    A well-known diagnostic imaging modality, termed ultrasound tomography, was quickly developed for the detection of very small tumors whose sizes are smaller than the wavelength of the incident pressure wave without ionizing radiation, compared to the current gold-standard X-ray mammography. Based on inverse scattering technique, ultrasound tomography uses some material properties such as sound contrast or attenuation to detect small targets. The Distorted Born Iterative Method (DBIM) based on first-order Born approximation is an efficient diffraction tomography approach. One of the challenges for a high quality reconstruction is to obtain many measurements from the number of transmitters and receivers. Given the fact that biomedical images are often sparse, the compressed sensing (CS) technique could be therefore effectively applied to ultrasound tomography by reducing the number of transmitters and receivers, while maintaining a high quality of image reconstruction. There are currently several work on CS that dispose randomly distributed locations for the measurement system. However, this random configuration is relatively difficult to implement in practice. Instead of it, we should adopt a methodology that helps determine the locations of measurement devices in a deterministic way. For this, we develop the novel DCS-DBIM algorithm that is highly applicable in practice. Inspired of the exploitation of the deterministic compressed sensing technique (DCS) introduced by the authors few years ago with the image reconstruction process implemented using l 1 regularization. Simulation results of the proposed approach have demonstrated its high performance, with the normalized error approximately 90% reduced, compared to the conventional approach, this new approach can save half of number of measurements and only uses two iterations. Universal image quality index is also evaluated in order to prove the efficiency of the proposed approach. Numerical simulation results indicate that CS and DCS techniques offer equivalent image reconstruction quality with simpler practical implementation. It would be a very promising approach in practical applications of modern biomedical imaging technology.

  17. Climate is changing, everything is flowing, stationarity is immortal

    NASA Astrophysics Data System (ADS)

    Koutsoyiannis, Demetris; Montanari, Alberto

    2015-04-01

    There is no doubt that climate is changing -- and ever has been. The environment is also changing and in the last decades, as a result of demographic change and technological advancement, environmental change has been accelerating. These affect also the hydrological processes, whose changes in connection with rapidly changing human systems have been the focus of the new scientific decade 2013-2022 of the International Association of Hydrological Sciences, entitled "Panta Rhei - Everything Flows". In view of the changing systems, it has recently suggested that, when dealing with water management and hydrological extremes, stationarity is no longer a proper assumption. Hence, it was proposed that hydrological processes should be treated as nonstationary. Two main reasons contributed to this perception. First, the climate models project a future hydroclimate that will be different from the current one. Second, as streamflow record become longer, they indicate the presence of upward or downward trends. However, till now hydroclimatic projections made in the recent past have not been verified. At the same time, evidence from quite longer records, instrumental or proxy, suggest that local trends are omnipresent but not monotonic; rather at some time upward trends turn to downward ones and vice versa. These observations suggest that improvident dismiss of stationarity and adoption of nonstationary descriptions based either on climate model outputs or observed trends may entail risks. The risks stem from the facts that the future can be different from what was deterministically projected, that deterministic projections are associated with an illusion of decreased uncertainty, as well as that nonstationary models fitted on observed data may have lower predictive capacity than simpler stationary ones. In most of the cases, what is actually needed is to revisit the concept of stationarity and try to apply it carefully, making it consistent with the presence of local trends, possibly incorporating information from deterministic predictions, whenever these prove to be reliable, and estimating the total predictive uncertainty.

  18. Aerospace technology and commercial nuclear power; Proceedings of the Workshop Conference, Williamsburg, VA, November 18-20, 1981

    NASA Technical Reports Server (NTRS)

    Grey, J. (Editor)

    1982-01-01

    An attempt has been made to compare the technologies, institutions and procedures of the aerospace and commercial nuclear power industries, in order to characterize similarities and contrasts as well as to identify the most fruitful means by which to transfer information, technology, and procedures between the two industries. The seven working groups involved in this study took as their topics powerplant design formulation and effectiveness, plant safety and operations, powerplant control technology and integration, economic and financial analyses, public relations, and the management of nuclear waste and spent fuel. Consequential differences are noted between the two industries in matters of certification and licencing procedures, assignment of responsibility for both safety and financial performance, and public viewpoint. Areas for beneficial interaction include systems management and control and safety system technology. No individual items are abstracted in this volume

  19. Functional safety for the Advanced Technology Solar Telescope

    NASA Astrophysics Data System (ADS)

    Bulau, Scott; Williams, Timothy R.

    2012-09-01

    Since inception, the Advanced Technology Solar Telescope (ATST) has planned to implement a facility-wide functional safety system to protect personnel from harm and prevent damage to the facility or environment. The ATST will deploy an integrated safety-related control system (SRCS) to achieve functional safety throughout the facility rather than relying on individual facility subsystems to provide safety functions on an ad hoc basis. The Global Interlock System (GIS) is an independent, distributed, facility-wide, safety-related control system, comprised of commercial off-the-shelf (COTS) programmable controllers that monitor, evaluate, and control hazardous energy and conditions throughout the facility that arise during operation and maintenance. The GIS has been designed to utilize recent advances in technology for functional safety plus revised national and international standards that allow for a distributed architecture using programmable controllers over a local area network instead of traditional hard-wired safety functions, while providing an equivalent or even greater level of safety. Programmable controllers provide an ideal platform for controlling the often complex interrelationships between subsystems in a modern astronomical facility, such as the ATST. A large, complex hard-wired relay control system is no longer needed. This type of system also offers greater flexibility during development and integration in addition to providing for expanded capability into the future. The GIS features fault detection, self-diagnostics, and redundant communications that will lead to decreased maintenance time and increased availability of the facility.

  20. Improving Patient Safety in Hospitals through Usage of Cloud Supported Video Surveillance

    PubMed Central

    Dašić, Predrag; Dašić, Jovan; Crvenković, Bojan

    2017-01-01

    BACKGROUND: Patient safety in hospitals is of equal importance as providing treatments and urgent healthcare. With the development of Cloud technologies and Big Data analytics, it is possible to employ VSaaS technology virtually anywhere, for any given security purpose. AIM: For the listed benefits, in this paper, we give an overview of the existing cloud surveillance technologies which can be implemented for improving patient safety. MATERIAL AND METHODS: Modern VSaaS systems provide higher elasticity and project scalability in dealing with real-time information processing. Modern surveillance technologies can prove to be an effective tool for prevention of patient falls, undesired movement and tempering with attached life supporting devices. Given a large number of patients who require constant supervision, a cloud-based monitoring system can dramatically reduce the occurring costs. It provides continuous real-time monitoring, increased overall security and safety, improved staff productivity, prevention of dishonest claims and long-term digital archiving. CONCLUSION: Patient safety is a growing issue which can be improved with the usage of high-end centralised surveillance systems allowing the staff to focus more on treating health issues rather that keeping a watchful eye on potential incidents. PMID:28507610

  1. Deterministic Computer-Controlled Polishing Process for High-Energy X-Ray Optics

    NASA Technical Reports Server (NTRS)

    Khan, Gufran S.; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian

    2010-01-01

    A deterministic computer-controlled polishing process for large X-ray mirror mandrels is presented. Using tool s influence function and material removal rate extracted from polishing experiments, design considerations of polishing laps and optimized operating parameters are discussed

  2. Solving difficult problems creatively: a role for energy optimised deterministic/stochastic hybrid computing

    PubMed Central

    Palmer, Tim N.; O’Shea, Michael

    2015-01-01

    How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete. PMID:26528173

  3. Heart rate variability as determinism with jump stochastic parameters.

    PubMed

    Zheng, Jiongxuan; Skufca, Joseph D; Bollt, Erik M

    2013-08-01

    We use measured heart rate information (RR intervals) to develop a one-dimensional nonlinear map that describes short term deterministic behavior in the data. Our study suggests that there is a stochastic parameter with persistence which causes the heart rate and rhythm system to wander about a bifurcation point. We propose a modified circle map with a jump process noise term as a model which can qualitatively capture such this behavior of low dimensional transient determinism with occasional (stochastically defined) jumps from one deterministic system to another within a one parameter family of deterministic systems.

  4. Stochastic assembly in a subtropical forest chronosequence: evidence from contrasting changes of species, phylogenetic and functional dissimilarity over succession.

    PubMed

    Mi, Xiangcheng; Swenson, Nathan G; Jia, Qi; Rao, Mide; Feng, Gang; Ren, Haibao; Bebber, Daniel P; Ma, Keping

    2016-09-07

    Deterministic and stochastic processes jointly determine the community dynamics of forest succession. However, it has been widely held in previous studies that deterministic processes dominate forest succession. Furthermore, inference of mechanisms for community assembly may be misleading if based on a single axis of diversity alone. In this study, we evaluated the relative roles of deterministic and stochastic processes along a disturbance gradient by integrating species, functional, and phylogenetic beta diversity in a subtropical forest chronosequence in Southeastern China. We found a general pattern of increasing species turnover, but little-to-no change in phylogenetic and functional turnover over succession at two spatial scales. Meanwhile, the phylogenetic and functional beta diversity were not significantly different from random expectation. This result suggested a dominance of stochastic assembly, contrary to the general expectation that deterministic processes dominate forest succession. On the other hand, we found significant interactions of environment and disturbance and limited evidence for significant deviations of phylogenetic or functional turnover from random expectations for different size classes. This result provided weak evidence of deterministic processes over succession. Stochastic assembly of forest succession suggests that post-disturbance restoration may be largely unpredictable and difficult to control in subtropical forests.

  5. Discrete-State Stochastic Models of Calcium-Regulated Calcium Influx and Subspace Dynamics Are Not Well-Approximated by ODEs That Neglect Concentration Fluctuations

    PubMed Central

    Weinberg, Seth H.; Smith, Gregory D.

    2012-01-01

    Cardiac myocyte calcium signaling is often modeled using deterministic ordinary differential equations (ODEs) and mass-action kinetics. However, spatially restricted “domains” associated with calcium influx are small enough (e.g., 10−17 liters) that local signaling may involve 1–100 calcium ions. Is it appropriate to model the dynamics of subspace calcium using deterministic ODEs or, alternatively, do we require stochastic descriptions that account for the fundamentally discrete nature of these local calcium signals? To address this question, we constructed a minimal Markov model of a calcium-regulated calcium channel and associated subspace. We compared the expected value of fluctuating subspace calcium concentration (a result that accounts for the small subspace volume) with the corresponding deterministic model (an approximation that assumes large system size). When subspace calcium did not regulate calcium influx, the deterministic and stochastic descriptions agreed. However, when calcium binding altered channel activity in the model, the continuous deterministic description often deviated significantly from the discrete stochastic model, unless the subspace volume is unrealistically large and/or the kinetics of the calcium binding are sufficiently fast. This principle was also demonstrated using a physiologically realistic model of calmodulin regulation of L-type calcium channels introduced by Yue and coworkers. PMID:23509597

  6. Mixing Single Scattering Properties in Vector Radiative Transfer for Deterministic and Stochastic Solutions

    NASA Astrophysics Data System (ADS)

    Mukherjee, L.; Zhai, P.; Hu, Y.; Winker, D. M.

    2016-12-01

    Among the primary factors, which determine the polarized radiation, field of a turbid medium are the single scattering properties of the medium. When multiple types of scatterers are present, the single scattering properties of the scatterers need to be properly mixed in order to find the solutions to the vector radiative transfer theory (VRT). The VRT solvers can be divided into two types: deterministic and stochastic. The deterministic solver can only accept one set of single scattering property in its smallest discretized spatial volume. When the medium contains more than one kind of scatterer, their single scattering properties are averaged, and then used as input for the deterministic solver. The stochastic solver, can work with different kinds of scatterers explicitly. In this work, two different mixing schemes are studied using the Successive Order of Scattering (SOS) method and Monte Carlo (MC) methods. One scheme is used for deterministic and the other is used for the stochastic Monte Carlo method. It is found that the solutions from the two VRT solvers using two different mixing schemes agree with each other extremely well. This confirms the equivalence to the two mixing schemes and also provides a benchmark for the VRT solution for the medium studied.

  7. Standards Development Activities at White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Baker, D. L.; Beeson, H. D.; Saulsberry, R. L.; Julien, H. L.; Woods, S. S.

    2003-01-01

    The development of standards and standard activities at the JSC White Sands Test Facility (WSTF) has been expanded to include the transfer of technology and standards to voluntary consensus organizations in five technical areas of importance to NASA. This effort is in direct response to the National Technology Transfer Act designed to accelerate transfer of technology to industry and promote government-industry partnerships. Technology transfer is especially important for WSTF, whose longterm mission has been to develop and provide vital propellant safety and hazards information to aerospace designers, operations personnel, and safety personnel. Meeting this mission is being accomplished through the preparation of consensus guidelines and standards, propellant hazards analysis protocols, and safety courses for the propellant use of hydrogen, oxygen, and hypergols, as well as the design and inspection of spacecraft pressure vessels and the use of pyrovalves in spacecraft propulsion systems. The overall WSTF technology transfer program is described and the current status of technology transfer activities are summarized.

  8. Propagation of neutron-reaction uncertainties through multi-physics models of novel LWR's

    NASA Astrophysics Data System (ADS)

    Hernandez-Solis, Augusto; Sjöstrand, Henrik; Helgesson, Petter

    2017-09-01

    The novel design of the renewable boiling water reactor (RBWR) allows a breeding ratio greater than unity and thus, it aims at providing for a self-sustained fuel cycle. The neutron reactions that compose the different microscopic cross-sections and angular distributions are uncertain, so when they are employed in the determination of the spatial distribution of the neutron flux in a nuclear reactor, a methodology should be employed to account for these associated uncertainties. In this work, the Total Monte Carlo (TMC) method is used to propagate the different neutron-reactions (as well as angular distributions) covariances that are part of the TENDL-2014 nuclear data (ND) library. The main objective is to propagate them through coupled neutronic and thermal-hydraulic models in order to assess the uncertainty of important safety parameters related to multi-physics, such as peak cladding temperature along the axial direction of an RBWR fuel assembly. The objective of this study is to quantify the impact that ND covariances of important nuclides such as U-235, U-238, Pu-239 and the thermal scattering of hydrogen in H2O have in the deterministic safety analysis of novel nuclear reactors designs.

  9. Fast, Safe, Propellant-Efficient Spacecraft Motion Planning Under Clohessy-Wiltshire-Hill Dynamics

    NASA Technical Reports Server (NTRS)

    Starek, Joseph A.; Schmerling, Edward; Maher, Gabriel D.; Barbee, Brent W.; Pavone, Marco

    2016-01-01

    This paper presents a sampling-based motion planning algorithm for real-time and propellant-optimized autonomous spacecraft trajectory generation in near-circular orbits. Specifically, this paper leverages recent algorithmic advances in the field of robot motion planning to the problem of impulsively actuated, propellant- optimized rendezvous and proximity operations under the Clohessy-Wiltshire-Hill dynamics model. The approach calls upon a modified version of the FMT* algorithm to grow a set of feasible trajectories over a deterministic, low-dispersion set of sample points covering the free state space. To enforce safety, the tree is only grown over the subset of actively safe samples, from which there exists a feasible one-burn collision-avoidance maneuver that can safely circularize the spacecraft orbit along its coasting arc under a given set of potential thruster failures. Key features of the proposed algorithm include 1) theoretical guarantees in terms of trajectory safety and performance, 2) amenability to real-time implementation, and 3) generality, in the sense that a large class of constraints can be handled directly. As a result, the proposed algorithm offers the potential for widespread application, ranging from on-orbit satellite servicing to orbital debris removal and autonomous inspection missions.

  10. A system methodology for optimization design of the structural crashworthiness of a vehicle subjected to a high-speed frontal crash

    NASA Astrophysics Data System (ADS)

    Xia, Liang; Liu, Weiguo; Lv, Xiaojiang; Gu, Xianguang

    2018-04-01

    The structural crashworthiness design of vehicles has become an important research direction to ensure the safety of the occupants. To effectively improve the structural safety of a vehicle in a frontal crash, a system methodology is presented in this study. The surrogate model of Online support vector regression (Online-SVR) is adopted to approximate crashworthiness criteria and different kernel functions are selected to enhance the accuracy of the model. The Online-SVR model is demonstrated to have the advantages of solving highly nonlinear problems and saving training costs, and can effectively be applied for vehicle structural crashworthiness design. By combining the non-dominated sorting genetic algorithm II and Monte Carlo simulation, both deterministic optimization and reliability-based design optimization (RBDO) are conducted. The optimization solutions are further validated by finite element analysis, which shows the effectiveness of the RBDO solution in the structural crashworthiness design process. The results demonstrate the advantages of using RBDO, resulting in not only increased energy absorption and decreased structural weight from a baseline design, but also a significant improvement in the reliability of the design.

  11. On the Application of a Response Surface Technique to Analyze Roll-over Stability of Capsules with Airbags Using LS-Dyna

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.

    2008-01-01

    As NASA moves towards developing technologies needed to implement its new Exploration program, studies conducted for Apollo in the 1960's to understand the rollover stability of capsules landing are being revisited. Although rigid body kinematics analyses of the roll-over behavior of capsules on impact provided critical insight to the Apollo problem, extensive ground test programs were also used. For the new Orion spacecraft being developed to implement today's Exploration program, new air-bag designs have improved sufficiently for NASA to consider their use to mitigate landing loads to ensure crew safety and to enable re-usability of the capsule. Simple kinematics models provide only limited understanding of the behavior of these air bag systems, and more sophisticated tools must be used. In particular, NASA and its contractors are using the LS-Dyna nonlinear simulation code for impact response predictions of the full Orion vehicle with air bags by leveraging the extensive air bag prediction work previously done by the automotive industry. However, even in today's computational environment, these analyses are still high-dimensional, time consuming, and computationally intensive. To alleviate the computational burden, this paper presents an approach that uses deterministic sampling techniques and an adaptive response surface method to not only use existing LS-Dyna solutions but also to interpolate from LS-Dyna solutions to predict the stability boundaries for a capsule on airbags. Results for the stability boundary in terms of impact velocities, capsule attitude, impact plane orientation, and impact surface friction are discussed.

  12. Safety & Health. Resource Guide for Occupational/Technology Education.

    ERIC Educational Resources Information Center

    Kirk, Albert S., Ed.

    This guide is intended to alert occupational/technology teachers, teacher educators, school administrators, and industrial education supervisors to the need and importance of a strong and active safety program. Responsibilities are detailed for all individuals involved. Teacher liability is addressed. A section on emergency procedures covers…

  13. Safety Relevant Observations on the X2000 Train as Developed for the Swedish National Railways

    DOT National Transportation Integrated Search

    1990-01-01

    The safety of high speed rail technology proposed for possible application in the United States is of concern to the Federal Railroad Administration. This report, one in a series of reports planned for high speed rail technologies presents an initial...

  14. 49 CFR 350.319 - What are permissible uses of High Priority Activity Funds?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Implement, promote, and maintain national programs to improve CMV safety. (2) Increase compliance with CMV safety regulations. (3) Increase public awareness about CMV safety. (4) Provide education on CMV safety and related issues. (5) Demonstrate new safety related technologies. (b) These funds will be allocated...

  15. 2009 Human Factors and Roadway Safety Workshop : Overviews of Safety Initiatives in Iowa [SD .WMV (720x480/29fps/80.2 MB)

    DOT National Transportation Integrated Search

    2009-11-05

    Iowa Department of Transportation Research and Technology Bureau video presentation from the 2009 human factors and roadway safety workshop session titled: Overview of Safety Initiatives in Iowa : Tom Welch, Iowa DOT Highway Division Safety Engineer,...

  16. Commercialization of Kennedy Space Center Instrumentation Developed to Improve Safety, Reliability, Cost Effectiveness of Space Shuttle Processing, Launch, and Landing

    NASA Technical Reports Server (NTRS)

    Helms, William R.; Starr, Stanley O.

    1997-01-01

    Priorities and achievements of the Kennedy Space Center (KSF) Instrumentation Laboratories in improving operational safety and decreasing processing costs associated with the Shuttle vehicle are addressed. Technologies that have been or are in the process of technology transfer are reviewed, and routes by which commercial concerns can obtain licenses to other KSF Instrumentation Laboratory technologies are discussed.

  17. The Triangle Model for evaluating the effect of health information technology on healthcare quality and safety

    PubMed Central

    Kern, Lisa M; Abramson, Erika; Kaushal, Rainu

    2011-01-01

    With the proliferation of relatively mature health information technology (IT) systems with large numbers of users, it becomes increasingly important to evaluate the effect of these systems on the quality and safety of healthcare. Previous research on the effectiveness of health IT has had mixed results, which may be in part attributable to the evaluation frameworks used. The authors propose a model for evaluation, the Triangle Model, developed for designing studies of quality and safety outcomes of health IT. This model identifies structure-level predictors, including characteristics of: (1) the technology itself; (2) the provider using the technology; (3) the organizational setting; and (4) the patient population. In addition, the model outlines process predictors, including (1) usage of the technology, (2) organizational support for and customization of the technology, and (3) organizational policies and procedures about quality and safety. The Triangle Model specifies the variables to be measured, but is flexible enough to accommodate both qualitative and quantitative approaches to capturing them. The authors illustrate this model, which integrates perspectives from both health services research and biomedical informatics, with examples from evaluations of electronic prescribing, but it is also applicable to a variety of types of health IT systems. PMID:21857023

  18. Defining the Relationship Between Human Error Classes and Technology Intervention Strategies

    NASA Technical Reports Server (NTRS)

    Wiegmann, Douglas A.; Rantanen, Esa; Crisp, Vicki K. (Technical Monitor)

    2002-01-01

    One of the main factors in all aviation accidents is human error. The NASA Aviation Safety Program (AvSP), therefore, has identified several human-factors safety technologies to address this issue. Some technologies directly address human error either by attempting to reduce the occurrence of errors or by mitigating the negative consequences of errors. However, new technologies and system changes may also introduce new error opportunities or even induce different types of errors. Consequently, a thorough understanding of the relationship between error classes and technology "fixes" is crucial for the evaluation of intervention strategies outlined in the AvSP, so that resources can be effectively directed to maximize the benefit to flight safety. The purpose of the present project, therefore, was to examine the repositories of human factors data to identify the possible relationship between different error class and technology intervention strategies. The first phase of the project, which is summarized here, involved the development of prototype data structures or matrices that map errors onto "fixes" (and vice versa), with the hope of facilitating the development of standards for evaluating safety products. Possible follow-on phases of this project are also discussed. These additional efforts include a thorough and detailed review of the literature to fill in the data matrix and the construction of a complete database and standards checklists.

  19. Analysis of Aviation Safety Reporting System Incident Data Associated with the Technical Challenges of the System-Wide Safety and Assurance Technologies Project

    NASA Technical Reports Server (NTRS)

    Withrow, Colleen A.; Reveley, Mary S.

    2015-01-01

    The Aviation Safety Program (AvSP) System-Wide Safety and Assurance Technologies (SSAT) Project asked the AvSP Systems and Portfolio Analysis Team to identify SSAT-related trends. SSAT had four technical challenges: advance safety assurance to enable deployment of NextGen systems; automated discovery of precursors to aviation safety incidents; increasing safety of human-automation interaction by incorporating human performance, and prognostic algorithm design for safety assurance. This report reviews incident data from the NASA Aviation Safety Reporting System (ASRS) for system-component-failure- or-malfunction- (SCFM-) related and human-factor-related incidents for commercial or cargo air carriers (Part 121), commuter airlines (Part 135), and general aviation (Part 91). The data was analyzed by Federal Aviation Regulations (FAR) part, phase of flight, SCFM category, human factor category, and a variety of anomalies and results. There were 38 894 SCFM-related incidents and 83 478 human-factorrelated incidents analyzed between January 1993 and April 2011.

  20. A Methodology for Validating Safety Heuristics Using Clinical Simulations: Identifying and Preventing Possible Technology-Induced Errors Related to Using Health Information Systems

    PubMed Central

    Borycki, Elizabeth; Kushniruk, Andre; Carvalho, Christopher

    2013-01-01

    Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations. PMID:23606902

  1. Sensitivity analysis in economic evaluation: an audit of NICE current practice and a review of its use and value in decision-making.

    PubMed

    Andronis, L; Barton, P; Bryan, S

    2009-06-01

    To determine how we define good practice in sensitivity analysis in general and probabilistic sensitivity analysis (PSA) in particular, and to what extent it has been adhered to in the independent economic evaluations undertaken for the National Institute for Health and Clinical Excellence (NICE) over recent years; to establish what policy impact sensitivity analysis has in the context of NICE, and policy-makers' views on sensitivity analysis and uncertainty, and what use is made of sensitivity analysis in policy decision-making. Three major electronic databases, MEDLINE, EMBASE and the NHS Economic Evaluation Database, were searched from inception to February 2008. The meaning of 'good practice' in the broad area of sensitivity analysis was explored through a review of the literature. An audit was undertaken of the 15 most recent NICE multiple technology appraisal judgements and their related reports to assess how sensitivity analysis has been undertaken by independent academic teams for NICE. A review of the policy and guidance documents issued by NICE aimed to assess the policy impact of the sensitivity analysis and the PSA in particular. Qualitative interview data from NICE Technology Appraisal Committee members, collected as part of an earlier study, were also analysed to assess the value attached to the sensitivity analysis components of the economic analyses conducted for NICE. All forms of sensitivity analysis, notably both deterministic and probabilistic approaches, have their supporters and their detractors. Practice in relation to univariate sensitivity analysis is highly variable, with considerable lack of clarity in relation to the methods used and the basis of the ranges employed. In relation to PSA, there is a high level of variability in the form of distribution used for similar parameters, and the justification for such choices is rarely given. Virtually all analyses failed to consider correlations within the PSA, and this is an area of concern. Uncertainty is considered explicitly in the process of arriving at a decision by the NICE Technology Appraisal Committee, and a correlation between high levels of uncertainty and negative decisions was indicated. The findings suggest considerable value in deterministic sensitivity analysis. Such analyses serve to highlight which model parameters are critical to driving a decision. Strong support was expressed for PSA, principally because it provides an indication of the parameter uncertainty around the incremental cost-effectiveness ratio. The review and the policy impact assessment focused exclusively on documentary evidence, excluding other sources that might have revealed further insights on this issue. In seeking to address parameter uncertainty, both deterministic and probabilistic sensitivity analyses should be used. It is evident that some cost-effectiveness work, especially around the sensitivity analysis components, represents a challenge in making it accessible to those making decisions. This speaks to the training agenda for those sitting on such decision-making bodies, and to the importance of clear presentation of analyses by the academic community.

  2. INCREASING HEAVY OIL RESERVES IN THE WILMINGTON OIL FIELD THROUGH ADVANCED RESERVOIR CHARACTERIZATION AND THERMAL PRODUCTION TECHNOLOGIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott Hara

    2000-02-18

    The project involves using advanced reservoir characterization and thermal production technologies to improve thermal recovery techniques and lower operating and capital costs in a slope and basin clastic (SBC) reservoir in the Wilmington field, Los Angeles Co., CA. Through March 1999, project work has been completed related to data preparation, basic reservoir engineering, developing a deterministic three dimensional (3-D) geologic model, a 3-D deterministic reservoir simulation model, and a rock-log model, well drilling and completions, and surface facilities. Work is continuing on the stochastic geologic model, developing a 3-D stochastic thermal reservoir simulation model of the Fault Block IIA Tarmore » (Tar II-A) Zone, and operational work and research studies to prevent thermal-related formation compaction. Thermal-related formation compaction is a concern of the project team due to observed surface subsidence in the local area above the steamflood project. Last quarter on January 12, the steamflood project lost its inexpensive steam source from the Harbor Cogeneration Plant as a result of the recent deregulation of electrical power rates in California. An operational plan was developed and implemented to mitigate the effects of the two situations. Seven water injection wells were placed in service in November and December 1998 on the flanks of the Phase 1 steamflood area to pressure up the reservoir to fill up the existing steam chest. Intensive reservoir engineering and geomechanics studies are continuing to determine the best ways to shut down the steamflood operations in Fault Block II while minimizing any future surface subsidence. The new 3-D deterministic thermal reservoir simulator model is being used to provide sensitivity cases to optimize production, steam injection, future flank cold water injection and reservoir temperature and pressure. According to the model, reservoir fill up of the steam chest at the current injection rate of 28,000 BPD and gross and net oil production rates of 7,700 BPD and 750 BOPD (injection to production ratio of 4) will occur in October 1999. At that time, the reservoir should act more like a waterflood and production and cold water injection can be operated at lower net injection rates to be determined. Modeling runs developed this quarter found that varying individual well injection rates to meet added production and local pressure problems by sub-zone could reduce steam chest fill-up by up to one month.« less

  3. Deterministic models for traffic jams

    NASA Astrophysics Data System (ADS)

    Nagel, Kai; Herrmann, Hans J.

    1993-10-01

    We study several deterministic one-dimensional traffic models. For integer positions and velocities we find the typical high and low density phases separated by a simple transition. If positions and velocities are continuous variables the model shows self-organized critically driven by the slowest car.

  4. Impact assessment of extreme storm events using a Bayesian network

    USGS Publications Warehouse

    den Heijer, C.(Kees); Knipping, Dirk T.J.A.; Plant, Nathaniel G.; van Thiel de Vries, Jaap S. M.; Baart, Fedor; van Gelder, Pieter H. A. J. M.

    2012-01-01

    This paper describes an investigation on the usefulness of Bayesian Networks in the safety assessment of dune coasts. A network has been created that predicts the erosion volume based on hydraulic boundary conditions and a number of cross-shore profile indicators. Field measurement data along a large part of the Dutch coast has been used to train the network. Corresponding storm impact on the dunes was calculated with an empirical dune erosion model named duros+. Comparison between the Bayesian Network predictions and the original duros+ results, here considered as observations, results in a skill up to 0.88, provided that the training data covers the range of predictions. Hence, the predictions from a deterministic model (duros+) can be captured in a probabilistic model (Bayesian Network) such that both the process knowledge and uncertainties can be included in impact and vulnerability assessments.

  5. Comparison of deterministic and stochastic approaches for isotopic concentration and decay heat uncertainty quantification on elementary fission pulse

    NASA Astrophysics Data System (ADS)

    Lahaye, S.; Huynh, T. D.; Tsilanizara, A.

    2016-03-01

    Uncertainty quantification of interest outputs in nuclear fuel cycle is an important issue for nuclear safety, from nuclear facilities to long term deposits. Most of those outputs are functions of the isotopic vector density which is estimated by fuel cycle codes, such as DARWIN/PEPIN2, MENDEL, ORIGEN or FISPACT. CEA code systems DARWIN/PEPIN2 and MENDEL propagate by two different methods the uncertainty from nuclear data inputs to isotopic concentrations and decay heat. This paper shows comparisons between those two codes on a Uranium-235 thermal fission pulse. Effects of nuclear data evaluation's choice (ENDF/B-VII.1, JEFF-3.1.1 and JENDL-2011) is inspected in this paper. All results show good agreement between both codes and methods, ensuring the reliability of both approaches for a given evaluation.

  6. 2006 NASA Range Safety Annual Report

    NASA Technical Reports Server (NTRS)

    TenHaken, Ron; Daniels, B.; Becker, M.; Barnes, Zack; Donovan, Shawn; Manley, Brenda

    2007-01-01

    Throughout 2006, Range Safety was involved in a number of exciting and challenging activities and events, from developing, implementing, and supporting Range Safety policies and procedures-such as the Space Shuttle Launch and Landing Plans, the Range Safety Variance Process, and the Expendable Launch Vehicle Safety Program procedures-to evaluating new technologies. Range Safety training development is almost complete with the last course scheduled to go on line in mid-2007. Range Safety representatives took part in a number of panels and councils, including the newly formed Launch Constellation Range Safety Panel, the Range Commanders Council and its subgroups, the Space Shuttle Range Safety Panel, and the unmanned aircraft systems working group. Space based range safety demonstration and certification (formerly STARS) and the autonomous flight safety system were successfully tested. The enhanced flight termination system will be tested in early 2007 and the joint advanced range safety system mission analysis software tool is nearing operational status. New technologies being evaluated included a processor for real-time compensation in long range imaging, automated range surveillance using radio interferometry, and a space based range command and telemetry processor. Next year holds great promise as we continue ensuring safety while pursuing our quest beyond the Moon to Mars.

  7. Ending the use of animals in toxicity testing and risk evaluation.

    PubMed

    Rowan, Andrew N

    2015-10-01

    This article discusses the use of animals for the safety testing of chemicals, including pharmaceuticals, household products, pesticides, and industrial chemicals. It reviews changes in safety testing technology and what those changes mean from the perspective of industrial innovation, public policy and public health, economics, and ethics. It concludes that the continuing use of animals for chemical safety testing should end within the decade as cheaper, quicker, and more predictive technologies are developed and applied.

  8. An Overview of the NASA Aviation Safety Program Propulsion Health Monitoring Element

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.

    2000-01-01

    The NASA Aviation Safety Program (AvSP) has been initiated with aggressive goals to reduce the civil aviation accident rate, To meet these goals, several technology investment areas have been identified including a sub-element in propulsion health monitoring (PHM). Specific AvSP PHM objectives are to develop and validate propulsion system health monitoring technologies designed to prevent engine malfunctions from occurring in flight, and to mitigate detrimental effects in the event an in-flight malfunction does occur. A review of available propulsion system safety information was conducted to help prioritize PHM areas to focus on under the AvSP. It is noted that when a propulsion malfunction is involved in an aviation accident or incident, it is often a contributing factor rather than the sole cause for the event. Challenging aspects of the development and implementation of PHM technology such as cost, weight, robustness, and reliability are discussed. Specific technology plans are overviewed including vibration diagnostics, model-based controls and diagnostics, advanced instrumentation, and general aviation propulsion system health monitoring technology. Propulsion system health monitoring, in addition to engine design, inspection, maintenance, and pilot training and awareness, is intrinsic to enhancing aviation propulsion system safety.

  9. MEMS sensor technologies for human centred applications in healthcare, physical activities, safety and environmental sensing: a review on research activities in Italy.

    PubMed

    Ciuti, Gastone; Ricotti, Leonardo; Menciassi, Arianna; Dario, Paolo

    2015-03-17

    Over the past few decades the increased level of public awareness concerning healthcare, physical activities, safety and environmental sensing has created an emerging need for smart sensor technologies and monitoring devices able to sense, classify, and provide feedbacks to users' health status and physical activities, as well as to evaluate environmental and safety conditions in a pervasive, accurate and reliable fashion. Monitoring and precisely quantifying users' physical activity with inertial measurement unit-based devices, for instance, has also proven to be important in health management of patients affected by chronic diseases, e.g., Parkinson's disease, many of which are becoming highly prevalent in Italy and in the Western world. This review paper will focus on MEMS sensor technologies developed in Italy in the last three years describing research achievements for healthcare and physical activity, safety and environmental sensing, in addition to smart systems integration. Innovative and smart integrated solutions for sensing devices, pursued and implemented in Italian research centres, will be highlighted, together with specific applications of such technologies. Finally, the paper will depict the future perspective of sensor technologies and corresponding exploitation opportunities, again with a specific focus on Italy.

  10. Safety Outreach and Incident Response Stakeholder Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosewater, David Martin; Conover, David

    2016-06-01

    The objective of this document is to set out a strategy to reach all stakeholders that can impact the timely deployment of safe stationary energy storage systems in the built environment with information on ESS technology and safety that is relevant to their role in deployment of the technology.

  11. Intervention technologies for food safety on minimally processed produce:Perspectives on food-borne and plant pathogens

    USDA-ARS?s Scientific Manuscript database

    Produce contamination associated with enteric pathogens such Escherichia coli O157:H7, salmonella spp., Listeria monocytogenes, Shigella and others are significant challenges to food safety. This is due to the illnesses and economic impacts resulting from the outbreaks. Innovative technologies for i...

  12. Soil pH mediates the balance between stochastic and deterministic assembly of bacteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tripathi, Binu M.; Stegen, James C.; Kim, Mincheol

    Little is known about the factors affecting the relative influence of stochastic and deterministic processes that governs the assembly of microbial communities in successional soils. Here, we conducted a meta-analysis of bacterial communities using six different successional soils data sets, scattered across different regions, with different pH conditions in early and late successional soils. We found that soil pH was the best predictor of bacterial community assembly and the relative importance of stochastic and deterministic processes along successional soils. Extreme acidic or alkaline pH conditions lead to assembly of phylogenetically more clustered bacterial communities through deterministic processes, whereas pH conditionsmore » close to neutral lead to phylogenetically less clustered bacterial communities with more stochasticity. We suggest that the influence of pH, rather than successional age, is the main driving force in producing trends in phylogenetic assembly of bacteria, and that pH also influences the relative balance of stochastic and deterministic processes along successional soils. Given that pH had a much stronger association with community assembly than did successional age, we evaluated whether the inferred influence of pH was maintained when studying globally-distributed samples collected without regard for successional age. This dataset confirmed the strong influence of pH, suggesting that the influence of soil pH on community assembly processes occurs globally. Extreme pH conditions likely exert more stringent limits on survival and fitness, imposing strong selective pressures through ecological and evolutionary time. Taken together, these findings suggest that the degree to which stochastic vs. deterministic processes shape soil bacterial community assembly is a consequence of soil pH rather than successional age.« less

  13. The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments

    NASA Astrophysics Data System (ADS)

    Chen, Fajing; Jiao, Meiyan; Chen, Jing

    2013-04-01

    Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.

  14. Deterministic Factors Overwhelm Stochastic Environmental Fluctuations as Drivers of Jellyfish Outbreaks.

    PubMed

    Benedetti-Cecchi, Lisandro; Canepa, Antonio; Fuentes, Veronica; Tamburello, Laura; Purcell, Jennifer E; Piraino, Stefano; Roberts, Jason; Boero, Ferdinando; Halpin, Patrick

    2015-01-01

    Jellyfish outbreaks are increasingly viewed as a deterministic response to escalating levels of environmental degradation and climate extremes. However, a comprehensive understanding of the influence of deterministic drivers and stochastic environmental variations favouring population renewal processes has remained elusive. This study quantifies the deterministic and stochastic components of environmental change that lead to outbreaks of the jellyfish Pelagia noctiluca in the Mediterranen Sea. Using data of jellyfish abundance collected at 241 sites along the Catalan coast from 2007 to 2010 we: (1) tested hypotheses about the influence of time-varying and spatial predictors of jellyfish outbreaks; (2) evaluated the relative importance of stochastic vs. deterministic forcing of outbreaks through the environmental bootstrap method; and (3) quantified return times of extreme events. Outbreaks were common in May and June and less likely in other summer months, which resulted in a negative relationship between outbreaks and SST. Cross- and along-shore advection by geostrophic flow were important concentrating forces of jellyfish, but most outbreaks occurred in the proximity of two canyons in the northern part of the study area. This result supported the recent hypothesis that canyons can funnel P. noctiluca blooms towards shore during upwelling. This can be a general, yet unappreciated mechanism leading to outbreaks of holoplanktonic jellyfish species. The environmental bootstrap indicated that stochastic environmental fluctuations have negligible effects on return times of outbreaks. Our analysis emphasized the importance of deterministic processes leading to jellyfish outbreaks compared to the stochastic component of environmental variation. A better understanding of how environmental drivers affect demographic and population processes in jellyfish species will increase the ability to anticipate jellyfish outbreaks in the future.

  15. Rural hospital information technology implementation for safety and quality improvement: lessons learned.

    PubMed

    Tietze, Mari F; Williams, Josie; Galimbertti, Marisa

    2009-01-01

    This grant involved a hospital collaborative for excellence using information technology over 3-year period. The project activities focused on the improvement of patient care safety and quality in Southern rural and small community hospitals through the use of technology and education. The technology component of the design involved the implementation of a Web-based business analytic tool that allows hospitals to view data, create reports, and analyze their safety and quality data. Through a preimplementation and postimplementation comparative design, the focus of the implementation team was twofold: to recruit participant hospitals and to implement the technology at each of the 66 hospital sites. Rural hospitals were defined as acute care hospitals located in a county with a population of less than 100 000 or a state-administered Critical Access Hospital, making the total study population target 188 hospitals. Lessons learned during the information technology implementation of these hospitals are reflective of the unique culture, financial characteristics, organizational structure, and technology architecture of rural hospitals. Specific steps such as recruitment, information technology assessment, conference calls for project planning, data file extraction and transfer, technology training, use of e-mail, use of telephones, personnel management, and engaging information technology vendors were found to vary greatly among hospitals.

  16. Beyond usability: designing effective technology implementation systems to promote patient safety.

    PubMed

    Karsh, B-T

    2004-10-01

    Evidence is emerging that certain technologies such as computerized provider order entry may reduce the likelihood of patient harm. However, many technologies that should reduce medical errors have been abandoned because of problems with their design, their impact on workflow, and general dissatisfaction with them by end users. Patient safety researchers have therefore looked to human factors engineering for guidance on how to design technologies to be usable (easy to use) and useful (improving job performance, efficiency, and/or quality). While this is a necessary step towards improving the likelihood of end user satisfaction, it is still not sufficient. Human factors engineering research has shown that the manner in which technologies are implemented also needs to be designed carefully if benefits are to be realized. This paper reviews the theoretical knowledge on what leads to successful technology implementation and how this can be translated into specifically designed processes for successful technology change. The literature on diffusion of innovations, technology acceptance, organisational justice, participative decision making, and organisational change is reviewed and strategies for promoting successful implementation are provided. Given the rapid and ever increasing pace of technology implementation in health care, it is critical for the science of technology implementation to be understood and incorporated into efforts to improve patient safety.

  17. Studying technology use as social practice: the untapped potential of ethnography

    PubMed Central

    2011-01-01

    Information and communications technologies (ICTs) in healthcare are often introduced with expectations of higher-quality, more efficient, and safer care. Many fail to meet these expectations. We argue here that the well-documented failures of ICTs in healthcare are partly attributable to the philosophical foundations of much health informatics research. Positivistic assumptions underpinning the design, implementation and evaluation of ICTs (in particular the notion that technology X has an impact which can be measured and reproduced in new settings), and the deterministic experimental and quasi-experimental study designs which follow from these assumptions, have inherent limitations when ICTs are part of complex social practices involving multiple human actors. We suggest that while experimental and quasi-experimental studies have an important place in health informatics research overall, ethnography is the preferred methodological approach for studying ICTs introduced into complex social systems. But for ethnographic approaches to be accepted and used to their full potential, many in the health informatics community will need to revisit their philosophical assumptions about what counts as research rigor. PMID:21521535

  18. Measuring and improving patient safety through health information technology: The Health IT Safety Framework

    PubMed Central

    Singh, Hardeep

    2016-01-01

    Health information technology (health IT) has potential to improve patient safety but its implementation and use has led to unintended consequences and new safety concerns. A key challenge to improving safety in health IT-enabled healthcare systems is to develop valid, feasible strategies to measure safety concerns at the intersection of health IT and patient safety. In response to the fundamental conceptual and methodological gaps related to both defining and measuring health IT-related patient safety, we propose a new framework, the Health IT Safety (HITS) measurement framework, to provide a conceptual foundation for health IT-related patient safety measurement, monitoring, and improvement. The HITS framework follows both Continuous Quality Improvement (CQI) and sociotechnical approaches and calls for new measures and measurement activities to address safety concerns in three related domains: 1) concerns that are unique and specific to technology (e.g., to address unsafe health IT related to unavailable or malfunctioning hardware or software); 2) concerns created by the failure to use health IT appropriately or by misuse of health IT (e.g. to reduce nuisance alerts in the electronic health record (EHR)), and 3) the use of health IT to monitor risks, health care processes and outcomes and identify potential safety concerns before they can harm patients (e.g. use EHR-based algorithms to identify patients at risk for medication errors or care delays). The framework proposes to integrate both retrospective and prospective measurement of HIT safety with an organization's existing clinical risk management and safety programs. It aims to facilitate organizational learning, comprehensive 360 degree assessment of HIT safety that includes vendor involvement, refinement of measurement tools and strategies, and shared responsibility to identify problems and implement solutions. A long term framework goal is to enable rigorous measurement that helps achieve the safety benefits of health IT in real-world clinical settings. PMID:26369894

  19. 49 CFR 1.95 - Delegations to the National Highway Traffic Safety Administrator.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... motorcyclist safety; (10) Section 2011 [23 U.S.C. 405 note], relating to child safety and child booster seat... 10306, relating to the study of safety belt use technologies; (24) Section 10307(b) [15 U.S.C. 1232 note...

  20. 49 CFR 1.95 - Delegations to the National Highway Traffic Safety Administrator.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... motorcyclist safety; (10) Section 2011 [23 U.S.C. 405 note], relating to child safety and child booster seat... 10306, relating to the study of safety belt use technologies; (24) Section 10307(b) [15 U.S.C. 1232 note...

  1. Advanced biosensing methodologies developed for evaluating performance quality and safety of emerging biophotonics technologies and medical devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ilev, Ilko K.; Walker, Bennett; Calhoun, William; Hassan, Moinuddin

    2016-03-01

    Biophotonics is an emerging field in modern biomedical technology that has opened up new horizons for transfer of state-of-the-art techniques from the areas of lasers, fiber optics and biomedical optics to the life sciences and medicine. This field continues to vastly expand with advanced developments across the entire spectrum of biomedical applications ranging from fundamental "bench" laboratory studies to clinical patient "bedside" diagnostics and therapeutics. However, in order to translate these technologies to clinical device applications, the scientific and industrial community, and FDA are facing the requirement for a thorough evaluation and review of laser radiation safety and efficacy concerns. In many cases, however, the review process is complicated due the lack of effective means and standard test methods to precisely analyze safety and effectiveness of some of the newly developed biophotonics techniques and devices. There is, therefore, an immediate public health need for new test protocols, guidance documents and standard test methods to precisely evaluate fundamental characteristics, performance quality and safety of these technologies and devices. Here, we will overview our recent developments of novel test methodologies for safety and efficacy evaluation of some emerging biophotonics technologies and medical devices. These methodologies are based on integrating the advanced features of state-of-the-art optical sensor technologies and approaches such as high-resolution fiber-optic sensing, confocal and optical coherence tomography imaging, and infrared spectroscopy. The presentation will also illustrate some methodologies developed and implemented for testing intraocular lens implants, biochemical contaminations of medical devices, ultrahigh-resolution nanoscopy, and femtosecond laser therapeutics.

  2. Cognitive Diagnostic Analysis Using Hierarchically Structured Skills

    ERIC Educational Resources Information Center

    Su, Yu-Lan

    2013-01-01

    This dissertation proposes two modified cognitive diagnostic models (CDMs), the deterministic, inputs, noisy, "and" gate with hierarchy (DINA-H) model and the deterministic, inputs, noisy, "or" gate with hierarchy (DINO-H) model. Both models incorporate the hierarchical structures of the cognitive skills in the model estimation…

  3. Deterministic Mean-Field Ensemble Kalman Filtering

    DOE PAGES

    Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul

    2016-05-03

    The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d

  4. Fault Detection for Nonlinear Process With Deterministic Disturbances: A Just-In-Time Learning Based Data Driven Method.

    PubMed

    Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay

    2017-11-01

    Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.

  5. Deterministic Mean-Field Ensemble Kalman Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul

    The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d

  6. Recent advances in MRI technology: Implications for image quality and patient safety

    PubMed Central

    Sobol, Wlad T.

    2012-01-01

    Recent advances in MRI technology are presented, with emphasis on how this new technology impacts clinical operations (better image quality, faster exam times, and improved throughput). In addition, implications for patient safety are discussed with emphasis on the risk of patient injury due to either high local specific absorption rate (SAR) or large cumulative energy doses delivered during long exam times. Patient comfort issues are examined as well. PMID:23961024

  7. Improving Deterministic Reserve Requirements for Security Constrained Unit Commitment and Scheduling Problems in Power Systems

    NASA Astrophysics Data System (ADS)

    Wang, Fengyu

    Traditional deterministic reserve requirements rely on ad-hoc, rule of thumb methods to determine adequate reserve in order to ensure a reliable unit commitment. Since congestion and uncertainties exist in the system, both the quantity and the location of reserves are essential to ensure system reliability and market efficiency. The modeling of operating reserves in the existing deterministic reserve requirements acquire the operating reserves on a zonal basis and do not fully capture the impact of congestion. The purpose of a reserve zone is to ensure that operating reserves are spread across the network. Operating reserves are shared inside each reserve zone, but intra-zonal congestion may block the deliverability of operating reserves within a zone. Thus, improving reserve policies such as reserve zones may improve the location and deliverability of reserve. As more non-dispatchable renewable resources are integrated into the grid, it will become increasingly difficult to predict the transfer capabilities and the network congestion. At the same time, renewable resources require operators to acquire more operating reserves. With existing deterministic reserve requirements unable to ensure optimal reserve locations, the importance of reserve location and reserve deliverability will increase. While stochastic programming can be used to determine reserve by explicitly modelling uncertainties, there are still scalability as well as pricing issues. Therefore, new methods to improve existing deterministic reserve requirements are desired. One key barrier of improving existing deterministic reserve requirements is its potential market impacts. A metric, quality of service, is proposed in this thesis to evaluate the price signal and market impacts of proposed hourly reserve zones. Three main goals of this thesis are: 1) to develop a theoretical and mathematical model to better locate reserve while maintaining the deterministic unit commitment and economic dispatch structure, especially with the consideration of renewables, 2) to develop a market settlement scheme of proposed dynamic reserve policies such that the market efficiency is improved, 3) to evaluate the market impacts and price signal of the proposed dynamic reserve policies.

  8. HFE safety reviews of advanced nuclear power plant control rooms

    NASA Technical Reports Server (NTRS)

    Ohara, John

    1994-01-01

    Advanced control rooms (ACR's) will utilize human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator's overall role and means of interacting with the system. The Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) aspects of HSI's to ensure that they are designed to good HFE principles and support performance and reliability in order to protect public health and safety. However, the only available NRC guidance was developed more than ten years ago, and does not adequately address the human performance issues and technology changes associated with ACR's. Accordingly, a new approach to ACR safety reviews was developed based upon the concept of 'convergent validity'. This approach to ACR safety reviews is described.

  9. Patient safety trilogy: perspectives from clinical engineering.

    PubMed

    Gieras, Izabella; Sherman, Paul; Minsent, Dennis

    2013-01-01

    This article examines the role a clinical engineering or healthcare technology management (HTM) department can play in promoting patient safety from three different perspectives: a community hospital, a national government health system, and an academic medical center. After a general overview, Izabella Gieras from Huntington Hospital in Pasadena, CA, leads off by examining the growing role of human factors in healthcare technology, and describing how her facility uses clinical simulations in medical equipment evaluations. A section by Paul Sherman follows, examining patient safety initiatives from the perspective of the Veterans Health Administration with a focus on hazard alerts and recalls. Dennis Minsent from Oregon Health & Science University writes about patient safety from an academic healthcare perspective, and details how clinical engineers can engage in multidisciplinary safety opportunities.

  10. Whole blood pathogen reduction technology and blood safety in sub-Saharan Africa: A systematic review with regional discussion

    PubMed Central

    Agbor, Gabriel; Asongalem, Emmanuel; Tagny, Claude; Asonganyi, Tazoacha

    2016-01-01

    Background Despite vast improvements in transfusion services in sub-Saharan Africa over the last decade, there remain serious concerns on the safety and adequacy of the blood supply across the region. Objective This review paper ascertains the role of pathogen reduction technology (PRT) in improving blood safety and supply adequacy in the region. Method The state of blood safety in sub-Saharan Africa was reviewed. Meetings, seminars and correspondence were undertaken with key clinicians, scientists and professional bodies in the region, including the World Health Organization’s Regional Office for Africa, to examine the suitability of PRT for improving the safety of whole blood transfusion, a prevalent transfusion format in the region. Results Existing literature suggests that combining PRT with current blood safety measures (such as serology) would improve the safety and adequacy of the blood supply for transfusions in sub-Saharan Africa. This was echoed by the findings of the stakeholder meetings. Conclusion Following a detailed appraisal of two leading PRT systems, the Mirasol® PRT System and the Cerus S-303 System, we suggest that companies conduct comprehensive toxicological evaluation of the agents used for PRT and publish this in the scientific literature. We also recommend that the safety and efficacy of these technologies should be established in a randomised clinical trial conducted in sub-Saharan Africa. PMID:28879109

  11. Nuclear electric propulsion operational reliability and crew safety study: NEP systems/modeling report

    NASA Technical Reports Server (NTRS)

    Karns, James

    1993-01-01

    The objective of this study was to establish the initial quantitative reliability bounds for nuclear electric propulsion systems in a manned Mars mission required to ensure crew safety and mission success. Finding the reliability bounds involves balancing top-down (mission driven) requirements and bottom-up (technology driven) capabilities. In seeking this balance we hope to accomplish the following: (1) provide design insights into the achievability of the baseline design in terms of reliability requirements, given the existing technology base; (2) suggest alternative design approaches which might enhance reliability and crew safety; and (3) indicate what technology areas require significant research and development to achieve the reliability objectives.

  12. Parameter Estimation in Epidemiology: from Simple to Complex Dynamics

    NASA Astrophysics Data System (ADS)

    Aguiar, Maíra; Ballesteros, Sebastién; Boto, João Pedro; Kooi, Bob W.; Mateus, Luís; Stollenwerk, Nico

    2011-09-01

    We revisit the parameter estimation framework for population biological dynamical systems, and apply it to calibrate various models in epidemiology with empirical time series, namely influenza and dengue fever. When it comes to more complex models like multi-strain dynamics to describe the virus-host interaction in dengue fever, even most recently developed parameter estimation techniques, like maximum likelihood iterated filtering, come to their computational limits. However, the first results of parameter estimation with data on dengue fever from Thailand indicate a subtle interplay between stochasticity and deterministic skeleton. The deterministic system on its own already displays complex dynamics up to deterministic chaos and coexistence of multiple attractors.

  13. Structures and Materials Technologies for Extreme Environments Applied to Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Clay, Christopher; Rezin, Marc

    2003-01-01

    This paper provides an overview of the evolution of structures and materials technology approaches to survive the challenging extreme environments encountered by earth-to-orbit space transportation systems, with emphasis on more recent developments in the USA. The evolution of technology requirements and experience in the various approaches to meeting these requirements has significantly influenced the technology approaches. While previous goals were primarily performance driven, more recently dramatic improvements in costs/operations and in safety have been paramount goals. Technologies that focus on the cost/operations and safety goals in the area of hot structures and thermal protection systems for reusable launch vehicles are presented. Assessments of the potential ability of the various technologies to satisfy the technology requirements, and their current technology readiness status are also presented.

  14. Mine Safety Education and Training Seminar. Proceedings: Bureau of Mines Technology Transfer Seminar (Pittsburgh, Pennsylvania, May 17, 1988; Beckley, West Virginia, May 19, 1988; St. Louis, Missouri, May 24, 1988; and Reno, Nevada, May 26, 1988). Information Circular 9185.

    ERIC Educational Resources Information Center

    Bureau of Mines (Dept. of Interior), Washington, DC.

    This publication contains the papers presented at four technology transfer seminars on mine safety education and training. The papers highlight the Bureau of Mines' recent research aimed at improving the effectiveness of mine safety training in order to reduce workplace accidents. The following eight papers are included: "Effect of Training…

  15. 49 CFR 236.911 - Exclusions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... system technology. However, a subsystem or component of an office system must comply with the requirements of this subpart if it performs safety-critical functions within, or affects the safety performance... this subpart if they result in a degradation of safety or a material increase in safety-critical...

  16. 49 CFR 236.911 - Exclusions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... system technology. However, a subsystem or component of an office system must comply with the requirements of this subpart if it performs safety-critical functions within, or affects the safety performance... this subpart if they result in a degradation of safety or a material increase in safety-critical...

  17. 49 CFR 236.911 - Exclusions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... system technology. However, a subsystem or component of an office system must comply with the requirements of this subpart if it performs safety-critical functions within, or affects the safety performance... this subpart if they result in a degradation of safety or a material increase in safety-critical...

  18. 49 CFR 236.911 - Exclusions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... system technology. However, a subsystem or component of an office system must comply with the requirements of this subpart if it performs safety-critical functions within, or affects the safety performance... this subpart if they result in a degradation of safety or a material increase in safety-critical...

  19. 49 CFR 236.911 - Exclusions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... system technology. However, a subsystem or component of an office system must comply with the requirements of this subpart if it performs safety-critical functions within, or affects the safety performance... this subpart if they result in a degradation of safety or a material increase in safety-critical...

  20. The effects of safety practice, technology adoption, and firm characteristics on motor carrier safety

    DOT National Transportation Integrated Search

    2004-01-01

    The theory of the firm suggests that firms should maximize profit by investing in safety until : marginal cost is equal to the marginal benefit. This paper addresses motor carrier safety from the : perspective of the firm, developing the theoretical ...

Top