Sample records for level detailed analysis

  1. Geometric Modelling of Tree Roots with Different Levels of Detail

    NASA Astrophysics Data System (ADS)

    Guerrero Iñiguez, J. I.

    2017-09-01

    This paper presents a geometric approach for modelling tree roots with different Levels of Detail, suitable for analysis of the tree anchoring, potentially occupied underground space, interaction with urban elements and damage produced and taken in the built-in environment. Three types of tree roots are considered to cover several species: tap root, heart shaped root and lateral roots. Shrubs and smaller plants are not considered, however, a similar approach can be considered if the information is available for individual species. The geometrical approach considers the difficulties of modelling the actual roots, which are dynamic and almost opaque to direct observation, proposing generalized versions. For each type of root, different geometric models are considered to capture the overall shape of the root, a simplified block model, and a planar or surface projected version. Lower detail versions are considered as compatibility version for 2D systems while higher detail models are suitable for 3D analysis and visualization. The proposed levels of detail are matched with CityGML Levels of Detail, enabling both analysis and aesthetic views for urban modelling.

  2. Decoupled 1D/3D analysis of a hydraulic valve

    NASA Astrophysics Data System (ADS)

    Mehring, Carsten; Zopeya, Ashok; Latham, Matt; Ihde, Thomas; Massie, Dan

    2014-10-01

    Analysis approaches during product development of fluid valves and other aircraft fluid delivery components vary greatly depending on the development stage. Traditionally, empirical or simplistic one-dimensional tools are being deployed during preliminary design, whereas detailed analysis such as CFD (Computational Fluid Dynamics) tools are used to refine a selected design during the detailed design stage. In recent years, combined 1D/3D co-simulation has been deployed specifically for system level simulations requiring an increased level of analysis detail for one or more components. The present paper presents a decoupled 1D/3D analysis approach where 3D CFD analysis results are utilized to enhance the fidelity of a dynamic 1D modelin context of an aircraft fuel valve.

  3. Beyond the Shadow of a Trait: Understanding Discounting through Item-Level Analysis of Personality Scales

    ERIC Educational Resources Information Center

    Charlton, Shawn R.; Gossett, Bradley D.; Charlton, Veda A.

    2011-01-01

    Temporal discounting, the loss in perceived value associated with delayed outcomes, correlates with a number of personality measures, suggesting that an item-level analysis of trait measures might provide a more detailed understanding of discounting. The current report details two studies that investigate the utility of such an item-level…

  4. The Economic Contribution of Canada's Colleges and Institutes. An Analysis of Investment Effectiveness and Economic Growth. Volume 2: Detailed Results by Gender and Entry Level of Education

    ERIC Educational Resources Information Center

    Robison, M. Henry; Christophersen, Kjell A.

    2008-01-01

    The purpose of this volume is to present the results of the economic impact analysis in detail by gender and entry level of education. On the data entry side, gender and entry level of education are important variables that help characterize the student body profile. This profile data links to national statistical databases which are already…

  5. Automated distress surveys : analysis of network-level data.

    DOT National Transportation Integrated Search

    2017-04-01

    TxDOT Project 0-6663, Phase 1: Rutting : Applus, Dynatest, Fugro, Pathway and TxDOT : Reference: detailed project level (24 550-ft sections) : Phase 2: Distresses : Dynatest, Fugro, WayLink-OSU and TxDOT : Reference: detailed proj...

  6. Micro-Macro Analysis of Complex Networks

    PubMed Central

    Marchiori, Massimo; Possamai, Lino

    2015-01-01

    Complex systems have attracted considerable interest because of their wide range of applications, and are often studied via a “classic” approach: study a specific system, find a complex network behind it, and analyze the corresponding properties. This simple methodology has produced a great deal of interesting results, but relies on an often implicit underlying assumption: the level of detail on which the system is observed. However, in many situations, physical or abstract, the level of detail can be one out of many, and might also depend on intrinsic limitations in viewing the data with a different level of abstraction or precision. So, a fundamental question arises: do properties of a network depend on its level of observability, or are they invariant? If there is a dependence, then an apparently correct network modeling could in fact just be a bad approximation of the true behavior of a complex system. In order to answer this question, we propose a novel micro-macro analysis of complex systems that quantitatively describes how the structure of complex networks varies as a function of the detail level. To this extent, we have developed a new telescopic algorithm that abstracts from the local properties of a system and reconstructs the original structure according to a fuzziness level. This way we can study what happens when passing from a fine level of detail (“micro”) to a different scale level (“macro”), and analyze the corresponding behavior in this transition, obtaining a deeper spectrum analysis. The obtained results show that many important properties are not universally invariant with respect to the level of detail, but instead strongly depend on the specific level on which a network is observed. Therefore, caution should be taken in every situation where a complex network is considered, if its context allows for different levels of observability. PMID:25635812

  7. A Primer on Architectural Level Fault Tolerance

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    2008-01-01

    This paper introduces the fundamental concepts of fault tolerant computing. Key topics covered are voting, fault detection, clock synchronization, Byzantine Agreement, diagnosis, and reliability analysis. Low level mechanisms such as Hamming codes or low level communications protocols are not covered. The paper is tutorial in nature and does not cover any topic in detail. The focus is on rationale and approach rather than detailed exposition.

  8. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.

    2003-01-01

    Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.

  9. Development of Multidisciplinary, Multifidelity Analysis, Integration, and Optimization of Aerospace Vehicles

    DTIC Science & Technology

    2010-02-27

    investigated in more detail. The intermediate level of fidelity, though more expensive, is then used to refine the analysis , add geometric detail, and...design stage is used to further refine the analysis , narrowing the design to a handful of options. Figure 1. Integrated Hierarchical Framework. In...computational structural and computational fluid modeling. For the structural analysis tool we used McIntosh Structural Dynamics’ finite element code CNEVAL

  10. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  11. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  12. Land use classification and change analysis using ERTS-1 imagery in CARETS

    NASA Technical Reports Server (NTRS)

    Alexander, R. H.

    1973-01-01

    Land use detail in the CARETS area obtainable from ERTS exceeds the expectations of the Interagency Steering Committee and the USGS proposed standardized classification, which presents Level 1 categories for ERTS and Level 2 for high altitude aircraft data. Some Levels 2 and 3, in addition to Level 1, categories were identified on ERTS data. Significant land use changes totaling 39.2 sq km in the Norfolk-Portsmouth SMSA were identified and mapped at Level 2 detail using a combination of procedures employing ERTS and high altitude aircraft data.

  13. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    DTIC Science & Technology

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  14. Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,

    DTIC Science & Technology

    1975-12-01

    Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and

  15. Viewer: a User Interface for Failure Region Analysis

    DTIC Science & Technology

    1990-12-01

    another possible area of continued research. The program could detect whether the user is a beginner , intermediate, or expert and provide different...interfaces for each level. The beginner level would provide detailed help functions, and prompt the user with detailed explanations of what the program...June 1990. Brooke, J.B. and Duncan, K.D., "Experimental Studies of Flowchart Use at Different Stages of Program Debugging" (Ergonomics, Vol 23, No

  16. STEM: Science Technology Engineering Mathematics. State-Level Analysis

    ERIC Educational Resources Information Center

    Carnevale, Anthony P.; Smith, Nicole; Melton, Michelle

    2011-01-01

    The science, technology, engineering, and mathematics (STEM) state-level analysis provides policymakers, educators, state government officials, and others with details on the projections of STEM jobs through 2018. This report delivers a state-by-state snapshot of the demand for STEM jobs, including: (1) The number of forecast net new and…

  17. A Comparison of Functional Models for Use in the Function-Failure Design Method

    NASA Technical Reports Server (NTRS)

    Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.

    2006-01-01

    When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based on NTSB accident reports. To best record this data, standardized functional and failure mode vocabularies are used. Two separate function-failure knowledge bases are then created aid compared. Results indicate that encoding failure data using more detailed functional models allows for a more robust knowledge base. Interestingly however, when applying the EFDM, high level descriptions continue to produce useful results when using the knowledge base generated from the detailed functional models.

  18. Using Toulmin analysis to analyse an instructor's proof presentation in abstract algebra

    NASA Astrophysics Data System (ADS)

    Fukawa-connelly, Timothy

    2014-01-01

    This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of written models for their work. Similarly, the analysis shows that the details the instructor says aloud differ from what she writes down. Although her verbal commentary provides additional detail and appears to have pedagogical value, for instance, by modelling thinking that supports proof writing, this value might be better realized if she were to change her teaching practices.

  19. Horizontal Structure: A Neo-Piagetian Analysis of Structural Parallels across Domains.

    ERIC Educational Resources Information Center

    McKeough, Anne M.

    An analysis of children's narrative composition and art revealed concurrent development at both a general structural level and at a fine-grained detail level. A three-part study investigated whether this general cognitive pattern would be maintained across a different range of tasks: literary composition, scientific reasoning, and working memory.…

  20. The Culture of Distance Education: Implementing an Online Graduate Level Course in Audience Analysis.

    ERIC Educational Resources Information Center

    Duin, Ann Hill

    1998-01-01

    Details the experience of designing, implementing, and evaluating an online course in audience analysis at the graduate level. Describes how the educational culture of the Land Grant Mission flowed into efforts to create a quality learning experience. Discusses how the Web modules and asynchronous (listserv) and synchronous (MOO) conversations…

  1. Mind the Gap: An Initial Analysis of the Transition of a Second Level Curriculum Reform to Higher Education

    ERIC Educational Resources Information Center

    Prendergast, Mark; Faulkner, Fiona; Breen, Cormac; Carr, Michael

    2017-01-01

    This article details an initial analysis of the transition of a second level curriculum reform to higher education in Ireland. The reform entitled 'Project Maths' involved changes to what second level students learn in mathematics, how they learn it, and how they are assessed. Changes were rolled out nationally on a phased basis in September 2010.…

  2. Quantum chemical characterization of N-(2-hydroxybenzylidene)acetohydrazide (HBAH): a detailed vibrational and NLO analysis.

    PubMed

    Tamer, Ömer; Avcı, Davut; Atalay, Yusuf

    2014-01-03

    The molecular modeling of N-(2-hydroxybenzylidene)acetohydrazide (HBAH) was carried out using B3LYP, CAMB3LYP and PBE1PBE levels of density functional theory (DFT). The molecular structure of HBAH was solved by means of IR, NMR and UV-vis spectroscopies. In order to find the stable conformers, conformational analysis was performed based on B3LYP level. A detailed vibrational analysis was made on the basis of potential energy distribution (PED). HOMO and LUMO energies were calculated, and the obtained energies displayed that charge transfer occurs in HBAH. NLO analysis indicated that HBAH can be used as an effective NLO material. NBO analysis also proved that charge transfer, conjugative interactions and intramolecular hydrogen bonding interactions occur through HBAH. Additionally, major contributions from molecular orbitals to the electronic transitions were investigated theoretically. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Training Analysis of P-3 Replacement Pilot Training.

    ERIC Educational Resources Information Center

    Browning, Robert F.; And Others

    The report covers an evaluation of current P-3 pilot training programs at the replacement squadron level. It contains detailed discussions concerning training hardware and software that have been supplied. A detailed examination is made of the curriculum and the simulation capabilities and utilization of P-3 operational flight trainers. Concurrent…

  4. Predict Dem Bones!

    ERIC Educational Resources Information Center

    Gray, John S.

    1994-01-01

    A detailed analysis and computer-based solution to a puzzle addressing the arrangement of dominoes on a grid is presented. The problem is one used in a college-level data structures or algorithms course. The solution uses backtracking to generate all possible answers. Details of the use of backtracking and techniques for mapping abstract problems…

  5. Abstractions for DNA circuit design.

    PubMed

    Lakin, Matthew R; Youssef, Simon; Cardelli, Luca; Phillips, Andrew

    2012-03-07

    DNA strand displacement techniques have been used to implement a broad range of information processing devices, from logic gates, to chemical reaction networks, to architectures for universal computation. Strand displacement techniques enable computational devices to be implemented in DNA without the need for additional components, allowing computation to be programmed solely in terms of nucleotide sequences. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. Another challenge has been to design devices capable of sustaining precise reaction kinetics over long periods, without relying on complex experimental equipment to continually replenish depleted species over time. In this paper, we present a programming language for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. The language allows device designs to be programmed using a common syntax and then analysed at varying levels of detail, with or without interference, without needing to modify the program. This allows a trade-off to be made between the level of molecular detail and the computational cost of analysis. We use the language to design a buffered architecture for DNA devices, capable of maintaining precise reaction kinetics for a potentially unbounded period. We test the effectiveness of buffered gates to support long-running computation by designing a DNA strand displacement system capable of sustained oscillations.

  6. Radarsat-1 and ERS InSAR analysis over southeastern coastal Louisiana: Implications for mapping water-level changes beneath swamp forests

    USGS Publications Warehouse

    Lu, Z.; Kwoun, Oh-Ig

    2008-01-01

    Detailed analysis of C-band European Remote Sensing 1 and 2 (ERS-1/ERS-2) and Radarsat-1 interferometric synthetic aperture radar (InSAR) imagery was conducted to study water-level changes of coastal wetlands of southeastern Louisiana. Radar backscattering and InSAR coherence suggest that the dominant radar backscattering mechanism for swamp forest and saline marsh is double-bounce backscattering, implying that InSAR images can be used to estimate water-level changes with unprecedented spatial details. On the one hand, InSAR images suggest that water-level changes over the study site can be dynamic and spatially heterogeneous and cannot be represented by readings from sparsely distributed gauge stations. On the other hand, InSAR phase measurements are disconnected by structures and other barriers and require absolute water-level measurements from gauge stations or other sources to convert InSAR phase values to absolute water-level changes. ?? 2006 IEEE.

  7. Two-dimensional analysis of coupled heat and moisture transport in masonry structures

    NASA Astrophysics Data System (ADS)

    Krejčí, Tomáš

    2016-06-01

    Reconstruction and maintenance of historical buildings and bridges require good knowledge of temperature and moisture distribution. Sharp changes in the temperature and moisture can lead to damage. This paper describes analysis of coupled heat and moisture transfer in masonry based on two-level approach. Macro-scale level describes the whole structure while meso-scale level takes into account detailed composition of the masonry. The two-level approach is very computationally demanding and it was implemented in parallel. The two-level approach was used in analysis of temperature and moisture distribution in Charles bridge in Prague, Czech Republic.

  8. Computer Analysis of Air Pollution from Highways, Streets, and Complex Interchanges

    DOT National Transportation Integrated Search

    1974-03-01

    A detailed computer analysis of air quality for a complex highway interchange was prepared, using an in-house version of the Environmental Protection Agency's Gaussian Highway Line Source Model. This analysis showed that the levels of air pollution n...

  9. Aquarius Reflector Surface Temperature Monitoring Test and Analysis

    NASA Technical Reports Server (NTRS)

    Abbott, Jamie; Lee, Siu-Chun; Becker, Ray

    2008-01-01

    The presentation addresses how to infer the front side temperatures for the Aquarius L-band reflector based upon backside measurement sites. Slides discussing the mission objectives and design details are at the same level found on typical project outreach websites and in conference papers respectively. The test discussion provides modest detail of an ordinary thermal balance test using mockup hardware. The photographs show an off-Lab vacuum chamber facility with no compromising details.

  10. Detailed α -decay study of 180Tl

    NASA Astrophysics Data System (ADS)

    Andel, B.; Andreyev, A. N.; Antalic, S.; Barzakh, A.; Bree, N.; Cocolios, T. E.; Comas, V. F.; Diriken, J.; Elseviers, J.; Fedorov, D. V.; Fedosseev, V. N.; Franchoo, S.; Ghys, L.; Heredia, J. A.; Huyse, M.; Ivanov, O.; Köster, U.; Liberati, V.; Marsh, B. A.; Nishio, K.; Page, R. D.; Patronis, N.; Seliverstov, M. D.; Tsekhanovich, I.; Van den Bergh, P.; Van De Walle, J.; Van Duppen, P.; Venhart, M.; Vermote, S.; Veselský, M.; Wagemans, C.

    2017-11-01

    A detailed α -decay spectroscopy study of 180Tl has been performed at ISOLDE (CERN). Z -selective ionization by the Resonance Ionization Laser Ion Source (RILIS) coupled to mass separation provided a high-purity beam of 180Tl. Fine-structure α decays to excited levels in the daughter 176Au were identified and an α -decay scheme of 180Tl was constructed based on an analysis of α -γ and α -γ -γ coincidences. Multipolarities of several γ -ray transitions deexciting levels in 176Au were determined. Based on the analysis of reduced α -decay widths, it was found that all α decays are hindered, which signifies a change of configuration between the parent and all daughter states.

  11. Dasymetric high resolution population distribution estimates for improved decision making, with a case study of sea-level rise vulnerability in Boca Raton, Florida

    NASA Astrophysics Data System (ADS)

    Ziegler, Hannes Moritz

    Planners and managers often rely on coarse population distribution data from the census for addressing various social, economic, and environmental problems. In the analysis of physical vulnerabilities to sea-level rise, census units such as blocks or block groups are coarse relative to the required decision-making application. This study explores the benefits offered from integrating image classification and dasymetric mapping at the household level to provide detailed small area population estimates at the scale of residential buildings. In a case study of Boca Raton, FL, a sea-level rise inundation grid based on mapping methods by NOAA is overlaid on the highly detailed population distribution data to identify vulnerable residences and estimate population displacement. The enhanced spatial detail offered through this method has the potential to better guide targeted strategies for future development, mitigation, and adaptation efforts.

  12. Transportation analyses for the lunar-Mars initiative

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon R.; Buddington, Patricia A.

    1991-01-01

    This paper focuses on certain results of an ongoing NASA-sponsored study by Boeing, including (1) a series of representative space exploration scenarios; (2) the levels of effort required to accomplish each; and (3) a range of candidate transportation system as partial implementations of the scenarios. This effort predated release of the Synthesis report; the three levels of activity described are not responses to the Synthesis architectures. These three levels (minimum, median and ambitious), do envelope the range of scope described in the four Synthesis architecture models. The level of analysis detail was to the current known level of detail of transportation hardware systems and mission scenarios. The study did not include detailed analysis of earth-to-orbit transportation, surface systems, or tracking and communications systems. The influence of earth-to-orbit systems was considered in terms of delivery capacity and cost. Aspects of additional options, such as in situ resource utilization are explored as needed to indicate potential benefits. Results favored cryogenic chemical propulsion for low activity levels and undemanding missions (such as cargo and some lunar missions), nuclear thermal propulsion for median activity levels similar to the Synthesis architectures, and nuclear thermal propulsion with aerobraking or nuclear electric propulsion for high activity levels. Solar electric propulsion was seen as having an important role if the present high unit cost (i.e., dollars per watt) of space photovoltaics could be reduced by a factor of five or more at production rates of megawatts per year.

  13. A Novel Method for the In-Depth Multimodal Analysis of Student Learning Trajectories in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Liu, Ran; Stamper, John; Davenport, Jodi

    2018-01-01

    Temporal analyses are critical to understanding learning processes, yet understudied in education research. Data from different sources are often collected at different grain sizes, which are difficult to integrate. Making sense of data at many levels of analysis, including the most detailed levels, is highly time-consuming. In this paper, we…

  14. Falcon: A Temporal Visual Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.

    2016-09-05

    Flexible visible exploration of long, high-resolution time series from multiple sensor streams is a challenge in several domains. Falcon is a visual analytics approach that helps researchers acquire a deep understanding of patterns in log and imagery data. Falcon allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations with multiple levels of detail. These capabilities are applicable to the analysis of any quantitative time series.

  15. 49 CFR 611.5 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... following definitions apply: Alternatives analysis is a corridor level analysis which evaluates all... should include transit improvements lower in cost than the new start which result in a better ratio of... preparation of final construction plans (including construction management plans), detailed specifications...

  16. 49 CFR 611.5 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... following definitions apply: Alternatives analysis is a corridor level analysis which evaluates all... should include transit improvements lower in cost than the new start which result in a better ratio of... preparation of final construction plans (including construction management plans), detailed specifications...

  17. 49 CFR 611.5 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... following definitions apply: Alternatives analysis is a corridor level analysis which evaluates all... should include transit improvements lower in cost than the new start which result in a better ratio of... preparation of final construction plans (including construction management plans), detailed specifications...

  18. How could multimedia information about dental implant surgery effects patients' anxiety level?

    PubMed

    Kazancioglu, H-O; Dahhan, A-S; Acar, A-H

    2017-01-01

    To evaluate the effects of different patient education techniques on patients' anxiety levels before and after dental implant surgery. Sixty patients were randomized into three groups; each contained 20 patients; [group 1, basic information given verbally, with details of operation and recovery; group 2 (study group), basic information given verbally with details of operative procedures and recovery, and by watching a movie on single implant surgery]; and a control group [basic information given verbally "but it was devoid of the details of the operative procedures and recovery"]. Anxiety levels were assessed using the Spielberger's State-Trait Anxiety Inventory (STAI) and Modified Dental Anxiety Scale (MDAS). Pain was assessed with a visual analog scale (VAS). The most significant changes were observed in the movie group (P < 0.05). Patients who were more anxious also used more analgesic medication. Linear regression analysis showed that female patients had higher levels of anxiety (P < 0.05). Preoperative multimedia information increases anxiety level.

  19. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  20. Dynamics of land change in India: a fine-scale spatial analysis

    NASA Astrophysics Data System (ADS)

    Meiyappan, P.; Roy, P. S.; Sharma, Y.; Jain, A. K.; Ramachandran, R.; Joshi, P. K.

    2015-12-01

    Land is scarce in India: India occupies 2.4% of worlds land area, but supports over 1/6th of worlds human and livestock population. This high population to land ratio, combined with socioeconomic development and increasing consumption has placed tremendous pressure on India's land resources for food, feed, and fuel. In this talk, we present contemporary (1985 to 2005) spatial estimates of land change in India using national-level analysis of Landsat imageries. Further, we investigate the causes of the spatial patterns of change using two complementary lines of evidence. First, we use statistical models estimated at macro-scale to understand the spatial relationships between land change patterns and their concomitant drivers. This analysis using our newly compiled extensive socioeconomic database at village level (~630,000 units), is 100x higher in spatial resolution compared to existing datasets, and covers over 200 variables. The detailed socioeconomic data enabled the fine-scale spatial analysis with Landsat data. Second, we synthesized information from over 130 survey based case studies on land use drivers in India to complement our macro-scale analysis. The case studies are especially useful to identify unobserved variables (e.g. farmer's attitude towards risk). Ours is the most detailed analysis of contemporary land change in India, both in terms of national extent, and the use of detailed spatial information on land change, socioeconomic factors, and synthesis of case studies.

  1. Tomcat-Projects_RF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warrant, Marilyn M.; Garcia, Rudy J.; Zhang, Pengchu

    2004-09-15

    Tomcat-Projects_RF is a software package for analyzing sensor data obtained from a database and displaying the results with Java Servlet Pages (JSP). SQL Views into the dataset are tailored for personnel having different roles in monitoring the items in a storage facility. For example, an inspector, a host treaty compliance officer, a system engineer and software developers were the users identified that would need to access data at different levels of detail, The analysis provides a high level status of the storage facility and allows the user to go deeper into the data details if the user desires.

  2. Climate Action Benefits: Methods of Analysis

    EPA Pesticide Factsheets

    This page provides detailed information on the methods used in the CIRA analyses, including the overall framework, temperature projections, precipitation projections, sea level rise projections, uncertainty, and limitations.

  3. Enhanced LOD Concepts for Virtual 3d City Models

    NASA Astrophysics Data System (ADS)

    Benner, J.; Geiger, A.; Gröger, G.; Häfele, K.-H.; Löwner, M.-O.

    2013-09-01

    Virtual 3D city models contain digital three dimensional representations of city objects like buildings, streets or technical infrastructure. Because size and complexity of these models continuously grow, a Level of Detail (LoD) concept effectively supporting the partitioning of a complete model into alternative models of different complexity and providing metadata, addressing informational content, complexity and quality of each alternative model is indispensable. After a short overview on various LoD concepts, this paper discusses the existing LoD concept of the CityGML standard for 3D city models and identifies a number of deficits. Based on this analysis, an alternative concept is developed and illustrated with several examples. It differentiates between first, a Geometric Level of Detail (GLoD) and a Semantic Level of Detail (SLoD), and second between the interior building and its exterior shell. Finally, a possible implementation of the new concept is demonstrated by means of an UML model.

  4. Integrated approach for stress analysis of high performance diesel engine cylinder head

    NASA Astrophysics Data System (ADS)

    Chainov, N. D.; Myagkov, L. L.; Malastowski, N. S.; Blinov, A. S.

    2018-03-01

    Growing thermal and mechanical loads due to development of engines with high level of a mean effective pressure determine requirements to cylinder head durability. In this paper, computational schemes for thermal and mechanical stress analysis of a high performance diesel engine cylinder head were described. The most important aspects in this approach are the account of temperature fields of conjugated details (valves and saddles), heat transfer modeling in a cooling jacket of a cylinder head and topology optimization of the detail force scheme. Simulation results are shown and analyzed.

  5. A Homegrown Design for Data Warehousing: A District Customizes Its Own Process for Generating Detailed Information about Students in Real Time

    ERIC Educational Resources Information Center

    Thompson, Terry J.; Gould, Karen J.

    2005-01-01

    In recent years the Metropolitan School District of Wayne Township in Indianapolis has been awash in data. In attempts to improve levels of student achievement, the authors collected all manner of statistical details about students and schools and attempted to perform data analysis as part of the school improvement process. The authors were never…

  6. Visualizing Morphological Changes of Abscission Zone Cells in Arabidopsis by Scanning Electron Microscope.

    PubMed

    Shi, Chun-Lin; Butenko, Melinka A

    2018-01-01

    Scanning electron microscope (SEM) is a type of electron microscope which produces detailed images of surface structures. It has been widely used in plants and animals to study cellular structures. Here, we describe a detailed protocol to prepare samples of floral abscission zones (AZs) for SEM, as well as further image analysis. We show that it is a powerful tool to detect morphologic changes at the cellular level during the course of abscission in wild-type plants and to establish the details of phenotypic alteration in abscission mutants.

  7. Plane-wave decomposition by spherical-convolution microphone array

    NASA Astrophysics Data System (ADS)

    Rafaely, Boaz; Park, Munhum

    2004-05-01

    Reverberant sound fields are widely studied, as they have a significant influence on the acoustic performance of enclosures in a variety of applications. For example, the intelligibility of speech in lecture rooms, the quality of music in auditoria, the noise level in offices, and the production of 3D sound in living rooms are all affected by the enclosed sound field. These sound fields are typically studied through frequency response measurements or statistical measures such as reverberation time, which do not provide detailed spatial information. The aim of the work presented in this seminar is the detailed analysis of reverberant sound fields. A measurement and analysis system based on acoustic theory and signal processing, designed around a spherical microphone array, is presented. Detailed analysis is achieved by decomposition of the sound field into waves, using spherical Fourier transform and spherical convolution. The presentation will include theoretical review, simulation studies, and initial experimental results.

  8. GOATS - Orbitology Component

    NASA Technical Reports Server (NTRS)

    Haber, Benjamin M.; Green, Joseph J.

    2010-01-01

    The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the early stage of mission concept development.

  9. Procedures for numerical analysis of circadian rhythms

    PubMed Central

    REFINETTI, ROBERTO; LISSEN, GERMAINE CORNÉ; HALBERG, FRANZ

    2010-01-01

    This article reviews various procedures used in the analysis of circadian rhythms at the populational, organismal, cellular and molecular levels. The procedures range from visual inspection of time plots and actograms to several mathematical methods of time series analysis. Computational steps are described in some detail, and additional bibliographic resources and computer programs are listed. PMID:23710111

  10. Second-level post-occupancy evaluation (POE) analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, B.; Fisher, W.; Marans, R.W.

    1989-02-14

    Findings from a detailed analysis of post-occupancy evaluation data, sponsored by LRI, which involved thirteen office buildings typical of current design practice, will be discussed. Analysis of the data indicates that occupant satisfaction can be related to type of lighting system, presence of daylight, and patterns of luminance in the office. 15 refs., 9 figs., 3 tabs.

  11. 2016 Cost of Wind Energy Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stehly, Tyler J.; Heimiller, Donna M.; Scott, George N.

    This report uses representative utility-scale projects to estimate the levelized cost of energy (LCOE) for land-based and offshore wind power plants in the United States. Data and results detailed here are derived from 2016 commissioned plants. More specifically, analysis detailed here relies on recent market data and state-of-the-art modeling capabilities to maintain an up-to-date understanding of wind energy cost trends and drivers. This report is intended to provide insight into current component-level costs as well as a basis for understanding variability in LCOE across the country. This publication represents the sixth installment of this annual report.

  12. 2015 Cost of Wind Energy Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moné, Christopher; Hand, Maureen; Bolinger, Mark

    This report uses representative utility-scale projects to estimate the levelized cost of energy (LCOE) for land-based and offshore wind plants in the United States. Data and results detailed here are derived from 2015 commissioned plants. More specifically, analysis detailed here relies on recent market data and state-of-the-art modeling capabilities to maintain an up-to-date understanding of wind energy cost trends and drivers. It is intended to provide insight into current component-level costs as well as a basis for understanding variability in LCOE across the industry. This publication reflects the fifth installment of this annual report.

  13. RHSEG and Subdue: Background and Preliminary Approach for Combining these Technologies for Enhanced Image Data Analysis, Mining and Knowledge Discovery

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Cook, Diane J.

    2008-01-01

    Under a project recently selected for funding by NASA's Science Mission Directorate under the Applied Information Systems Research (AISR) program, Tilton and Cook will design and implement the integration of the Subdue graph based knowledge discovery system, developed at the University of Texas Arlington and Washington State University, with image segmentation hierarchies produced by the RHSEG software, developed at NASA GSFC, and perform pilot demonstration studies of data analysis, mining and knowledge discovery on NASA data. Subdue represents a method for discovering substructures in structural databases. Subdue is devised for general-purpose automated discovery, concept learning, and hierarchical clustering, with or without domain knowledge. Subdue was developed by Cook and her colleague, Lawrence B. Holder. For Subdue to be effective in finding patterns in imagery data, the data must be abstracted up from the pixel domain. An appropriate abstraction of imagery data is a segmentation hierarchy: a set of several segmentations of the same image at different levels of detail in which the segmentations at coarser levels of detail can be produced from simple merges of regions at finer levels of detail. The RHSEG program, a recursive approximation to a Hierarchical Segmentation approach (HSEG), can produce segmentation hierarchies quickly and effectively for a wide variety of images. RHSEG and HSEG were developed at NASA GSFC by Tilton. In this presentation we provide background on the RHSEG and Subdue technologies and present a preliminary analysis on how RHSEG and Subdue may be combined to enhance image data analysis, mining and knowledge discovery.

  14. The low-level waste handbook: A user's guide to the Low-Level Radioactive Waste Policy Amendments Act of 1985. [Contains glossary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, H.

    1986-11-01

    This report provides a detailed, section-by-section analysis of the Low-Level Radioactive Waste Policy Amendments Act of 1985. Appendices include lists of relevant law and legislation, relevant Congressional committees, members of Congress mentioned in the report, and exact copies of the 1980 and 1985 Acts. (TEM)

  15. Automation Applications in an Advance Air Traffic Management System : Volume IIB : Functional Analysis of Air Traffic Management (Cont'd)

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...

  16. Automated Applications in an Advanced Air Traffic Management System : Volume 2B. Functional Analysis of Air Traffic Management (Cont'd.)

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...

  17. Automation Applications in an Advanced Air Traffic Management System : Volume 2A. Functional Analysis of Air Traffic Management.

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...

  18. Automation Applications in an Advanced Air Traffic Management System : Volume 2C. Functional Analysis of Air Traffic Management (Cont.'d)

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...

  19. Analysis of Particle Image Velocimetry (PIV) Data for Acoustic Velocity Measurements

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    Acoustic velocity measurements were taken using Particle Image Velocimetry (PIV) in a Normal Incidence Tube configuration at various frequency, phase, and amplitude levels. This report presents the results of the PIV analysis and data reduction portions of the test and details the processing that was done. Estimates of lower measurement sensitivity levels were determined based on PIV image quality, correlation, and noise level parameters used in the test. Comparison of measurements with linear acoustic theory are presented. The onset of nonlinear, harmonic frequency acoustic levels were also studied for various decibel and frequency levels ranging from 90 to 132 dB and 500 to 3000 Hz, respectively.

  20. Advanced Information Processing System (AIPS)-based fault tolerant avionics architecture for launch vehicles

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1990-01-01

    An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.

  1. Systematic comparison of the behaviors produced by computational models of epileptic neocortex.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warlaumont, A. S.; Lee, H. C.; Benayoun, M.

    2010-12-01

    Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data canmore » be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.« less

  2. 14 CFR 1274.801 - Adjustments to performance costs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... NASA's initial cost share or funding levels, detailed cost analysis techniques may be applied, which... shall continue to maintain the share ratio requirements (normally 50/50) stated in § 1274.204(b). ...

  3. The element level time domain (ELTD) method for the analysis of nano-optical systems: I. Nondispersive media

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Oswald, Benedikt; Leidenberger, Patrick

    2012-04-01

    We study a 3-dimensional, dual-field, fully explicit method for the solution of Maxwell's equations in the time domain on unstructured, tetrahedral grids. The algorithm uses the element level time domain (ELTD) discretization of the electric and magnetic vector wave equations. In particular, the suitability of the method for the numerical analysis of nanometer structured systems in the optical region of the electromagnetic spectrum is investigated. The details of the theory and its implementation as a computer code are introduced and its convergence behavior as well as conditions for stable time domain integration is examined. Here, we restrict ourselves to non-dispersive dielectric material properties since dielectric dispersion will be treated in a subsequent paper. Analytically solvable problems are analyzed in order to benchmark the method. Eventually, a dielectric microlens is considered to demonstrate the potential of the method. A flexible method of 2nd order accuracy is obtained that is applicable to a wide range of nano-optical configurations and can be a serious competitor to more conventional finite difference time domain schemes which operate only on hexahedral grids. The ELTD scheme can resolve geometries with a wide span of characteristic length scales and with the appropriate level of detail, using small tetrahedra where delicate, physically relevant details must be modeled.

  4. Research on three-dimensional visualization based on virtual reality and Internet

    NASA Astrophysics Data System (ADS)

    Wang, Zongmin; Yang, Haibo; Zhao, Hongling; Li, Jiren; Zhu, Qiang; Zhang, Xiaohong; Sun, Kai

    2007-06-01

    To disclose and display water information, a three-dimensional visualization system based on Virtual Reality (VR) and Internet is researched for demonstrating "digital water conservancy" application and also for routine management of reservoir. To explore and mine in-depth information, after completion of modeling high resolution DEM with reliable quality, topographical analysis, visibility analysis and reservoir volume computation are studied. And also, some parameters including slope, water level and NDVI are selected to classify easy-landslide zone in water-level-fluctuating zone of reservoir area. To establish virtual reservoir scene, two kinds of methods are used respectively for experiencing immersion, interaction and imagination (3I). First virtual scene contains more detailed textures to increase reality on graphical workstation with virtual reality engine Open Scene Graph (OSG). Second virtual scene is for internet users with fewer details for assuring fluent speed.

  5. What Can We Expect from A-Level Mathematics Students?

    ERIC Educational Resources Information Center

    Lawson, Duncan

    1997-01-01

    Discusses the results obtained by students with A-level mathematics on Coventry University's diagnostic test in October 1997. Compares these results with those of students who entered the university in 1991. Provides detailed analysis of specific questions that indicate those areas in which students with grade A clearly outperform students with…

  6. Measuring Structural Gender Equality in Mexico: A State Level Analysis

    ERIC Educational Resources Information Center

    Frias, Sonia M.

    2008-01-01

    The main goal of this article is to assess the level of gender equality across the 32 Mexican states. After reviewing conceptual and methodological issues related to previous measures of structural inequality I detail the logic and methodology involved in the construction of a composite and multidimensional measure of gender equality, at the…

  7. High-Level Ab Initio Calculations of Intermolecular Interactions: Heavy Main-Group Element π-Interactions.

    PubMed

    Krasowska, Małgorzata; Schneider, Wolfgang B; Mehring, Michael; Auer, Alexander A

    2018-05-02

    This work reports high-level ab initio calculations and a detailed analysis on the nature of intermolecular interactions of heavy main-group element compounds and π systems. For this purpose we have chosen a set of benchmark molecules of the form MR 3 , in which M=As, Sb, or Bi, and R=CH 3 , OCH 3 , or Cl. Several methods for the description of weak intermolecular interactions are benchmarked including DFT-D, DFT-SAPT, MP2, and high-level coupled cluster methods in the DLPNO-CCSD(T) approximation. Using local energy decomposition (LED) and an analysis of the electron density, details of the nature of this interaction are unraveled. The results yield insight into the nature of dispersion and donor-acceptor interactions in this type of system, including systematic trends in the periodic table, and also provide a benchmark for dispersion interactions in heavy main-group element compounds. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board

    DTIC Science & Technology

    2017-03-01

    due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huff, Kathryn D.

    Component level and system level abstraction of detailed computational geologic repository models have resulted in four rapid computational models of hydrologic radionuclide transport at varying levels of detail. Those models are described, as is their implementation in Cyder, a software library of interchangeable radionuclide transport models appropriate for representing natural and engineered barrier components of generic geology repository concepts. A proof of principle demonstration was also conducted in which these models were used to represent the natural and engineered barrier components of a repository concept in a reducing, homogenous, generic geology. This base case demonstrates integration of the Cyder openmore » source library with the Cyclus computational fuel cycle systems analysis platform to facilitate calculation of repository performance metrics with respect to fuel cycle choices. (authors)« less

  10. Identification of large geomorphological anomalies based on 2D discrete wavelet transform

    NASA Astrophysics Data System (ADS)

    Doglioni, A.; Simeone, V.

    2012-04-01

    The identification and analysis based on quantitative evidences of large geomorphological anomalies is an important stage for the study of large landslides. Numerical geomorphic analyses represent an interesting approach to this kind of studies, allowing for a detailed and pretty accurate identification of hidden topographic anomalies that may be related to large landslides. Here a geomorphic numerical analyses of the Digital Terrain Model (DTM) is presented. The introduced approach is based on 2D discrete wavelet transform (Antoine et al., 2003; Bruun and Nilsen, 2003, Booth et al., 2009). The 2D wavelet decomposition of the DTM, and in particular the analysis of the detail coefficients of the wavelet transform can provide evidences of anomalies or singularities, i.e. discontinuities of the land surface. These discontinuities are not very evident from the DTM as it is, while 2D wavelet transform allows for grid-based analysis of DTM and for mapping the decomposition. In fact, the grid-based DTM can be assumed as a matrix, where a discrete wavelet transform (Daubechies, 1992) is performed columnwise and linewise, which basically represent horizontal and vertical directions. The outcomes of this analysis are low-frequency approximation coefficients and high-frequency detail coefficients. Detail coefficients are analyzed, since their variations are associated to discontinuities of the DTM. Detailed coefficients are estimated assuming to perform 2D wavelet transform both for the horizontal direction (east-west) and for the vertical direction (north-south). Detail coefficients are then mapped for both the cases, thus allowing to visualize and quantify potential anomalies of the land surface. Moreover, wavelet decomposition can be pushed to further levels, assuming a higher scale number of the transform. This may potentially return further interesting results, in terms of identification of the anomalies of land surface. In this kind of approach, the choice of a proper mother wavelet function is a tricky point, since it conditions the analysis and then their outcomes. Therefore multiple levels as well as multiple wavelet analyses are guessed. Here the introduced approach is applied to some interesting cases study of south Italy, in particular for the identification of large anomalies associated to large landslides at the transition between Apennine chain domain and the foredeep domain. In particular low Biferno valley and Fortore valley are here analyzed. Finally, the wavelet transforms are performed on multiple levels, thus trying to address the problem of which is the level extent for an accurate analysis fit to a specific problem. Antoine J.P., Carrette P., Murenzi R., and Piette B., (2003), Image analysis with two-dimensional continuous wavelet transform, Signal Processing, 31(3), pp. 241-272, doi:10.1016/0165-1684(93)90085-O. Booth A.M., Roering J.J., and Taylor Perron J., (2009), Automated landslide mapping using spectral analysis and high-resolution topographic data: Puget Sound lowlands, Washington, and Portland Hills, Oregon, Geomorphology, 109(3-4), pp. 132-147, doi:10.1016/j.geomorph.2009.02.027. Bruun B.T., and Nilsen S., (2003), Wavelet representation of large digital terrain models, Computers and Geoscience, 29(6), pp. 695-703, doi:10.1016/S0098-3004(03)00015-3. Daubechies, I. (1992), Ten lectures on wavelets, SIAM.

  11. Analyzing Visibility Configurations.

    PubMed

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  12. Structural design and analysis of a Mach zero to five turbo-ramjet system

    NASA Technical Reports Server (NTRS)

    Spoth, Kevin A.; Moses, Paul L.

    1993-01-01

    The paper discusses the structural design and analysis of a Mach zero to five turbo-ramjet propulsion system for a Mach five waverider-derived cruise vehicle. The level of analysis detail necessary for a credible conceptual design is shown. The results of a finite-element failure mode sizing analysis for the engine primary structure is presented. The importance of engine/airframe integration is also discussed.

  13. Macro scale models for freight railroad terminals.

    DOT National Transportation Integrated Search

    2016-03-02

    The project has developed a yard capacity model for macro-level analysis. The study considers the detailed sequence and scheduling in classification yards and their impacts on yard capacities simulate typical freight railroad terminals, and statistic...

  14. Summary and Analysis of President Obama's Education Budget Request, Fiscal Year 2012: Issue Brief

    ERIC Educational Resources Information Center

    New America Foundation, 2011

    2011-01-01

    President Barack Obama submitted his third budget request to Congress on February 14th, 2011. The detailed budget request includes proposed funding levels for federal programs and agencies in aggregate for the upcoming 10 fiscal years, and specific fiscal year 2012 funding levels for individual programs subject to appropriations. Congress will use…

  15. Analysis of Particle Image Velocimetry (PIV) Data for Application to Subsonic Jet Noise Studies

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    Global velocimetry measurements were taken using Particle Image Velocimetry (PIV) in the subsonic flow exiting a 1 inch circular nozzle in an attempt to better understand the turbulence characteristics of its shear layer region. This report presents the results of the PIV analysis and data reduction portions of the test and details the processing that was done. Custom data analysis and data validation algorithms were developed and applied to a data ensemble consisting of over 750 PIV 70 mm photographs taken in the 0.85 mach flow facility. Results are presented detailing spatial characteristics of the flow including ensemble mean and standard deviation, turbulence intensities and Reynold's stress levels, and 2-point spatial correlations.

  16. High resolution imaging of latent fingerprints by localized corrosion on brass surfaces.

    PubMed

    Goddard, Alex J; Hillman, A Robert; Bond, John W

    2010-01-01

    The Atomic Force Microscope (AFM) is capable of imaging fingerprint ridges on polished brass substrates at an unprecedented level of detail. While exposure to elevated humidity at ambient or slightly raised temperatures does not change the image appreciably, subsequent brief heating in a flame results in complete loss of the sweat deposit and the appearance of pits and trenches. Localized elemental analysis (using EDAX, coupled with SEM imaging) shows the presence of the constituents of salt in the initial deposits. Together with water and atmospheric oxygen--and with thermal enhancement--these are capable of driving a surface corrosion process. This process is sufficiently localized that it has the potential to generate a durable negative topographical image of the fingerprint. AFM examination of surface regions between ridges revealed small deposits (probably microscopic "spatter" of sweat components or transferred particulates) that may ultimately limit the level of ridge detail analysis.

  17. NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity.

    PubMed

    Al-Awami, Ali K; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter; Hadwiger, Markus

    2014-12-01

    We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.

  18. Measurement and control of detailed electronic properties in a single molecule break junction.

    PubMed

    Wang, Kun; Hamill, Joseph; Zhou, Jianfeng; Guo, Cunlan; Xu, Bingqian

    2014-01-01

    The lack of detailed experimental controls has been one of the major obstacles hindering progress in molecular electronics. While large fluctuations have been occurring in the experimental data, specific details, related mechanisms, and data analysis techniques are in high demand to promote our physical understanding at the single-molecule level. A series of modulations we recently developed, based on traditional scanning probe microscopy break junctions (SPMBJs), have helped to discover significant properties in detail which are hidden in the contact interfaces of a single-molecule break junction (SMBJ). For example, in the past we have shown that the correlated force and conductance changes under the saw tooth modulation and stretch-hold mode of PZT movement revealed inherent differences in the contact geometries of a molecular junction. In this paper, using a bias-modulated SPMBJ and utilizing emerging data analysis techniques, we report on the measurement of the altered alignment of the HOMO of benzene molecules with changing the anchoring group which coupled the molecule to metal electrodes. Further calculations based on Landauer fitting and transition voltage spectroscopy (TVS) demonstrated the effects of modulated bias on the location of the frontier molecular orbitals. Understanding the alignment of the molecular orbitals with the Fermi level of the electrodes is essential for understanding the behaviour of SMBJs and for the future design of more complex devices. With these modulations and analysis techniques, fruitful information has been found about the nature of the metal-molecule junction, providing us insightful clues towards the next step for in-depth study.

  19. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  20. An Introductory Classroom Exercise on Protein Molecular Model Visualization and Detailed Analysis of Protein-Ligand Binding

    ERIC Educational Resources Information Center

    Poeylaut-Palena, Andres, A.; de los Angeles Laborde, Maria

    2013-01-01

    A learning module for molecular level analysis of protein structure and ligand/drug interaction through the visualization of X-ray diffraction is presented. Using DeepView as molecular model visualization software, students learn about the general concepts of protein structure. This Biochemistry classroom exercise is designed to be carried out by…

  1. Questioning ORACLE: An Assessment of ORACLE's Analysis of Teachers' Questions and [A Comment on "Questioning ORACLE"].

    ERIC Educational Resources Information Center

    Scarth, John; And Others

    1986-01-01

    Analysis of teachers' questions, part of the ORACLE (Observation Research and Classroom Learning Evaluation) project research, is examined in detail. Scarth and Hammersley argue that the rules ORACLE uses for identifying different types of questions involve levels of ambiguity and inference that threaten reliability and validity of the study's…

  2. Using Toulmin Analysis to Analyse an Instructor's Proof Presentation in Abstract Algebra

    ERIC Educational Resources Information Center

    Fukawa-Connelly, Timothy

    2014-01-01

    This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of…

  3. Lessons Learned from Application of System and Software Level RAMS Analysis to a Space Control System

    NASA Astrophysics Data System (ADS)

    Silva, N.; Esper, A.

    2012-01-01

    The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.

  4. Confounding adjustment in comparative effectiveness research conducted within distributed research networks.

    PubMed

    Toh, Sengwee; Gagne, Joshua J; Rassen, Jeremy A; Fireman, Bruce H; Kulldorff, Martin; Brown, Jeffrey S

    2013-08-01

    A distributed research network (DRN) of electronic health care databases, in which data reside behind the firewall of each data partner, can support a wide range of comparative effectiveness research (CER) activities. An essential component of a fully functional DRN is the capability to perform robust statistical analyses to produce valid, actionable evidence without compromising patient privacy, data security, or proprietary interests. We describe the strengths and limitations of different confounding adjustment approaches that can be considered in observational CER studies conducted within DRNs, and the theoretical and practical issues to consider when selecting among them in various study settings. Several methods can be used to adjust for multiple confounders simultaneously, either as individual covariates or as confounder summary scores (eg, propensity scores and disease risk scores), including: (1) centralized analysis of patient-level data, (2) case-centered logistic regression of risk set data, (3) stratified or matched analysis of aggregated data, (4) distributed regression analysis, and (5) meta-analysis of site-specific effect estimates. These methods require different granularities of information be shared across sites and afford investigators different levels of analytic flexibility. DRNs are growing in use and sharing of highly detailed patient-level information is not always feasible in DRNs. Methods that incorporate confounder summary scores allow investigators to adjust for a large number of confounding factors without the need to transfer potentially identifiable information in DRNs. They have the potential to let investigators perform many analyses traditionally conducted through a centralized dataset with detailed patient-level information.

  5. Computational Analysis of Hybrid Two-Photon Absorbers with Excited State Absorption

    DTIC Science & Technology

    2007-03-01

    level. This hybrid arrangement creates a complex dynamical system in which the electron carrier concentration of every photo-activated energy level...spatiotemporal details of the electron population densities of each photo-activated energy level as well as the pulse shape in space and time. The main...experiments at low input energy . However, further additions must be done to the calculation of the optical path for high input energy . 1 15. SUBJECT TERM

  6. Low Thrust Orbital Maneuvers Using Ion Propulsion

    NASA Astrophysics Data System (ADS)

    Ramesh, Eric

    2011-10-01

    Low-thrust maneuver options, such as electric propulsion, offer specific challenges within mission-level Modeling, Simulation, and Analysis (MS&A) tools. This project seeks to transition techniques for simulating low-thrust maneuvers from detailed engineering level simulations such as AGI's Satellite ToolKit (STK) Astrogator to mission level simulations such as the System Effectiveness Analysis Simulation (SEAS). Our project goals are as follows: A) Assess different low-thrust options to achieve various orbital changes; B) Compare such approaches to more conventional, high-thrust profiles; C) Compare computational cost and accuracy of various approaches to calculate and simulate low-thrust maneuvers; D) Recommend methods for implementing low-thrust maneuvers in high-level mission simulations; E) prototype recommended solutions.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. L. Renner

    Recent national focus on the value of increasing our supply of indigenous, renewable energy underscores the need for reevaluating all alternatives, particularly those that are large and welldistributed nationally. This analysis will help determine how we can enlarge and diversify the portfolio of options we should be vigorously pursuing. One such option that is often ignored is geothermal energy, produced from both conventional hydrothermal and Enhanced (or engineered) Geothermal Systems (EGS). An 18-member assessment panel was assembled in September 2005 to evaluate the technical and economic feasibility of EGS becoming a major supplier of primary energy for U.S. base-load generationmore » capacity by 2050. This report documents the work of the panel at three separate levels of detail. The first is a Synopsis, which provides a brief overview of the scope, motivation, approach, major findings, and recommendations of the panel. At the second level, an Executive Summary reviews each component of the study, providing major results and findings. The third level provides full documentation in eight chapters, with each detailing the scope, approach, and results of the analysis and modeling conducted in each area.« less

  8. Nuclear powered Mars cargo transport mission utilizing advanced ion propulsion

    NASA Technical Reports Server (NTRS)

    Galecki, Diane L.; Patterson, Michael J.

    1987-01-01

    Nuclear-powered ion propulsion technology was combined with detailed trajectory analysis to determine propulsion system and trajectory options for an unmanned cargo mission to Mars in support of manned Mars missions. A total of 96 mission scenarios were identified by combining two power levels, two propellants, four values of specific impulse per propellant, three starting altitudes, and two starting velocities. Sixty of these scenarios were selected for a detailed trajectory analysis; a complete propulsion system study was then conducted for 20 of these trajectories. Trip times ranged from 344 days for a xenon propulsion system operating at 300 kW total power and starting from lunar orbit with escape velocity, to 770 days for an argon propulsion system operating at 300 kW total power and starting from nuclear start orbit with circular velocity. Trip times for the 3 MW cases studied ranged from 356 to 413 days. Payload masses ranged from 5700 to 12,300 kg for the 300 kW power level, and from 72,200 to 81,500 kg for the 3 MW power level.

  9. Multidisciplinary optimization of an HSCT wing using a response surface methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giunta, A.A.; Grossman, B.; Mason, W.H.

    1994-12-31

    Aerospace vehicle design is traditionally divided into three phases: conceptual, preliminary, and detailed. Each of these design phases entails a particular level of accuracy and computational expense. While there are several computer programs which perform inexpensive conceptual-level aircraft multidisciplinary design optimization (MDO), aircraft MDO remains prohibitively expensive using preliminary- and detailed-level analysis tools. This occurs due to the expense of computational analyses and because gradient-based optimization requires the analysis of hundreds or thousands of aircraft configurations to estimate design sensitivity information. A further hindrance to aircraft MDO is the problem of numerical noise which occurs frequently in engineering computations. Computermore » models produce numerical noise as a result of the incomplete convergence of iterative processes, round-off errors, and modeling errors. Such numerical noise is typically manifested as a high frequency, low amplitude variation in the results obtained from the computer models. Optimization attempted using noisy computer models may result in the erroneous calculation of design sensitivities and may slow or prevent convergence to an optimal design.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad Allen

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  11. Model-based time-series analysis of FIA panel data absent re-measurements

    Treesearch

    Raymond L. Czaplewski; Mike T. Thompson

    2013-01-01

    An epidemic of lodgepole pine (Pinus contorta) mortality from the mountain pine beetle (Dendroctonus ponderosae) has swept across the Interior West. Aerial surveys monitor the areal extent of the epidemic, but only Forest Inventory and Analysis (FIA) field data support a detailed assessment at the tree level. Dynamics of the lodgepole pine population occur at a more...

  12. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation;more » and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).« less

  13. Relation of pediatric blood lead levels to lead in gasoline.

    PubMed Central

    Billick, I H; Curran, A S; Shier, D R

    1980-01-01

    Analysis of a large data set of pediatric blood lead levels collected in New York City (1970-1976) shows a highly significant association between geometric mean blood lead levels and the amount of lead present in gasoline sold during the same period. This association was observed for all age and ethnic groups studied, and it suggests that possible exposure pathways other than ambient air should be considered. Even without detailed knowledge of the exact exposure pathways, sufficient information now exists for policy analysis and decisions relevant to controls and standards related to lead in gasoline and its effect on subsets of the population. PMID:7389685

  14. The SOBANE risk management strategy and the Déparis method for the participatory screening of the risks.

    PubMed

    Malchaire, J B

    2004-08-01

    The first section of the document describes a risk-prevention strategy, called SOBANE, in four levels: screening, observation, analysis and expertise. The aim is to make risk prevention faster, more cost effective, and more effective in coordinating the contributions of the workers themselves, their management, the internal and external occupational health (OH) practitioners and the experts. These four levels are: screening, where the risk factors are detected by the workers and their management, and obvious solutions are implemented; observation, where the remaining problems are studied in more detail, one by one, and the reasons and the solutions are discussed in detail; analysis, where, when necessary, an OH practitioner is called upon to carry out appropriate measurements to develop specific solutions; expertise, where, in very sophisticated and rare cases, the assistance of an expert is called upon to solve a particular problem. The method for the participatory screening of the risks (in French: Dépistage Participatif des Risques), Déparis, is proposed for the first level screening of the SOBANE strategy. The work situation is systematically reviewed and all the aspects conditioning the easiness, the effectiveness and the satisfaction at work are discussed, in search of practical prevention measures. The points to be studied more in detail at level 2, observation, are identified. The method is carried out during a meeting of key workers and technical staff. The method proves to be simple, sparing in time and means and playing a significant role in the development of a dynamic plan of risk management and of a culture of dialogue in the company.

  15. Maximizing data holdings and data documentation with a hierarchical system for sample-based geochemical data

    NASA Astrophysics Data System (ADS)

    Hsu, L.; Lehnert, K. A.; Walker, J. D.; Chan, C.; Ash, J.; Johansson, A. K.; Rivera, T. A.

    2011-12-01

    Sample-based measurements in geochemistry are highly diverse, due to the large variety of sample types, measured properties, and idiosyncratic analytical procedures. In order to ensure the utility of sample-based data for re-use in research or education they must be associated with a high quality and quantity of descriptive, discipline-specific metadata. Without an adequate level of documentation, it is not possible to reproduce scientific results or have confidence in using the data for new research inquiries. The required detail in data documentation makes it challenging to aggregate large sets of data from different investigators and disciplines. One solution to this challenge is to build data systems with several tiers of intricacy, where the less detailed tiers are geared toward discovery and interoperability, and the more detailed tiers have higher value for data analysis. The Geoinformatics for Geochemistry (GfG) group, which is part of the Integrated Earth Data Applications facility (http://www.iedadata.org), has taken this approach to provide services for the discovery, access, and analysis of sample-based geochemical data for a diverse user community, ranging from the highly informed geochemist to non-domain scientists and undergraduate students. GfG builds and maintains three tiers in the sample based data systems, from a simple data catalog (Geochemical Resource Library), to a substantially richer data model for the EarthChem Portal (EarthChem XML), and finally to detailed discipline-specific data models for petrologic (PetDB), sedimentary (SedDB), hydrothermal spring (VentDB), and geochronological (GeoChron) samples. The data catalog, the lowest level in the hierarchy, contains the sample data values plus metadata only about the dataset itself (Dublin Core metadata such as dataset title and author), and therefore can accommodate the widest diversity of data holdings. The second level includes measured data values from the sample, basic information about the analytical method, and metadata about the samples such as geospatial information and sample type. The third and highest level includes detailed data quality documentation and more specific information about the scientific context of the sample. The three tiers are linked to allow users to quickly navigate to their desired level of metadata detail. Links are based on the use of unique identifiers: (a) DOI at the granularity of datasets, and (b) the International Geo Sample Number IGSN at the granularity of samples. Current developments in the GfG sample-based systems include new registry architecture for the IGSN to advance international implementation, growth and modification of EarthChemXML to include geochemical data for new sample types such as soils and liquids, and the construction of a hydrothermal vent data system. This flexible, tiered, model provides a solution for offering varying levels of detail in order to aggregate a large quantity of data and serve the largest user group of both disciplinary novices and experts.

  16. Setting Strategic Directions Using Critical Success Factors.

    ERIC Educational Resources Information Center

    Bourne, Bonnie; Gates, Larry; Cofer, James

    2000-01-01

    Describes implementation of a system-level planning model focused on institutional improvement and effectiveness at the University of Missouri. Details implementation of three phases of the strategic planning model (strategic analysis, strategic thinking/decision-making, and campus outreach/systems administration planning); identifies critical…

  17. Pre-Deployment Handbook: Timor-Leste

    DTIC Science & Technology

    2014-05-01

    events as opposed to the detail. In a community where literacy levels are low, the telling of stories in public is an important part of recording...Tempo Semanal is the main national newspaper. However, low literacy levels make this less effective as a means of sharing information . English... information that will assist in understanding the complex environment that is Timor-Leste. The research and analysis supports a range of contingencies

  18. Sizing and Lifecycle Cost Analysis of an Ares V Composite Interstage

    NASA Technical Reports Server (NTRS)

    Mann, Troy; Smeltzer, Stan; Grenoble, Ray; Mason, Brian; Rosario, Sev; Fairbairn, Bob

    2012-01-01

    The Interstage Element of the Ares V launch vehicle was sized using a commercially available structural sizing software tool. Two different concepts were considered, a metallic design and a composite design. Both concepts were sized using similar levels of analysis fidelity and included the influence of design details on each concept. Additionally, the impact of the different manufacturing techniques and failure mechanisms for composite and metallic construction were considered. Significant details were included in analysis models of each concept, including penetrations for human access, joint connections, as well as secondary loading effects. The designs and results of the analysis were used to determine lifecycle cost estimates for the two Interstage designs. Lifecycle cost estimates were based on industry provided cost data for similar launch vehicle components. The results indicated that significant mass as well as cost savings are attainable for the chosen composite concept as compared with a metallic option.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Espinosa-Paredes, Gilberto; Prieto-Guerrero, Alfonso; Nunez-Carrera, Alejandro

    This paper introduces a wavelet-based method to analyze instability events in a boiling water reactor (BWR) during transient phenomena. The methodology to analyze BWR signals includes the following: (a) the short-time Fourier transform (STFT) analysis, (b) decomposition using the continuous wavelet transform (CWT), and (c) application of multiresolution analysis (MRA) using discrete wavelet transform (DWT). STFT analysis permits the study, in time, of the spectral content of analyzed signals. The CWT provides information about ruptures, discontinuities, and fractal behavior. To detect these important features in the signal, a mother wavelet has to be chosen and applied at several scales tomore » obtain optimum results. MRA allows fast implementation of the DWT. Features like important frequencies, discontinuities, and transients can be detected with analysis at different levels of detail coefficients. The STFT was used to provide a comparison between a classic method and the wavelet-based method. The damping ratio, which is an important stability parameter, was calculated as a function of time. The transient behavior can be detected by analyzing the maximum contained in detail coefficients at different levels in the signal decomposition. This method allows analysis of both stationary signals and highly nonstationary signals in the timescale plane. This methodology has been tested with the benchmark power instability event of Laguna Verde nuclear power plant (NPP) Unit 1, which is a BWR-5 NPP.« less

  20. Levels of detail analysis of microwave scattering from human head models for brain stroke detection

    PubMed Central

    2017-01-01

    In this paper, we have presented a microwave scattering analysis from multiple human head models. This study incorporates different levels of detail in the human head models and its effect on microwave scattering phenomenon. Two levels of detail are taken into account; (i) Simplified ellipse shaped head model (ii) Anatomically realistic head model, implemented using 2-D geometry. In addition, heterogenic and frequency-dispersive behavior of the brain tissues has also been incorporated in our head models. It is identified during this study that the microwave scattering phenomenon changes significantly once the complexity of head model is increased by incorporating more details using magnetic resonance imaging database. It is also found out that the microwave scattering results match in both types of head model (i.e., geometrically simple and anatomically realistic), once the measurements are made in the structurally simplified regions. However, the results diverge considerably in the complex areas of brain due to the arbitrary shape interface of tissue layers in the anatomically realistic head model. After incorporating various levels of detail, the solution of subject microwave scattering problem and the measurement of transmitted and backscattered signals were obtained using finite element method. Mesh convergence analysis was also performed to achieve error free results with a minimum number of mesh elements and a lesser degree of freedom in the fast computational time. The results were promising and the E-Field values converged for both simple and complex geometrical models. However, the E-Field difference between both types of head model at the same reference point differentiated a lot in terms of magnitude. At complex location, a high difference value of 0.04236 V/m was measured compared to the simple location, where it turned out to be 0.00197 V/m. This study also contributes to provide a comparison analysis between the direct and iterative solvers so as to find out the solution of subject microwave scattering problem in a minimum computational time along with memory resources requirement. It is seen from this study that the microwave imaging may effectively be utilized for the detection, localization and differentiation of different types of brain stroke. The simulation results verified that the microwave imaging can be efficiently exploited to study the significant contrast between electric field values of the normal and abnormal brain tissues for the investigation of brain anomalies. In the end, a specific absorption rate analysis was carried out to compare the ionizing effects of microwave signals to different types of head model using a factor of safety for brain tissues. It is also suggested after careful study of various inversion methods in practice for microwave head imaging, that the contrast source inversion method may be more suitable and computationally efficient for such problems. PMID:29177115

  1. Multi-level Discourse Analysis in a Physics Teaching Methods Course from the Psychological Perspective of Activity Theory

    NASA Astrophysics Data System (ADS)

    Vieira, Rodrigo Drumond; Kelly, Gregory J.

    2014-11-01

    In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.

  2. Rapid-estimation method for assessing scour at highway bridges

    USGS Publications Warehouse

    Holnbeck, Stephen R.

    1998-01-01

    A method was developed by the U.S. Geological Survey for rapid estimation of scour at highway bridges using limited site data and analytical procedures to estimate pier, abutment, and contraction scour depths. The basis for the method was a procedure recommended by the Federal Highway Administration for conducting detailed scour investigations, commonly referred to as the Level 2 method. Using pier, abutment, and contraction scour results obtained from Level 2 investigations at 122 sites in 10 States, envelope curves and graphical relations were developed that enable determination of scour-depth estimates at most bridge sites in a matter of a few hours. Rather than using complex hydraulic variables, surrogate variables more easily obtained in the field were related to calculated scour-depth data from Level 2 studies. The method was tested by having several experienced individuals apply the method in the field, and results were compared among the individuals and with previous detailed analyses performed for the sites. Results indicated that the variability in predicted scour depth among individuals applying the method generally was within an acceptable range, and that conservatively greater scour depths generally were obtained by the rapid-estimation method compared to the Level 2 method. The rapid-estimation method is considered most applicable for conducting limited-detail scour assessments and as a screening tool to determine those bridge sites that may require more detailed analysis. The method is designed to be applied only by a qualified professional possessing knowledge and experience in the fields of bridge scour, hydraulics, and flood hydrology, and having specific expertise with the Level 2 method.

  3. HydroApps: An R package for statistical simulation to use in regional analysis

    NASA Astrophysics Data System (ADS)

    Ganora, D.

    2013-12-01

    The HydroApps package is a newborn R extension initially developed to support the use of a recent model for flood frequency estimation developed for applications in Northwestern Italy; it also contains some general tools for regional analyses and can be easily extended to include other statistical models. The package is currently at an experimental level of development. The HydroApps is a corollary of the SSEM project for regional flood frequency analysis, although it was developed independently to support various instances of regional analyses. Its aim is to provide a basis for interplay between statistical simulation and practical operational use. In particular, the main module of the package deals with the building of the confidence bands of flood frequency curves expressed by means of their L-moments. Other functions include pre-processing and visualization of hydrologic time series, analysis of the optimal design-flood under uncertainty, but also tools useful in water resources management for the estimation of flow duration curves and their sensitivity to water withdrawals. Particular attention is devoted to the code granularity, i.e. the level of detail and aggregation of the code: a greater detail means more low-level functions, which entails more flexibility but reduces the ease of use for practical use. A balance between detail and simplicity is necessary and can be resolved with appropriate wrapping functions and specific help pages for each working block. From a more general viewpoint, the package has not really and user-friendly interface, but runs on multiple operating systems and it's easy to update, as many other open-source projects., The HydroApps functions and their features are reported in order to share ideas and materials to improve the ';technological' and information transfer between scientist communities and final users like policy makers.

  4. Investigating the Influence Relationship Models for Stocks in Indian Equity Market: A Weighted Network Modelling Study

    PubMed Central

    Acharjee, Animesh

    2016-01-01

    The socio-economic systems today possess high levels of both interconnectedness and interdependencies, and such system-level relationships behave very dynamically. In such situations, it is all around perceived that influence is a perplexing power that has an overseeing part in affecting the dynamics and behaviours of involved ones. As a result of the force & direction of influence, the transformative change of one entity has a cogent aftereffect on the other entities in the system. The current study employs directed weighted networks for investigating the influential relationship patterns existent in a typical equity market as an outcome of inter-stock interactions happening at the market level, the sectorial level and the industrial level. The study dataset is derived from 335 constituent stocks of ‘Standard & Poor Bombay Stock Exchange 500 index’ and study period is 1st June 2005 to 30th June 2015. The study identifies the set of most dynamically influential stocks & their respective temporal pattern at three hierarchical levels: the complete equity market, different sectors, and constituting industry segments of those sectors. A detailed influence relationship analysis is performed for the sectorial level network of the construction sector, and it was found that stocks belonging to the cement industry possessed high influence within this sector. Also, the detailed network analysis of construction sector revealed that it follows scale-free characteristics and power law distribution. In the industry specific influence relationship analysis for cement industry, methods based on threshold filtering and minimum spanning tree were employed to derive a set of sub-graphs having temporally stable high-correlation structure over this ten years period. PMID:27846251

  5. Investigating the Influence Relationship Models for Stocks in Indian Equity Market: A Weighted Network Modelling Study.

    PubMed

    Bhattacharjee, Biplab; Shafi, Muhammad; Acharjee, Animesh

    2016-01-01

    The socio-economic systems today possess high levels of both interconnectedness and interdependencies, and such system-level relationships behave very dynamically. In such situations, it is all around perceived that influence is a perplexing power that has an overseeing part in affecting the dynamics and behaviours of involved ones. As a result of the force & direction of influence, the transformative change of one entity has a cogent aftereffect on the other entities in the system. The current study employs directed weighted networks for investigating the influential relationship patterns existent in a typical equity market as an outcome of inter-stock interactions happening at the market level, the sectorial level and the industrial level. The study dataset is derived from 335 constituent stocks of 'Standard & Poor Bombay Stock Exchange 500 index' and study period is 1st June 2005 to 30th June 2015. The study identifies the set of most dynamically influential stocks & their respective temporal pattern at three hierarchical levels: the complete equity market, different sectors, and constituting industry segments of those sectors. A detailed influence relationship analysis is performed for the sectorial level network of the construction sector, and it was found that stocks belonging to the cement industry possessed high influence within this sector. Also, the detailed network analysis of construction sector revealed that it follows scale-free characteristics and power law distribution. In the industry specific influence relationship analysis for cement industry, methods based on threshold filtering and minimum spanning tree were employed to derive a set of sub-graphs having temporally stable high-correlation structure over this ten years period.

  6. Lifetime Measurement in the Yrast Band of 119I

    NASA Astrophysics Data System (ADS)

    Lobach, Yu. N.; Pasternak, A. A.; Srebrny, J.; Droste, Ch.; Hagemann, G. B.; Juutinen, S.; Morek, T.; Piiparinen, M.; Podsvirova, E. O.; Toermaenen, S.; Starosta, K.; Virtanen, A.; Wasilewski, A. A.

    1999-05-01

    The lifetime of levels in the yrast band of 119I were measured by DSAM and RDM using the 109Ag (13C,3n) reaction at E=54 MeV. The detailed description of data analysis including the stopping power determination and estimation of side feeding time is given. A modified method of RDM data analysis --- Recoil Distance Doppler Shape Attenuation (RDDSA) is used.

  7. Large Deployable Reflector (LDR) thermal characteristics

    NASA Technical Reports Server (NTRS)

    Miyake, R. N.; Wu, Y. C.

    1988-01-01

    The thermal support group, which is part of the lightweight composite reflector panel program, developed thermal test and analysis evaluation tools necessary to support the integrated interdisciplinary analysis (IIDA) capability. A detailed thermal mathematical model and a simplified spacecraft thermal math model were written. These models determine the orbital temperature level and variation, and the thermally induced gradients through and across a panel, for inclusion in the IIDA.

  8. A Cost and Performance System (CAPS) in a Federal agency

    NASA Technical Reports Server (NTRS)

    Huseonia, W. F.; Penton, P. G.

    1994-01-01

    Cost and Performance System (CAPS) is an automated system used from the planning phase through implementation to analysis and documentation. Data is retrievable or available for analysis of cost versus performance anomalies. CAPS provides a uniform system across intra- and international elements. A common system is recommended throughout an entire cost or profit center. Data can be easily accumulated and aggregated into higher levels of tracking and reporting of cost and performance.The level and quality of performance or productivity is indicated in the CAPS model and its process. The CAPS model provides the necessary decision information and insight to the principal investigator/project engineer for a successful project management experience. CAPS provides all levels of management with the appropriate detailed level of data.

  9. AIR EMISSIONS FROM COMBUSTION OF SOLVENT REFINED COAL

    EPA Science Inventory

    The report gives details of a Solvent Refined Coal (SRC) combustion test at Georgia Power Company's Plant Mitchell, March, May, and June 1977. Flue gas samples were collected for modified EPA Level 1 analysis; analytical results are reported. Air emissions from the combustion of ...

  10. Measurement and the Professions: Lessons from Accounting, Law, and Medicine.

    ERIC Educational Resources Information Center

    Nowakowski, Jeri; And Others

    1983-01-01

    This detailed analysis of the role of measurement across the three professions of law, medicine, and accounting offers insights into entry-level and performance barriers in occupations that rely on certification, licensing, and regulation to influence performance, ethics, and training. (Author/PN)

  11. Environmental research program. 1995 Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, N.J.

    1996-06-01

    The objective of the Environmental Research Program is to enhance the understanding of, and mitigate the effects of pollutants on health, ecological systems, global and regional climate, and air quality. The program is multidisciplinary and includes fundamental research and development in efficient and environmentally benign combustion, pollutant abatement and destruction, and novel methods of detection and analysis of criteria and noncriteria pollutants. This diverse group conducts investigations in combustion, atmospheric and marine processes, flue-gas chemistry, and ecological systems. Combustion chemistry research emphasizes modeling at microscopic and macroscopic scales. At the microscopic scale, functional sensitivity analysis is used to explore themore » nature of the potential-to-dynamics relationships for reacting systems. Rate coefficients are estimated using quantum dynamics and path integral approaches. At the macroscopic level, combustion processes are modelled using chemical mechanisms at the appropriate level of detail dictated by the requirements of predicting particular aspects of combustion behavior. Parallel computing has facilitated the efforts to use detailed chemistry in models of turbulent reacting flow to predict minor species concentrations.« less

  12. Citygml and the Streets of New York - a Proposal for Detailed Street Space Modelling

    NASA Astrophysics Data System (ADS)

    Beil, C.; Kolbe, T. H.

    2017-10-01

    Three-dimensional semantic city models are increasingly used for the analysis of large urban areas. Until now the focus has mostly been on buildings. Nonetheless many applications could also benefit from detailed models of public street space for further analysis. However, there are only few guidelines for representing roads within city models. Therefore, related standards dealing with street modelling are examined and discussed. Nearly all street representations are based on linear abstractions. However, there are many use cases that require or would benefit from the detailed geometrical and semantic representation of street space. A variety of potential applications for detailed street space models are presented. Subsequently, based on related standards as well as on user requirements, a concept for a CityGML-compliant representation of street space in multiple levels of detail is developed. In the course of this process, the CityGML Transportation model of the currently valid OGC standard CityGML2.0 is examined to discover possibilities for further developments. Moreover, a number of improvements are presented. Finally, based on open data sources, the proposed concept is implemented within a semantic 3D city model of New York City generating a detailed 3D street space model for the entire city. As a result, 11 thematic classes, such as roadbeds, sidewalks or traffic islands are generated and enriched with a large number of thematic attributes.

  13. Digital classification of Landsat data for vegetation and land-cover mapping in the Blackfoot River watershed, southeastern Idaho

    USGS Publications Warehouse

    Pettinger, L.R.

    1982-01-01

    This paper documents the procedures, results, and final products of a digital analysis of Landsat data used to produce a vegetation and landcover map of the Blackfoot River watershed in southeastern Idaho. Resource classes were identified at two levels of detail: generalized Level I classes (for example, forest land and wetland) and detailed Levels II and III classes (for example, conifer forest, aspen, wet meadow, and riparian hardwoods). Training set statistics were derived using a modified clustering approach. Environmental stratification that separated uplands from lowlands improved discrimination between resource classes having similar spectral signatures. Digital classification was performed using a maximum likelihood algorithm. Classification accuracy was determined on a single-pixel basis from a random sample of 25-pixel blocks. These blocks were transferred to small-scale color-infrared aerial photographs, and the image area corresponding to each pixel was interpreted. Classification accuracy, expressed as percent agreement of digital classification and photo-interpretation results, was 83.0:t 2.1 percent (0.95 probability level) for generalized (Level I) classes and 52.2:t 2.8 percent (0.95 probability level) for detailed (Levels II and III) classes. After the classified images were geometrically corrected, two types of maps were produced of Level I and Levels II and III resource classes: color-coded maps at a 1:250,000 scale, and flatbed-plotter overlays at a 1:24,000 scale. The overlays are more useful because of their larger scale, familiar format to users, and compatibility with other types of topographic and thematic maps of the same scale.

  14. Analysis of 3d Building Models Accuracy Based on the Airborne Laser Scanning Point Clouds

    NASA Astrophysics Data System (ADS)

    Ostrowski, W.; Pilarska, M.; Charyton, J.; Bakuła, K.

    2018-05-01

    Creating 3D building models in large scale is becoming more popular and finds many applications. Nowadays, a wide term "3D building models" can be applied to several types of products: well-known CityGML solid models (available on few Levels of Detail), which are mainly generated from Airborne Laser Scanning (ALS) data, as well as 3D mesh models that can be created from both nadir and oblique aerial images. City authorities and national mapping agencies are interested in obtaining the 3D building models. Apart from the completeness of the models, the accuracy aspect is also important. Final accuracy of a building model depends on various factors (accuracy of the source data, complexity of the roof shapes, etc.). In this paper the methodology of inspection of dataset containing 3D models is presented. The proposed approach check all building in dataset with comparison to ALS point clouds testing both: accuracy and level of details. Using analysis of statistical parameters for normal heights for reference point cloud and tested planes and segmentation of point cloud provides the tool that can indicate which building and which roof plane in do not fulfill requirement of model accuracy and detail correctness. Proposed method was tested on two datasets: solid and mesh model.

  15. Topographica: Building and Analyzing Map-Level Simulations from Python, C/C++, MATLAB, NEST, or NEURON Components

    PubMed Central

    Bednar, James A.

    2008-01-01

    Many neural regions are arranged into two-dimensional topographic maps, such as the retinotopic maps in mammalian visual cortex. Computational simulations have led to valuable insights about how cortical topography develops and functions, but further progress has been hindered by the lack of appropriate tools. It has been particularly difficult to bridge across levels of detail, because simulators are typically geared to a specific level, while interfacing between simulators has been a major technical challenge. In this paper, we show that the Python-based Topographica simulator makes it straightforward to build systems that cross levels of analysis, as well as providing a common framework for evaluating and comparing models implemented in other simulators. These results rely on the general-purpose abstractions around which Topographica is designed, along with the Python interfaces becoming available for many simulators. In particular, we present a detailed, general-purpose example of how to wrap an external spiking PyNN/NEST simulation as a Topographica component using only a dozen lines of Python code, making it possible to use any of the extensive input presentation, analysis, and plotting tools of Topographica. Additional examples show how to interface easily with models in other types of simulators. Researchers simulating topographic maps externally should consider using Topographica's analysis tools (such as preference map, receptive field, or tuning curve measurement) to compare results consistently, and for connecting models at different levels. This seamless interoperability will help neuroscientists and computational scientists to work together to understand how neurons in topographic maps organize and operate. PMID:19352443

  16. DC Bus Regulation with a Flywheel Energy Storage System

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Kascak, Peter E.

    2003-01-01

    This paper describes the DC bus regulation control algorithm for the NASA flywheel energy storage system during charge, charge reduction and discharge modes of operation. The algorithm was experimentally verified with results given in a previous paper. This paper presents the necessary models for simulation with detailed block diagrams of the controller algorithm. It is shown that the flywheel system and the controller can be modeled in three levels of detail depending on the type of analysis required. The three models are explained and then compared using simulation results.

  17. Comparative utility of LANDSAT-1 and Skylab data for coastal wetland mapping and ecological studies

    NASA Technical Reports Server (NTRS)

    Anderson, R.; Alsid, L.; Carter, V.

    1975-01-01

    Skylab 190-A photography and LANDSAT-1 analog data have been analyzed to determine coastal wetland mapping potential as a near term substitute for aircraft data and as a long term monitoring tool. The level of detail and accuracy of each was compared. Skylab data provides more accurate classification of wetland types, better delineation of freshwater marshes and more detailed analysis of drainage patterns. LANDSAT-1 analog data is useful for general classification, boundary definition and monitoring of human impact in wetlands.

  18. The levels of analysis revisited

    PubMed Central

    MacDougall-Shackleton, Scott A.

    2011-01-01

    The term levels of analysis has been used in several ways: to distinguish between ultimate and proximate levels, to categorize different kinds of research questions and to differentiate levels of reductionism. Because questions regarding ultimate function and proximate mechanisms are logically distinct, I suggest that distinguishing between these two levels is the best use of the term. Integrating across levels in research has potential risks, but many benefits. Consideration at one level can help generate novel hypotheses at the other, define categories of behaviour and set criteria that must be addressed. Taking an adaptationist stance thus strengthens research on proximate mechanisms. Similarly, it is critical for researchers studying adaptation and function to have detailed knowledge of proximate mechanisms that may constrain or modulate evolutionary processes. Despite the benefits of integrating across ultimate and proximate levels, failure to clearly identify levels of analysis, and whether or not hypotheses are exclusive alternatives, can create false debates. Such non-alternative hypotheses may occur between or within levels, and are not limited to integrative approaches. In this review, I survey different uses of the term levels of analysis and the benefits of integration, and highlight examples of false debate within and between levels. The best integrative biology reciprocally uses ultimate and proximate hypotheses to generate a more complete understanding of behaviour. PMID:21690126

  19. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  20. The Geomorphology of Puget Sound Beaches

    DTIC Science & Technology

    2006-10-01

    of longer-term climate variations it is referred to as a meteorological residual. An analysis of regional air pressure and water level observations...wave and tidal climate . For further details on the analy- sis rational and methods, see Finlayson (2006) The clustering analysis resulted in four profile...energy compared with incident waves on the Pacific Coast, and (2) the wave climate is tightly coupled with local wind patterns. The direction of

  1. Killing Barney Fife: Law Enforcements Socially Constructed Perception of Violence and its Influence on Police Militarization

    DTIC Science & Technology

    2015-09-01

    then examines the correlation between violence and police militarization. A statistical analysis of crime data found an inverse relationship between...violence and police militarization. A statistical analysis of crime data found an inverse relationship between levels of reported violence and...events. The research then focused on the correlation between violence and police militarization. The research began with a detailed statistical

  2. Cost and Price Collaboration

    DTIC Science & Technology

    2016-04-30

    competitions and dual sourcing. [patrick.n.watkins4.civ@mail.mil] Abstract This paper examines how collaboration between the cost analysis and price ...analysis. A review of the CSDRs by the price analyst led to the conclusion that the level of detail in the CSDR between recurring and non -recurring hours...Reports (CSDRs) – Starting in 2004 renewed emphasis on contractually requiring CSDR – CSDRs report actual and non -recurring costs • Price Negotiation

  3. High speed cylindrical roller bearing analysis. SKF computer program CYBEAN. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Dyba, G. J.; Kleckner, R. J.

    1981-01-01

    CYBEAN (CYlindrical BEaring ANalysis) was created to detail radially loaded, aligned and misaligned cylindrical roller bearing performance under a variety of operating conditions. Emphasis was placed on detailing the effects of high speed, preload and system thermal coupling. Roller tilt, skew, radial, circumferential and axial displacement as well as flange contact were considered. Variable housing and flexible out-of-round outer ring geometries, and both steady state and time transient temperature calculations were enabled. The complete range of elastohydrodynamic contact considerations, employing full and partial film conditions were treated in the computation of raceway and flange contacts. The practical and correct implementation of CYBEAN is discussed. The capability to execute the program at four different levels of complexity was included. In addition, the program was updated to properly direct roller-to-raceway contact load vectors automatically in those cases where roller or ring profiles have small radii of curvature. Input and output architectures containing guidelines for use and two sample executions are detailed.

  4. Structural Design of Ares V Interstage Composite Structure

    NASA Technical Reports Server (NTRS)

    Sleigh, David W.; Sreekantamurthy, Thammaiah; Kosareo, Daniel N.; Martin, Robert A.; Johnson, Theodore F.

    2011-01-01

    Preliminary and detailed design studies were performed to mature composite structural design concepts for the Ares V Interstage structure as a part of NASA s Advanced Composite Technologies Project. Aluminum honeycomb sandwich and hat-stiffened composite panel structural concepts were considered. The structural design and analysis studies were performed using HyperSizer design sizing software and MSC Nastran finite element analysis software. System-level design trade studies were carried out to predict weight and margins of safety for composite honeycomb-core sandwich and composite hat-stiffened skin design concepts. Details of both preliminary and detailed design studies are presented in the paper. For the range of loads and geometry considered in this work, the hat-stiffened designs were found to be approximately 11-16 percent lighter than the sandwich designs. A down-select process was used to choose the most favorable structural concept based on a set of figures of merit, and the honeycomb sandwich design was selected as the best concept based on advantages in manufacturing cost.

  5. Weekly Time Course of Neuro-Muscular Adaptation to Intensive Strength Training.

    PubMed

    Brown, Niklas; Bubeck, Dieter; Haeufle, Daniel F B; Weickenmeier, Johannes; Kuhl, Ellen; Alt, Wilfried; Schmitt, Syn

    2017-01-01

    Detailed description of the time course of muscular adaptation is rarely found in literature. Thus, models of muscular adaptation are difficult to validate since no detailed data of adaptation are available. In this article, as an initial step toward a detailed description and analysis of muscular adaptation, we provide a case report of 8 weeks of intense strength training with two active, male participants. Muscular adaptations were analyzed on a morphological level with MRI scans of the right quadriceps muscle and the calculation of muscle volume, on a voluntary strength level by isometric voluntary contractions with doublet stimulation (interpolated twitch technique) and on a non-voluntary level by resting twitch torques. Further, training volume and isokinetic power were closely monitored during the training phase. Data were analyzed weekly for 1 week prior to training, pre-training, 8 weeks of training and 2 weeks of detraining (no strength training). Results show a very individual adaptation to the intense strength training protocol. While training volume and isokinetic power increased linearly during the training phase, resting twitch parameters decreased for both participants after the first week of training and stayed below baseline until de-training. Voluntary activation level showed an increase in the first 4 weeks of training, while maximum voluntary contraction showed only little increase compared to baseline. Muscle volume increased for both subjects. Especially training status seemed to influence the acute reaction to intense strength training. Fatigue had a major influence on performance and could only be overcome by one participant. The results give a first detailed insight into muscular adaptation to intense strength training on various levels, providing a basis of data for a validation of muscle fatigue and adaptation models.

  6. Assessing efficiency of software production for NASA-SEL data

    NASA Technical Reports Server (NTRS)

    Vonmayrhauser, Anneliese; Roeseler, Armin

    1993-01-01

    This paper uses production models to identify and quantify efficient allocation of resources and key drivers of software productivity for project data in the NASA-SEL database. While analysis allows identification of efficient projects, many of the metrics that could have provided a more detailed analysis are not at a level of measurement to allow production model analysis. Production models must be used with proper parameterization to be successful. This may mean a new look at which metrics are helpful for efficiency assessment.

  7. Mod 1 wind turbine generator failure modes and effects analysis

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A failure modes and effects analysis (FMEA) was directed primarily at identifying those critical failure modes that would be hazardous to life or would result in major damage to the system. Each subsystem was approached from the top down, and broken down to successive lower levels where it appeared that the criticality of the failure mode warranted more detail analysis. The results were reviewed by specialists from outside the Mod 1 program, and corrective action taken wherever recommended.

  8. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  9. Low heat transfer oxidizer heat exchanger design and analysis

    NASA Technical Reports Server (NTRS)

    Kanic, P. G.; Kmiec, T. D.; Peckham, R. J.

    1987-01-01

    The RL10-IIB engine, a derivative of the RLIO, is capable of multi-mode thrust operation. This engine operates at two low thrust levels: tank head idle (THI), which is approximately 1 to 2 percent of full thrust, and pumped idle (PI), which is 10 percent of full thrust. Operation at THI provides vehicle propellant settling thrust and efficient engine thermal conditioning; PI operation provides vehicle tank pre-pressurization and maneuver thrust for log-g deployment. Stable combustion of the RL10-IIB engine at THI and PI thrust levels can be accomplished by providing gaseous oxygen at the propellant injector. Using gaseous hydrogen from the thrust chamber jacket as an energy source, a heat exchanger can be used to vaporize liquid oxygen without creating flow instability. This report summarizes the design and analysis of a United Aircraft Products (UAP) low-rate heat transfer heat exchanger concept for the RL10-IIB rocket engine. The design represents a second iteration of the RL10-IIB heat exchanger investigation program. The design and analysis of the first heat exchanger effort is presented in more detail in NASA CR-174857. Testing of the previous design is detailed in NASA CR-179487.

  10. Detailed Vibration Analysis of Pinion Gear with Time-Frequency Methods

    NASA Technical Reports Server (NTRS)

    Mosher, Marianne; Pryor, Anna H.; Lewicki, David G.

    2003-01-01

    In this paper, the authors show a detailed analysis of the vibration signal from the destructive testing of a spiral bevel gear and pinion pair containing seeded faults. The vibration signal is analyzed in the time domain, frequency domain and with four time-frequency transforms: the Short Time Frequency Transform (STFT), the Wigner-Ville Distribution with the Choi-Williams kernel (WV-CW), the Continuous Wavelet' Transform (CWT) and the Discrete Wavelet Transform (DWT). Vibration data of bevel gear tooth fatigue cracks, under a variety of operating load levels and damage conditions, are analyzed using these methods. A new metric for automatic anomaly detection is developed and can be produced from any systematic numerical representation of the vibration signals. This new metric reveals indications of gear damage with all of the time-frequency transforms, as well as time and frequency representations, on this data set. Analysis with the CWT detects changes in the signal at low torque levels not found with the other transforms. The WV-CW and CWT use considerably more resources than the STFT and the DWT. More testing of the new metric is needed to determine its value for automatic anomaly detection and to develop fault detection methods for the metric.

  11. Toward Theory-Based Instruction in Scientific Problem Solving.

    ERIC Educational Resources Information Center

    Heller, Joan I.; And Others

    Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…

  12. 49 CFR 229.209 - Alternative locomotive crashworthiness designs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... locomotive crashworthiness design, in detail; (3) The intended type of service for locomotives built under the proposed design; and (4) Appropriate data and analysis showing how the design either satisfies the requirements of § 229.205 for the type of locomotive or provides at least an equivalent level of safety. Types...

  13. Monitoring and Controlling Engineering and Construction Management Cost Performance Within the Corps of Engineers

    DTIC Science & Technology

    1988-12-01

    COST MANAGEMENT The CMIF approach addresses total costs but does not permit the analysis of indirect costs. We found that indirect costs vary...responsibility USACE/divisions Increasing CMIF Districts/divisions level of by fund type detail G&A, technical indirect, burden Districts by fund type

  14. Course Level and the Relationship between Research Productivity and Teaching Effectiveness

    ERIC Educational Resources Information Center

    Arnold, Ivo J. M.

    2008-01-01

    The author examines the relationship between research productivity and teaching effectiveness using data from the Erasmus School of Economics. The initial findings indicate a positive overall relationship between the variables. A more detailed analysis reveals a sharp reversal in the nature of the relationship. Although the relationship is…

  15. Performance Analysis of Saturated Induction Motors by Virtual Tests

    ERIC Educational Resources Information Center

    Ojaghi, M.; Faiz, J.; Kazemi, M.; Rezaei, M.

    2012-01-01

    Many undergraduate-level electrical machines textbooks give detailed treatments of the performance of induction motors. Students can deepen this understanding of motor performance by performing the appropriate practical work in laboratories or in simulation using proper software packages. This paper considers various common and less-common tests…

  16. Personality and Subjective Well-Being: What Hides behind Global Analyses?

    ERIC Educational Resources Information Center

    Albuquerque, Isabel; de Lima, Margarida Pedroso; Matos, Marcela; Figueiredo, Claudia

    2012-01-01

    The relation between personality and subjective well-being (SWB) remains involved in a considerable ambiguity and the numerous studies conducted have neglected an approach at a more detailed level of analysis. This study explores the idea that neuroticism, extraversion and conscientiousness facets predict differentially each SWB component. A…

  17. Measuring Pressure Volume Loops in the Mouse.

    PubMed

    Townsend, DeWayne

    2016-05-02

    Understanding the causes and progression of heart disease presents a significant challenge to the biomedical community. The genetic flexibility of the mouse provides great potential to explore cardiac function at the molecular level. The mouse's small size does present some challenges in regards to performing detailed cardiac phenotyping. Miniaturization and other advancements in technology have made many methods of cardiac assessment possible in the mouse. Of these, the simultaneous collection of pressure and volume data provides a detailed picture of cardiac function that is not available through any other modality. Here a detailed procedure for the collection of pressure-volume loop data is described. Included is a discussion of the principles underlying the measurements and the potential sources of error. Anesthetic management and surgical approaches are discussed in great detail as they are both critical to obtaining high quality hemodynamic measurements. The principles of hemodynamic protocol development and relevant aspects of data analysis are also addressed.

  18. GOplot: an R package for visually combining expression data with functional analysis.

    PubMed

    Walter, Wencke; Sánchez-Cabo, Fátima; Ricote, Mercedes

    2015-09-01

    Despite the plethora of methods available for the functional analysis of omics data, obtaining comprehensive-yet detailed understanding of the results remains challenging. This is mainly due to the lack of publicly available tools for the visualization of this type of information. Here we present an R package called GOplot, based on ggplot2, for enhanced graphical representation. Our package takes the output of any general enrichment analysis and generates plots at different levels of detail: from a general overview to identify the most enriched categories (bar plot, bubble plot) to a more detailed view displaying different types of information for molecules in a given set of categories (circle plot, chord plot, cluster plot). The package provides a deeper insight into omics data and allows scientists to generate insightful plots with only a few lines of code to easily communicate the findings. The R package GOplot is available via CRAN-The Comprehensive R Archive Network: http://cran.r-project.org/web/packages/GOplot. The shiny web application of the Venn diagram can be found at: https://wwalter.shinyapps.io/Venn/. A detailed manual of the package with sample figures can be found at https://wencke.github.io/ fscabo@cnic.es or mricote@cnic.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Analysis of the impact of simulation model simplifications on the quality of low-energy buildings simulation results

    NASA Astrophysics Data System (ADS)

    Klimczak, Marcin; Bojarski, Jacek; Ziembicki, Piotr; Kęskiewicz, Piotr

    2017-11-01

    The requirements concerning energy performance of buildings and their internal installations, particularly HVAC systems, have been growing continuously in Poland and all over the world. The existing, traditional calculation methods following from the static heat exchange model are frequently not sufficient for a reasonable heating design of a building. Both in Poland and elsewhere in the world, methods and software are employed which allow a detailed simulation of the heating and moisture conditions in a building, and also an analysis of the performance of HVAC systems within a building. However, these systems are usually difficult in use and complex. In addition, the development of a simulation model that is sufficiently adequate to the real building requires considerable time involvement of a designer, is time-consuming and laborious. A simplification of the simulation model of a building renders it possible to reduce the costs of computer simulations. The paper analyses in detail the effect of introducing a number of different variants of the simulation model developed in Design Builder on the quality of final results obtained. The objective of this analysis is to find simplifications which allow obtaining simulation results which have an acceptable level of deviations from the detailed model, thus facilitating a quick energy performance analysis of a given building.

  20. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  1. Language Geography from Microblogging Platforms

    NASA Astrophysics Data System (ADS)

    Mocanu, Delia; Baronchelli, Andrea; Perra, Nicola; Gonçalves, Bruno; Vespignani, Alessandro

    2013-03-01

    Microblogging platforms have now become major open source indicators for complex social interactions. With the advent of smartphones, the everincreasing mobile Internet traffic gives us the unprecedented opportunity to complement studies of complex social phenomena with real-time location information. In this work, we show that the data nowadays accessible allows for detailed studies at different scales, ranging from country-level aggregate analysis to the analysis of linguistic communities withing specific neighborhoods. The high resolution and coverage of this data permits us to investigate such issues as the linguistic homogeneity of different countries, touristic seasonal patterns within countries, and the geographical distribution of different languages in bilingual regions. This work highlights the potentialities of geolocalized studies of open data sources that can provide an extremely detailed picture of the language geography.

  2. U.S. Army Base Closure Program, Final Decision Document, Cameron Station, Alexandria, Virginia

    DTIC Science & Technology

    1993-11-01

    POTENTIAL CONCERN AT THE CAMERON STATION SITE TABLE 2-7 CHEMICALS CONTRIBUTING TO EXCESS CANCER RISK AT A PATHWAY LEVEL OF 1E-6 OR GREATER TABLE 2-8...CANCER RISK AT A PATHWAY LEVEL OF 1E-6 OR GREATER TABLE 2-8 SUMMARY OF DETAILED ANALYSIS OU-1 PCB TRANSFORMER SERVICE, STORAGE AND SPILL AREAS TABLE 2-9...Restrictions MCL Maximum Contaminant Level mg milligram 3001riOOh:bfsrod~acronyms.Ri November 18, 1993 v LIST OF ACRONYMS AND ABBREVIATIONS, CONTINUED O&M

  3. Probing photoelectron multiple interferences via Fourier spectroscopy in energetic photoionization of Xe@C60

    NASA Astrophysics Data System (ADS)

    Potter, Andrea; McCune, Matthew A.; de, Ruma; Madjet, Mohamed E.; Chakraborty, Himadri S.

    2010-09-01

    Considering the photoionization of the Xe@C60 endohedral compound, we study in detail the ionization cross sections of various levels of the system at energies higher than the plasmon resonance region. Five classes of single-electron levels are identified depending on their spectral character. Each class engenders distinct oscillations in the cross section, emerging from the interference between active ionization modes specific to that class. Analysis of the cross sections based on their Fourier transforms unravels oscillation frequencies that carry unique fingerprints of the emitting level.

  4. 78 FR 66929 - Intent To Conduct a Detailed Economic Impact Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... EXPORT-IMPORT BANK Intent To Conduct a Detailed Economic Impact Analysis AGENCY: Policy and... Federal Register notice informing the public of its intent to conduct a detailed economic impact analysis... subject to a detailed economic impact analysis. DATES: The Federal Register notice published on August 5...

  5. Base-By-Base: single nucleotide-level analysis of whole viral genome alignments.

    PubMed

    Brodie, Ryan; Smith, Alex J; Roper, Rachel L; Tcherepanov, Vasily; Upton, Chris

    2004-07-14

    With ever increasing numbers of closely related virus genomes being sequenced, it has become desirable to be able to compare two genomes at a level more detailed than gene content because two strains of an organism may share the same set of predicted genes but still differ in their pathogenicity profiles. For example, detailed comparison of multiple isolates of the smallpox virus genome (each approximately 200 kb, with 200 genes) is not feasible without new bioinformatics tools. A software package, Base-By-Base, has been developed that provides visualization tools to enable researchers to 1) rapidly identify and correct alignment errors in large, multiple genome alignments; and 2) generate tabular and graphical output of differences between the genomes at the nucleotide level. Base-By-Base uses detailed annotation information about the aligned genomes and can list each predicted gene with nucleotide differences, display whether variations occur within promoter regions or coding regions and whether these changes result in amino acid substitutions. Base-By-Base can connect to our mySQL database (Virus Orthologous Clusters; VOCs) to retrieve detailed annotation information about the aligned genomes or use information from text files. Base-By-Base enables users to quickly and easily compare large viral genomes; it highlights small differences that may be responsible for important phenotypic differences such as virulence. It is available via the Internet using Java Web Start and runs on Macintosh, PC and Linux operating systems with the Java 1.4 virtual machine.

  6. Training Guide for Severe Weather Forecasters

    DTIC Science & Technology

    1979-11-01

    that worked very well for the example forecast is used to show the importance of parameter intensities and the actual thought processes that go into the...simplify the explanation of the complete level analysis. This entire process will be repeated for the 700 mb and 500 mb levels. Details in Figures 1 through...parameters of moderate to strong intensity must occur, in the same place at the same time. A description of what constitutes a weak, moderate, or strong

  7. Alternative Fuels Data Center: State Alternative Fuel and Advanced Vehicle

    Science.gov Websites

    2017, the number of new state-level legislative, executive, private, and utility activities related to 2016 (54 total), but in line with previous years. The analysis below provides detail on trends in new (PEVs). Also consistent with recent prior years, new utility and private incentives primarily addressed

  8. The Use of Images in Intelligent Advisor Systems.

    ERIC Educational Resources Information Center

    Boulet, Marie-Michele

    This paper describes the intelligent advisor system, named CODAMA, used in teaching a university-level systems analysis and design course. The paper discusses: (1) the use of CODAMA to assist students to transfer theoretical knowledge to the practical; (2) details of how CODAMA is applied in conjunction with a computer-aided software engineering…

  9. Evolutionary conservation, diversity and specificity of LTR retrotransposons in flowering plants: insights from genome-wide analysis and multi-specific comparison

    USDA-ARS?s Scientific Manuscript database

    The availability of complete or nearly complete genome sequences from several plant species permits detailed discovery and cross-species comparison of transposable elements (TEs) at the whole genome level. We initially investigated 510 LTR-retrotransposon (LTR-RT) families that are comprised of 32,...

  10. Comparative anatomy of the female genitalia of generic-level taxa in tribe Aedini (Diptera: Culicidae). Part XXXIV. Genus Catageiomyia Theobald

    USDA-ARS?s Scientific Manuscript database

    A comparative, morphological analysis of the female genitalia of species included in genus Catageiomyia Theobald was conducted. Treatment of the genital morphology of the genus includes a composite description of the genus, a detailed description and illustration of the type species (Cg. irritans (...

  11. Leveling the Playing Field: Increasing Student Achievement through Data-Driven Ability Grouping and Instructional Practices

    ERIC Educational Resources Information Center

    Sexton, Jami

    2010-01-01

    This action research project focuses on increasing student comprehension and achievement. The study examined the effectiveness of completing detailed item analysis of assessments for the purpose of placing students into different Language Arts classes and learning groups within those classes. Research advocates placing students of similar ability…

  12. Teaching Digital Libraries in Spain: Context and Experiences

    ERIC Educational Resources Information Center

    Garcia-Marco, Francisco-Javier

    2009-01-01

    The situation of digital libraries teaching and learning in Spain up to 2008 is examined. A detailed analysis of the different curricula and subjects is provided both at undergraduate and postgraduate level. Digital libraries have been mostly a postgraduate topic in Spain, but they should become mainstream, with special subjects devoted to them,…

  13. The "Virtual ChemLab" Project: A Realistic and Sophisticated Simulation of Organic Synthesis and Organic Qualitative Analysis

    ERIC Educational Resources Information Center

    Woodfield, Brian F.; Andrus, Merritt B.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg; Andersen, Tricia; Miller, Jordan; Simmons, Bryon; Stanger, Richard

    2005-01-01

    A set of sophisticated and realistic laboratory simulations is created for use in freshman- and sophomore-level chemistry classes and laboratories called 'Virtual ChemLab'. A detailed assessment of student responses is provided and the simulation's pedagogical utility is described using the organic simulation.

  14. Revealing the structural nature of the Cd isotopes

    NASA Astrophysics Data System (ADS)

    Garrett, P. E.; Diaz Varela, A.; Green, K. L.; Jamieson, D. S.; Jigmeddorj, B.; Wood, J. L.; Yates, S. W.

    2015-10-01

    The even-even Cd isotopes have provided fertile ground for the investigation of collectivity in nuclei. Soon after the development of the Bohr model, the stable Cd isotopes were identified as nearly harmonic vibrators based on their excitation energy patterns. The measurements of enhanced B (E 2) values appeared to support this interpretation. Shape co-existing rotational-like intruder bands were discovered, and mixing between the configurations was invoked to explain the deviation of the decay pattern of multiphonon vibrational states. Very recently, a detailed analysis of the low-lying levels of 110Cd combining results of the (n ,n' γ) reaction and high-statistics β decay, provided strong evidence that the mixing between configurations is weak, except for the ground-state band and ``Kπ =0+ '' intruder band. The analysis of the levels in 110Cd has now been extended to 3 MeV, and combined with data for 112Cd and previous Coulomb excitation data for 114Cd, enables a detailed map of the E 2 collectivity in these nuclei, demanding a complete re-interpretation of the structure of the stable Cd isotopes.

  15. Lunar Exploration Architecture Level Key Drivers and Sensitivities

    NASA Technical Reports Server (NTRS)

    Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher

    2009-01-01

    Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.

  16. Analyzing developmental processes on an individual level using nonstationary time series modeling.

    PubMed

    Molenaar, Peter C M; Sinclair, Katerina O; Rovine, Michael J; Ram, Nilam; Corneal, Sherry E

    2009-01-01

    Individuals change over time, often in complex ways. Generally, studies of change over time have combined individuals into groups for analysis, which is inappropriate in most, if not all, studies of development. The authors explain how to identify appropriate levels of analysis (individual vs. group) and demonstrate how to estimate changes in developmental processes over time using a multivariate nonstationary time series model. They apply this model to describe the changing relationships between a biological son and father and a stepson and stepfather at the individual level. The authors also explain how to use an extended Kalman filter with iteration and smoothing estimator to capture how dynamics change over time. Finally, they suggest further applications of the multivariate nonstationary time series model and detail the next steps in the development of statistical models used to analyze individual-level data.

  17. The use of open source bioinformatics tools to dissect transcriptomic data.

    PubMed

    Nitsche, Benjamin M; Ram, Arthur F J; Meyer, Vera

    2012-01-01

    Microarrays are a valuable technology to study fungal physiology on a transcriptomic level. Various microarray platforms are available comprising both single and two channel arrays. Despite different technologies, preprocessing of microarray data generally includes quality control, background correction, normalization, and summarization of probe level data. Subsequently, depending on the experimental design, diverse statistical analysis can be performed, including the identification of differentially expressed genes and the construction of gene coexpression networks.We describe how Bioconductor, a collection of open source and open development packages for the statistical programming language R, can be used for dissecting microarray data. We provide fundamental details that facilitate the process of getting started with R and Bioconductor. Using two publicly available microarray datasets from Aspergillus niger, we give detailed protocols on how to identify differentially expressed genes and how to construct gene coexpression networks.

  18. Unraveling Mixed Hydrate Formation: Microscopic Insights into Early Stage Behavior.

    PubMed

    Hall, Kyle Wm; Zhang, Zhengcai; Kusalik, Peter G

    2016-12-29

    The molecular-level details of mixed hydrate nucleation remain unclear despite the broad implications of this process for a variety of scientific domains. Through analysis of mixed hydrate nucleation in a prototypical CH 4 /H 2 S/H 2 O system, we demonstrate that high-level kinetic similarities between mixed hydrate systems and corresponding pure hydrate systems are not a reliable basis for estimating the composition of early stage mixed hydrate nuclei. Moreover, we show that solution compositions prior to and during nucleation are not necessarily effective proxies for the composition of early stage mixed hydrate nuclei. Rather, microscopic details, (e.g., guest-host interactions and previously neglected cage types) apparently play key roles in determining early stage behavior of mixed hydrates. This work thus provides key foundational concepts and insights for understanding mixed hydrate nucleation.

  19. Spatio-temporal Analysis for New York State SPARCS Data

    PubMed Central

    Chen, Xin; Wang, Yu; Schoenfeld, Elinor; Saltz, Mary; Saltz, Joel; Wang, Fusheng

    2017-01-01

    Increased accessibility of health data provides unique opportunities to discover spatio-temporal patterns of diseases. For example, New York State SPARCS (Statewide Planning and Research Cooperative System) data collects patient level detail on patient demographics, diagnoses, services, and charges for each hospital inpatient stay and outpatient visit. Such data also provides home addresses for each patient. This paper presents our preliminary work on spatial, temporal, and spatial-temporal analysis of disease patterns for New York State using SPARCS data. We analyzed spatial distribution patterns of typical diseases at ZIP code level. We performed temporal analysis of common diseases based on 12 years’ historical data. We then compared the spatial variations for diseases with different levels of clustering tendency, and studied the evolution history of such spatial patterns. Case studies based on asthma demonstrated that the discovered spatial clusters are consistent with prior studies. We visualized our spatial-temporal patterns as animations through videos. PMID:28815148

  20. Independent Orbiter Assessment (IOA): Weibull analysis report

    NASA Technical Reports Server (NTRS)

    Raffaelli, Gary G.

    1987-01-01

    The Auxiliary Power Unit (APU) and Hydraulic Power Unit (HPU) Space Shuttle Subsystems were reviewed as candidates for demonstrating the Weibull analysis methodology. Three hardware components were identified as analysis candidates: the turbine wheel, the gearbox, and the gas generator. Detailed review of subsystem level wearout and failure history revealed the lack of actual component failure data. In addition, component wearout data were not readily available or would require a separate data accumulation effort by the vendor. Without adequate component history data being available, the Weibull analysis methodology application to the APU and HPU subsystem group was terminated.

  1. Structural dynamic analysis of the Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Scott, L. P.; Jamison, G. T.; Mccutcheon, W. A.; Price, J. M.

    1981-01-01

    This structural dynamic analysis supports development of the SSME by evaluating components subjected to critical dynamic loads, identifying significant parameters, and evaluating solution methods. Engine operating parameters at both rated and full power levels are considered. Detailed structural dynamic analyses of operationally critical and life limited components support the assessment of engine design modifications and environmental changes. Engine system test results are utilized to verify analytic model simulations. The SSME main chamber injector assembly is an assembly of 600 injector elements which are called LOX posts. The overall LOX post analysis procedure is shown.

  2. Analysis of polonium-210 in food products and bioassay samples by isotope-dilution alpha spectrometry.

    PubMed

    Lin, Zhichao; Wu, Zhongyu

    2009-05-01

    A rapid and reliable radiochemical method coupled with a simple and compact plating apparatus was developed, validated, and applied for the analysis of (210)Po in variety of food products and bioassay samples. The method performance characteristics, including accuracy, precision, robustness, and specificity, were evaluated along with a detailed measurement uncertainty analysis. With high Po recovery, improved energy resolution, and effective removal of interfering elements by chromatographic extraction, the overall method accuracy was determined to be better than 5% with measurement precision of 10%, at 95% confidence level.

  3. Algorithms and architecture for multiprocessor based circuit simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deutsch, J.T.

    Accurate electrical simulation is critical to the design of high performance integrated circuits. Logic simulators can verify function and give first-order timing information. Switch level simulators are more effective at dealing with charge sharing than standard logic simulators, but cannot provide accurate timing information or discover DC problems. Delay estimation techniques and cell level simulation can be used in constrained design methods, but must be tuned for each application, and circuit simulation must still be used to generate the cell models. None of these methods has the guaranteed accuracy that many circuit designers desire, and none can provide detailed waveformmore » information. Detailed electrical-level simulation can predict circuit performance if devices and parasitics are modeled accurately. However, the computational requirements of conventional circuit simulators make it impractical to simulate current large circuits. In this dissertation, the implementation of Iterated Timing Analysis (ITA), a relaxation-based technique for accurate circuit simulation, on a special-purpose multiprocessor is presented. The ITA method is an SOR-Newton, relaxation-based method which uses event-driven analysis and selective trace to exploit the temporal sparsity of the electrical network. Because event-driven selective trace techniques are employed, this algorithm lends itself to implementation on a data-driven computer.« less

  4. Analysis of LMNB1 Duplications in Autosomal Dominant Leukodystrophy Provides Insights into Duplication Mechanisms and Allele-Specific Expression

    PubMed Central

    Giorgio, Elisa; Rolyan, Harshvardhan; Kropp, Laura; Chakka, Anish Baswanth; Yatsenko, Svetlana; Gregorio, Eleonora Di; Lacerenza, Daniela; Vaula, Giovanna; Talarico, Flavia; Mandich, Paola; Toro, Camilo; Pierre, Eleonore Eymard; Labauge, Pierre; Capellari, Sabina; Cortelli, Pietro; Vairo, Filippo Pinto; Miguel, Diego; Stubbolo, Danielle; Marques, Lourenco Charles; Gahl, William; Boespflug-Tanguy, Odile; Melberg, Atle; Hassin-Baer, Sharon; Cohen, Oren S; Pjontek, Rastislav; Grau, Armin; Klopstock, Thomas; Fogel, Brent; Meijer, Inge; Rouleau, Guy; Bouchard, Jean-Pierre L; Ganapathiraju, Madhavi; Vanderver, Adeline; Dahl, Niklas; Hobson, Grace; Brusco, Alfredo; Brussino, Alessandro; Padiath, Quasar Saleem

    2013-01-01

    ABSTRACT Autosomal dominant leukodystrophy (ADLD) is an adult onset demyelinating disorder that is caused by duplications of the lamin B1 (LMNB1) gene. However, as only a few cases have been analyzed in detail, the mechanisms underlying LMNB1 duplications are unclear. We report the detailed molecular analysis of the largest collection of ADLD families studied, to date. We have identified the minimal duplicated region necessary for the disease, defined all the duplication junctions at the nucleotide level and identified the first inverted LMNB1 duplication. We have demonstrated that the duplications are not recurrent; patients with identical duplications share the same haplotype, likely inherited from a common founder and that the duplications originated from intrachromosomal events. The duplication junction sequences indicated that nonhomologous end joining or replication-based mechanisms such fork stalling and template switching or microhomology-mediated break induced repair are likely to be involved. LMNB1 expression was increased in patients’ fibroblasts both at mRNA and protein levels and the three LMNB1 alleles in ADLD patients show equal expression, suggesting that regulatory regions are maintained within the rearranged segment. These results have allowed us to elucidate duplication mechanisms and provide insights into allele-specific LMNB1 expression levels. PMID:23649844

  5. Enhancement of low light level images using color-plus-mono dual camera.

    PubMed

    Jung, Yong Ju

    2017-05-15

    In digital photography, the improvement of imaging quality in low light shooting is one of the users' needs. Unfortunately, conventional smartphone cameras that use a single, small image sensor cannot provide satisfactory quality in low light level images. A color-plus-mono dual camera that consists of two horizontally separate image sensors, which simultaneously captures both a color and mono image pair of the same scene, could be useful for improving the quality of low light level images. However, an incorrect image fusion between the color and mono image pair could also have negative effects, such as the introduction of severe visual artifacts in the fused images. This paper proposes a selective image fusion technique that applies an adaptive guided filter-based denoising and selective detail transfer to only those pixels deemed reliable with respect to binocular image fusion. We employ a dissimilarity measure and binocular just-noticeable-difference (BJND) analysis to identify unreliable pixels that are likely to cause visual artifacts during image fusion via joint color image denoising and detail transfer from the mono image. By constructing an experimental system of color-plus-mono camera, we demonstrate that the BJND-aware denoising and selective detail transfer is helpful in improving the image quality during low light shooting.

  6. User modeling techniques for enhanced usability of OPSMODEL operations simulation software

    NASA Technical Reports Server (NTRS)

    Davis, William T.

    1991-01-01

    The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.

  7. Magnetic interactions and magnetic anisotropy in exchange coupled 4f-3d systems: a case study of a heterodinuclear Ce3+-Fe3+ cyanide-bridged complex.

    PubMed

    Sorace, Lorenzo; Sangregorio, Claudio; Figuerola, Albert; Benelli, Cristiano; Gatteschi, Dante

    2009-01-01

    We report here a detailed single-crystal EPR and magnetic study of a homologous series of complexes of the type Ln-M (Ln = La(III), Ce(III); M = Fe(III), Co(III)). We were able to obtain a detailed picture of the low-lying levels of Ce(III) and Fe(III) centres through the combined use of single-crystal EPR and magnetic susceptibility data. We show that classical ligand field theory can be of great help in rationalising the energies of the low-lying levels of both the transition-metal and rare-earth ions. The combined analysis of single-crystal EPR and magnetic data of the coupled system Ce-Fe confirmed the great complexity of the interactions involving rare-earth elements. With little uncertainty, it turned out clearly that the description of the interaction involving the lowest lying spin levels requires the introduction of the isotropic, anisotropic and antisymmetric terms.

  8. Rapid and Near Real-Time Assessments of Population Displacement Using Mobile Phone Data Following Disasters: The 2015 Nepal Earthquake.

    PubMed

    Wilson, Robin; Zu Erbach-Schoenberg, Elisabeth; Albert, Maximilian; Power, Daniel; Tudge, Simon; Gonzalez, Miguel; Guthrie, Sam; Chamberlain, Heather; Brooks, Christopher; Hughes, Christopher; Pitonakova, Lenka; Buckee, Caroline; Lu, Xin; Wetter, Erik; Tatem, Andrew; Bengtsson, Linus

    2016-02-24

    Sudden impact disasters often result in the displacement of large numbers of people. These movements can occur prior to events, due to early warning messages, or take place post-event due to damages to shelters and livelihoods as well as a result of long-term reconstruction efforts. Displaced populations are especially vulnerable and often in need of support. However, timely and accurate data on the numbers and destinations of displaced populations are extremely challenging to collect across temporal and spatial scales, especially in the aftermath of disasters. Mobile phone call detail records were shown to be a valid data source for estimates of population movements after the 2010 Haiti earthquake, but their potential to provide near real-time ongoing measurements of population displacements immediately after a natural disaster has not been demonstrated. A computational architecture and analytical capacity were rapidly deployed within nine days of the Nepal earthquake of 25th April 2015, to provide spatiotemporally detailed estimates of population displacements from call detail records based on movements of 12 million de-identified mobile phones users. Analysis shows the evolution of population mobility patterns after the earthquake and the patterns of return to affected areas, at a high level of detail. Particularly notable is the movement of an estimated 390,000 people above normal from the Kathmandu valley after the earthquake, with most people moving to surrounding areas and the highly-populated areas in the central southern area of Nepal. This analysis provides an unprecedented level of information about human movement after a natural disaster, provided within a very short timeframe after the earthquake occurred. The patterns revealed using this method are almost impossible to find through other methods, and are of great interest to humanitarian agencies.

  9. Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.

    2016-01-01

    Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.

  10. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, Jack C; Begoli, Edmon; Jose, Ajith

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Severalmore » useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.« less

  11. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  12. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  13. Constructing graph models for software system development and analysis

    NASA Astrophysics Data System (ADS)

    Pogrebnoy, Andrey V.

    2017-01-01

    We propose a concept for creating the instrumentation for functional and structural decisions rationale during the software system (SS) development. We propose to develop SS simultaneously on two models - functional (FM) and structural (SM). FM is a source code of the SS. Adequate representation of the FM in the form of a graph model (GM) is made automatically and called SM. The problem of creating and visualizing GM is considered from the point of applying it as a uniform platform for the adequate representation of the SS source code. We propose three levels of GM detailing: GM1 - for visual analysis of the source code and for SS version control, GM2 - for resources optimization and analysis of connections between SS components, GM3 - for analysis of the SS functioning in dynamics. The paper includes examples of constructing all levels of GM.

  14. PathFinder: reconstruction and dynamic visualization of metabolic pathways.

    PubMed

    Goesmann, Alexander; Haubrock, Martin; Meyer, Folker; Kalinowski, Jörn; Giegerich, Robert

    2002-01-01

    Beyond methods for a gene-wise annotation and analysis of sequenced genomes new automated methods for functional analysis on a higher level are needed. The identification of realized metabolic pathways provides valuable information on gene expression and regulation. Detection of incomplete pathways helps to improve a constantly evolving genome annotation or discover alternative biochemical pathways. To utilize automated genome analysis on the level of metabolic pathways new methods for the dynamic representation and visualization of pathways are needed. PathFinder is a tool for the dynamic visualization of metabolic pathways based on annotation data. Pathways are represented as directed acyclic graphs, graph layout algorithms accomplish the dynamic drawing and visualization of the metabolic maps. A more detailed analysis of the input data on the level of biochemical pathways helps to identify genes and detect improper parts of annotations. As an Relational Database Management System (RDBMS) based internet application PathFinder reads a list of EC-numbers or a given annotation in EMBL- or Genbank-format and dynamically generates pathway graphs.

  15. Geographic information system as country-level development and monitoring tool, Senegal example

    USGS Publications Warehouse

    Moore, Donald G.; Howard, Stephen M.; ,

    1990-01-01

    Geographic information systems (GIS) allow an investigator the capability to merge and analyze numerous types of country-level resource data. Hypothetical resource analysis applications in Senegal were conducted to illustrate the utility of a GIS for development planning and resource monitoring. Map and attribute data for soils, vegetation, population, infrastructure, and administrative units were merged to form a database within a GIS. Several models were implemented using a GIS to: analyze development potential for sustainable dryland agriculture; prioritize where agricultural development should occur based upon a regional food budget; and monitor dynamic events with remote sensing. The steps for implementing a GIS analysis are described and illustrated, and the use of a GIS for conducting an economic analysis is outlined. Using a GIS for analysis and display of results opens new methods of communication between resource scientists and decision makers. Analyses yielding country-wide map output and detailed statistical data for each level of administration provide the advantage of a single system that can serve a variety of users.

  16. Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis

    PubMed Central

    Hauser, Tobias U.; Fiore, Vincenzo G.; Moutoussis, Michael; Dolan, Raymond J.

    2016-01-01

    Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. PMID:26787097

  17. Benefits of Spacecraft Level Vibration Testing

    NASA Technical Reports Server (NTRS)

    Gordon, Scott; Kern, Dennis L.

    2015-01-01

    NASA-HDBK-7008 Spacecraft Level Dynamic Environments Testing discusses the approaches, benefits, dangers, and recommended practices for spacecraft level dynamic environments testing, including vibration testing. This paper discusses in additional detail the benefits and actual experiences of vibration testing spacecraft for NASA Goddard Space Flight Center (GSFC) and Jet Propulsion Laboratory (JPL) flight projects. JPL and GSFC have both similarities and differences in their spacecraft level vibration test approach: JPL uses a random vibration input and a frequency range usually starting at 5 Hz and extending to as high as 250 Hz. GSFC uses a sine sweep vibration input and a frequency range usually starting at 5 Hz and extending only to the limits of the coupled loads analysis (typically 50 to 60 Hz). However, both JPL and GSFC use force limiting to realistically notch spacecraft resonances and response (acceleration) limiting as necessary to protect spacecraft structure and hardware from exceeding design strength capabilities. Despite GSFC and JPL differences in spacecraft level vibration test approaches, both have uncovered a significant number of spacecraft design and workmanship anomalies in vibration tests. This paper will give an overview of JPL and GSFC spacecraft vibration testing approaches and provide a detailed description of spacecraft anomalies revealed.

  18. FUSE: a profit maximization approach for functional summarization of biological networks.

    PubMed

    Seah, Boon-Siew; Bhowmick, Sourav S; Dewey, C Forbes; Yu, Hanry

    2012-03-21

    The availability of large-scale curated protein interaction datasets has given rise to the opportunity to investigate higher level organization and modularity within the protein interaction network (PPI) using graph theoretic analysis. Despite the recent progress, systems level analysis of PPIS remains a daunting task as it is challenging to make sense out of the deluge of high-dimensional interaction data. Specifically, techniques that automatically abstract and summarize PPIS at multiple resolutions to provide high level views of its functional landscape are still lacking. We present a novel data-driven and generic algorithm called FUSE (Functional Summary Generator) that generates functional maps of a PPI at different levels of organization, from broad process-process level interactions to in-depth complex-complex level interactions, through a pro t maximization approach that exploits Minimum Description Length (MDL) principle to maximize information gain of the summary graph while satisfying the level of detail constraint. We evaluate the performance of FUSE on several real-world PPIS. We also compare FUSE to state-of-the-art graph clustering methods with GO term enrichment by constructing the biological process landscape of the PPIS. Using AD network as our case study, we further demonstrate the ability of FUSE to quickly summarize the network and identify many different processes and complexes that regulate it. Finally, we study the higher-order connectivity of the human PPI. By simultaneously evaluating interaction and annotation data, FUSE abstracts higher-order interaction maps by reducing the details of the underlying PPI to form a functional summary graph of interconnected functional clusters. Our results demonstrate its effectiveness and superiority over state-of-the-art graph clustering methods with GO term enrichment.

  19. The Budget Enforcement Act: Implications for Children and Families.

    ERIC Educational Resources Information Center

    Baehler, Karen

    This analysis of the Budget Enforcement Act of 1990 (BEA) and its implications for public financing of education and other children's services notes that voters want more and better education and related services, and at the same time want to pay less in taxes and balance budgets at every governmental level. The first section details recent…

  20. Job Analysis Techniques for Restructuring Health Manpower Education and Training in the Navy Medical Department. Attachment 1. Radiation QPCB Task Sort for Radiation.

    ERIC Educational Resources Information Center

    Technomics, Inc., McLean, VA.

    This publication is Attachment 1 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in radiation. (BT)

  1. National and International Disability Rights Legislation: A Qualitative Account of Its Enactment in Australia

    ERIC Educational Resources Information Center

    Whitburn, Ben

    2015-01-01

    In this paper, a detailed analysis based on the lived experiences of the study participants and the researcher (each with vision impairment) in education, post school and in the pursuit for employment is developed. The policy discourses of disability legislation--both at national and international levels--are explored with particular reference to…

  2. Comparative anatomy of the female genitalia of generic-level taxa in tribe Aedini (Diptera: Culicidae). Part XXXVII. Genus Bifidistylus Reinert, Harbach and Kitching

    USDA-ARS?s Scientific Manuscript database

    A comparative, morphological analysis of the female genitalia of species included in genus Bifidistylus Reinert, Harbach and Kitching was conducted. Treatment of the genital morphology of the genus includes a composite description of the genus, a detailed description and illustration of the type sp...

  3. Comparative anatomy of the female genitalia of generic-level taxa in tribe Aedini (Diptera: Culicidae). Part XXXVI. Genus Polyleptiomyia Theobald

    USDA-ARS?s Scientific Manuscript database

    A morphological analysis of the female genitalia of species included in genus Polyleptiomyia Theobald was conducted. Treatment of the genital morphology of the genus includes a description of the genus, a detailed description and illustration of the type species, Po. albocephala (Theobald), a list ...

  4. An Analysis of Conceptual Flow Patterns and Structures in the Physics Classroom

    ERIC Educational Resources Information Center

    Eshach, Haim

    2010-01-01

    The aim of the current research is to characterize the conceptual flow processes occurring in whole-class dialogic discussions with a high level of interanimation; in the present case, of a high-school class learning about image creation on plane mirrors. Using detailed chains of interaction and conceptual flow discourse maps--both developed for…

  5. Total hydrocarbon content (THC) testing in liquid oxygen (LOX) systems

    NASA Astrophysics Data System (ADS)

    Meneghelli, B. J.; Obregon, R. E.; Ross, H. R.; Hebert, B. J.; Sass, J. P.; Dirschka, G. E.

    2015-12-01

    The measured Total Hydrocarbon Content (THC) levels in liquid oxygen (LOX) systems at Stennis Space Center (SSC) have shown wide variations. Examples of these variations include the following: 1) differences between vendor-supplied THC values and those obtained using standard SSC analysis procedures; and 2) increasing THC values over time at an active SSC test stand in both storage and run vessels. A detailed analysis of LOX sampling techniques, analytical instrumentation, and sampling procedures will be presented. Additional data obtained on LOX system operations and LOX delivery trailer THC values during the past 12-24 months will also be discussed. Field test results showing THC levels and the distribution of the THC's in the test stand run tank, modified for THC analysis via dip tubes, will be presented.

  6. Total Hydrocarbon Content (THC) Testing in Liquid Oxygen (LOX)

    NASA Technical Reports Server (NTRS)

    Meneghelli, B. J.; Obregon, R. E.; Ross, H. R.; Hebert, B. J.; Sass, J. P.; Dirschka, G. E.

    2016-01-01

    The measured Total Hydrocarbon Content (THC) levels in liquid oxygen (LOX) systems at Stennis Space Center (SSC) have shown wide variations. Examples of these variations include the following: 1) differences between vendor-supplied THC values and those obtained using standard SSC analysis procedures; and 2) increasing THC values over time at an active SSC test stand in both storage and run vessels. A detailed analysis of LOX sampling techniques, analytical instrumentation, and sampling procedures will be presented. Additional data obtained on LOX system operations and LOX delivery trailer THC values during the past 12-24 months will also be discussed. Field test results showing THC levels and the distribution of the THC's in the test stand run tank, modified for THC analysis via dip tubes, will be presented.

  7. Understanding the structure of skill through a detailed analysis of Individuals' performance on the Space Fortress game.

    PubMed

    Towne, Tyler J; Boot, Walter R; Ericsson, K Anders

    2016-09-01

    In this paper we describe a novel approach to the study of individual differences in acquired skilled performance in complex laboratory tasks based on an extension of the methodology of the expert-performance approach (Ericsson & Smith, 1991) to shorter periods of training and practice. In contrast to more traditional approaches that study the average performance of groups of participants, we explored detailed behavioral changes for individual participants across their development on the Space Fortress game. We focused on dramatic individual differences in learning and skill acquisition at the individual level by analyzing the archival game data of several interesting players to uncover the specific structure of their acquired skill. Our analysis revealed that even after maximal values for game-generated subscores were reached, the most skilled participant's behaviors such as his flight path, missile firing, and mine handling continued to be refined and improved (Participant 17 from Boot et al., 2010). We contrasted this participant's behavior with the behavior of several other participants and found striking differences in the structure of their performance, which calls into question the appropriateness of averaging their data. For example, some participants engaged in different control strategies such as "world wrapping" or maintaining a finely-tuned circular flight path around the fortress (in contrast to Participant 17's angular flight path). In light of these differences, we raise fundamental questions about how skill acquisition for individual participants should be studied and described. Our data suggest that a detailed analysis of individuals' data is an essential step for generating a general theory of skill acquisition that explains improvement at the group and individual levels. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Impacts of Realistic Urban Heating, Part I: Spatial Variability of Mean Flow, Turbulent Exchange and Pollutant Dispersion

    NASA Astrophysics Data System (ADS)

    Nazarian, Negin; Martilli, Alberto; Kleissl, Jan

    2018-03-01

    As urbanization progresses, more realistic methods are required to analyze the urban microclimate. However, given the complexity and computational cost of numerical models, the effects of realistic representations should be evaluated to identify the level of detail required for an accurate analysis. We consider the realistic representation of surface heating in an idealized three-dimensional urban configuration, and evaluate the spatial variability of flow statistics (mean flow and turbulent fluxes) in urban streets. Large-eddy simulations coupled with an urban energy balance model are employed, and the heating distribution of urban surfaces is parametrized using sets of horizontal and vertical Richardson numbers, characterizing thermal stratification and heating orientation with respect to the wind direction. For all studied conditions, the thermal field is strongly affected by the orientation of heating with respect to the airflow. The modification of airflow by the horizontal heating is also pronounced for strongly unstable conditions. The formation of the canyon vortices is affected by the three-dimensional heating distribution in both spanwise and streamwise street canyons, such that the secondary vortex is seen adjacent to the windward wall. For the dispersion field, however, the overall heating of urban surfaces, and more importantly, the vertical temperature gradient, dominate the distribution of concentration and the removal of pollutants from the building canyon. Accordingly, the spatial variability of concentration is not significantly affected by the detailed heating distribution. The analysis is extended to assess the effects of three-dimensional surface heating on turbulent transfer. Quadrant analysis reveals that the differential heating also affects the dominance of ejection and sweep events and the efficiency of turbulent transfer (exuberance) within the street canyon and at the roof level, while the vertical variation of these parameters is less dependent on the detailed heating of urban facets.

  9. Critical evaluation of measured rotational-vibrational transitions of four sulphur isotopologues of S16O2

    NASA Astrophysics Data System (ADS)

    Tóbiás, Roland; Furtenbacher, Tibor; Császár, Attila G.; Naumenko, Olga V.; Tennyson, Jonathan; Flaud, Jean-Marie; Kumar, Praveen; Poirier, Bill

    2018-03-01

    A critical evaluation and validation of the complete set of previously published experimental rotational-vibrational line positions is reported for the four stable sulphur isotopologues of the semirigid SO2 molecule - i.e., 32S16O2, 33S16O2, 34S16O2, and 36S16O2. The experimentally measured, assigned, and labeled transitions are collated from 43 sources. The 32S16O2, 33S16O2, 34S16O2, and 36S16O2 datasets contain 40,269, 15,628, 31,080, and 31 lines, respectively. Of the datasets collated, only the extremely limited 36S16O2 dataset is not subjected to a detailed analysis. As part of a detailed analysis of the experimental spectroscopic networks corresponding to the ground electronic states of the 32S16O2, 33S16O2, and 34S16O2 isotopologues, the MARVEL (Measured Active Rotational-Vibrational Energy Levels) procedure is used to determine the rovibrational energy levels. The rovibrational levels and their vibrational parent and asymmetric-top quantum numbers are compared to ones obtained from accurate variational nuclear-motion computations as well as to results of carefully designed effective Hamiltonian models. The rovibrational energy levels of the three isotopologues having the same labels are also compared against each other to ensure self-consistency. This careful, multifaceted analysis gives rise to 15,130, 5852, and 10,893 validated rovibrational energy levels, with a typical accuracy of a few 0.0001 cm-1 , for 32S16O2, 33S16O2, and 34S16O2, respectively. The extensive list of validated experimental lines and empirical (MARVEL) energy levels of the S16O2 isotopologues studied are deposited in the Supplementary Material of this article, as well as in the distributed information system ReSpecTh (http://respecth.hu).

  10. Generating mouse lines for lineage tracing and knockout studies.

    PubMed

    Kraus, Petra; Sivakamasundari, V; Xing, Xing; Lufkin, Thomas

    2014-01-01

    In 2007 Capecchi, Evans, and Smithies received the Nobel Prize in recognition for discovering the principles for introducing specific gene modifications in mice via embryonic stem cells, a technology, which has revolutionized the field of biomedical science allowing for the generation of genetically engineered animals. Here we describe detailed protocols based on and developed from these ground-breaking discoveries, allowing for the modification of genes not only to create mutations to study gene function but additionally to modify genes with fluorescent markers, thus permitting the isolation of specific rare wild-type and mutant cell types for further detailed analysis at the biochemical, pathological, and genomic levels.

  11. Measurement of airfoil heat transfer coefficients on a turbine stage

    NASA Technical Reports Server (NTRS)

    Dring, R. P.; Blair, M. F.

    1984-01-01

    The primary basis for heat transfer analysis of turbine airfoils is experimental data obtained in linear cascades. A detailed set of heat transfer coefficients was obtained along the midspan of a stator and a rotor in a rotating turbine stage. The data are to be compared to standard analyses of blade boundary layer heat transfer. A detailed set of heat transfer coefficients was obtained along the midspan of a stator located in the wake of a full upstream turbine stage. Two levels of inlet turbulence (1 and 10 percent) were used. The analytical capability will be examined to improve prediction of the experimental data.

  12. Structural Element Testing in Support of the Design of the NASA Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Jackson, Wade C.; Thesken, John C.; Schleicher, Eric; Wagner, Perry; Kirsch, Michael T.

    2012-01-01

    In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). For the design and manufacturing of the CCM, the team adopted the building block approach where design and manufacturing risks were mitigated through manufacturing trials and structural testing at various levels of complexity. Following NASA's Structural Design Verification Requirements, a further objective was the verification of design analysis methods and the provision of design data for critical structural features. Test articles increasing in complexity from basic material characterization coupons through structural feature elements and large structural components, to full-scale structures were evaluated. This paper discusses only four elements tests three of which include joints and one that includes a tapering honeycomb core detail. For each test series included are specimen details, instrumentation, test results, a brief analysis description, test analysis correlation and conclusions.

  13. Cnn Based Retinal Image Upscaling Using Zero Component Analysis

    NASA Astrophysics Data System (ADS)

    Nasonov, A.; Chesnakov, K.; Krylov, A.

    2017-05-01

    The aim of the paper is to obtain high quality of image upscaling for noisy images that are typical in medical image processing. A new training scenario for convolutional neural network based image upscaling method is proposed. Its main idea is a novel dataset preparation method for deep learning. The dataset contains pairs of noisy low-resolution images and corresponding noiseless highresolution images. To achieve better results at edges and textured areas, Zero Component Analysis is applied to these images. The upscaling results are compared with other state-of-the-art methods like DCCI, SI-3 and SRCNN on noisy medical ophthalmological images. Objective evaluation of the results confirms high quality of the proposed method. Visual analysis shows that fine details and structures like blood vessels are preserved, noise level is reduced and no artifacts or non-existing details are added. These properties are essential in retinal diagnosis establishment, so the proposed algorithm is recommended to be used in real medical applications.

  14. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  15. A statistical approach to deriving subsystem specifications. [for spacecraft shock and vibrational environment tests

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1974-01-01

    In order to produce cost effective environmental test programs, the test specifications must be realistic and to be useful, they must be available early in the life of a program. This paper describes a method for achieving such specifications for subsystems by utilizing the results of a statistical analysis of data acquired at subsystem mounting locations during system level environmental tests. The paper describes the details of this statistical analysis. The resultant recommended levels are a function of the subsystems' mounting location in the spacecraft. Methods of determining this mounting 'zone' are described. Recommendations are then made as to which of the various problem areas encountered should be pursued further.

  16. Safety evaluation methodology for advanced coal extraction systems

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.

    1981-01-01

    Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.

  17. USB environment measurements based on full-scale static engine ground tests

    NASA Technical Reports Server (NTRS)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.

  18. Market assessment overview

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, H.

    1981-01-01

    Market assessment, refined with analysis disaggregated from a national level to the regional level and to specific market applications, resulted in more accurate and detailed market estimates. The development of an integrated set of computer simulations, coupled with refined market data, allowed progress in the ability to evaluate the worth of solar thermal parabolic dish systems. In-depth analyses of both electric and thermal market applications of these systems are described. The following market assessment studies were undertaken: (1) regional analysis of the near term market for parabolic dish systems; (2) potential early market estimate for electric applications; (3) potential early market estimate for industrial process heat/cogeneration applications; and (4) selection of thermal and electric application case studies for fiscal year 1981.

  19. Historical Analysis and Charaterization of Ground Level Ozone for Canada and United State

    NASA Astrophysics Data System (ADS)

    Lin, H.; Li, H.; Auld, H.

    2003-12-01

    Ground-level ozone has long been recognized as an important health and ecosystem-related air quality concern in Canada and the United States. In this work we seek to understand the characteristics of ground level ozone conditions for Canada and United States to support the Ozone Annex under the Canada-U.S. Air Quality Agreement. Our analyses are based upon the data collected by Canadian National Air Pollution Surveillance (NAPS, the NAPS database has also been expanded to include U.S. EPA ground level ozone data) network. Historical ozone data from 1974 to 2002 at a total of 538 stations (253 Canadian stations and 285 U.S. stations) were statistically analyzed using several methodologies including the Canada Wide Standard (CWS). A more detailed analysis including hourly, daily, monthly, seasonally and yearly ozone concentration distributions and trends was undertaken for 54 stations.

  20. Dynamical tests on fiber optic data taken from the riser section of a circulating fluidized bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, E.M.; Guenther, C.P.; Breault, R.W.

    2007-11-01

    Dynamical tests have been applied to fiber optic data taken from a cold-flow circulating fluidized bed to characterize flow conditions, identify three time and/or length scales (macro, meso, and micro), and understand the contribution these scales have on the raw data. The characteristic variable analyzed is the raw voltage signal obtained from a fiber-optic probe taken at various axial and radial positions under different loading conditions so that different flow regimes could be attained. These experiments were carried out with the bed material of 812 μm cork particles. The characterization was accomplished through analysis of the distribution of the signalmore » through the third and fourth moments of skewness and excess kurtosis. A generalization of the autocorrelation function known as the average mutual information function was analyzed by examining the function’s first minimum, identifying the point at which successive elements are no longer correlated. Further characterization was accomplished through the correlation dimension, a measure of the complexity of the attractor. Lastly, the amount of disorder of the system is described by a Kolmogorov-type entropy estimate. All six aforementioned tests were also implemented on ten levels of detail coefficients resulting from a discrete wavelet transformation of the same signal as used above. Through this analysis it is possible to identify and describe micro (particle level), meso (clustering or turbulence level), and macro (physical or dimensional level) length scales even though some literature considers these scales inseparable [6]. This investigation also used detail wavelet coefficients in conjunction with ANOVA analysis to show which scales have the most impact on the raw signal resulting from local hydrodynamic conditions.« less

  1. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  2. A detailed report of the resource use and costs associated with implementation of a short stay programme for breast cancer surgery.

    PubMed

    Ament, Stephanie M C; de Kok, Mascha; van de Velde, Cornelis J H; Roukema, Jan A; Bell, Toine V R J; van der Ent, Fred W; van der Weijden, Trudy; von Meyenfeldt, Maarten F; Dirksen, Carmen D

    2015-05-27

    Despite the increased attention for assessing the effectiveness of implementation strategies, most implementation studies provide little or no information on its associated costs. The focus of the current study was to provide a detailed report of the resource use and costs associated with implementation of a short stay programme for breast cancer surgery in four Dutch hospitals. The analysis was performed alongside a multi-centre implementation study. The process of identification, measurement and valuation of the implementation activities was based on recommendations for the design, analysis and reporting of health technology assessments. A scoring form was developed to prospectively determine the implementation activities at professional and implementation expert level. A time horizon of 5 years was used to calculate the implementation costs per patient. Identified activities were consisted of development and execution of the implementation strategy during the implementation project. Total implementation costs over the four hospitals were €83.293. Mean implementation costs, calculated for 660 patients treated over a period of 5 years, were €25 per patient. Subgroup analyses showed that the implementation costs ranged from €3.942 to €32.000 on hospital level. From a local hospital perspective, overall implementation costs were €21 per patient, after exclusion of the costs made by the expert centre. We provided a detailed case description of how implementation costs can be determined. Notable differences in implementation costs between hospitals were observed. ISRCTN77253391.

  3. Chemometric analysis of soil pollution data using the Tucker N-way method.

    PubMed

    Stanimirova, I; Zehl, K; Massart, D L; Vander Heyden, Y; Einax, J W

    2006-06-01

    N-way methods, particularly the Tucker method, are often the methods of choice when analyzing data sets arranged in three- (or higher) way arrays, which is the case for most environmental data sets. In the future, applying N-way methods will become an increasingly popular way to uncover hidden information in complex data sets. The reason for this is that classical two-way approaches such as principal component analysis are not as good at revealing the complex relationships present in data sets. This study describes in detail the application of a chemometric N-way approach, namely the Tucker method, in order to evaluate the level of pollution in soil from a contaminated site. The analyzed soil data set was five-way in nature. The samples were collected at different depths (way 1) from two locations (way 2) and the levels of thirteen metals (way 3) were analyzed using a four-step-sequential extraction procedure (way 4), allowing detailed information to be obtained about the bioavailability and activity of the different binding forms of the metals. Furthermore, the measurements were performed under two conditions (way 5), inert and non-inert. The preferred Tucker model of definite complexity showed that there was no significant difference in measurements analyzed under inert or non-inert conditions. It also allowed two depth horizons, characterized by different accumulation pathways, to be distinguished, and it allowed the relationships between chemical elements and their biological activities and mobilities in the soil to be described in detail.

  4. PTools: an opensource molecular docking library

    PubMed Central

    Saladin, Adrien; Fiorucci, Sébastien; Poulain, Pierre; Prévost, Chantal; Zacharias, Martin

    2009-01-01

    Background Macromolecular docking is a challenging field of bioinformatics. Developing new algorithms is a slow process generally involving routine tasks that should be found in a robust library and not programmed from scratch for every new software application. Results We present an object-oriented Python/C++ library to help the development of new docking methods. This library contains low-level routines like PDB-format manipulation functions as well as high-level tools for docking and analyzing results. We also illustrate the ease of use of this library with the detailed implementation of a 3-body docking procedure. Conclusion The PTools library can handle molecules at coarse-grained or atomic resolution and allows users to rapidly develop new software. The library is already in use for protein-protein and protein-DNA docking with the ATTRACT program and for simulation analysis. This library is freely available under the GNU GPL license, together with detailed documentation. PMID:19409097

  5. PTools: an opensource molecular docking library.

    PubMed

    Saladin, Adrien; Fiorucci, Sébastien; Poulain, Pierre; Prévost, Chantal; Zacharias, Martin

    2009-05-01

    Macromolecular docking is a challenging field of bioinformatics. Developing new algorithms is a slow process generally involving routine tasks that should be found in a robust library and not programmed from scratch for every new software application. We present an object-oriented Python/C++ library to help the development of new docking methods. This library contains low-level routines like PDB-format manipulation functions as well as high-level tools for docking and analyzing results. We also illustrate the ease of use of this library with the detailed implementation of a 3-body docking procedure. The PTools library can handle molecules at coarse-grained or atomic resolution and allows users to rapidly develop new software. The library is already in use for protein-protein and protein-DNA docking with the ATTRACT program and for simulation analysis. This library is freely available under the GNU GPL license, together with detailed documentation.

  6. Gait Analysis From a Single Ear-Worn Sensor: Reliability and Clinical Evaluation for Orthopaedic Patients.

    PubMed

    Jarchi, Delaram; Lo, Benny; Wong, Charence; Ieong, Edmund; Nathwani, Dinesh; Yang, Guang-Zhong

    2016-08-01

    Objective assessment of detailed gait patterns after orthopaedic surgery is important for post-surgical follow-up and rehabilitation. The purpose of this paper is to assess the use of a single ear-worn sensor for clinical gait analysis. A reliability measure is devised for indicating the confidence level of the estimated gait events, allowing it to be used in free-walking environments and for facilitating clinical assessment of orthopaedic patients after surgery. Patient groups prior to or following anterior cruciate ligament (ACL) reconstruction and knee replacement were recruited to assess the proposed method. The ability of the sensor for detailed longitudinal analysis is demonstrated with a group of patients after lower limb reconstruction by considering parameters such as temporal and force-related gait asymmetry derived from gait events. The results suggest that the ear-worn sensor can be used for objective gait assessments of orthopaedic patients without the requirement and expense of an elaborate laboratory setup for gait analysis. It significantly simplifies the monitoring protocol and opens the possibilities for home-based remote patient assessment.

  7. Rapid Disaster Damage Estimation

    NASA Astrophysics Data System (ADS)

    Vu, T. T.

    2012-07-01

    The experiences from recent disaster events showed that detailed information derived from high-resolution satellite images could accommodate the requirements from damage analysts and disaster management practitioners. Richer information contained in such high-resolution images, however, increases the complexity of image analysis. As a result, few image analysis solutions can be practically used under time pressure in the context of post-disaster and emergency responses. To fill the gap in employment of remote sensing in disaster response, this research develops a rapid high-resolution satellite mapping solution built upon a dual-scale contextual framework to support damage estimation after a catastrophe. The target objects are building (or building blocks) and their condition. On the coarse processing level, statistical region merging deployed to group pixels into a number of coarse clusters. Based on majority rule of vegetation index, water and shadow index, it is possible to eliminate the irrelevant clusters. The remaining clusters likely consist of building structures and others. On the fine processing level details, within each considering clusters, smaller objects are formed using morphological analysis. Numerous indicators including spectral, textural and shape indices are computed to be used in a rule-based object classification. Computation time of raster-based analysis highly depends on the image size or number of processed pixels in order words. Breaking into 2 level processing helps to reduce the processed number of pixels and the redundancy of processing irrelevant information. In addition, it allows a data- and tasks- based parallel implementation. The performance is demonstrated with QuickBird images captured a disaster-affected area of Phanga, Thailand by the 2004 Indian Ocean tsunami are used for demonstration of the performance. The developed solution will be implemented in different platforms as well as a web processing service for operational uses.

  8. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena S.; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1-xSex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.

  9. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  10. A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.

  11. On the topological structure of multinationals network

    NASA Astrophysics Data System (ADS)

    Joyez, Charlie

    2017-05-01

    This paper uses a weighted network analysis to examine the structure of multinationals' implantation countries network. Based on French firm-level dataset of multinational enterprises (MNEs) the network analysis provides information on each country position in the network and in internationalization strategies of French MNEs through connectivity preferences among the nodes. The paper also details network-wide features and their recent evolution toward a more decentralized structure. While much has been said on international trade network, this paper shows that multinational firms' studies would also benefit from network analysis, notably by investigating the sensitivity of the network construction to firm heterogeneity.

  12. Thermal structure of Sikhote Alin and adjacent areas based on spectral analysis of the anomalous magnetic field

    NASA Astrophysics Data System (ADS)

    Didenko, A. N.; Nosyrev, M. Yu.; Shevchenko, B. F.; Gilmanova, G. Z.

    2017-11-01

    The depth of the base of the magnetoactive layer and the geothermal gradient in the Sikhote Alin crust are estimated based on a method determining the Curie depth point of magnetoactive masses by using spectral analysis of the anomalous magnetic field. A detailed map of the geothermal gradient is constructed for the first time for the Sikhote Alin and adjacent areas of the Central Asian belt. Analysis of this map shows that the zones with a higher geothermal gradient geographically fit the areas with a higher level of seismicity.

  13. Large-Scale Femtoliter Droplet Array for Single Cell Efflux Assay of Bacteria.

    PubMed

    Iino, Ryota; Sakakihara, Shouichi; Matsumoto, Yoshimi; Nishino, Kunihiko

    2018-01-01

    Large-scale femtoliter droplet array as a platform for single cell efflux assay of bacteria is described. Device microfabrication, femtoliter droplet array formation and concomitant enclosure of single bacterial cells, fluorescence-based detection of efflux activity at the single cell level, and collection of single cells from droplet and subsequent gene analysis are described in detail.

  14. The Florida Community College Accountability Plan: An Analysis of Institutional Characteristics and Success at Meeting State Defined Performance Measures.

    ERIC Educational Resources Information Center

    Windham, Patricia W.; Hackett, E. Raymond

    In response to the increasing use of state-based performance indicators for postsecondary education, a study was undertaken to review the reliability and validity of state-level indicators in the Florida Community College System (FCCS). Data were collected from literature reviews and the 1996 FCCS Accountability Report, detailing outcomes for 17…

  15. Land use analysis of US urban areas using high-resolution imagery from Skylab

    NASA Technical Reports Server (NTRS)

    Gallagher, D. B. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. The S-190B imagery from Skylab 3 permitted the detection of higher levels of land use detail than any satellite imagery previously evaluated using manual interpretation techniques. Resolution approaches that of 1:100,000 scale infrared aircraft photography, especially regarding urban areas. Nonurban areas are less distinct.

  16. Job Analysis Techniques for Restructuring Health Manpower Education and Training in the Navy Medical Department. Attachment 9. Laboratory QPCB Task Sort for Medical Laboratory Technology.

    ERIC Educational Resources Information Center

    Technomics, Inc., McLean, VA.

    This publication is Attachment 9 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in medical laboratory technology. (BT)

  17. Impact of Machine-Translated Text on Entity and Relationship Extraction

    DTIC Science & Technology

    2014-12-01

    20 1 1. Introduction Using social network analysis tools is an important asset in...semantic modeling software to automatically build detailed network models from unstructured text. Contour imports unstructured text and then maps the text...onto an existing ontology of frames at the sentence level, using FrameNet, a structured language model, and through Semantic Role Labeling ( SRL

  18. The Use of Cohesive Devices in Argumentative Writing by Chinese EFL Learners at Different Proficiency Levels

    ERIC Educational Resources Information Center

    Yang, Wenxing; Sun, Ying

    2012-01-01

    This article reports on a study that comparatively investigated the differences and similarities in the (incorrect) use of cohesive devices by second-year and fourth-year undergraduate Chinese EFL learners in their argumentative writings. Via detailed analysis of the quantitative and qualitative data, this study seeks to reveal if the patterns of…

  19. Detailed Analysis of Misconceptions as a Basis for Developing Remedial Instruction: The Case of Photosynthesis.

    ERIC Educational Resources Information Center

    Amir, Ruth; Tamir, Pinchas

    A great number of misconceptions in diverse subject areas as well as across age levels have been documented and described. Photosynthesis is one of the more intensively studied areas in biology. The purpose of this research was to carefully select and define misconceptions about photosynthesis needing remedial efforts. To achieve this, a specially…

  20. Comparative anatomy of the female genitalia of generic-level taxa in tribe Aedini (Diptera: Culicidae). Part XXXIII. Genus Lewnielsenius Reinert, Harbach and Kitching

    USDA-ARS?s Scientific Manuscript database

    A morphological analysis of the female genitalia of the species included in genus Lewnielsenius Reinert, Harbach and Kitching was conducted. The genitalia of the type species of the genus, Ln. muelleri (Dyar), are illustrated. Treatment of the genital morphology of the genus includes a detailed de...

  1. Understanding product cost vs. performance through an in-depth system Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Sanson, Mark C.

    2017-08-01

    The manner in which an optical system is toleranced and compensated greatly affects the cost to build it. By having a detailed understanding of different tolerance and compensation methods, the end user can decide on the balance of cost and performance. A detailed phased approach Monte Carlo analysis can be used to demonstrate the tradeoffs between cost and performance. In complex high performance optical systems, performance is fine-tuned by making adjustments to the optical systems after they are initially built. This process enables the overall best system performance, without the need for fabricating components to stringent tolerance levels that often can be outside of a fabricator's manufacturing capabilities. A good performance simulation of as built performance can interrogate different steps of the fabrication and build process. Such a simulation may aid the evaluation of whether the measured parameters are within the acceptable range of system performance at that stage of the build process. Finding errors before an optical system progresses further into the build process saves both time and money. Having the appropriate tolerances and compensation strategy tied to a specific performance level will optimize the overall product cost.

  2. Internal Carotid Artery Web as the Cause of Recurrent Cryptogenic Ischemic Stroke.

    PubMed

    Antigüedad-Muñoz, Jon; de la Riva, Patricia; Arenaza Choperena, Gorka; Muñoz Lopetegi, Amaia; Andrés Marín, Naiara; Fernández-Eulate, Gorka; Moreno Valladares, Manuel; Martínez Zabaleta, Maite

    2018-05-01

    Carotid artery web is considered an exceptional cause of recurrent ischemic strokes in the affected arterial territory. The underlying pathology proposed for this entity is an atypical fibromuscular dysplasia. We present the case of a 43-year-old woman with no cardiovascular risk factors who had experienced 2 cryptogenic ischemic strokes in the same arterial territory within an 11-month period. Although all diagnostic tests initially yielded normal results, detailed analysis of the computed tomography angiography images revealed a carotid web; catheter angiography subsequently confirmed the diagnosis. Carotid surgery was performed, since which time the patient has remained completely asymptomatic. The histological finding of intimal hyperplasia is consistent with previously reported cases of carotid artery web. Carotid artery web is an infrequent cause of stroke, and this diagnosis requires a high level of suspicion plus a detailed analysis of vascular imaging studies. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  3. Helios1A EoL: A Success. For the first Time a Long Final Thrust Scenario, Respecting the French Law on Space Operations

    NASA Astrophysics Data System (ADS)

    Guerry, Agnes; Moussi, Aurelie; Sartine, Christian; Beaumet, Gregory

    2013-09-01

    HELIOS1A End Of Live (EOL) operations occurred in the early 2012. Through this EOL operation, CNES wanted to make an example of French Space Act compliance. Because the satellite wasn't natively designed for such an EOL phase, the operation was touchy and risky. It was organized as a real full project in order to assess every scenario details with dedicated Mission Analysis, to secure the operations through detailed risk analysis at system level and to consider the major failures that could occur during the EOL. A short scenario allowing to reach several objectives with benefits was eventually selected. The main objective of this project was to preserve space environment. The operations were led on a "best effort" basis. The French Space Operations Act (FSOA) requirements were met: HELIOS-1A EOL operations had been led successfully.

  4. Criteria-based evaluation of group 3 level memory telefacsimile equipment for interlibrary loan.

    PubMed Central

    Bennett, V M; Wood, M S; Malcom, D L

    1990-01-01

    The Interlibrary Loan, Document Delivery, and Union List Task Force of the Health Sciences Libraries Consortium (HSLC)--with nineteen libraries located in Philadelphia, Pittsburgh, and Hershey, Pennsylvania, and Delaware--accepted the charge of evaluating and recommending for purchase telefacsimile hardware to further interlibrary loan among HSLC members. To allow a thorough and scientific evaluation of group 3 level telefacsimile equipment, the task force identified ninety-six hardware features, which were grouped into nine broad criteria. These features formed the basis of a weighted analysis that identified three final candidates, with one model recommended to the HSLC board. This article details each of the criteria and discusses features in terms of library applications. The evaluation grid developed in the weighted analysis process should aid librarians charged with the selection of level 3 telefacsimile equipment. PMID:2328361

  5. Analysis of transport eco-efficiency scenarios to support sustainability assessment: a study on Dhaka City, Bangladesh.

    PubMed

    Iqbal, Asif; Allan, Andrew; Afroze, Shirina

    2017-08-01

    The study focused to assess the level of efficiency (of both emissions and service quality) that can be achieved for the transport system in Dhaka City, Bangladesh. The assessment technique attempted to quantify the extent of eco-efficiency achievable for the system modifications due to planning or strategy. The eco-efficiency analysis was facilitated with a detailed survey data on Dhaka City transport system, which was conducted for 9 months in 2012-2013. Line source modelling (CALINE4) was incorporated to estimate the on-road emission concentration. The eco-efficiency of the transport systems was assessed with the 'multi-criteria analysis' (MCA) technique that enabled the valuation of systems' qualitative and quantitative parameters. As per the analysis, driving indiscipline on road can alone promise about 47% reductions in emissions, which along with the number of private vehicles were the important stressors that restrict achieving eco-efficiency in Dhaka City. Detailed analysis of the transport system together with the potential transport system scenarios can offer a checklist to the policy makers enabling to identify the possible actions needed that can offer greater services to the dwellers against lesser emissions, which in turn can bring sustainability of the system.

  6. New insights on ion track morphology in pyrochlores by aberration corrected scanning transmission electron microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sachan, Ritesh; Zhang, Yanwen; Ou, Xin

    Here we demonstrate the enhanced imaging capabilities of an aberration corrected scanning transmission electron microscope to advance the understanding of ion track structure in pyrochlore structured materials (i.e., Gd 2Ti 2O 7 and Gd 2TiZrO 7). Track formation occurs due to the inelastic transfer of energy from incident ions to electrons, and atomic-level details of track morphology as a function of energy-loss are revealed in the present work. A comparison of imaging details obtained by varying collection angles of detectors is discussed in the present work. A quantitative analysis of phase identification using high-angle annular dark field imaging is performedmore » on the ion tracks. Finally, a novel 3-dimensional track reconstruction method is provided that is based on depth dependent imaging of the ion tracks. The technique is used in extracting the atomic-level details of nanoscale features, such as the disordered ion tracks, which are embedded in relatively thicker matrix. Another relevance of the method is shown by measuring the tilt of the ion tracks relative to the electron beam incidence that helps in knowing the structure and geometry of ion tracks quantitatively.« less

  7. New insights on ion track morphology in pyrochlores by aberration corrected scanning transmission electron microscopy

    DOE PAGES

    Sachan, Ritesh; Zhang, Yanwen; Ou, Xin; ...

    2016-12-13

    Here we demonstrate the enhanced imaging capabilities of an aberration corrected scanning transmission electron microscope to advance the understanding of ion track structure in pyrochlore structured materials (i.e., Gd 2Ti 2O 7 and Gd 2TiZrO 7). Track formation occurs due to the inelastic transfer of energy from incident ions to electrons, and atomic-level details of track morphology as a function of energy-loss are revealed in the present work. A comparison of imaging details obtained by varying collection angles of detectors is discussed in the present work. A quantitative analysis of phase identification using high-angle annular dark field imaging is performedmore » on the ion tracks. Finally, a novel 3-dimensional track reconstruction method is provided that is based on depth dependent imaging of the ion tracks. The technique is used in extracting the atomic-level details of nanoscale features, such as the disordered ion tracks, which are embedded in relatively thicker matrix. Another relevance of the method is shown by measuring the tilt of the ion tracks relative to the electron beam incidence that helps in knowing the structure and geometry of ion tracks quantitatively.« less

  8. Design for testability and diagnosis at the system-level

    NASA Technical Reports Server (NTRS)

    Simpson, William R.; Sheppard, John W.

    1993-01-01

    The growing complexity of full-scale systems has surpassed the capabilities of most simulation software to provide detailed models or gate-level failure analyses. The process of system-level diagnosis approaches the fault-isolation problem in a manner that differs significantly from the traditional and exhaustive failure mode search. System-level diagnosis is based on a functional representation of the system. For example, one can exercise one portion of a radar algorithm (the Fast Fourier Transform (FFT) function) by injecting several standard input patterns and comparing the results to standardized output results. An anomalous output would point to one of several items (including the FFT circuit) without specifying the gate or failure mode. For system-level repair, identifying an anomalous chip is sufficient. We describe here an information theoretic and dependency modeling approach that discards much of the detailed physical knowledge about the system and analyzes its information flow and functional interrelationships. The approach relies on group and flow associations and, as such, is hierarchical. Its hierarchical nature allows the approach to be applicable to any level of complexity and to any repair level. This approach has been incorporated in a product called STAMP (System Testability and Maintenance Program) which was developed and refined through more than 10 years of field-level applications to complex system diagnosis. The results have been outstanding, even spectacular in some cases. In this paper we describe system-level testability, system-level diagnoses, and the STAMP analysis approach, as well as a few STAMP applications.

  9. Comparison of cross culture engineering ethics training using the simulator for engineering ethics education.

    PubMed

    Chung, Christopher

    2015-04-01

    This paper describes the use and analysis of the Simulator for Engineering Ethics Education (SEEE) to perform cross culture engineering ethics training and analysis. Details describing the first generation and second generation development of the SEEE are published in Chung and Alfred, Science and Engineering Ethics, vol. 15, 2009 and Alfred and Chung, Science and Engineering Ethics, vol. 18, 2012. In this effort, a group of far eastern educated students operated the simulator in the instructional, training, scenario, and evaluation modes. The pre and post treatment performance of these students were compared to U.S. Educated students. Analysis of the performance indicated that the far eastern educated student increased their level of knowledge 23.7 percent while U.S. educated students increased their level of knowledge by 39.3 percent.

  10. Comparison of Australian and US Cost-Benefit Approaches to MEPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, James E.

    2004-03-12

    The Australian Greenhouse Office contracted with the Collaborative Labeling and Appliance Standards Program (CLASP) for LBNL to compare US and Australian approaches to analyzing costs and benefits of minimum energy performance standards (MEPS). This report compares the approaches for three types of products: household refrigerators and freezers, small electric storage water heaters, and commercial/industrial air conditioners. This report presents the findings of similarities and differences between the approaches of the two countries and suggests changes to consider in the approach taken in Australia. The purpose of the Australian program is to reduce greenhouse gas emissions, while the US program ismore » intended to increase energy efficiency; each program is thus subject to specific constraints. The market and policy contexts are different, with the USA producing most of its own products and conducting pioneering engineering-economic studies to identify maximum energy efficiency levels that are technologically feasible and economically justified. In contrast, Australia imports a large share of its products and adopts MEPS already in place elsewhere. With these differences in circumstances, Australia's analysis approach could be expected to have less analytical detail and still result in MEPS levels that are appropriate for their policy and market context. In practice, the analysis required to meet these different objectives is quite similar. To date, Australia's cost-benefit analysis has served the goals and philosophies of the program well and been highly effective in successfully identifying MEPS that are significantly reducing greenhouse gas emissions while providing economic benefits to consumers. In some cases, however, the experience of the USA--using more extensive data sets and more detailed analysis--suggests possible improvements to Australia's cost-benefit analysis. The principal findings of the comparison are: (1) The Technology and Market Assessments are similar; no changes are recommended. (2) The Australian approach to determining the relationship of price to energy efficiency is based on current market, while the US approach uses prospective estimates. Both approaches may benefit from increased retrospective analysis of impacts of MEPS on appliance and equipment prices. Under some circumstances, Australia may wish to consider analyzing two separate components leading to price impacts: (a) changes in manufacturing costs and (b) markups used to convert from manufacturing costs to consumer price. (3) The Life-Cycle Cost methods are similar, but the USA has statistical surveys that permit a more detailed analysis. Australia uses average values, while the US uses full distributions. If data and resources permit, Australia may benefit from greater depth here as well. If implemented, the changes will provide more information about the benefits and costs of the program, in particular identifying who benefits and who bears net costs so that programs can be designed to offset unintended negative consequences, and may assist the government in convincing affected parties of the justification for some MEPS. However, without a detailed and statistically representative national survey, such an approach may not be practical for Australia at this time. (4) The National Benefits and Costs methods are similar prospective estimates of shipments, costs and energy savings, as well as greenhouse gas emissions. Additional sensitivity studies could further illustrate the ranges in these estimates. Consideration of lower discount rates could lead to more stringent MEPS in some cases. (5) Both the Australian and US analyses of impacts on industry, competition, and trade ultimately depend upon sufficient consultation with industry experts. While the Australian analysis of financial impacts on manufacturers is less detailed than that of the US, the Australian treatment of impacts on market shares imported from different regions of the world is more detailed. No change is recommended. Implementing these changes would increase the depth of analysis, require additional data collection and analysis, and incur associated costs and time. The recommended changes are likely to have incremental rather than dramatic impacts on the substance and implications of the analysis as currently conducted.« less

  11. Large Terrain Continuous Level of Detail 3D Visualization Tool

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    This software solved the problem of displaying terrains that are usually too large to be displayed on standard workstations in real time. The software can visualize terrain data sets composed of billions of vertices, and can display these data sets at greater than 30 frames per second. The Large Terrain Continuous Level of Detail 3D Visualization Tool allows large terrains, which can be composed of billions of vertices, to be visualized in real time. It utilizes a continuous level of detail technique called clipmapping to support this. It offloads much of the work involved in breaking up the terrain into levels of details onto the GPU (graphics processing unit) for faster processing.

  12. Institute for Home Economics Teachers on Initiating, Developing, and Evaluating Programs at the Post High School Level to Prepare Food Service Supervisors and Assistants to Directors of Child Care Services: Volume I: A Post High School Program in Home Economics (May 1, 1966-June 30, 1967). Final Report.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Coll. of Education.

    The institute was designed to provide information and develop some ability in initiating, developing, and evaluating programs for training workers as food service supervisors in post-high school level programs. Organizational details, student and faculty qualifications, a job description and analysis of the food service supervisor occupation are…

  13. Computational study of some fluoroquinolones: Structural, spectral and docking investigations

    NASA Astrophysics Data System (ADS)

    Sayin, Koray; Karakaş, Duran; Kariper, Sultan Erkan; Sayin, Tuba Alagöz

    2018-03-01

    Quantum chemical calculations are performed over norfloxacin, tosufloxacin and levofloxacin. The most stable structures for each molecule are determined by thermodynamic parameters. Then the best level for calculations is determined by benchmark analysis. M062X/6-31 + G(d) level is used in calculations. IR, UV-VIS and NMR spectrum are calculated and examined in detail. Some quantum chemical parameters are calculated and the tendency of activity is recommended. Additionally, molecular docking calculations are performed between related compounds and a protein (ID: 2J9N).

  14. Development and testing of a fast conceptual river water quality model.

    PubMed

    Keupers, Ingrid; Willems, Patrick

    2017-04-15

    Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. New approach for determination of the influence of long-range order and selected ring oscillations on IR spectra in zeolites

    NASA Astrophysics Data System (ADS)

    Mikuła, Andrzej; Król, Magdalena; Mozgawa, Włodzimierz; Koleżyński, Andrzej

    2018-04-01

    Vibrational spectroscopy can be considered as one of the most important methods used for structural characterization of various porous aluminosilicate materials, including zeolites. On the other hand, vibrational spectra of zeolites are still difficult to interpret, particularly in the pseudolattice region, where bands related to ring oscillations can be observed. Using combination of theoretical and computational approach, a detailed analysis of these regions of spectra is possible; such analysis should be, however, carried out employing models with different level of complexity and simultaneously the same theory level. In this work, an attempt was made to identify ring oscillations in vibrational spectra of selected zeolite structures. A series of ab initio calculations focused on S4R, S6R, and as a novelty, 5-1 isolated clusters, as well as periodic siliceous frameworks built from those building units (ferrierite (FER), mordenite (MOR) and heulandite (HEU) type) have been carried out. Due to the hierarchical structure of zeolite frameworks it can be expected that the total envelope of the zeolite spectra should be with good accuracy a sum of the spectra of structural elements that build each zeolite framework. Based on the results of HF calculations, normal vibrations have been visualized and detailed analysis of pseudolattice range of resulting theoretical spectra have been carried out. Obtained results have been applied for interpretation of experimental spectra of selected zeolites.

  16. An efficient liner cooling scheme for advanced small gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Paskin, Marc D.; Mongia, Hukam C.; Acosta, Waldo A.

    1993-01-01

    A joint Army/NASA program was conducted to design, fabricate, and test an advanced, small gas turbine, reverse-flow combustor utilizing a compliant metal/ceramic (CMC) wall cooling concept. The objectives of this effort were to develop a design method (basic design data base and analysis) for the CMC cooling technique and then demonstrate its application to an advanced cycle, small, reverse-flow combustor with 3000 F burner outlet temperature. The CMC concept offers significant improvements in wall cooling effectiveness resulting in a large reduction in cooling air requirements. Therefore, more air is available for control of burner outlet temperature pattern in addition to the benefits of improved efficiency, reduced emissions, and lower smoke levels. The program was divided into four tasks. Task 1 defined component materials and localized design of the composite wall structure in conjunction with development of basic design models for the analysis of flow and heat transfer through the wall. Task 2 included implementation of the selected materials and validated design models during combustor preliminary design. Detail design of the selected combustor concept and its refinement with 3D aerothermal analysis were completed in Task 3. Task 4 covered detail drawings, process development and fabrication, and a series of burner rig tests. The purpose of this paper is to provide details of the investigation into the fundamental flow and heat transfer characteristics of the CMC wall structure as well as implementation of the fundamental analysis method for full-scale combustor design.

  17. Mission planning, mission analysis and software formulation. Level C requirements for the shuttle mission control center orbital guidance software

    NASA Technical Reports Server (NTRS)

    Langston, L. J.

    1976-01-01

    The formulation of Level C requirements for guidance software was reported. Requirements for a PEG supervisor which controls all input/output interfaces with other processors and determines which PEG mode is to be utilized were studied in detail. A description of the two guidance modes for which Level C requirements have been formulated was presented. Functions required for proper execution of the guidance software were defined. The requirements for a navigation function that is used in the prediction logic of PEG mode 4 were discussed. It is concluded that this function is extracted from the current navigation FSSR.

  18. Investigation of the turbulent wind field below 500 feet altitude at the Eastern Test Range, Florida

    NASA Technical Reports Server (NTRS)

    Blackadar, A. K.; Panofsky, H. A.; Fiedler, F.

    1974-01-01

    A detailed analysis of wind profiles and turbulence at the 150 m Cape Kennedy Meteorological Tower is presented. Various methods are explored for the estimation of wind profiles, wind variances, high-frequency spectra, and coherences between various levels, given roughness length and either low-level wind and temperature data, or geostrophic wind and insolation. The relationship between planetary Richardson number, insolation, and geostrophic wind is explored empirically. Techniques were devised which resulted in surface stresses reasonably well correlated with the surface stresses obtained from low-level data. Finally, practical methods are suggested for the estimation of wind profiles and wind statistics.

  19. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  20. Use of Probabilistic Engineering Methods in the Detailed Design and Development Phases of the NASA Ares Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Fayssal, Safie; Weldon, Danny

    2008-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.

  1. Supersonic molecular beam-hyperthermal surface ionisation coupled with time-of-flight mass spectrometry applied to trace level detection of polynuclear aromatic hydrocarbons in drinking water for reduced sample preparation and analysis time.

    PubMed

    Davis, S C; Makarov, A A; Hughes, J D

    1999-01-01

    Analysis of sub-ppb levels of polynuclear aromatic hydrocarbons (PAHs) in drinking water by high performance liquid chromatography (HPLC) fluorescence detection typically requires large water samples and lengthy extraction procedures. The detection itself, although selective, does not give compound identity confirmation. Benchtop gas chromatography/mass spectrometry (GC/MS) systems operating in the more sensitive selected ion monitoring (SIM) acquisition mode discard spectral information and, when operating in scanning mode, are less sensitive and scan too slowly. The selectivity of hyperthermal surface ionisation (HSI), the high column flow rate capacity of the supersonic molecular beam (SMB) GC/MS interface, and the high acquisition rate of time-of-flight (TOF) mass analysis, are combined here to facilitate a rapid, specific and sensitive technique for the analysis of trace levels of PAHs in water. This work reports the advantages gained by using the GC/HSI-TOF system over the HPLC fluorescence method, and discusses in some detail the nature of the instrumentation used.

  2. Meta-T: TetrisⓇ as an experimental paradigm for cognitive skills research.

    PubMed

    Lindstedt, John K; Gray, Wayne D

    2015-12-01

    Studies of human performance in complex tasks using video games are an attractive prospect, but many existing games lack a comprehensive way to modify the game and track performance beyond basic levels of analysis. Meta-T provides experimenters a tool to study behavior in a dynamic task environment with time-stressed decision-making and strong perceptual-motor elements, offering a host of experimental manipulations with a robust and detailed logging system for all user events, system events, and screen objects. Its experimenter-friendly interface provides control over detailed parameters of the task environment without need for programming expertise. Support for eye-tracking and computational cognitive modeling extend the paradigm's scope.

  3. Evidence from mixed hydrate nucleation for a funnel model of crystallization.

    PubMed

    Hall, Kyle Wm; Carpendale, Sheelagh; Kusalik, Peter G

    2016-10-25

    The molecular-level details of crystallization remain unclear for many systems. Previous work has speculated on the phenomenological similarities between molecular crystallization and protein folding. Here we demonstrate that molecular crystallization can involve funnel-shaped potential energy landscapes through a detailed analysis of mixed gas hydrate nucleation, a prototypical multicomponent crystallization process. Through this, we contribute both: (i) a powerful conceptual framework for exploring and rationalizing molecular crystallization, and (ii) an explanation of phenomenological similarities between protein folding and crystallization. Such funnel-shaped potential energy landscapes may be typical of broad classes of molecular ordering processes, and can provide a new perspective for both studying and understanding these processes.

  4. Evidence from mixed hydrate nucleation for a funnel model of crystallization

    PubMed Central

    Hall, Kyle Wm.; Carpendale, Sheelagh; Kusalik, Peter G.

    2016-01-01

    The molecular-level details of crystallization remain unclear for many systems. Previous work has speculated on the phenomenological similarities between molecular crystallization and protein folding. Here we demonstrate that molecular crystallization can involve funnel-shaped potential energy landscapes through a detailed analysis of mixed gas hydrate nucleation, a prototypical multicomponent crystallization process. Through this, we contribute both: (i) a powerful conceptual framework for exploring and rationalizing molecular crystallization, and (ii) an explanation of phenomenological similarities between protein folding and crystallization. Such funnel-shaped potential energy landscapes may be typical of broad classes of molecular ordering processes, and can provide a new perspective for both studying and understanding these processes. PMID:27790987

  5. SemanticSCo: A platform to support the semantic composition of services for gene expression analysis.

    PubMed

    Guardia, Gabriela D A; Ferreira Pires, Luís; da Silva, Eduardo G; de Farias, Cléver R G

    2017-02-01

    Gene expression studies often require the combined use of a number of analysis tools. However, manual integration of analysis tools can be cumbersome and error prone. To support a higher level of automation in the integration process, efforts have been made in the biomedical domain towards the development of semantic web services and supporting composition environments. Yet, most environments consider only the execution of simple service behaviours and requires users to focus on technical details of the composition process. We propose a novel approach to the semantic composition of gene expression analysis services that addresses the shortcomings of the existing solutions. Our approach includes an architecture designed to support the service composition process for gene expression analysis, and a flexible strategy for the (semi) automatic composition of semantic web services. Finally, we implement a supporting platform called SemanticSCo to realize the proposed composition approach and demonstrate its functionality by successfully reproducing a microarray study documented in the literature. The SemanticSCo platform provides support for the composition of RESTful web services semantically annotated using SAWSDL. Our platform also supports the definition of constraints/conditions regarding the order in which service operations should be invoked, thus enabling the definition of complex service behaviours. Our proposed solution for semantic web service composition takes into account the requirements of different stakeholders and addresses all phases of the service composition process. It also provides support for the definition of analysis workflows at a high-level of abstraction, thus enabling users to focus on biological research issues rather than on the technical details of the composition process. The SemanticSCo source code is available at https://github.com/usplssb/SemanticSCo. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. USB environment measurements based on full-scale static engine ground tests. [Upper Surface Blowing for YC-14

    NASA Technical Reports Server (NTRS)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive-lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data and to establish a basis for future flight test comparisons.

  7. Capturing Fine Details Involving Low-Cost Sensors -a Comparative Study

    NASA Astrophysics Data System (ADS)

    Rehany, N.; Barsi, A.; Lovas, T.

    2017-11-01

    Capturing the fine details on the surface of small objects is a real challenge to many conventional surveying methods. Our paper discusses the investigation of several data acquisition technologies, such as arm scanner, structured light scanner, terrestrial laser scanner, object line-scanner, DSLR camera, and mobile phone camera. A palm-sized embossed sculpture reproduction was used as a test object; it has been surveyed by all the instruments. The result point clouds and meshes were then analyzed, using the arm scanner's dataset as reference. In addition to general statistics, the results have been evaluated based both on 3D deviation maps and 2D deviation graphs; the latter allows even more accurate analysis of the characteristics of the different data acquisition approaches. Additionally, own-developed local minimum maps were created that nicely visualize the potential level of detail provided by the applied technologies. Besides the usual geometric assessment, the paper discusses the different resource needs (cost, time, expertise) of the discussed techniques. Our results proved that even amateur sensors operated by amateur users can provide high quality datasets that enable engineering analysis. Based on the results, the paper contains an outlook to potential future investigations in this field.

  8. Cost/Effort Drivers and Decision Analysis

    NASA Technical Reports Server (NTRS)

    Seidel, Jonathan

    2010-01-01

    Engineering trade study analyses demand consideration of performance, cost and schedule impacts across the spectrum of alternative concepts and in direct reference to product requirements. Prior to detailed design, requirements are too often ill-defined (only goals ) and prone to creep, extending well beyond the Systems Requirements Review. Though lack of engineering design and definitive requirements inhibit the ability to perform detailed cost analyses, affordability trades still comprise the foundation of these future product decisions and must evolve in concert. This presentation excerpts results of the recent NASA subsonic Engine Concept Study for an Advanced Single Aisle Transport to demonstrate an affordability evaluation of performance characteristics and the subsequent impacts on engine architecture decisions. Applying the Process Based Economic Analysis Tool (PBEAT), development cost, production cost, as well as operation and support costs were considered in a traditional weighted ranking of the following system-level figures of merit: mission fuel burn, take-off noise, NOx emissions, and cruise speed. Weighting factors were varied to ascertain the architecture ranking sensitivities to these performance figures of merit with companion cost considerations. A more detailed examination of supersonic variable cycle engine cost is also briefly presented, with observations and recommendations for further refinements.

  9. 13. DETAIL, IRON STAIR RAILING, STREET LEVEL TO GROUND FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. DETAIL, IRON STAIR RAILING, STREET LEVEL TO GROUND FLOOR LEVEL (2 x 2 negative; 5 x 7 print) - Patent Office Building, Bounded by Seventh, Ninth, F & G Streets, Northwest, Washington, District of Columbia, DC

  10. Unmanned vehicles for maritime spill response case study: Exercise Cathach.

    PubMed

    Dooly, Gerard; Omerdic, Edin; Coleman, Joseph; Miller, Liam; Kaknjo, Admir; Hayes, James; Braga, Jóse; Ferreira, Filipe; Conlon, Hugh; Barry, Hugh; Marcos-Olaya, Jesús; Tuohy, Thomas; Sousa, João; Toal, Dan

    2016-09-15

    This paper deals with two aspects, namely a historical analysis of the use of unmanned vehicles (UAVs ROVs, AUVs) in maritime spill incidents and a detailed description of a multi-agency oil and HNS incident response exercise involving the integration and analysis of unmanned vehicles environmental sensing equipment. The exercise was a first in terms of the level of robotic systems deployed to assist in survey, surveillance and inspection roles for oil spills and harmful and noxious substances. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Texture Analysis of Recurrence Plots Based on Wavelets and PSO for Laryngeal Pathologies Detection.

    PubMed

    Souza, Taciana A; Vieira, Vinícius J D; Correia, Suzete E N; Costa, Silvana L N C; de A Costa, Washington C; Souza, Micael A

    2015-01-01

    This paper deals with the discrimination between healthy and pathological speech signals using recurrence plots and wavelet transform with texture features. Approximation and detail coefficients are obtained from the recurrence plots using Haar wavelet transform, considering one decomposition level. The considered laryngeal pathologies are: paralysis, Reinke's edema and nodules. Accuracy rates above 86% were obtained by means of the employed method.

  12. The Economics of Higher Education: A Report Prepared by the Department of the Treasury with the Department of Education

    ERIC Educational Resources Information Center

    US Department of the Treasury, 2012

    2012-01-01

    This report discusses the current state of higher education, with a brief high-level overview of the market and a more detailed discussion and analysis of the financial aid system. It also discusses the important changes President Obama has made to make higher education more accessible and affordable. The key findings are: (1) The economic returns…

  13. Preliminary candidate advanced avionics system for general aviation

    NASA Technical Reports Server (NTRS)

    Mccalla, T. M.; Grismore, F. L.; Greatline, S. E.; Birkhead, L. M.

    1977-01-01

    An integrated avionics system design was carried out to the level which indicates subsystem function, and the methods of overall system integration. Sufficient detail was included to allow identification of possible system component technologies, and to perform reliability, modularity, maintainability, cost, and risk analysis upon the system design. Retrofit to older aircraft, availability of this system to the single engine two place aircraft, was considered.

  14. Systems Analysis of Amphibious Landing Craft: Comparisons of Preliminary Designs of Advanced Landing Craft

    DTIC Science & Technology

    the SRI program GAMUT , which is a simulation covering much the same ground as the STS-2 package but with a great reduction in the level of detail...that is considered. It provides the means of rapidly and cheaply changing the input conditions and operating procedures used in the simulation. Selected preliminary results of the GAMUT model are given.

  15. The Nature of the Nodes, Weights and Degree of Precision in Gaussian Quadrature Rules

    ERIC Educational Resources Information Center

    Prentice, J. S. C.

    2011-01-01

    We present a comprehensive proof of the theorem that relates the weights and nodes of a Gaussian quadrature rule to its degree of precision. This level of detail is often absent in modern texts on numerical analysis. We show that the degree of precision is maximal, and that the approximation error in Gaussian quadrature is minimal, in a…

  16. Improving Analytical Characterization of Glycoconjugate Vaccines through Combined High-Resolution MS and NMR: Application to Neisseria meningitidis Serogroup B Oligosaccharide-Peptide Glycoconjugates.

    PubMed

    Yu, Huifeng; An, Yanming; Battistel, Marcos D; Cipollo, John F; Freedberg, Darón I

    2018-04-17

    Conjugate vaccines are highly heterogeneous in terms of glycosylation sites and linked oligosaccharide length. Therefore, the characterization of conjugate vaccines' glycosylation state is challenging. However, improved product characterization can lead to enhancements in product control and product quality. Here, we present a synergistic combination of high-resolution mass spectrometry (MS) and nuclear magnetic resonance spectroscopy (NMR) for the analysis of glycoconjugates. We use the power of this strategy to characterize model polysaccharide conjugates and to demonstrate a detailed level of glycoproteomic analysis. These are first steps on model compounds that will help untangle the details of complex product characterization in conjugate vaccines. Ultimately, this strategy can be applied to enhance the characterization of polysaccharide conjugate vaccines. In this study, we lay the groundwork for the analysis of conjugate vaccines. To begin this effort, oligosaccharide-peptide conjugates were synthesized by periodate oxidation of an oligosaccharide of a defined length, α,2-8 sialic acid trimer, followed by a reductive amination, and linking the trimer to an immunogenic peptide from tetanus toxoid. Combined mass spectrometry and nuclear magnetic resonance were used to monitor each reaction and conjugation products. Complete NMR peak assignment and detailed MS information on oxidized oligosialic acid and conjugates are reported. These studies provide a deeper understanding of the conjugation chemistry process and products, which can lead to a better controlled production process.

  17. CERES Product Level Details

    Atmospheric Science Data Center

    2013-02-28

    ... CERES Product Level Details   Level 1B:  Data products are processed to sensor units. The BDS product contains CERES ... position and velocity, and all raw engineering and status data from the instrument. Level 2:  Data products are derived ... between average global net TOA flux imbalance and ocean heat storage). ...

  18. 14. DETAIL, IRON STAIR RAILING, STREET LEVEL TO GROUND FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. DETAIL, IRON STAIR RAILING, STREET LEVEL TO GROUND FLOOR LEVEL, SHOWING NEWEL POST (2 x 2 negative; 5 x 7 print) - Patent Office Building, Bounded by Seventh, Ninth, F & G Streets, Northwest, Washington, District of Columbia, DC

  19. Proteomics Analysis of Tissue Samples Reveals Changes in Mitochondrial Protein Levels in Parathyroid Hyperplasia over Adenoma

    PubMed Central

    AKPINAR, GURLER; KASAP, MURAT; CANTURK, NUH ZAFER; ZULFIGAROVA, MEHIN; ISLEK, EYLÜL ECE; GULER, SERTAC ATA; SIMSEK, TURGAY; CANTURK, ZEYNEP

    2017-01-01

    Background/Aim: To unveil the pathophysiology of primary hyperparathyroidism, molecular details of parathyroid hyperplasia and adenoma have to be revealed. Such details will provide the tools necessary for differentiation of these two look-alike diseases. Therefore, in the present study, a comparative proteomic study using postoperative tissue samples from the parathyroid adenoma and parathyroid hyperplasia patients was performed. Materials and Methods: Protein extracts were prepared from tissue samples (n=8 per group). Protein pools were created for each group and subjected to DIGE and conventional 2DE. Following image analysis, spots representing the differentially regulated proteins were excised from the and used for identification via MALDI-TOF/TOF analysis. Results: The identities of 40 differentially-expressed proteins were revealed. Fourteen of these proteins were over-expressed in the hyperplasia while 26 of them were over-expressed in the adenoma. Conclusion: Most proteins found to be over-expressed in the hyperplasia samples were mitochondrial, underlying the importance of the mitochondrial activity as a potential biomarker for differentiation of parathyroid hyperplasia from adenoma. PMID:28446534

  20. 13C cell wall enrichment and ionic liquid NMR analysis: progress towards a high-throughput detailed chemical analysis of the whole plant cell wall.

    PubMed

    Foston, Marcus; Samuel, Reichel; Ragauskas, Arthur J

    2012-09-07

    The ability to accurately and rapidly measure plant cell wall composition, relative monolignol content and lignin-hemicellulose inter-unit linkage distributions has become essential to efforts centered on reducing the recalcitrance of biomass by genetic engineering. Growing (13)C enriched transgenic plants is a viable route to achieve the high-throughput, detailed chemical analysis of whole plant cell wall before and after pretreatment and microbial or enzymatic utilization by (13)C nuclear magnetic resonance (NMR) in a perdeuterated ionic liquid solvent system not requiring component isolation. 1D (13)C whole cell wall ionic liquid NMR of natural abundant and (13)C enriched corn stover stem samples suggest that a high level of uniform labeling (>97%) can significantly reduce the total NMR experiment times up to ~220 times. Similarly, significant reduction in total NMR experiment time (~39 times) of the (13)C enriched corn stover stem samples for 2D (13)C-(1)H heteronuclear single quantum coherence NMR was found.

  1. Analytical Prediction of the Seismic Response of a Reinforced Concrete Containment Vessel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, R.J.; Rashid, Y.R.; Cherry, J.L.

    Under the sponsorship of the Ministry of International Trade and Industry (MITI) of Japan, the Nuclear Power Engineering Corporation (NUPEC) is investigating the seismic behavior of a Reinforced Concrete Containment Vessel (RCCV) through scale-model testing using the high-performance shaking table at the Tadotsu Engineering Laboratory. A series of tests representing design-level seismic ground motions was initially conducted to gather valuable experimental measurements for use in design verification. Additional tests will be conducted with increasing amplifications of the seismic input until a structural failure of the test model occurs. In a cooperative program with NUPEC, the US Nuclear Regulatory Commission (USNRC),more » through Sandia National Laboratories (SNL), is conducting analytical research on the seismic behavior of RCCV structures. As part of this program, pretest analytical predictions of the model tests are being performed. The dynamic time-history analysis utilizes a highly detailed concrete constitutive model applied to a three-dimensional finite element representation of the test structure. This paper describes the details of the analysis model and provides analysis results.« less

  2. Sensor image prediction techniques

    NASA Astrophysics Data System (ADS)

    Stenger, A. J.; Stone, W. R.; Berry, L.; Murray, T. J.

    1981-02-01

    The preparation of prediction imagery is a complex, costly, and time consuming process. Image prediction systems which produce a detailed replica of the image area require the extensive Defense Mapping Agency data base. The purpose of this study was to analyze the use of image predictions in order to determine whether a reduced set of more compact image features contains enough information to produce acceptable navigator performance. A job analysis of the navigator's mission tasks was performed. It showed that the cognitive and perceptual tasks he performs during navigation are identical to those performed for the targeting mission function. In addition, the results of the analysis of his performance when using a particular sensor can be extended to the analysis of this mission tasks using any sensor. An experimental approach was used to determine the relationship between navigator performance and the type of amount of information in the prediction image. A number of subjects were given image predictions containing varying levels of scene detail and different image features, and then asked to identify the predicted targets in corresponding dynamic flight sequences over scenes of cultural, terrain, and mixed (both cultural and terrain) content.

  3. Destructive physical analysis of hollow cathodes from the Deep Space 1 Flight spare ion engine 30,000 hr life test

    NASA Technical Reports Server (NTRS)

    Sengupta, Anita

    2005-01-01

    Destructive physical analysis of the discharge and neutralizer hollow cathode assemblies from the Deep Space 1 Flight Spare 30,000 Hr life test was performed to characterize physical and chemical evidence of operationally induced effects after 30,372 hours of operation with beam extraction. Post-test inspection of the discharge-cathode assembly was subdivided into detailed analyses at the subcomponent level. Detailed materials analysis and optical inspection of the insert, orifice plate, cathode tube, heater, keeper assembly, insulator, and low-voltage propellant isolator were performed. Energy dispersive X-ray (EDX) and scanning electron microscopy (SEW analyses were used to determine the extent and composition of regions of net deposition and erosion of both the discharge and neutralizer inserts. A comparative approach with an un-operated 4:1:1 insert was used to determine the extent of impregnate material depletion as a function of depth from the ID surface and axial position from the orifice plate. Analysis results are compared and contrasted with those obtained from similar analyses on components from shorter term tests, and provide insight regarding the prospect for successful longer-term operation consistent with SOA ion engine program life objectives at NASA.

  4. Clinical professional governance for detailed clinical models.

    PubMed

    Goossen, William; Goossen-Baremans, Anneke

    2013-01-01

    This chapter describes the need for Detailed Clinical Models for contemporary Electronic Health Systems, data exchange and data reuse. It starts with an explanation of the components related to Detailed Clinical Models with a brief summary of knowledge representation, including terminologies representing clinic relevant "things" in the real world, and information models that abstract these in order to let computers process data about these things. Next, Detailed Clinical Models are defined and their purpose is described. It builds on existing developments around the world and accumulates in current work to create a technical specification at the level of the International Standards Organization. The core components of properly expressed Detailed Clinical Models are illustrated, including clinical knowledge and context, data element specification, code bindings to terminologies and meta-information about authors, versioning among others. Detailed Clinical Models to date are heavily based on user requirements and specify the conceptual and logical levels of modelling. It is not precise enough for specific implementations, which requires an additional step. However, this allows Detailed Clinical Models to serve as specifications for many different kinds of implementations. Examples of Detailed Clinical Models are presented both in text and in Unified Modelling Language. Detailed Clinical Models can be positioned in health information architectures, where they serve at the most detailed granular level. The chapter ends with examples of projects that create and deploy Detailed Clinical Models. All have in common that they can often reuse materials from earlier projects, and that strict governance of these models is essential to use them safely in health care information and communication technology. Clinical validation is one point of such governance, and model testing another. The Plan Do Check Act cycle can be applied for governance of Detailed Clinical Models. Finally, collections of clinical models do require a repository in which they can be stored, searched, and maintained. Governance of Detailed Clinical Models is required at local, national, and international levels.

  5. To acquire more detailed radiation drive by use of ``quasi-steady'' approximation in atomic kinetics

    NASA Astrophysics Data System (ADS)

    Ren, Guoli; Pei, Wenbing; Lan, Ke; Gu, Peijun; Li, Xin

    2012-10-01

    In current routine 2D simulation of hohlraum physics, we adopt the principal-quantum- number(n-level) average atom model(AAM) in NLTE plasma description. However, the detailed experimental frequency-dependant radiative drive differs from our n-level simulated drive, which reminds us the need of a more detailed atomic kinetics description. The orbital-quantum- number(nl-level) average atom model is a natural consideration, however the nl-level in-line calculation needs much more computational resource. By distinguishing the rapid bound-bound atomic processes from the relative slow bound-free atomic processes, we found a method to build up a more detailed bound electron distribution(nl-level even nlm-level) using in-line n-level calculated plasma conditions(temperature, density, and average ionization degree). We name this method ``quasi-steady approximation'' in atomic kinetics. Using this method, we re-build the nl-level bound electron distribution (Pnl), and acquire a new hohlraum radiative drive by post-processing. Comparison with the n-level post-processed hohlraum drive shows that we get an almost identical radiation flux but with more fine frequency-denpending spectrum structure which appears only in nl-level transition with same n number(n=0) .

  6. Correlation of contrast-detail analysis and clinical image quality assessment in chest radiography with a human cadaver study.

    PubMed

    De Crop, An; Bacher, Klaus; Van Hoof, Tom; Smeets, Peter V; Smet, Barbara S; Vergauwen, Merel; Kiendys, Urszula; Duyck, Philippe; Verstraete, Koenraad; D'Herde, Katharina; Thierens, Hubert

    2012-01-01

    To determine the correlation between the clinical and physical image quality of chest images by using cadavers embalmed with the Thiel technique and a contrast-detail phantom. The use of human cadavers fulfilled the requirements of the institutional ethics committee. Clinical image quality was assessed by using three human cadavers embalmed with the Thiel technique, which results in excellent preservation of the flexibility and plasticity of organs and tissues. As a result, lungs can be inflated during image acquisition to simulate the pulmonary anatomy seen on a chest radiograph. Both contrast-detail phantom images and chest images of the Thiel-embalmed bodies were acquired with an amorphous silicon flat-panel detector. Tube voltage (70, 81, 90, 100, 113, 125 kVp), copper filtration (0.1, 0.2, 0.3 mm Cu), and exposure settings (200, 280, 400, 560, 800 speed class) were altered to simulate different quality levels. Four experienced radiologists assessed the image quality by using a visual grading analysis (VGA) technique based on European Quality Criteria for Chest Radiology. The phantom images were scored manually and automatically with use of dedicated software, both resulting in an inverse image quality figure (IQF). Spearman rank correlations between inverse IQFs and VGA scores were calculated. A statistically significant correlation (r = 0.80, P < .01) was observed between the VGA scores and the manually obtained inverse IQFs. Comparison of the VGA scores and the automated evaluated phantom images showed an even better correlation (r = 0.92, P < .001). The results support the value of contrast-detail phantom analysis for evaluating clinical image quality in chest radiography. © RSNA, 2011.

  7. Association of Lead Levels and Cerebral Palsy

    PubMed Central

    Bansal, Neha; Aggarwal, Anju; Faridi, M. M. A.; Sharma, Tusha; Baneerjee, B. D.

    2017-01-01

    Background: Cerebral palsy is a common motor disability in childhood. Raised lead levels affect cognition. Children with cerebral palsy may have raised lead levels, further impairing their residual cognitive motor and behavioral abilities. Environmental exposure and abnormal eating habits may lead to increased lead levels. Aims and Objectives: To measure blood lead levels in children with cerebral palsy and compare them with healthy neurologically normal children. To correlate blood lead levels with environmental factors. Material and Methods: Design: Prospective case-control study. Setting: Tertiary care hospital. Participants: Cases comprised 34 children with cerebral palsy, and controls comprised 34 neurologically normal, age- and sex-matched children. Methods: Clinical and demographic details were recorded as per proforma. Detailed environmental history was recorded to know the source of exposure to lead. These children were investigated and treated as per protocol. Venous blood was collected in ethylenediaminetetraacetic acid vials for analysis of blood lead levels. Lead levels were estimated by Schimadzu Flame AA-6800 (atomic absorption spectrophotometer). Data were analyzed using SPSS version 17. P < .05 was taken as significant. Results: Mean blood lead levels were 9.20 ± 8.31 µg/dL in cerebral palsy cases and 2.89 ± 3.04 µg/dL in their controls (P < .001). Among children with cerebral palsy, 19 (55.88%) children had blood lead levels ≥5 µg/dL. Lead levels in children with pica were 12.33 ± 10.02 µg/dL in comparison to children with no history of pica, 6.70 ± 4.60 µg/dL (P = .029). No correlation was found between hemoglobin and blood lead levels in cases and controls. Conclusion: In our study, blood lead levels are raised in children with cerebral palsy. However, further studies are required to show effects of raised levels in these children. PMID:28491920

  8. SCOWLP classification: Structural comparison and analysis of protein binding regions

    PubMed Central

    Teyra, Joan; Paszkowski-Rogacz, Maciej; Anders, Gerd; Pisabarro, M Teresa

    2008-01-01

    Background Detailed information about protein interactions is critical for our understanding of the principles governing protein recognition mechanisms. The structures of many proteins have been experimentally determined in complex with different ligands bound either in the same or different binding regions. Thus, the structural interactome requires the development of tools to classify protein binding regions. A proper classification may provide a general view of the regions that a protein uses to bind others and also facilitate a detailed comparative analysis of the interacting information for specific protein binding regions at atomic level. Such classification might be of potential use for deciphering protein interaction networks, understanding protein function, rational engineering and design. Description Protein binding regions (PBRs) might be ideally described as well-defined separated regions that share no interacting residues one another. However, PBRs are often irregular, discontinuous and can share a wide range of interacting residues among them. The criteria to define an individual binding region can be often arbitrary and may differ from other binding regions within a protein family. Therefore, the rational behind protein interface classification should aim to fulfil the requirements of the analysis to be performed. We extract detailed interaction information of protein domains, peptides and interfacial solvent from the SCOWLP database and we classify the PBRs of each domain family. For this purpose, we define a similarity index based on the overlapping of interacting residues mapped in pair-wise structural alignments. We perform our classification with agglomerative hierarchical clustering using the complete-linkage method. Our classification is calculated at different similarity cut-offs to allow flexibility in the analysis of PBRs, feature especially interesting for those protein families with conflictive binding regions. The hierarchical classification of PBRs is implemented into the SCOWLP database and extends the SCOP classification with three additional family sub-levels: Binding Region, Interface and Contacting Domains. SCOWLP contains 9,334 binding regions distributed within 2,561 families. In 65% of the cases we observe families containing more than one binding region. Besides, 22% of the regions are forming complex with more than one different protein family. Conclusion The current SCOWLP classification and its web application represent a framework for the study of protein interfaces and comparative analysis of protein family binding regions. This comparison can be performed at atomic level and allows the user to study interactome conservation and variability. The new SCOWLP classification may be of great utility for reconstruction of protein complexes, understanding protein networks and ligand design. SCOWLP will be updated with every SCOP release. The web application is available at . PMID:18182098

  9. Considerations for Reporting Finite Element Analysis Studies in Biomechanics

    PubMed Central

    Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason; Tadepalli, Srinivas C.; Morrison, Tina M.

    2012-01-01

    Simulation-based medicine and the development of complex computer models of biological structures is becoming ubiquitous for advancing biomedical engineering and clinical research. Finite element analysis (FEA) has been widely used in the last few decades to understand and predict biomechanical phenomena. Modeling and simulation approaches in biomechanics are highly interdisciplinary, involving novice and skilled developers in all areas of biomedical engineering and biology. While recent advances in model development and simulation platforms offer a wide range of tools to investigators, the decision making process during modeling and simulation has become more opaque. Hence, reliability of such models used for medical decision making and for driving multiscale analysis comes into question. Establishing guidelines for model development and dissemination is a daunting task, particularly with the complex and convoluted models used in FEA. Nonetheless, if better reporting can be established, researchers will have a better understanding of a model’s value and the potential for reusability through sharing will be bolstered. Thus, the goal of this document is to identify resources and considerate reporting parameters for FEA studies in biomechanics. These entail various levels of reporting parameters for model identification, model structure, simulation structure, verification, validation, and availability. While we recognize that it may not be possible to provide and detail all of the reporting considerations presented, it is possible to establish a level of confidence with selective use of these parameters. More detailed reporting, however, can establish an explicit outline of the decision-making process in simulation-based analysis for enhanced reproducibility, reusability, and sharing. PMID:22236526

  10. Tethered Satellite System Contingency Investigation Board

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Tethered Satellite System (TSS-1) was launched aboard the Space Shuttle Atlantis (STS-46) on July 31, 1992. During the attempted on-orbit operations, the Tethered Satellite System failed to deploy successfully beyond 256 meters. The satellite was retrieved successfully and was returned on August 6, 1992. The National Aeronautics and Space Administration (NASA) Associate Administrator for Space Flight formed the Tethered Satellite System (TSS-1) Contingency Investigation Board on August 12, 1992. The TSS-1 Contingency Investigation Board was asked to review the anomalies which occurred, to determine the probable cause, and to recommend corrective measures to prevent recurrence. The board was supported by the TSS Systems Working group as identified in MSFC-TSS-11-90, 'Tethered Satellite System (TSS) Contingency Plan'. The board identified five anomalies for investigation: initial failure to retract the U2 umbilical; initial failure to flyaway; unplanned tether deployment stop at 179 meters; unplanned tether deployment stop at 256 meters; and failure to move tether in either direction at 224 meters. Initial observations of the returned flight hardware revealed evidence of mechanical interference by a bolt with the level wind mechanism travel as well as a helical shaped wrap of tether which indicated that the tether had been unwound from the reel beyond the travel by the level wind mechanism. Examination of the detailed mission events from flight data and mission logs related to the initial failure to flyaway and the failure to move in either direction at 224 meters, together with known preflight concerns regarding slack tether, focused the assessment of these anomalies on the upper tether control mechanism. After the second meeting, the board requested the working group to complete and validate a detailed integrated mission sequence to focus the fault tree analysis on a stuck U2 umbilical, level wind mechanical interference, and slack tether in upper tether control mechanism and to prepare a detailed plan for hardware inspection, test, and analysis including any appropriate hardware disassembly.

  11. Tethered Satellite System Contingency Investigation Board

    NASA Astrophysics Data System (ADS)

    1992-11-01

    The Tethered Satellite System (TSS-1) was launched aboard the Space Shuttle Atlantis (STS-46) on July 31, 1992. During the attempted on-orbit operations, the Tethered Satellite System failed to deploy successfully beyond 256 meters. The satellite was retrieved successfully and was returned on August 6, 1992. The National Aeronautics and Space Administration (NASA) Associate Administrator for Space Flight formed the Tethered Satellite System (TSS-1) Contingency Investigation Board on August 12, 1992. The TSS-1 Contingency Investigation Board was asked to review the anomalies which occurred, to determine the probable cause, and to recommend corrective measures to prevent recurrence. The board was supported by the TSS Systems Working group as identified in MSFC-TSS-11-90, 'Tethered Satellite System (TSS) Contingency Plan'. The board identified five anomalies for investigation: initial failure to retract the U2 umbilical; initial failure to flyaway; unplanned tether deployment stop at 179 meters; unplanned tether deployment stop at 256 meters; and failure to move tether in either direction at 224 meters. Initial observations of the returned flight hardware revealed evidence of mechanical interference by a bolt with the level wind mechanism travel as well as a helical shaped wrap of tether which indicated that the tether had been unwound from the reel beyond the travel by the level wind mechanism. Examination of the detailed mission events from flight data and mission logs related to the initial failure to flyaway and the failure to move in either direction at 224 meters, together with known preflight concerns regarding slack tether, focused the assessment of these anomalies on the upper tether control mechanism. After the second meeting, the board requested the working group to complete and validate a detailed integrated mission sequence to focus the fault tree analysis on a stuck U2 umbilical, level wind mechanical interference, and slack tether in upper tether control mechanism and to prepare a detailed plan for hardware inspection, test, and analysis including any appropriate hardware disassembly.

  12. Modelling near field regional uplift patterns in West Greenland/Disko Bay with plane-Earth finite element models.

    NASA Astrophysics Data System (ADS)

    Meldgaard, Asger; Nielsen, Lars; Iaffaldano, Giampiero

    2017-04-01

    Relative sea level data, primarily obtained through isolation basin analysis in western Greenland and on Disko Island, indicates asynchronous rates of uplift during the Early Holocene with larger rates of uplift in southern Disko Bay compared to the northern part of the bay. Similar short-wavelength variations can be inferred from the Holocene marine limit as observations on the north and south side of Disko Island differ by as much as 60 m. While global isostatic adjustment models are needed to account for far field contributions to the relative sea level and for the calculation of accurate ocean functions, they are generally not suited for a detailed analysis of the short-wavelength uplift patterns observed close to present ice margins. This is in part due to the excessive computational cost required for sufficient resolution, and because these models generally ignore regional lateral heterogeneities in mantle and lithosphere rheology. To mitigate this problem, we perform sensitivity tests to investigate the effects of near field loading on a regional plane-Earth finite element model of the lithosphere and mantle of the Disko Bay area, where the global isostatic uplift chronology is well documented. By loading the model area through detailed regional ocean function and ice models, and by including a high resolution topography model of the area, we seek to assess the isostatic rebound generated by surface processes with wavelengths similar to those of the observed rebound signal. We also investigate possible effects of varying lithosphere and mantle rheology, which may play an important role in explaining the rebound signal. We use the abundance of relative sea level curves obtained in the region primarily through isolation basin analysis on Disko Island to constrain the parameters of the Earth model.

  13. Orion Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Hoelscher, Brian R.

    2007-01-01

    The Orion Entry, Descent, and Landing simulation was created over the past two years to serve as the primary Crew Exploration Vehicle guidance, navigation, and control (GN&C) design and analysis tool at the National Aeronautics and Space Administration (NASA). The Advanced NASA Technology Architecture for Exploration Studies (ANTARES) simulation is a six degree-of-freedom tool with a unique design architecture which has a high level of flexibility. This paper describes the decision history and motivations that guided the creation of this simulation tool. The capabilities of the models within ANTARES are presented in detail. Special attention is given to features of the highly flexible GN&C architecture and the details of the implemented GN&C algorithms. ANTARES provides a foundation simulation for the Orion Project that has already been successfully used for requirements analysis, system definition analysis, and preliminary GN&C design analysis. ANTARES will find useful application in engineering analysis, mission operations, crew training, avionics-in-the-loop testing, etc. This paper focuses on the entry simulation aspect of ANTARES, which is part of a bigger simulation package supporting the entire mission profile of the Orion vehicle. The unique aspects of entry GN&C design are covered, including how the simulation is being used for Monte Carlo dispersion analysis and for support of linear stability analysis. Sample simulation output from ANTARES is presented in an appendix.

  14. Single Upconversion Nanoparticle-Bacterium Cotrapping for Single-Bacterium Labeling and Analysis.

    PubMed

    Xin, Hongbao; Li, Yuchao; Xu, Dekang; Zhang, Yueli; Chen, Chia-Hung; Li, Baojun

    2017-04-01

    Detecting and analyzing pathogenic bacteria in an effective and reliable manner is crucial for the diagnosis of acute bacterial infection and initial antibiotic therapy. However, the precise labeling and analysis of bacteria at the single-bacterium level are a technical challenge but very important to reveal important details about the heterogeneity of cells and responds to environment. This study demonstrates an optical strategy for single-bacterium labeling and analysis by the cotrapping of single upconversion nanoparticles (UCNPs) and bacteria together. A single UCNP with an average size of ≈120 nm is first optically trapped. Both ends of a single bacterium are then trapped and labeled with single UCNPs emitting green light. The labeled bacterium can be flexibly moved to designated locations for further analysis. Signals from bacteria of different sizes are detected in real time for single-bacterium analysis. This cotrapping method provides a new approach for single-pathogenic-bacterium labeling, detection, and real-time analysis at the single-particle and single-bacterium level. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Nuclear reactor descriptions for space power systems analysis

    NASA Technical Reports Server (NTRS)

    Mccauley, E. W.; Brown, N. J.

    1972-01-01

    For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.

  16. Evans hole and non linear optical activity in Bis(melaminium) sulphate dihydrate: A vibrational spectral study.

    PubMed

    Suresh Kumar, V R; Binoy, J; Dawn Dharma Roy, S; Marchewka, M K; Jayakumar, V S

    2015-01-01

    Bis(melaminium) sulphate dihydrate (BMSD), an interesting melaminium derivative for nonlinear optical activity, has been subjected to vibrational spectral analysis using FT IR and FT Raman spectra. The analysis has been aided by the Potential Energy Distribution (PED) of vibrational spectral bands, derived using density functional theory (DFT) at B3LYP/6-31G(d) level. The geometry is found to correlate well with the XRD structure and the band profiles for certain vibrations in the finger print region have been theoretically explained using Evans hole. The detailed Natural Bond Orbital (NBO) analysis of the hydrogen bonding in BMSD has also been carried out to understand the correlation between the stabilization energy of hyperconjugation of the lone pair of donor with the σ(∗) orbital of hydrogen-acceptor bond and the strength of hydrogen bond. The theoretical calculation shows that BMSD has NLO efficiency, 2.66 times that of urea. The frontier molecular orbital analysis points to a charge transfer, which contributes to NLO activity, through N-H…O intermolecular hydrogen bonding between the melaminium ring and the sulphate. The molecular electrostatic potential (MEP) mapping has also been performed for the detailed analysis of the mutual interactions between melaminium ring and sulphate ion. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Infrasound, Its Sources and Its Effects on Man

    DTIC Science & Technology

    1976-05-01

    modulated by an infra - Annoyance has been broken out as a separate sonic frequency. For instance, the amplified topic because I believe that the greatest...importance is the nigh frequency response of quency sound. In general, infrasound does not the measurement system. Measurement of infra - often occur at levels...esuential for detailed analysis and changes in barometric pressure would be con- from these recordings a narrow band spectral sidered infrasonic . The

  18. Prototyping with Data Dictionaries for Requirements Analysis.

    DTIC Science & Technology

    1985-03-01

    statistical packages and software for screen layout. These items work at a higher level than another category of prototyping tool, program generators... Program generators are software packages which, when given specifications, produce source listings, usually in a high order language such as COBCL...with users and this will not happen if he must stop to develcp a detailed program . [Ref. 241] Hardware as well as software should be considered in

  19. Analysis of the effects of combustion emissions and Santa Ana winds on ambient ozone during the October 2007 southern California wildfires

    Treesearch

    A. Bytnerowicz; D. Cayan; P. Riggan; S. Schilling; P. Dawson; M. Tyree; L. Wolden; R. Tissell; H. Preisler

    2010-01-01

    Combustion emissions and strong Santa Ana winds had pronounced effects on patterns and levels of ambient ozone (O3) in southern California during the extensive wildland fires of October 2007. These changes are described in detail for a rural receptor site, the Santa Margarita Ecological Reserve, located among large fires in San Diego and Orange counties. In addition,...

  20. The Michigan data needs questionnaire

    NASA Technical Reports Server (NTRS)

    Hill-Rowley, R.

    1981-01-01

    The data needs questionnaire is an element in the project design study for the Michigan Resource Inventory Act and is aimed at gathering information on what inventory information is required by land use planners throughout the state. Analysis of questionnaire responses is discussed. Some information on current use categories was tabulated. The respondents selected a broad range of categories at all levels of detail. Those most frequently indicated were urban categories.

  1. Job Analysis Techniques for Restructuring Health Manpower Education and Training in the Navy Medical Department. Attachment 4. Clinic QPCB Task Sort for Clinical Physician Assistants--Dermatology, ENT, Opththalmology, Orthopedics, and Urology.

    ERIC Educational Resources Information Center

    Technomics, Inc., McLean, VA.

    This publication is Attachment 4 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties for clinical physician assistants. (BT)

  2. Defense Advanced Research Projects Agency: Key Factors Drive Transition of Technologies, but Better Training and Data Dissemination Can Increase Success

    DTIC Science & Technology

    2015-11-01

    more detail. Table 1: Overview of DARPA Programs Selected for GAO Case Study Analyses Program name Program description Advanced Wireless Networks ...Selected DARPA Programs Program name According to DARPA portfolio-level database According to GAO analysis Advanced Wireless Networks for the Soldier...with potential transition partners Achievement of clearly defined technical goals Successful transition Advanced Wireless Networks for Soldier

  3. Improving Legacy Aircraft Systems Through Condition-Based Maintenance: An H-60 Case Study

    DTIC Science & Technology

    2014-09-01

    level functions. These decompositions are equivalent to a detailed design effort in systems engineering. NAMPSOPs have a common architectural structure...Assembly Power Available Spindle Cables No.1 Engine Load Demand Spindle Control Cables Engine Pneumatic Starters Auxiliary Power Unit IRCM FLIR Mission...Analysis Fuel System Main Rotor Head Main Module Main Gear Box Radiator Engine Output Shaft Auxiliary Power Unit Flight Control Cables Tail Landing

  4. Job Analysis Techniques for Restructuring Health Manpower Education and Training in the Navy Medical Department. Attachment 5. Biotronics QPCB Task Sort for Cardio-Pulmonary, EEG, EKG, Inhalation Therapy.

    ERIC Educational Resources Information Center

    Technomics, Inc., McLean, VA.

    This publication is Attachment 5 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in cardio-pulmonary, EEG, EKG, and inhalation therapy. (BT)

  5. Analysis of a Farquhar-von Caemmerer-Berry leaf-level photosynthetic rate model for Populus tremuloides in the context of modeling and measurement limitations

    Treesearch

    K.E. Lenz; G.E. Host; K. Roskoski; A. Noormets; A. Sober; D.F. Karnosky

    2010-01-01

    The balance of mechanistic detail with mathematical simplicity contributes to the broad use of the Farquhar, von Caemmerer and Berry (FvCB) photosynthetic rate model. Here the FvCB model was coupled with a stomatal conductance model to form an [A,gs] model, and parameterized for mature Populus tremuloides leaves under varying CO2...

  6. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    NASA Astrophysics Data System (ADS)

    Fernandez Galarreta, J.; Kerle, N.; Gerke, M.

    2015-06-01

    Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  7. Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.

    1994-01-01

    This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.

  8. Satellite Imagery Analysis for Automated Global Food Security Forecasting

    NASA Astrophysics Data System (ADS)

    Moody, D.; Brumby, S. P.; Chartrand, R.; Keisler, R.; Mathis, M.; Beneke, C. M.; Nicholaeff, D.; Skillman, S.; Warren, M. S.; Poehnelt, J.

    2017-12-01

    The recent computing performance revolution has driven improvements in sensor, communication, and storage technology. Multi-decadal remote sensing datasets at the petabyte scale are now available in commercial clouds, with new satellite constellations generating petabytes/year of daily high-resolution global coverage imagery. Cloud computing and storage, combined with recent advances in machine learning, are enabling understanding of the world at a scale and at a level of detail never before feasible. We present results from an ongoing effort to develop satellite imagery analysis tools that aggregate temporal, spatial, and spectral information and that can scale with the high-rate and dimensionality of imagery being collected. We focus on the problem of monitoring food crop productivity across the Middle East and North Africa, and show how an analysis-ready, multi-sensor data platform enables quick prototyping of satellite imagery analysis algorithms, from land use/land cover classification and natural resource mapping, to yearly and monthly vegetative health change trends at the structural field level.

  9. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE PAGES

    Belianinov, Alex; Panchapakesan, G.; Lin, Wenzhi; ...

    2014-12-02

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1 x Sex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signaturemore » and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  10. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belianinov, Alex, E-mail: belianinova@ornl.gov; Ganesh, Panchapakesan; Lin, Wenzhi

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe{sub 0.55}Se{sub 0.45} (T{sub c} = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe{sub 1−x}Se{sub x} structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified bymore » their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  11. The view from the tip of the iceberg.

    PubMed

    Josephs, L

    1997-01-01

    In recent years there has been a growing interest in refining the technique of ego defense analysis. All of these approaches share in common an attempt to work closely with the patient's free associations, to interpret at a level that is accessible to the patient's consciously observing ego, and to avoid bypassing the analysis of the patient's most surface-level resistances in an effort to understand unconscious conflict. These innovations reflect a commendable effort to work in a way that is rigorously empirical, that respects the patient's autonomy, and that minimizes the pressure of the analyst's transferential authority in the patient's acceptance of the analyst's interpretations. Despite the undeniable value of these technical innovations, such approaches to ego defense analysis may inadvertently result in certain overemphases in technique that may unnecessarily constrain the analytic process. They may result in a sort of obsessive tunnel vision that is overly focused on small details to the exclusion of the larger picture. An approach that counterbalances the microscopic and the macroscopic analysis of ego defense is recommended.

  12. Man power/cost estimation model: Automated planetary projects

    NASA Technical Reports Server (NTRS)

    Kitchen, L. D.

    1975-01-01

    A manpower/cost estimation model is developed which is based on a detailed level of financial analysis of over 30 million raw data points which are then compacted by more than three orders of magnitude to the level at which the model is applicable. The major parameter of expenditure is manpower (specifically direct labor hours) for all spacecraft subsystem and technical support categories. The resultant model is able to provide a mean absolute error of less than fifteen percent for the eight programs comprising the model data base. The model includes cost saving inheritance factors, broken down in four levels, for estimating follow-on type programs where hardware and design inheritance are evident or expected.

  13. Protein disorder is positively correlated with gene expression in E. coli

    PubMed Central

    Paliy, Oleg; Gargac, Shawn M.; Cheng, Yugong; Uversky, Vladimir N.; Dunker, A. Keith

    2009-01-01

    We considered on a global scale the relationship between the predicted fraction of protein disorder and RNA and protein expression in E. coli. Fraction of protein disorder correlated positively with both measured RNA expression levels of E. coli genes in three different growth media and with predicted abundance levels of E. coli proteins. Though weak, the correlation was highly significant. Correlation of protein disorder with RNA expression did not depend on the growth rate of E. coli cultures and was not caused by a small subset of genes showing exceptionally high concordance in their disorder and expression levels. Global analysis was complemented by detailed consideration of several groups of proteins. PMID:18465893

  14. Theater-Level Gaming and Analysis Workshop for Force Planning. Volume II. Summary, Discussion of Issues and Requirements for Research. September 27- 29, 1977, Held at Xerox International Center for Training and Management Development, Leesburg, Virginia

    DTIC Science & Technology

    1981-05-01

    be allocated to targets on the battlefield and in the rear area. The speaker describes the VECTOR I/NUCLEAR model, a combination of the UNICORN target...outlined. UNICORN is compatible with VECTOR 1 in level of detail. It is an expected value damage model and uses linear programming to optimize the...and a growing appreciation for the power of simulation in addressing large, complex problems, it was only a few short years before these games had

  15. Validation of helicopter noise prediction techniques

    NASA Technical Reports Server (NTRS)

    Succi, G. P.

    1981-01-01

    The current techniques of helicopter rotor noise prediction attempt to describe the details of the noise field precisely and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The purpose of this paper is to review those techniques in general and the Farassat/Nystrom analysis in particular. The predictions of the Farassat/Nystrom noise computer program, using both measured and calculated blade surface pressure data, are compared to measured noise level data. This study is based on a contract from NASA to Bolt Beranek and Newman Inc. with measured data from the AH-1G Helicopter Operational Loads Survey flight test program supplied by Bell Helicopter Textron.

  16. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  17. Cryptic or pseudocryptic: can morphological methods inform copepod taxonomy? An analysis of publications and a case study of the Eurytemora affinis species complex

    PubMed Central

    Lajus, Dmitry; Sukhikh, Natalia; Alekseev, Victor

    2015-01-01

    Interest in cryptic species has increased significantly with current progress in genetic methods. The large number of cryptic species suggests that the resolution of traditional morphological techniques may be insufficient for taxonomical research. However, some species now considered to be cryptic may, in fact, be designated pseudocryptic after close morphological examination. Thus the “cryptic or pseudocryptic” dilemma speaks to the resolution of morphological analysis and its utility for identifying species. We address this dilemma first by systematically reviewing data published from 1980 to 2013 on cryptic species of Copepoda and then by performing an in-depth morphological study of the former Eurytemora affinis complex of cryptic species. Analyzing the published data showed that, in 5 of 24 revisions eligible for systematic review, cryptic species assignment was based solely on the genetic variation of forms without detailed morphological analysis to confirm the assignment. Therefore, some newly described cryptic species might be designated pseudocryptic under more detailed morphological analysis as happened with Eurytemora affinis complex. Recent genetic analyses of the complex found high levels of heterogeneity without morphological differences; it is argued to be cryptic. However, next detailed morphological analyses allowed to describe a number of valid species. Our study, using deep statistical analyses usually not applied for new species describing, of this species complex confirmed considerable differences between former cryptic species. In particular, fluctuating asymmetry (FA), the random variation of left and right structures, was significantly different between forms and provided independent information about their status. Our work showed that multivariate statistical approaches, such as principal component analysis, can be powerful techniques for the morphological discrimination of cryptic taxons. Despite increasing cryptic species designations, morphological techniques have great potential in determining copepod taxonomy. PMID:26120427

  18. GIS-based Landing-Site Analysis and Passive Decision Support

    NASA Astrophysics Data System (ADS)

    van Gasselt, Stephan; Nass, Andrea

    2016-04-01

    The increase of surface coverage and the availability and accessibility of planetary data allow researchers and engineers to remotely perform detailed studies on surface processes and properties, in particular on objects such as Mars and the Moon for which Terabytes of multi-temporal data at multiple spatial resolution levels have become available during the last 15 years. Orbiters, rovers and landers have been returning information and insights into the surface evolution of the terrestrial planets in unprecedented detail. While rover- and lander-based analyses are one major research aim to obtain ground truth, resource exploration or even potential establishment of bases using autonomous platforms are others and they require detailed investigation of settings in order to identify spots on the surface that are suitable for spacecraft to land and operate safely and over a long period of time. What has been done using hardcopy material in the past is today being carried by using either in-house developments or off-the-shelf spatial information system technology which allows to manage, integrate and analyse data as well as visualize and create user-defined reports for performing assessments. Usually, such analyses can be broken down (manually) by considering scientific wishes, engineering boundary conditions, potential hazards and various tertiary constraints. We here (1) review standard tasks of landing site analyses, (2) discuss issues inherently related to the analysis using integrated spatial analysis systems and (3) demonstrate a modular analysis framework for integration of data and for the evaluation of results from individual tasks in order to support decisions for landing-site selection.

  19. The national biennial RCRA hazardous waste report (based on 1999 data) : state detail analysis

    DOT National Transportation Integrated Search

    2001-06-01

    The State Detail Analysis is a detailed look at each State's waste handling practices, including overall totals for generation, management, and shipments and receipts, as well as totals for the largest fifty facilities.

  20. A land use and environmental impact analysis of the Norfolk-Portsmouth SMSA

    NASA Technical Reports Server (NTRS)

    Mitchel, W. B.; Berlin, G. L.

    1973-01-01

    The feasibility of using remote sensing techniques for land use and environmental assessment in the Norfolk-Portsmouth area is discussed. Data cover the use of high altitude aircraft and satellite remote sensing data for: (1) identifying various heirarchial levels of land use, (2) monitoring land use changes for repetitive basis, (3) assessing the impact of competing land uses, and (4) identifying areas of potential environmental deterioration. High altitude aircraft photographs (scale 1:120,000) acquired in 1959, 1970, and 1972, plus Earth Resources Technology Satellite (ERTS-1) color composite images acquired in 1972 were used for the land use and environmental assessments. The high altitude aircraft photography, as expected, was successfully used to map Level 1, Level 2, as well as some urban Level 3 land use categories. However, the detail of land use analysis obtainable from the ERTS imagery exceeded the expectations for the U.S. Geological Survey's land use classification scheme. Study results are consistent with the initial investigation which determined Level 1 land use change to be 16.7 square km per year.

  1. A THIRD OF PATIENTS TREATED AT A TERTIARY LEVEL SURGICAL SERVICE COULD BE TREATED AT A SECONDARY LEVEL FACILITY.

    PubMed

    Van Straten, S K; Stannard, C J; Bulabula, J; Paul, K; Leong, J; Klipin, M

    2017-06-01

    South Africa has an overburdened public healthcare system. Some admissions to Charlotte Maxeke Johannesburg Academic Hospital (CMJAH) may not require tertiary care. The numbers and details thereof are uncertain. Clinical research is limited by skills and access to data. A retrospective analysis of Electronic Discharge (ED) summaries from the Department of Surgery at CMJAH between 01 April 2015 and 01 April 2016. An SQL-query of the database generated a .csv file of all discharges with the fields database reference number, length of stay and level of care. The details and level of care of each record were verified by MBBCh 5 medical students using a defined level of care template with review of the full discharge summary. The data was reviewed by a senior clinician. There were 3007 discharge summaries, 97 were not classifiable, two were test records and one was a duplicate. These 100 records were excluded. There were no primary level records. Secondary level patients represented 29% (854) of patients discharged and 19% of total bed days. Tertiary and quaternary together represented 71% of the total patients and 81% of bed days. The average length of stay was 4.31 days for secondary, 6.98 days for tertiary and 9.77 days for quaternary level of care allocation. Almost a third (29%) of patients discharged from CMJAH Department of Surgery were deemed suitable for secondary level care. These admissions have a shorter length of stay and comprise 19% of total bed days. Students and electronic databases are useful research resources.

  2. Aircraft Conceptual Design and Risk Analysis Using Physics-Based Noise Prediction

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.; Mavris, Dimitri N.

    2006-01-01

    An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid trade-off and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The methodology was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and take-off and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.

  3. Selenium Speciation in the Fountain Creek Watershed (Colorado, USA) Correlates with Water Hardness, Ca and Mg Levels.

    PubMed

    Carsella, James S; Sánchez-Lombardo, Irma; Bonetti, Sandra J; Crans, Debbie C

    2017-04-30

    The environmental levels of selenium (Se) are regulated and strictly enforced by the Environmental Protection Agency (EPA) because of the toxicity that Se can exert at high levels. However, speciation plays an important role in the overall toxicity of Se, and only when speciation analysis has been conducted will a detailed understanding of the system be possible. In the following, we carried out the speciation analysis of the creek waters in three of the main tributaries-Upper Fountain Creek, Monument Creek and Lower Fountain Creek-located in the Fountain Creek Watershed (Colorado, USA). There are statistically significant differences between the Se, Ca and Mg, levels in each of the tributaries and seasonal swings in Se, Ca and Mg levels have been observed. There are also statistically significant differences between the Se levels when grouped by Pierre Shale type. These factors are considered when determining the forms of Se present and analyzing their chemistry using the reported thermodynamic relationships considering Ca 2+ , Mg 2+ , SeO₄ 2- , SeO₃ 2- and carbonates. This analysis demonstrated that the correlation between Se and water hardness can be explained in terms of formation of soluble CaSeO₄. The speciation analysis demonstrated that for the Fountain Creek waters, the Ca 2+ ion may be mainly responsible for the observed correlation with the Se level. Considering that the Mg 2+ level is also correlating linearly with the Se levels it is important to recognize that without Mg 2+ the Ca 2+ would be significantly reduced. The major role of Mg 2+ is thus to raise the Ca 2+ levels despite the equilibria with carbonate and other anions that would otherwise decrease Ca 2+ levels.

  4. A detailed transcript-level probe annotation reveals alternative splicing based microarray platform differences

    PubMed Central

    Lee, Joseph C; Stiles, David; Lu, Jun; Cam, Margaret C

    2007-01-01

    Background Microarrays are a popular tool used in experiments to measure gene expression levels. Improving the reproducibility of microarray results produced by different chips from various manufacturers is important to create comparable and combinable experimental results. Alternative splicing has been cited as a possible cause of differences in expression measurements across platforms, though no study to this point has been conducted to show its influence in cross-platform differences. Results Using probe sequence data, a new microarray probe/transcript annotation was created based on the AceView Aug05 release that allowed for the categorization of genes based on their expression measurements' susceptibility to alternative splicing differences across microarray platforms. Examining gene expression data from multiple platforms in light of the new categorization, genes unsusceptible to alternative splicing differences showed higher signal agreement than those genes most susceptible to alternative splicing differences. The analysis gave rise to a different probe-level visualization method that can highlight probe differences according to transcript specificity. Conclusion The results highlight the need for detailed probe annotation at the transcriptome level. The presence of alternative splicing within a given sample can affect gene expression measurements and is a contributing factor to overall technical differences across platforms. PMID:17708771

  5. New approach for determination of the influence of long-range order and selected ring oscillations on IR spectra in zeolites.

    PubMed

    Mikuła, Andrzej; Król, Magdalena; Mozgawa, Włodzimierz; Koleżyński, Andrzej

    2018-04-15

    Vibrational spectroscopy can be considered as one of the most important methods used for structural characterization of various porous aluminosilicate materials, including zeolites. On the other hand, vibrational spectra of zeolites are still difficult to interpret, particularly in the pseudolattice region, where bands related to ring oscillations can be observed. Using combination of theoretical and computational approach, a detailed analysis of these regions of spectra is possible; such analysis should be, however, carried out employing models with different level of complexity and simultaneously the same theory level. In this work, an attempt was made to identify ring oscillations in vibrational spectra of selected zeolite structures. A series of ab initio calculations focused on S4R, S6R, and as a novelty, 5-1 isolated clusters, as well as periodic siliceous frameworks built from those building units (ferrierite (FER), mordenite (MOR) and heulandite (HEU) type) have been carried out. Due to the hierarchical structure of zeolite frameworks it can be expected that the total envelope of the zeolite spectra should be with good accuracy a sum of the spectra of structural elements that build each zeolite framework. Based on the results of HF calculations, normal vibrations have been visualized and detailed analysis of pseudolattice range of resulting theoretical spectra have been carried out. Obtained results have been applied for interpretation of experimental spectra of selected zeolites. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. The physics behind the larger scale organization of DNA in eukaryotes.

    PubMed

    Emanuel, Marc; Radja, Nima Hamedani; Henriksson, Andreas; Schiessel, Helmut

    2009-07-01

    In this paper, we discuss in detail the organization of chromatin during a cell cycle at several levels. We show that current experimental data on large-scale chromatin organization have not yet reached the level of precision to allow for detailed modeling. We speculate in some detail about the possible physics underlying the larger scale chromatin organization.

  7. Shock wave viscosity measurements

    NASA Astrophysics Data System (ADS)

    Celliers, Peter

    2013-06-01

    Several decades ago a method was proposed and demonstrated to measure the viscosity of fluids at high pressure by observing the oscillatory damping of sinusoidal perturbations on a shock front. A detailed mathematical analysis of the technique carried out subsequently by Miller and Ahrens revealed its potential, as well as a deep level of complexity in the analysis. We revisit the ideas behind this technique in the context of a recent experimental development: two-dimensional imaging velocimetry. The new technique allows one to capture a broad spectrum of perturbations down to few micron scale-lengths imposed on a shock front from an initial perturbation. The detailed evolution of the perturbation spectrum is sensitive to the viscosity in the fluid behind the shock front. Initial experiments are aimed at examining the viscosity of shock compressed SiO2 just above the shock melting transition. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  8. The methodology of multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.

  9. A nonstationary analysis for the Northern Adriatic extreme sea levels

    NASA Astrophysics Data System (ADS)

    Masina, Marinella; Lamberti, Alberto

    2013-09-01

    The historical data from the Trieste, Venice, Porto Corsini, and Rimini tide gauges have been used to investigate the spatial and temporal changes in extreme high water levels in the Northern Adriatic. A detailed analysis of annual mean sea level evolution at the three longest operating stations shows a coherent behavior both on a regional and global scale. A slight increase in magnitude of extreme water elevations, after the removal of the regularized annual mean sea level necessary to eliminate the effect of local subsidence and sea level rise, is found at the Venice and Porto Corsini stations. It seems to be mainly associated with a wind regime change occurred in the 1990s, due to an intensification of Bora wind events after their decrease in frequency and intensity during the second half of the 20th century. The extreme values, adjusted for the annual mean sea level trend, are modeled using a time-dependent GEV distribution. The inclusion of seasonality in the GEV parameters considerably improves the data fitting. The interannual fluctuations of the detrended monthly maxima exhibit a significant correlation with the variability of the large-scale atmospheric circulation represented by the North Atlantic Oscillation and Arctic Oscillation indices. The different coast exposure to the Bora and Sirocco winds and their seasonal character explain the various seasonal patterns of extreme sea levels observed at the tide gauges considered in the present analysis.

  10. 12. DETAIL OF PAINTED SIGNS AND UPPER LEVEL WINDOWS ON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. DETAIL OF PAINTED SIGNS AND UPPER LEVEL WINDOWS ON SOUTH SIDE. VIEW TO NORTHWEST. - Commercial & Industrial Buildings, Dubuque Seed Company Warehouse, 169-171 Iowa Street, Dubuque, Dubuque County, IA

  11. Automating a Detailed Cognitive Task Analysis for Structuring Curriculum

    DTIC Science & Technology

    1991-08-01

    1991-- ] Aleeo/i ISM’-19# l Title: Automating a Detailed Cognitive Task Analysis for Structuring Curriculum Activities: To date we have completed task...The Institute for Management Sciences. Although the particular application of the modified GOMS cognitive task analysis technique under development is...Laboratories 91 9 23 074 Automnating a Detailed Cognitive Task Analysis For Stucuring Curriculum Research Plan Year 1 Task 1.0 Design Task 1.1 Conduct body

  12. Methodological challenges in international performance measurement using patient-level administrative data.

    PubMed

    Kiivet, Raul; Sund, Reijo; Linna, Miika; Silverman, Barbara; Pisarev, Heti; Friedman, Nurit

    2013-09-01

    We conducted this case study in order to test how health system performance could be compared using the existing national administrative health databases containing individual data. In this comparative analysis we used national data set from three countries, Estonia, Israel and Finland to follow the medical history, treatment outcome and resource use of patients with a chronic disease (diabetes) for 8 years after medical treatment was initiated. This study showed that several clinically important aspects of quality of care as well as health policy issues of cost-effectiveness and efficiency of health systems can be assessed by using the national administrative health data systems, in case those collecting person-level health service data. We developed a structured study protocol and detailed data specifications to generate standardized data sets, in each country, for long-term follow up of incident cohort of diabetic persons as well as shared analyzing programs to produce performance measures from the standardized data sets. This stepwise decentralized approach and use of anonymous person-level data allowed us to mitigate any legal, ownership, confidentiality and privacy concerns and to create internationally comparative data with the extent of detail that is seldom seen before. For example, our preliminary performance comparisons indicate that higher mortality among relatively young diabetes patients in Estonia may be related to considerably higher rates of cardiovascular complications and lower use of statins. Modern administrative person-level health service databases contain sufficiently rich data in details to assess the performance of health systems in the management of chronic diseases. This paper presents and discusses the methodological challenges and the way the problems were solved or avoided to enhance the representativeness and comparability of results. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Internal displacement and the Syrian crisis: an analysis of trends from 2011-2014.

    PubMed

    Doocy, Shannon; Lyles, Emily; Delbiso, Tefera D; Robinson, Courtland W

    2015-01-01

    Since the start of the Syrian crisis in 2011, civil unrest and armed conflict in the country have resulted in a rapidly increasing number of people displaced both within and outside of Syria. Those displaced face immense challenges in meeting their basic needs. This study sought to characterize internal displacement in Syria, including trends in both time and place, and to provide insights on the association between displacement and selected measures of household well-being and humanitarian needs. This study presents findings from two complementary methods: a desk review of displaced population estimates and movements and a needs assessment of 3930 Syrian households affected by the crisis. The first method, a desk review of displaced population estimates and movements, provides a retrospective analysis of national trends in displacement from March 2011 through June 2014. The second method, analysis of findings from a 2014 needs assessment by displacement status, provides insight into the displaced population and the association between displacement and humanitarian needs. Findings indicate that while displacement often corresponds to conflict levels, such trends were not uniformly observed in governorate-level analysis. Governorate level IDP estimates do not provide information on a scale detailed enough to adequately plan humanitarian assistance. Furthermore, such estimates are often influenced by obstructed access to certain areas, unsubstantiated reports, and substantial discrepancies in reporting. Secondary displacement is not consistently reported across sources nor are additional details about displacement, including whether displaced individuals originated within the current governorate or outside of the governorate. More than half (56.4 %) of households reported being displaced more than once, with a majority displaced for more than one year (73.3 %). Some differences between displaced and non-displaced population were observed in residence crowding, food consumption, health access, and education. Differences in reported living conditions and key health, nutrition, and education indicators between displaced and non-displaced populations indicate a need to better understand migration trends in order to inform planning and provision of live saving humanitarian assistance.

  14. Epistaxis grading in Osler's disease: comparison of comprehensive scores with detailed bleeding diaries.

    PubMed

    Parzefall, Thomas; Wolf, Axel; Frei, Klemens; Kaider, Alexandra; Riss, Dominik

    2017-03-01

    Use of reliable grading scores to measure epistaxis severity in hereditary hemorrhagic telangiectasia (HHT) is essential in clinical routine and for scientific purposes. For practical reasons, visual analog scale (VAS) scoring and the Epistaxis Severity Score (ESS) are widely used. VAS scores are purely subjective, and a potential shortcoming of the ESS is that it is based on self-reported anamnestic bleeding data. The aim of this study was to validate the level of correlation between VAS scores, the ESS, and actual bleeding events, based on detailed epistaxis diaries of patients. Records from daily epistaxis diaries maintained by 16 HHT patients over 112 consecutive days were compared with the monthly ESS and daily VAS scores in the corresponding time period. The Spearman rank correlation coefficient, analysis of variance models, and multiple R 2 measures were used for statistical analysis. Although the ESS and VAS scores generally showed a high degree of correlation with actual bleeding events, mild events were underrepresented in both scores. Our results highlight the usefulness of the ESS as a standard epistaxis score in cohorts with moderate to severe degrees of epistaxis. The use of detailed epistaxis diaries should be considered when monitoring patients and cohorts with mild forms of HHT. © 2016 ARS-AAOA, LLC.

  15. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  16. Shuttle payload interface verification equipment study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A preliminary design analysis of a stand alone payload integration device (IVE) is provided that is capable of verifying payload compatibility in form, fit and function with the shuttle orbiter prior to on-line payload/orbiter operations. The IVE is a high fidelity replica of the orbiter payload accommodations capable of supporting payload functional checkout and mission simulation. A top level payload integration analysis developed detailed functional flow block diagrams of the payload integration process for the broad spectrum of P/L's and identified degree of orbiter data required by the payload user and potential applications of the IVE.

  17. Visual Environments for CFD Research

    NASA Technical Reports Server (NTRS)

    Watson, Val; George, Michael W. (Technical Monitor)

    1994-01-01

    This viewgraph presentation gives an overview of the visual environments for computational fluid dynamics (CFD) research. It includes details on critical needs from the future computer environment, features needed to attain this environment, prospects for changes in and the impact of the visualization revolution on the human-computer interface, human processing capabilities, limits of personal environment and the extension of that environment with computers. Information is given on the need for more 'visual' thinking (including instances of visual thinking), an evaluation of the alternate approaches for and levels of interactive computer graphics, a visual analysis of computational fluid dynamics, and an analysis of visualization software.

  18. The space station assembly phase: Flight telerobotic servicer feasibility. Volume 2: Methodology and case study

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Gyamfi, Max A.; Volkmer, Kent; Zimmerman, Wayne F.

    1987-01-01

    A methodology is described for examining the feasibility of a Flight Telerobotic Servicer (FTS) using two assembly scenarios, defined at the EVA task level, for the 30 shuttle flights (beginning with MB-1) over a four-year period. Performing all EVA tasks by crew only is compared to a scenario in which crew EVA is augmented by FTS. A reference FTS concept is used as a technology baseline and life-cycle cost analysis is performed to highlight cost tradeoffs. The methodology, procedure, and data used to complete the analysis are documented in detail.

  19. Pressure Measurements Using an Airborne Differential Absorption Lidar. Part 1; Analysis of the Systematic Error Sources

    NASA Technical Reports Server (NTRS)

    Flamant, Cyrille N.; Schwemmer, Geary K.; Korb, C. Laurence; Evans, Keith D.; Palm, Stephen P.

    1999-01-01

    Remote airborne measurements of the vertical and horizontal structure of the atmospheric pressure field in the lower troposphere are made with an oxygen differential absorption lidar (DIAL). A detailed analysis of this measurement technique is provided which includes corrections for imprecise knowledge of the detector background level, the oxygen absorption fine parameters, and variations in the laser output energy. In addition, we analyze other possible sources of systematic errors including spectral effects related to aerosol and molecular scattering interference by rotational Raman scattering and interference by isotopic oxygen fines.

  20. Oracle Applications Patch Administration Tool (PAT) Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2002-01-04

    PAT is a Patch Administration Tool that provides analysis, tracking, and management of Oracle Application patches. This includes capabilities as outlined below: Patch Analysis & Management Tool Outline of capabilities: Administration Patch Data Maintenance -- track Oracle Application patches applied to what database instance & machine Patch Analysis capture text files (readme.txt and driver files) form comparison detail report comparison detail PL/SQL package comparison detail SQL scripts detail JSP module comparison detail Parse and load the current applptch.txt (10.7) or load patch data from Oracle Application database patch tables (11i) Display Analysis -- Compare patch to be applied with currentmore » Oracle Application installed Appl_top code versions Patch Detail Module comparison detail Analyze and display one Oracle Application module patch. Patch Management -- automatic queue and execution of patches Administration Parameter maintenance -- setting for directory structure of Oracle Application appl_top Validation data maintenance -- machine names and instances to patch Operation Patch Data Maintenance Schedule a patch (queue for later execution) Run a patch (queue for immediate execution) Review the patch logs Patch Management Reports« less

  1. The applicability of frame imaging from a spinning spacecraft. Volume 1: Summary report

    NASA Technical Reports Server (NTRS)

    Botticelli, R. A.; Johnson, R. O.; Wallmark, G. N.

    1973-01-01

    A detailed study was made of frame-type imaging systems for use on board a spin stabilized spacecraft for outer planets applications. All types of frame imagers capable of performing this mission were considered, regardless of the current state of the art. Detailed sensor models of these systems were developed at the component level and used in the subsequent analyses. An overall assessment was then made of the various systems based upon results of a worst-case performance analysis, foreseeable technology problems, and the relative reliability and radiation tolerance of the systems. Special attention was directed at restraints imposed by image motion and the limited data transmission and storage capability of the spacecraft. Based upon this overall assessment, the most promising systems were selected and then examined in detail for a specified Jupiter orbiter mission. The relative merits of each selected system were then analyzed, and the system design characteristics were demonstrated using preliminary configurations, block diagrams, and tables of estimated weights, volumes and power consumption.

  2. Fossil insect evidence for the end of the Western Settlement in Norse Greenland

    NASA Astrophysics Data System (ADS)

    Panagiotakopulu, Eva; Skidmore, Peter; Buckland, Paul

    2007-04-01

    The fate of Norse farming settlements in southwest Greenland has often been seen as one of the great mysteries of North Atlantic colonization and expansion. Preservation of organic remains in the permafrost of the area of the Western Settlement, inland from the modern capital Nuuk, allowed very detailed study of the phases of occupation. Samples were taken from house floors and middens during the process of archaeological excavations and from insect remains were abstracted and identified in the laboratory. In this study, we present a new paleoecological approach principally examining the fossil fly faunas from house floors. The results of our study provide contrasting detailed pictures of the demise of two neighboring farms, Gården under Sandet and Nipaatsoq, one where abandonment appears as part of a normal process of site selection and desertion, and the other where the end was more traumatic. The level of detail, which was obtained by analysis of the dipterous (true fly) remains, exceeds all previous work and provides insights otherwise unobtainable.

  3. Retrofitting solutions for two different occupancy levels of educational buildings in tropics

    NASA Astrophysics Data System (ADS)

    Yang, Junjing; Pantazaras, Alexandros; Lee, Siew Eang; Santamouris, Mattheos

    2018-01-01

    Within the multi-functionality of educational buildings, the energy conservation potential can be very different. In addition, among different retrofitting solutions investigated involving interventions on the building envelope, ventilation strategies, artificial lighting systems as well as equipment upgrading, different saving potential would come from different aspects. The opportunities for energy saving potential from the overall point of view and from the detailed aspect view of different retrofitting solutions would be very useful and important for building renovation decision making. This study presents a detailed retrofitting study of two different educational buildings. One represents a building with average occupancy variation and containing mainly offices and labs. The other one represents a building with high occupancy variation and containing mainly lecture rooms and studios. This comparison of the results gives an idea of the different energy saving potential for different types of educational buildings. Principal component analysis is also adopted to investigate the detailed performance of one of the buildings which is influenced stronger by these retrofitting solutions.

  4. Fossil insect evidence for the end of the Western Settlement in Norse Greenland.

    PubMed

    Panagiotakopulu, Eva; Skidmore, Peter; Buckland, Paul

    2007-04-01

    The fate of Norse farming settlements in southwest Greenland has often been seen as one of the great mysteries of North Atlantic colonization and expansion. Preservation of organic remains in the permafrost of the area of the Western Settlement, inland from the modern capital Nuuk, allowed very detailed study of the phases of occupation. Samples were taken from house floors and middens during the process of archaeological excavations and from insect remains were abstracted and identified in the laboratory. In this study, we present a new paleoecological approach principally examining the fossil fly faunas from house floors. The results of our study provide contrasting detailed pictures of the demise of two neighboring farms, Gården under Sandet and Nipaatsoq, one where abandonment appears as part of a normal process of site selection and desertion, and the other where the end was more traumatic. The level of detail, which was obtained by analysis of the dipterous (true fly) remains, exceeds all previous work and provides insights otherwise unobtainable.

  5. Transcriptomic analysis of Arabidopsis developing stems: a close-up on cell wall genes

    PubMed Central

    Minic, Zoran; Jamet, Elisabeth; San-Clemente, Hélène; Pelletier, Sandra; Renou, Jean-Pierre; Rihouey, Christophe; Okinyo, Denis PO; Proux, Caroline; Lerouge, Patrice; Jouanin, Lise

    2009-01-01

    Background Different strategies (genetics, biochemistry, and proteomics) can be used to study proteins involved in cell biogenesis. The availability of the complete sequences of several plant genomes allowed the development of transcriptomic studies. Although the expression patterns of some Arabidopsis thaliana genes involved in cell wall biogenesis were identified at different physiological stages, detailed microarray analysis of plant cell wall genes has not been performed on any plant tissues. Using transcriptomic and bioinformatic tools, we studied the regulation of cell wall genes in Arabidopsis stems, i.e. genes encoding proteins involved in cell wall biogenesis and genes encoding secreted proteins. Results Transcriptomic analyses of stems were performed at three different developmental stages, i.e., young stems, intermediate stage, and mature stems. Many genes involved in the synthesis of cell wall components such as polysaccharides and monolignols were identified. A total of 345 genes encoding predicted secreted proteins with moderate or high level of transcripts were analyzed in details. The encoded proteins were distributed into 8 classes, based on the presence of predicted functional domains. Proteins acting on carbohydrates and proteins of unknown function constituted the two most abundant classes. Other proteins were proteases, oxido-reductases, proteins with interacting domains, proteins involved in signalling, and structural proteins. Particularly high levels of expression were established for genes encoding pectin methylesterases, germin-like proteins, arabinogalactan proteins, fasciclin-like arabinogalactan proteins, and structural proteins. Finally, the results of this transcriptomic analyses were compared with those obtained through a cell wall proteomic analysis from the same material. Only a small proportion of genes identified by previous proteomic analyses were identified by transcriptomics. Conversely, only a few proteins encoded by genes having moderate or high level of transcripts were identified by proteomics. Conclusion Analysis of the genes predicted to encode cell wall proteins revealed that about 345 genes had moderate or high levels of transcripts. Among them, we identified many new genes possibly involved in cell wall biogenesis. The discrepancies observed between results of this transcriptomic study and a previous proteomic study on the same material revealed post-transcriptional mechanisms of regulation of expression of genes encoding cell wall proteins. PMID:19149885

  6. Measurement and statistical analysis of single-molecule current-voltage characteristics, transition voltage spectroscopy, and tunneling barrier height.

    PubMed

    Guo, Shaoyin; Hihath, Joshua; Díez-Pérez, Ismael; Tao, Nongjian

    2011-11-30

    We report on the measurement and statistical study of thousands of current-voltage characteristics and transition voltage spectra (TVS) of single-molecule junctions with different contact geometries that are rapidly acquired using a new break junction method at room temperature. This capability allows one to obtain current-voltage, conductance voltage, and transition voltage histograms, thus adding a new dimension to the previous conductance histogram analysis at a fixed low-bias voltage for single molecules. This method confirms the low-bias conductance values of alkanedithiols and biphenyldithiol reported in literature. However, at high biases the current shows large nonlinearity and asymmetry, and TVS allows for the determination of a critically important parameter, the tunneling barrier height or energy level alignment between the molecule and the electrodes of single-molecule junctions. The energy level alignment is found to depend on the molecule and also on the contact geometry, revealing the role of contact geometry in both the contact resistance and energy level alignment of a molecular junction. Detailed statistical analysis further reveals that, despite the dependence of the energy level alignment on contact geometry, the variation in single-molecule conductance is primarily due to contact resistance rather than variations in the energy level alignment.

  7. Detail of basement level concrete beams at southwest corner; camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of basement level concrete beams at southwest corner; camera facing west. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  8. NOSS Altimeter Detailed Algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Mcmillan, J. D.

    1982-01-01

    The details of the algorithms and data sets required for satellite radar altimeter data processing are documented in a form suitable for (1) development of the benchmark software and (2) coding the operational software. The algorithms reported in detail are those established for altimeter processing. The algorithms which required some additional development before documenting for production were only scoped. The algorithms are divided into two levels of processing. The first level converts the data to engineering units and applies corrections for instrument variations. The second level provides geophysical measurements derived from altimeter parameters for oceanographic users.

  9. Sensitive Multi-Species Emissions Monitoring: Infrared Laser-Based Detection of Trace-Level Contaminants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steill, Jeffrey D.; Huang, Haifeng; Hoops, Alexandra A.

    This report summarizes our development of spectroscopic chemical analysis techniques and spectral modeling for trace-gas measurements of highly-regulated low-concentration species present in flue gas emissions from utility coal boilers such as HCl under conditions of high humidity. Detailed spectral modeling of the spectroscopy of HCl and other important combustion and atmospheric species such as H 2 O, CO 2 , N 2 O, NO 2 , SO 2 , and CH 4 demonstrates that IR-laser spectroscopy is a sensitive multi-component analysis strategy. Experimental measurements from techniques based on IR laser spectroscopy are presented that demonstrate sub-ppm sensitivity levels to thesemore » species. Photoacoustic infrared spectroscopy is used to detect and quantify HCl at ppm levels with extremely high signal-to-noise even under conditions of high relative humidity. Additionally, cavity ring-down IR spectroscopy is used to achieve an extremely high sensitivity to combustion trace gases in this spectral region; ppm level CH 4 is one demonstrated example. The importance of spectral resolution in the sensitivity of a trace-gas measurement is examined by spectral modeling in the mid- and near-IR, and efforts to improve measurement resolution through novel instrument development are described. While previous project reports focused on benefits and complexities of the dual-etalon cavity ring-down infrared spectrometer, here details on steps taken to implement this unique and potentially revolutionary instrument are described. This report also illustrates and critiques the general strategy of IR- laser photodetection of trace gases leading to the conclusion that mid-IR laser spectroscopy techniques provide a promising basis for further instrument development and implementation that will enable cost-effective sensitive detection of multiple key contaminant species simultaneously.« less

  10. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    PubMed

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Application of a Resilience Framework to Military Installations: A Methodology for Energy Resilience Business Case Decisions

    DTIC Science & Technology

    2016-09-01

    Some technologies that were not included in the analysis (due to site-level evaluations), but could be added in the future, include: wind turbines ...number of entities involved in the procurement, operation, maintenance , testing, and fueling of the generators, detailed inventory and cost data is...difficult to obtain. The DPW is often understaffed, leading to uneven testing and maintenance of the equipment despite their best efforts. The

  12. Application of a Resilience Framework to Military Installations: A Methodology for Energy Resilience Business Case Decisions

    DTIC Science & Technology

    2016-10-04

    analysis (due to site-level evaluations), but could be added in the future, include: wind turbines (the installations we visited were not interested due...procurement, operation, maintenance , testing, and fueling of the generators, detailed inventory and cost data is difficult to obtain. The DPW is often...understaffed, leading to uneven testing and maintenance of the equipment despite their best efforts. The reliability of these generators is typically

  13. Bureau of Labor Statistics Employment Projections: Detailed Analysis of Selected Occupations and Industries. Report to the Honorable Berkley Bedell, United States House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    To compile its projections of future employment levels, the Bureau of Labor Statistics (BLS) combines the following five interlinked models in a six-step process: a labor force model, an econometric model of the U.S. economy, an industry activity model, an industry labor demand model, and an occupational labor demand model. The BLS was asked to…

  14. Analysis of FY79 Army Aircraft Accidents.

    DTIC Science & Technology

    1980-04-01

    maintenance and field manuals . *.7 "reel world" Army operations. It Includes detailed lemons Additional requirmnent indifId by the results of the le-a...and 2. Emphb and direction to upgrade training at unit trufe of akrraft control, and school levels. R% eview the current aulons nd manuals to 3. Unit...Evaluation and revision of Army regulations, e Evluate effectiveness of programs desgned to technical manuals , field manuals , and other written Insure

  15. Proteomic analysis of plasma-purified VLDL, LDL, and HDL fractions from atherosclerotic patients undergoing carotid endarterectomy: identification of serum amyloid A as a potential marker.

    PubMed

    Lepedda, Antonio J; Nieddu, Gabriele; Zinellu, Elisabetta; De Muro, Pierina; Piredda, Franco; Guarino, Anna; Spirito, Rita; Carta, Franco; Turrini, Francesco; Formato, Marilena

    2013-01-01

    Apolipoproteins are very heterogeneous protein family, implicated in plasma lipoprotein structural stabilization, lipid metabolism, inflammation, or immunity. Obtaining detailed information on apolipoprotein composition and structure may contribute to elucidating lipoprotein roles in atherogenesis and to developing new therapeutic strategies for the treatment of lipoprotein-associated disorders. This study aimed at developing a comprehensive method for characterizing the apolipoprotein component of plasma VLDL, LDL, and HDL fractions from patients undergoing carotid endarterectomy, by means of two-dimensional electrophoresis (2-DE) coupled with Mass Spectrometry analysis, useful for identifying potential markers of plaque presence and vulnerability. The adopted method allowed obtaining reproducible 2-DE maps of exchangeable apolipoproteins from VLDL, LDL, and HDL. Twenty-three protein isoforms were identified by peptide mass fingerprinting analysis. Differential proteomic analysis allowed for identifying increased levels of acute-phase serum amyloid A protein (AP SAA) in all lipoprotein fractions, especially in LDL from atherosclerotic patients. Results have been confirmed by western blotting analysis on each lipoprotein fraction using apo AI levels for data normalization. The higher levels of AP SAA found in patients suggest a role of LDL as AP SAA carrier into the subendothelial space of artery wall, where AP SAA accumulates and may exert noxious effects.

  16. PBF (PER620) interior, basement level. Detail of coolant piping. Date: ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF (PER-620) interior, basement level. Detail of coolant piping. Date: May 2004. INEEL negative no. HD-41-5-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  17. DETAIL INTERIOR VIEW OF ELECTRIC GENERATOR ON UPPER LEVEL ON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL INTERIOR VIEW OF ELECTRIC GENERATOR ON UPPER LEVEL ON HYDROELECTRIC POWER HOUSE - St. Lucie Canal, Lock No. 1, Hydroelectric Power House, St. Lucie, Cross State Canal, Okeechobee Intracoastal Waterway, Stuart, Martin County, FL

  18. Detail view of fourth level platform winch used to lift ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of fourth level platform winch used to lift platform segments away from the Shuttle assembly during testing. - Marshall Space Flight Center, Saturn V Dynamic Test Facility, East Test Area, Huntsville, Madison County, AL

  19. Analysis of Mining-Induced Subsidence Prediction by Exponent Knothe Model Combined with Insar and Leveling

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Zhang, Liguo; Tang, Yixian; Zhang, Hong

    2018-04-01

    The principle of exponent Knothe model was introduced in detail and the variation process of mining subsidence with time was analysed based on the formulas of subsidence, subsidence velocity and subsidence acceleration in the paper. Five scenes of radar images and six levelling measurements were collected to extract ground deformation characteristics in one coal mining area in this study. Then the unknown parameters of exponent Knothe model were estimated by combined levelling data with deformation information along the line of sight obtained by InSAR technique. By compared the fitting and prediction results obtained by InSAR and levelling with that obtained only by levelling, it was shown that the accuracy of fitting and prediction combined with InSAR and levelling was obviously better than the other that. Therefore, the InSAR measurements can significantly improve the fitting and prediction accuracy of exponent Knothe model.

  20. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2015-04-01

    The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.

  1. Spatial-structural analysis of leafless woody riparian vegetation for hydraulic considerations

    NASA Astrophysics Data System (ADS)

    Weissteiner, Clemens; Jalonen, Johanna; Järvelä, Juha; Rauch, Hans Peter

    2013-04-01

    Woody riparian vegetation is a vital element of riverine environments. On one hand woody riparian vegetation has to be taken into account from a civil engineering point of view due to boundary shear stress and vegetation drag. On the other hand it has to be considered from a river ecological point of view due to shadowing effects and as a source of organic material for aquatic habitats. In hydrodynamic and hydro-ecological studies the effects of woody riparian vegetation on flow patterns are usually investigated on a very detailed level. On the contrary vegetation elements and their spatial patterns are generally analysed and discussed on the basis of an integral approach measuring for example basal diameters, heights and projected plant areas. For a better understanding of the influence of woody riparian vegetation on turbulent flow and on river ecology, it is essential to record and analyse plant data sets on the same level of quality as for hydrodynamic or hydro-ecologic purposes. As a result of the same scale of the analysis it is possible to incorporate riparian vegetation as a sub-model in the hydraulic analysis. For plant structural components, such as branches on different topological levels it is crucial to record plant geometrical parameters describing the habitus of the plant on branch level. An exact 3D geometrical model of real plants allows for an extraction of various spatial-structural plant parameters. In addition, allometric relationships help to summarize and describe plant traits of riparian vegetation. This paper focuses on the spatial-structural composition of leafless riparia woddy vegetation. Structural and spatial analyses determine detailed geometric properties of the structural components of the plants. Geometrical and topological parameters were recorded with an electro-magnetic scanning device. In total, 23 plants (willows, alders and birches) were analysed in the study. Data were recorded on branch level, which allowed for the development of a 3D geometric plant model. The results are expected to improve knowledge on how the architectural system and allometric relationships of the plants relate to ecological and hydrodynamic properties.

  2. Analysis of enzyme production by submerged culture of Aspergillus oryzae using whole barley.

    PubMed

    Masuda, Susumu; Kikuchi, Kaori; Matsumoto, Yuko; Sugimoto, Toshikazu; Shoji, Hiroshi; Tanabe, Masayuki

    2009-10-01

    We have reported on high enzyme production by submerged culture of Aspergillus kawachii using barley with the husk (whole barley). To elucidate the mechanism underlying this high enzyme production, we performed a detailed analysis. Aspergillus oryzae RIB40 was submerged-cultured using whole barley and milled whole barley. Enzyme production was analyzed in terms of changes in medium components and gene expression levels. When whole barley was used, high production of glucoamylase and alpha-amylase and high gene expression levels of these enzymes were observed. Low ammonium concentrations were maintained with nitrate ion uptake continuing into the late stage using whole barley. These findings suggest that the sustainability of nitrogen metabolism is related to high enzyme production, and that a mechanism other than that associated with the conventional amylase expression system is involved in this relationship.

  3. Environmental regulation of plant gene expression: an RT-qPCR laboratory project for an upper-level undergraduate biochemistry or molecular biology course.

    PubMed

    Eickelberg, Garrett J; Fisher, Alison J

    2013-01-01

    We present a novel laboratory project employing "real-time" RT-qPCR to measure the effect of environment on the expression of the FLOWERING LOCUS C gene, a key regulator of floral timing in Arabidopsis thaliana plants. The project requires four 3-hr laboratory sessions and is aimed at upper-level undergraduate students in biochemistry or molecular biology courses. The project provides students with hands-on experience with RT-qPCR, the current "gold standard" for gene expression analysis, including detailed data analysis using the common 2-ΔΔCT method. Moreover, it provides a convenient starting point for many inquiry-driven projects addressing diverse questions concerning ecological biochemistry, naturally occurring genetic variation, developmental biology, and the regulation of gene expression in nature. Copyright © 2013 Wiley Periodicals, Inc.

  4. Evaluation of the Williams-type model for barley yields in North Dakota and Minnesota

    NASA Technical Reports Server (NTRS)

    Barnett, T. L. (Principal Investigator)

    1981-01-01

    The Williams-type yield model is based on multiple regression analysis of historial time series data at CRD level pooled to regional level (groups of similar CRDs). Basic variables considered in the analysis include USDA yield, monthly mean temperature, monthly precipitation, soil texture and topographic information, and variables derived from these. Technologic trend is represented by piecewise linear and/or quadratic functions of year. Indicators of yield reliability obtained from a ten-year bootstrap test (1970-1979) demonstrate that biases are small and performance based on root mean square appears to be acceptable for the intended AgRISTARS large area applications. The model is objective, adequate, timely, simple, and not costly. It consideres scientific knowledge on a broad scale but not in detail, and does not provide a good current measure of modeled yield reliability.

  5. Advanced manufacturing development of a composite empennage component for l-1011 aircraft

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Tooling concepts were developed which would permit co-couring of the hat stiffeners to the skin to form the cover assembly in a single autoclave cycle. These tooling concepts include the use of solid rubber mandrels, foam mandrels, and formed elastometric bladders. A simplification of the root end design of the cover hat stiffeners was accomplished in order to facilitate fabrication. The conversion of the 3D NASTRAN model from level 15 to level 16 was completed and a successful check run accomplished. A detailed analysis of the thermal load requirement for the environmental chambers was carried out. Based on the thermal analysis, best function requirements, load inputs and ease of access, a system involving four chambers, two for the covers containing 6 and 4 specimens, respectively, and two for the spares containing 6 and 4 specimens, respectively, evolved.

  6. 33. DETAIL INTERIOR VIEW OF LEVEL +55 IN POWERHOUSE #1, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    33. DETAIL INTERIOR VIEW OF LEVEL +55 IN POWERHOUSE #1, SHOWING TURBINE/GENERATOR CONTROL PANEL FOR TURBINE/GENERATOR UNIT NO 1. - Bonneville Project, Powerhouse No.1, Spanning Bradford Slough, from Bradford Island, Bonneville, Multnomah County, OR

  7. 4. VIEW NORTH, DETAIL SHOWING TERRA COTTA ROUND ARCH AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. VIEW NORTH, DETAIL SHOWING TERRA COTTA ROUND ARCH AND STRING COURSE AT FOURTH FLOOR LEVEL AND FRIEZE AT FIFTH FLOOR LEVEL - West Lexington Street, Number 308 (Commercial Building), 308 West Lexington Street, Baltimore, Independent City, MD

  8. A Dynamic Simulation of Musculoskeletal Function in the Mouse Hindlimb During Trotting Locomotion

    PubMed Central

    Charles, James P.; Cappellari, Ornella; Hutchinson, John R.

    2018-01-01

    Mice are often used as animal models of various human neuromuscular diseases, and analysis of these models often requires detailed gait analysis. However, little is known of the dynamics of the mouse musculoskeletal system during locomotion. In this study, we used computer optimization procedures to create a simulation of trotting in a mouse, using a previously developed mouse hindlimb musculoskeletal model in conjunction with new experimental data, allowing muscle forces, activation patterns, and levels of mechanical work to be estimated. Analyzing musculotendon unit (MTU) mechanical work throughout the stride allowed a deeper understanding of their respective functions, with the rectus femoris MTU dominating the generation of positive and negative mechanical work during the swing and stance phases. This analysis also tested previous functional inferences of the mouse hindlimb made from anatomical data alone, such as the existence of a proximo-distal gradient of muscle function, thought to reflect adaptations for energy-efficient locomotion. The results do not strongly support the presence of this gradient within the mouse musculoskeletal system, particularly given relatively high negative net work output from the ankle plantarflexor MTUs, although more detailed simulations could test this further. This modeling analysis lays a foundation for future studies of the control of vertebrate movement through the development of neuromechanical simulations. PMID:29868576

  9. Approach for Estimating Exposures and Incremental Health ...

    EPA Pesticide Factsheets

    Approach for Estimating Exposures and Incremental Health Effects from Lead During Renovation, Repair, and Painting Activities in Public and Commercial Buildings” (Technical Approach Document). Also available for public review and comment are two supplementary documents: the detailed appendices for the Technical Approach Document and a supplementary report entitled “Developing a Concentration-Response Function for Pb Exposure and Cardiovascular Disease-Related Mortality.” Together, these documents describes an analysis for estimating exposures and incremental health effects created by renovations of public and commercial buildings (P&CBs). This analysis could be used to identify and evaluate hazards from renovation, repair, and painting activities in P&CBs. A general overview of how this analysis can be used to inform EPA’s hazard finding is described in the Framework document that was previously made available for public comment (79 FR 31072; FRL9910-44). The analysis can be used in any proposed rulemaking to estimate the reduction in deleterious health effects that would result from any proposed regulatory requirements to mitigate exposure from P&CB renovation activities. The Technical Approach Document describes in detail how the analyses under this approach have been performed and presents the results – expected changes in blood lead levels and health effects due to lead exposure from renovation activities.

  10. Association Between Academic Medical Center Pharmaceutical Detailing Policies and Physician Prescribing.

    PubMed

    Larkin, Ian; Ang, Desmond; Steinhart, Jonathan; Chao, Matthew; Patterson, Mark; Sah, Sunita; Wu, Tina; Schoenbaum, Michael; Hutchins, David; Brennan, Troyen; Loewenstein, George

    2017-05-02

    In an effort to regulate physician conflicts of interest, some US academic medical centers (AMCs) enacted policies restricting pharmaceutical representative sales visits to physicians (known as detailing) between 2006 and 2012. Little is known about the effect of these policies on physician prescribing. To analyze the association between detailing policies enacted at AMCs and physician prescribing of actively detailed and not detailed drugs. The study used a difference-in-differences multivariable regression analysis to compare changes in prescribing by physicians before and after implementation of detailing policies at AMCs in 5 states (California, Illinois, Massachusetts, Pennsylvania, and New York) that made up the intervention group with changes in prescribing by a matched control group of similar physicians not subject to a detailing policy. Academic medical center implementation of policies regulating pharmaceutical salesperson visits to attending physicians. The monthly within-drug class market share of prescriptions written by an individual physician for detailed and nondetailed drugs in 8 drug classes (lipid-lowering drugs, gastroesophageal reflux disease drugs, diabetes drugs, antihypertensive drugs, hypnotic drugs approved for the treatment of insomnia [sleep aids], attention-deficit/hyperactivity disorder drugs, antidepressant drugs, and antipsychotic drugs) comparing the 10- to 36-month period before implementation of the detailing policies with the 12- to 36-month period after implementation, depending on data availability. The analysis included 16 121 483 prescriptions written between January 2006 and June 2012 by 2126 attending physicians at the 19 intervention group AMCs and by 24 593 matched control group physicians. The sample mean market share at the physician-drug-month level for detailed and nondetailed drugs prior to enactment of policies was 19.3% and 14.2%, respectively. Exposure to an AMC detailing policy was associated with a decrease in the market share of detailed drugs of 1.67 percentage points (95% CI, -2.18 to -1.18 percentage points; P < .001) and an increase in the market share of nondetailed drugs of 0.84 percentage points (95% CI, 0.54 to 1.14 percentage points; P < .001). Associations were statistically significant for 6 of 8 study drug classes for detailed drugs (lipid-lowering drugs, gastroesophageal reflux disease drugs, antihypertensive drugs, sleep aids, attention-deficit/hyperactivity disorder drugs, and antidepressant drugs) and for 9 of the 19 AMCs that implemented policies. Eleven of the 19 AMCs regulated salesperson gifts to physicians, restricted salesperson access to facilities, and incorporated explicit enforcement mechanisms. For 8 of these 11 AMCs, there was a significant change in prescribing. In contrast, there was a significant change at only 1 of 8 AMCs that did not enact policies in all 3 areas. Implementation of policies at AMCs that restricted pharmaceutical detailing between 2006 and 2012 was associated with modest but significant reductions in prescribing of detailed drugs across 6 of 8 major drug classes; however, changes were not seen in all of the AMCs that enacted policies.

  11. A three-level model for binary time-series data: the effects of air pollution on school absences in the Southern California Children's Health Study.

    PubMed

    Rondeau, Virginie; Berhane, Kiros; Thomas, Duncan C

    2005-04-15

    A three-level model is proposed to simultaneously examine the effects of daily exposure to air pollution and individual risk factors on health outcomes without aggregating over subjects or time. We used a logistic transition model with random effects to take into account heterogeneity and overdispersion of the observations. A distributed lag structure for pollution has been included, assuming that the event on day t for a subject depends on the levels of air pollution for several preceding days. We illustrate this proposed model via detailed analysis of the effect of air pollution on school absenteeism based on data from the Southern California Children's Health Study.

  12. Voltage Impacts of Utility-Scale Distributed Wind

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, A.

    2014-09-01

    Although most utility-scale wind turbines in the United States are added at the transmission level in large wind power plants, distributed wind power offers an alternative that could increase the overall wind power penetration without the need for additional transmission. This report examines the distribution feeder-level voltage issues that can arise when adding utility-scale wind turbines to the distribution system. Four of the Pacific Northwest National Laboratory taxonomy feeders were examined in detail to study the voltage issues associated with adding wind turbines at different distances from the sub-station. General rules relating feeder resistance up to the point of turbinemore » interconnection to the expected maximum voltage change levels were developed. Additional analysis examined line and transformer overvoltage conditions.« less

  13. MC-1 Engine Valves, Lessons Learned

    NASA Technical Reports Server (NTRS)

    Laszar, John

    2003-01-01

    Many lessons were learned during the development of the valves for the MC-1 engine. The purpose of this report is to focus on a variety of issues related to the engine valves and convey the lessons learned. This paper will not delve into detailed technical analysis of the components. None of the lessons learned are new or surprising, but simply reinforce the importance of addressing the details of the design early, at the component level. The Marshall Space Flight Center (MSFC), Huntsville, Alabama developed the MC-1 engine, a LOX / FW-1, 60,000 pound thrust engine. This engine was developed under the Low Cost Boost Technology office at MSFC and proved to be a very successful project for the MSFC Propulsion team and the various subcontractors working the development of the engine and its components.

  14. A visual analysis of multi-attribute data using pixel matrix displays

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel; Schreck, Tobias

    2007-01-01

    Charts and tables are commonly used to visually analyze data. These graphics are simple and easy to understand, but charts show only highly aggregated data and present only a limited number of data values while tables often show too many data values. As a consequence, these graphics may either lose or obscure important information, so different techniques are required to monitor complex datasets. Users need more powerful visualization techniques to digest and compare detailed multi-attribute data to analyze the health of their business. This paper proposes an innovative solution based on the use of pixel-matrix displays to represent transaction-level information. With pixelmatrices, users can visualize areas of importance at a glance, a capability not provided by common charting techniques. We present our solutions to use colored pixel-matrices in (1) charts for visualizing data patterns and discovering exceptions, (2) tables for visualizing correlations and finding root-causes, and (3) time series for visualizing the evolution of long-running transactions. The solutions have been applied with success to product sales, Internet network performance analysis, and service contract applications demonstrating the benefits of our method over conventional graphics. The method is especially useful when detailed information is a key part of the analysis.

  15. Strategies for the profiling, characterisation and detailed structural analysis of N-linked oligosaccharides.

    PubMed

    Tharmalingam, Tharmala; Adamczyk, Barbara; Doherty, Margaret A; Royle, Louise; Rudd, Pauline M

    2013-02-01

    Many post-translational modifications, including glycosylation, are pivotal for the structural integrity, location and functional activity of glycoproteins. Sub-populations of proteins that are relocated or functionally changed by such modifications can change resting proteins into active ones, mediating specific effector functions, as in the case of monoclonal antibodies. To ensure safe and efficacious drugs it is essential to employ appropriate robust, quantitative analytical strategies that can (i) perform detailed glycan structural analysis, (ii) characterise specific subsets of glycans to assess known critical features of therapeutic activities (iii) rapidly profile glycan pools for at-line monitoring or high level batch to batch screening. Here we focus on these aspects of glycan analysis, showing how state-of-the-art technologies are required at all stages during the production of recombinant glycotherapeutics. These data can provide insights into processing pathways and suggest markers for intervention at critical control points in bioprocessing and also critical decision points in disease and drug monitoring in patients. Importantly, these tools are now enabling the first glycome/genome studies in large populations, allowing the integration of glycomics into other 'omics platforms in a systems biology context.

  16. Binary partition tree analysis based on region evolution and its application to tree simplification.

    PubMed

    Lu, Huihai; Woods, John C; Ghanbari, Mohammed

    2007-04-01

    Pyramid image representations via tree structures are recognized methods for region-based image analysis. Binary partition trees can be applied which document the merging process with small details found at the bottom levels and larger ones close to the root. Hindsight of the merging process is stored within the tree structure and provides the change histories of an image property from the leaf to the root node. In this work, the change histories are modelled by evolvement functions and their second order statistics are analyzed by using a knee function. Knee values show the reluctancy of each merge. We have systematically formulated these findings to provide a novel framework for binary partition tree analysis, where tree simplification is demonstrated. Based on an evolvement function, for each upward path in a tree, the tree node associated with the first reluctant merge is considered as a pruning candidate. The result is a simplified version providing a reduced solution space and still complying with the definition of a binary tree. The experiments show that image details are preserved whilst the number of nodes is dramatically reduced. An image filtering tool also results which preserves object boundaries and has applications for segmentation.

  17. The Active for Life Year 5 (AFLY5) school-based cluster randomised controlled trial protocol: detailed statistical analysis plan.

    PubMed

    Lawlor, Debbie A; Peters, Tim J; Howe, Laura D; Noble, Sian M; Kipping, Ruth R; Jago, Russell

    2013-07-24

    The Active For Life Year 5 (AFLY5) randomised controlled trial protocol was published in this journal in 2011. It provided a summary analysis plan. This publication is an update of that protocol and provides a detailed analysis plan. This update provides a detailed analysis plan of the effectiveness and cost-effectiveness of the AFLY5 intervention. The plan includes details of how variables will be quality control checked and the criteria used to define derived variables. Details of four key analyses are provided: (a) effectiveness analysis 1 (the effect of the AFLY5 intervention on primary and secondary outcomes at the end of the school year in which the intervention is delivered); (b) mediation analyses (secondary analyses examining the extent to which any effects of the intervention are mediated via self-efficacy, parental support and knowledge, through which the intervention is theoretically believed to act); (c) effectiveness analysis 2 (the effect of the AFLY5 intervention on primary and secondary outcomes 12 months after the end of the intervention) and (d) cost effectiveness analysis (the cost-effectiveness of the AFLY5 intervention). The details include how the intention to treat and per-protocol analyses were defined and planned sensitivity analyses for dealing with missing data. A set of dummy tables are provided in Additional file 1. This detailed analysis plan was written prior to any analyst having access to any data and was approved by the AFLY5 Trial Steering Committee. Its publication will ensure that analyses are in accordance with an a priori plan related to the trial objectives and not driven by knowledge of the data. ISRCTN50133740.

  18. Potential-scour assessments and estimates of scour depth using different techniques at selected bridge sites in Missouri

    USGS Publications Warehouse

    Huizinga, Richard J.; Rydlund, Jr., Paul H.

    2004-01-01

    The evaluation of scour at bridges throughout the state of Missouri has been ongoing since 1991 in a cooperative effort by the U.S. Geological Survey and Missouri Department of Transportation. A variety of assessment methods have been used to identify bridges susceptible to scour and to estimate scour depths. A potential-scour assessment (Level 1) was used at 3,082 bridges to identify bridges that might be susceptible to scour. A rapid estimation method (Level 1+) was used to estimate contraction, pier, and abutment scour depths at 1,396 bridge sites to identify bridges that might be scour critical. A detailed hydraulic assessment (Level 2) was used to compute contraction, pier, and abutment scour depths at 398 bridges to determine which bridges are scour critical and would require further monitoring or application of scour countermeasures. The rapid estimation method (Level 1+) was designed to be a conservative estimator of scour depths compared to depths computed by a detailed hydraulic assessment (Level 2). Detailed hydraulic assessments were performed at 316 bridges that also had received a rapid estimation assessment, providing a broad data base to compare the two scour assessment methods. The scour depths computed by each of the two methods were compared for bridges that had similar discharges. For Missouri, the rapid estimation method (Level 1+) did not provide a reasonable conservative estimate of the detailed hydraulic assessment (Level 2) scour depths for contraction scour, but the discrepancy was the result of using different values for variables that were common to both of the assessment methods. The rapid estimation method (Level 1+) was a reasonable conservative estimator of the detailed hydraulic assessment (Level 2) scour depths for pier scour if the pier width is used for piers without footing exposure and the footing width is used for piers with footing exposure. Detailed hydraulic assessment (Level 2) scour depths were conservatively estimated by the rapid estimation method (Level 1+) for abutment scour, but there was substantial variability in the estimates and several substantial underestimations.

  19. Neck/shoulder pain in adolescents is not related to the level or nature of self-reported physical activity or type of sedentary activity in an Australian pregnancy cohort.

    PubMed

    Briggs, Andrew M; Straker, Leon M; Bear, Natasha L; Smith, Anne J

    2009-07-20

    An inconsistent relationship between physical activity and neck/shoulder pain (NSP) in adolescents has been reported in the literature. Earlier studies may be limited by not assessing physical activity in sufficient detail. The aim of this study was to comprehensively examine the association between NSP and the level and nature of physical activity, and type of sedentary activity in adolescents. A cross-sectional analysis using data from 924 adolescents in the Western Australian Pregnancy Cohort (RAINE) study was performed. Complete data were available for 643 adolescents (54.6% female) at the 14-year follow-up. Physical activity was measured using a detailed self-report electronic activity diary requiring participants to input details of all physical activities over the day in segments of 5 minutes for a one-week period. Physical activity levels were categorised as: sedentary, light, moderate, or vigorous based on metabolic energy equivalents. Nature of activity was determined by assigning each activity to categories based on the amount of movement (static/dynamic) and the main posture assumed for the activity (standing/sitting/lying). Type of sedentary activity was characterised by exposure time to watching TV, using a computer, and reading. Logistic regression was used to explore the association between NSP and activity. Females reported a higher prevalence of lifetime, 1-month and chronic NSP than males (50.9 vs 41.7%, 34.1 vs 23.5%, and 9.2 vs 6.2% respectively). No consistent, dose-response relationship was found between NSP and the level, nature, and type of physical activity. Self-reported one month and lifetime NSP prevalence in adolescents is not related to the level or intensity of physical activity or the type of sedentary activity over a one week period.

  20. Consciousness as a graded and an all-or-none phenomenon: A conceptual analysis.

    PubMed

    Windey, Bert; Cleeremans, Axel

    2015-09-01

    The issue whether consciousness is a graded or an all-or-none phenomenon has been and continues to be a debate. Both contradictory accounts are supported by solid evidence. Starting from a level of processing framework allowing for states of partial awareness, here we further elaborate our view that visual experience, as it is most often investigated in the literature, is both graded and all-or-none. Low-level visual experience is graded, whereas high-level visual experience is all-or-none. We then present a conceptual analysis starting from the notion that consciousness is a general concept. We specify a number of different subconcepts present in the literature on consciousness, and outline how each of them may be seen as either graded, all-or-none, or both. We argue that such specifications are necessary to lead to a detailed and integrated understanding of how consciousness should be conceived of as graded and all-or-none. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Unrecorded alcohol consumption in Russia: toxic denaturants and disinfectants pose additional risks

    PubMed Central

    Solodun, Yuriy V.; Monakhova, Yulia B.; Kuballa, Thomas; Samokhvalov, Andriy V.; Rehm, Jürgen; Lachenmeier, Dirk W.

    2011-01-01

    In 2005, 30% of all alcohol consumption in Russia was unrecorded. This paper describes the chemical composition of unrecorded and low cost alcohol, including a toxicological evaluation. Alcohol products (n=22) from both recorded and unrecorded sources were obtained from three Russian cities (Saratov, Lipetsk and Irkutsk) and were chemically analyzed. Unrecorded alcohols included homemade samogons, medicinal alcohols and surrogate alcohols. Analysis included alcoholic strength, levels of volatile compounds (methanol, acetaldehyde, higher alcohols), ethyl carbamate, diethyl phthalate (DEP) and polyhexamethyleneguanidine hydrochloride (PHMG). Single samples showed contamination with DEP (275–1269 mg/l) and PHMG (515 mg/l) above levels of toxicological concern. Our detailed chemical analysis of Russian alcohols showed that the composition of vodka, samogon and medicinal alcohols generally did not raise major public health concerns other than for ethanol. It was shown, however, that concentration levels of DEP and PHMG in some surrogate alcohols make these samples unfit for human consumption as even moderate drinking would exceed acceptable daily intakes. PMID:22319254

  2. Unrecorded alcohol consumption in Russia: toxic denaturants and disinfectants pose additional risks.

    PubMed

    Solodun, Yuriy V; Monakhova, Yulia B; Kuballa, Thomas; Samokhvalov, Andriy V; Rehm, Jürgen; Lachenmeier, Dirk W

    2011-12-01

    In 2005, 30% of all alcohol consumption in Russia was unrecorded. This paper describes the chemical composition of unrecorded and low cost alcohol, including a toxicological evaluation. Alcohol products (n=22) from both recorded and unrecorded sources were obtained from three Russian cities (Saratov, Lipetsk and Irkutsk) and were chemically analyzed. Unrecorded alcohols included homemade samogons, medicinal alcohols and surrogate alcohols. Analysis included alcoholic strength, levels of volatile compounds (methanol, acetaldehyde, higher alcohols), ethyl carbamate, diethyl phthalate (DEP) and polyhexamethyleneguanidine hydrochloride (PHMG). Single samples showed contamination with DEP (275-1269 mg/l) and PHMG (515 mg/l) above levels of toxicological concern. Our detailed chemical analysis of Russian alcohols showed that the composition of vodka, samogon and medicinal alcohols generally did not raise major public health concerns other than for ethanol. It was shown, however, that concentration levels of DEP and PHMG in some surrogate alcohols make these samples unfit for human consumption as even moderate drinking would exceed acceptable daily intakes.

  3. Long High Redshift GRB and Xrt/swift Lightcurves

    NASA Astrophysics Data System (ADS)

    Arkhangelskaja, Irene

    At February of 2010 the volume of Swift GRB subset with known redshift consisted of more than 150 bursts. Long GRB redshift distribution analysis has shown that confidence level of single peak approximation of this distribution is only ˜60%. Moreover, more than 40% of GRB are in very heavy tails outside 3σ level for this fit. More detailed analysis of long GRB redshift distribution reveals that at 97% confidence level at least two subgroups could be separated with following parameters: = 0.9 ± 0.1 and = 2.7 ± 0.2. It allows to make conclusion that Swift long GRB sources subset is not uniform. In the presented article attention is paid on the measure of discrepancy of long GRB with z>3 and subset of other long GRB with known redshifts. XRT/Swift lightcurves for these groups of GRB were considered and it have shown that at least 90% XRT/Swift lightcurves for GRB with z>3 are more complicated and have got a number of maxima.

  4. Modulation by steroid hormones of a "sexy" acoustic signal in an Oscine species, the Common Canary Serinus canaria.

    PubMed

    Rybak, Fanny; Gahr, Manfred

    2004-06-01

    The respective influence of testosterone and estradiol on the structure of the Common Canary Serinus canaria song was studied by experimentally controlling blood levels of steroid hormones in males and analyzing the consequent effects on acoustic parameters. A detailed acoustic analysis of the songs produced before and after hormonal manipulation revealed that testosterone and estradiol seem to control distinct song parameters independently. The presence of receptors for testosterone and estradiol in the brain neural pathway controlling song production strongly suggests that the observed effects are mediated by a steroid action at the neuronal level.

  5. The Partition of Multi-Resolution LOD Based on Qtm

    NASA Astrophysics Data System (ADS)

    Hou, M.-L.; Xing, H.-Q.; Zhao, X.-S.; Chen, J.

    2011-08-01

    The partition hierarch of Quaternary Triangular Mesh (QTM) determine the accuracy of spatial analysis and application based on QTM. In order to resolve the problem that the partition hierarch of QTM is limited by the level of the computer hardware, the new method that Multi- Resolution LOD (Level of Details) based on QTM will be discussed in this paper. This method can make the resolution of the cells varying with the viewpoint position by partitioning the cells of QTM, selecting the particular area according to the viewpoint; dealing with the cracks caused by different subdivisions, it satisfies the request of unlimited partition in part.

  6. Mapping cumulative noise from shipping to inform marine spatial planning.

    PubMed

    Erbe, Christine; MacGillivray, Alexander; Williams, Rob

    2012-11-01

    Including ocean noise in marine spatial planning requires predictions of noise levels on large spatiotemporal scales. Based on a simple sound transmission model and ship track data (Automatic Identification System, AIS), cumulative underwater acoustic energy from shipping was mapped throughout 2008 in the west Canadian Exclusive Economic Zone, showing high noise levels in critical habitats for endangered resident killer whales, exceeding limits of "good conservation status" under the EU Marine Strategy Framework Directive. Error analysis proved that rough calculations of noise occurrence and propagation can form a basis for management processes, because spending resources on unnecessary detail is wasteful and delays remedial action.

  7. Space transfer concepts and analysis for exploration missions. Implementation plan and element description document. Volume 1: Major trades. Book 1: Draft final

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This document presents trade studies and reference concept designs accomplished during a study of Space Transfer Concepts and Analyses for Exploration Missions (STCAEM). This volume contains the major top level trades, level 2 trades conducted in support of NASA's Lunar/Mars Exploration Program Office, and a synopsis of the vehicles for different propulsion systems under trade consideration. The vehicles are presented in more detail in other volumes of this report. Book 1 of Volume 1 covers the following analyses: lunar/Mars commonality trades, lunar/Mars mission operations, and Mars transfer systems.

  8. Formulation of advanced consumables management models: Environmental control and electrical power system performance models requirements

    NASA Technical Reports Server (NTRS)

    Daly, J. K.; Torian, J. G.

    1979-01-01

    Software design specifications for developing environmental control and life support system (ECLSS) and electrical power system (EPS) programs into interactive computer programs are presented. Specifications for the ECLSS program are at the detail design level with respect to modification of an existing batch mode program. The FORTRAN environmental analysis routines (FEAR) are the subject batch mode program. The characteristics of the FEAR program are included for use in modifying batch mode programs to form interactive programs. The EPS program specifications are at the preliminary design level. Emphasis is on top-down structuring in the development of an interactive program.

  9. Conceptual design of a crewed reusable space transportation system aimed at parabolic flights: stakeholder analysis, mission concept selection, and spacecraft architecture definition

    NASA Astrophysics Data System (ADS)

    Fusaro, Roberta; Viola, Nicole; Fenoglio, Franco; Santoro, Francesco

    2017-03-01

    This paper proposes a methodology to derive architectures and operational concepts for future earth-to-orbit and sub-orbital transportation systems. In particular, at first, it describes the activity flow, methods, and tools leading to the generation of a wide range of alternative solutions to meet the established goal. Subsequently, the methodology allows selecting a small number of feasible options among which the optimal solution can be found. For the sake of clarity, the first part of the paper describes the methodology from a theoretical point of view, while the second part proposes the selection of mission concepts and of a proper transportation system aimed at sub-orbital parabolic flights. Starting from a detailed analysis of the stakeholders and their needs, the major objectives of the mission have been derived. Then, following a system engineering approach, functional analysis tools as well as concept of operations techniques allowed generating a very high number of possible ways to accomplish the envisaged goals. After a preliminary pruning activity, aimed at defining the feasibility of these concepts, more detailed analyses have been carried out. Going on through the procedure, the designer should move from qualitative to quantitative evaluations, and for this reason, to support the trade-off analysis, an ad-hoc built-in mission simulation software has been exploited. This support tool aims at estimating major mission drivers (mass, heat loads, manoeuverability, earth visibility, and volumetric efficiency) as well as proving the feasibility of the concepts. Other crucial and multi-domain mission drivers, such as complexity, innovation level, and safety have been evaluated through the other appropriate analyses. Eventually, one single mission concept has been selected and detailed in terms of layout, systems, and sub-systems, highlighting also logistic, safety, and maintainability aspects.

  10. Australia’s first national level quantitative environmental justice assessment of industrial air pollution

    NASA Astrophysics Data System (ADS)

    Chakraborty, Jayajit; Green, Donna

    2014-04-01

    This study presents the first national level quantitative environmental justice assessment of industrial air pollution in Australia. Specifically, our analysis links the spatial distribution of sites and emissions associated with industrial pollution sources derived from the National Pollution Inventory, to Indigenous status and social disadvantage characteristics of communities derived from Australian Bureau of Statistics indicators. Our results reveal a clear national pattern of environmental injustice based on the locations of industrial pollution sources, as well as volume, and toxicity of air pollution released at these locations. Communities with the highest number of polluting sites, emission volume, and toxicity-weighted air emissions indicate significantly greater proportions of Indigenous population and higher levels of socio-economic disadvantage. The quantities and toxicities of industrial air pollution are particularly higher in communities with the lowest levels of educational attainment and occupational status. These findings emphasize the need for more detailed analysis in specific regions and communities where socially disadvantaged groups are disproportionately impacted by industrial air pollution. Our empirical findings also underscore the growing necessity to incorporate environmental justice considerations in environmental planning and policy-making in Australia.

  11. A third of patients treated at a tertiary-level surgical service could be treated at a secondary-level facility.

    PubMed

    Van Straten, S; Stannard, C; Bulabula, J; Boodhia, K; Paul, K; Leong, J; Klipin, M J

    2017-08-25

    South Africa (SA) has an overburdened public healthcare system. Some patients admitted to Charlotte Maxeke Johannesburg Academic Hospital (CMJAH), SA, may not require tertiary care, but the numbers and details are uncertain. Clinical research in SA is limited by scarce skills and limited access to data. To determine the proportion of and length of stay for secondary-, tertiary- and quaternary-level patients discharged from the Department of Surgery at CMJAH over 1 year. This is a retrospective analysis of electronic discharge (ED) summaries from the Department of Surgery at CMJAH between 1 April 2015 and 1 April 2016. An SQL query of the database generated a .csv file of all discharges with the following fields: database reference number, length of stay and level of care. The details of each record were verified by MBBCh V students, using a defined level-ofcare template and the full discharge summary. The data were reviewed by a senior clinician. There were 3 007 discharge summaries - 97 were not classifiable, two were test records and one was a duplicate. These 100 records were excluded. There were no primary-level records. Secondary-level patients represented 29% (854) of those discharged and 19% of total bed days. Tertiary- and quaternary-level patients together represented 71% of the total and 81% of bed days. The average length of stay was 4.31 days for secondary, 6.98 days for tertiary and 9.77 days for quaternary level-of-care allocation. Almost one-third (29%) of patients discharged from CMJAH's Department of Surgery were deemed suitable for secondarylevel care. These patients had a shorter length of stay and comprised 19% of total bed days. Students and electronic databases represent an important research resource.

  12. 38. DETAIL OF CYLINDER LEVELING SYSTEM SHOWING TYPICAL UPPER AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    38. DETAIL OF CYLINDER LEVELING SYSTEM SHOWING TYPICAL UPPER AND LOWER PULLEY BRACKET. F.C. TORKELSON DRAWING NUMBER 842-ARVFS-701-S-8. INEL INDEX CODE - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  13. DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  14. Regulatory-associated protein of TOR (RAPTOR) alters the hormonal and metabolic composition of Arabidopsis seeds, controlling seed morphology, viability and germination potential.

    PubMed

    Salem, Mohamed A; Li, Yan; Wiszniewski, Andrew; Giavalisco, Patrick

    2017-11-01

    Target of Rapamycin (TOR) is a positive regulator of growth and development in all eukaryotes, which positively regulates anabolic processes like protein synthesis, while repressing catabolic processes, including autophagy. To better understand TOR function we decided to analyze its role in seed development and germination. We therefore performed a detailed phenotypic analysis using mutants of the REGULATORY-ASSOCIATED PROTEIN OF TOR 1B (RAPTOR1B), a conserved TOR interactor, acting as a scaffold protein, which recruits substrates for the TOR kinase. Our results show that raptor1b plants produced seeds that were delayed in germination and less resistant to stresses, leading to decreased viability. These physiological phenotypes were accompanied by morphological changes including decreased seed-coat pigmentation and reduced production of seed-coat mucilage. A detailed molecular analysis revealed that many of these morphological changes were associated with significant changes of the metabolic content of raptor1b seeds, including elevated levels of free amino acids, as well as reduced levels of protective secondary metabolites and storage proteins. Most of these observed changes were accompanied by significantly altered phytohormone levels in the raptor1b seeds, with increases in abscisic acid, auxin and jasmonic acid, which are known to inhibit germination. Delayed germination and seedling growth, observed in the raptor1b seeds, could be partially restored by the exogenous supply of gibberellic acid, indicating that TOR is at the center of a regulatory hub controlling seed metabolism, maturation and germination. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  15. Tile-based Level of Detail for the Parallel Age

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niski, K; Cohen, J D

    Today's PCs incorporate multiple CPUs and GPUs and are easily arranged in clusters for high-performance, interactive graphics. We present an approach based on hierarchical, screen-space tiles to parallelizing rendering with level of detail. Adapt tiles, render tiles, and machine tiles are associated with CPUs, GPUs, and PCs, respectively, to efficiently parallelize the workload with good resource utilization. Adaptive tile sizes provide load balancing while our level of detail system allows total and independent management of the load on CPUs and GPUs. We demonstrate our approach on parallel configurations consisting of both single PCs and a cluster of PCs.

  16. Fluorescence multi-scale endoscopy and its applications in the study and diagnosis of gastro-intestinal diseases: set-up design and software implementation

    NASA Astrophysics Data System (ADS)

    Gómez-García, Pablo Aurelio; Arranz, Alicia; Fresno, Manuel; Desco, Manuel; Mahmood, Umar; Vaquero, Juan José; Ripoll, Jorge

    2015-06-01

    Endoscopy is frequently used in the diagnosis of several gastro-intestinal pathologies as Crohn disease, ulcerative colitis or colorectal cancer. It has great potential as a non-invasive screening technique capable of detecting suspicious alterations in the intestinal mucosa, such as inflammatory processes. However, these early lesions usually cannot be detected with conventional endoscopes, due to lack of cellular detail and the absence of specific markers. Due to this lack of specificity, the development of new endoscopy technologies, which are able to show microscopic changes in the mucosa structure, are necessary. We here present a confocal endomicroscope, which in combination with a wide field fluorescence endoscope offers fast and specific macroscopic information through the use of activatable probes and a detailed analysis at cellular level of the possible altered tissue areas. This multi-modal and multi-scale imaging module, compatible with commercial endoscopes, combines near-infrared fluorescence (NIRF) measurements (enabling specific imaging of markers of disease and prognosis) and confocal endomicroscopy making use of a fiber bundle, providing a cellular level resolution. The system will be used in animal models exhibiting gastro-intestinal diseases in order to analyze the use of potential diagnostic markers in colorectal cancer. In this work, we present in detail the set-up design and the software implementation in order to obtain simultaneous RGB/NIRF measurements and short confocal scanning times.

  17. Detailed clinical models: a review.

    PubMed

    Goossen, William; Goossen-Baremans, Anneke; van der Zel, Michael

    2010-12-01

    Due to the increasing use of electronic patient records and other health care information technology, we see an increase in requests to utilize these data. A highly level of standardization is required during the gathering of these data in the clinical context in order to use it for analyses. Detailed Clinical Models (DCM) have been created toward this purpose and several initiatives have been implemented in various parts of the world to create standardized models. This paper presents a review of DCM. Two types of analyses are presented; one comparing DCM against health care information architectures and a second bottom up approach from concept analysis to representation. In addition core parts of the draft ISO standard 13972 on DCM are used such as clinician involvement, data element specification, modeling, meta information, and repository and governance. SIX INITIATIVES WERE SELECTED: Intermountain Healthcare, 13606/OpenEHR Archetypes, Clinical Templates, Clinical Contents Models, Health Level 7 templates, and Dutch Detailed Clinical Models. Each model selected was reviewed for their overall development, involvement of clinicians, use of data types, code bindings, expressing semantics, modeling, meta information, use of repository and governance. Using both a top down and bottom up approach to comparison reveals many commonalties and differences between initiatives. Important differences include the use of or lack of a reference model and expressiveness of models. Applying clinical data element standards facilitates the use of conceptual DCM models in different technical representations.

  18. Monitoring Change Through Hierarchical Segmentation of Remotely Sensed Image Data

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Lawrence, William T.

    2005-01-01

    NASA's Goddard Space Flight Center has developed a fast and effective method for generating image segmentation hierarchies. These segmentation hierarchies organize image data in a manner that makes their information content more accessible for analysis. Image segmentation enables analysis through the examination of image regions rather than individual image pixels. In addition, the segmentation hierarchy provides additional analysis clues through the tracing of the behavior of image region characteristics at several levels of segmentation detail. The potential for extracting the information content from imagery data based on segmentation hierarchies has not been fully explored for the benefit of the Earth and space science communities. This paper explores the potential of exploiting these segmentation hierarchies for the analysis of multi-date data sets, and for the particular application of change monitoring.

  19. Effect of the statin therapy on biochemical laboratory tests--a chemometrics study.

    PubMed

    Durceková, Tatiana; Mocák, Ján; Boronová, Katarína; Balla, Ján

    2011-01-05

    Statins are the first-line choice for lowering total and LDL cholesterol levels and very important medicaments for reducing the risk of coronary artery disease. The aim of this study is therefore assessment of the results of biochemical tests characterizing the condition of 172 patients before and after administration of statins. For this purpose, several chemometric tools, namely principal component analysis, cluster analysis, discriminant analysis, logistic regression, KNN classification, ROC analysis, descriptive statistics and ANOVA were used. Mutual relations of 11 biochemical laboratory tests, the patient's age and gender were investigated in detail. Achieved results enable to evaluate the extent of the statin treatment in each individual case. They may also help in monitoring the dynamic progression of the disease. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. Near-infrared Spectroscopy to Reduce Prophylactic Fasciotomies for and Missed Cases of Acute Compartment Syndrome in Soldiers Injured in OEF/OIF

    DTIC Science & Technology

    2012-10-01

    studies demonstrated that NIRS measurement of hemoglobin oxygen saturation in the tibial compartment provided reliable and sensitive correlation to...pressure increases with muscle damage, there is not a complete loss of tissue oxygen saturation in the tissue over the 14 hours of the protocol. In...allow greater detail of information and flexibility in the analysis of tissue oxygenation levels. Although the 7610 oximeter has not been

  1. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  2. Studies to design and develop improved remote manipulator systems

    NASA Technical Reports Server (NTRS)

    Hill, J. W.; Sword, A. J.

    1973-01-01

    Remote manipulator control considered is based on several levels of automatic supervision which derives manipulator commands from an analysis of sensor states and task requirements. Principle sensors are manipulator joint position, tactile, and currents. The tactile sensor states can be displayed visually in perspective or replicated in the operator's control handle of perceived by the automatic supervisor. Studies are reported on control organization, operator performance and system performance measures. Unusual hardware and software details are described.

  3. OpenMDAO Framework Status

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia Gutierrez

    2010-01-01

    Advancing and exploring the science of Multidisciplinary Analysis & Optimization (MDAO) capabilities are high-level goals in the Fundamental Aeronautics Program s Subsonic Fixed Wing (SFW) project. The OpenMDAO team has made significant progress toward completing the Alpha OpenMDAO deliverable due in September 2010. Included in the presentation are: details of progress on developing the OpenMDAO framework, example usage of OpenMDAO, technology transfer plans, near term plans, progress toward establishing partnerships with external parties, and discussion of additional potential collaborations.

  4. Characterization and correction of the false-discovery rates in resting state connectivity using functional near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Santosa, Hendrik; Aarabi, Ardalan; Perlman, Susan B.; Huppert, Theodore J.

    2017-05-01

    Functional near-infrared spectroscopy (fNIRS) is a noninvasive neuroimaging technique that uses low levels of red to near-infrared light to measure changes in cerebral blood oxygenation. Spontaneous (resting state) functional connectivity (sFC) has become a critical tool for cognitive neuroscience for understanding task-independent neural networks, revealing pertinent details differentiating healthy from disordered brain function, and discovering fluctuations in the synchronization of interacting individuals during hyperscanning paradigms. Two of the main challenges to sFC-NIRS analysis are (i) the slow temporal structure of both systemic physiology and the response of blood vessels, which introduces false spurious correlations, and (ii) motion-related artifacts that result from movement of the fNIRS sensors on the participants' head and can introduce non-normal and heavy-tailed noise structures. In this work, we systematically examine the false-discovery rates of several time- and frequency-domain metrics of functional connectivity for characterizing sFC-NIRS. Specifically, we detail the modifications to the statistical models of these methods needed to avoid high levels of false-discovery related to these two sources of noise in fNIRS. We compare these analysis procedures using both simulated and experimental resting-state fNIRS data. Our proposed robust correlation method has better performance in terms of being more reliable to the noise outliers due to the motion artifacts.

  5. Adaptation of a Control Center Development Environment for Industrial Process Control

    NASA Technical Reports Server (NTRS)

    Killough, Ronnie L.; Malik, James M.

    1994-01-01

    In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.

  6. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  7. AGT-102 automotive gas turbine

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Development of a gas turbine powertrain with a 30% fuel economy improvement over a comparable S1 reciprocating engine, operation within 0.41 HC, 3.4 CO, and 0.40 NOx grams per mile emissions levels, and ability to use a variety of alternate fuels is summarized. The powertrain concept consists of a single-shaft engine with a ceramic inner shell for containment of hot gasses and support of twin regenerators. It uses a fixed-geometry, lean, premixed, prevaporized combustor, and a ceramic radial turbine rotor supported by an air-lubricated journal bearing. The engine is coupled to the vehicle through a widerange continuously variable transmission, which utilizes gearing and a variable-ratio metal compression belt. A response assist flywheel is used to achieve acceptable levels of engine response. The package offers a 100 lb weight advantage in a Chrysler K Car front-wheel-drive installation. Initial layout studies, preliminary transient thermal analysis, ceramic inner housing structural analysis, and detailed performance analysis were carried out for the basic engine.

  8. In Vivo Assessment of Cold Tolerance through Chlorophyll-a Fluorescence in Transgenic Zoysiagrass Expressing Mutant Phytochrome A

    PubMed Central

    Gururani, Mayank Anand; Venkatesh, Jelli; Ganesan, Markkandan; Strasser, Reto Jörg; Han, Yunjeong; Kim, Jeong-Il; Lee, Hyo-Yeon; Song, Pill-Soon

    2015-01-01

    Chlorophyll-a fluorescence analysis provides relevant information about the physiology of plants growing under abiotic stress. In this study, we evaluated the influence of cold stress on the photosynthetic machinery of transgenic turfgrass, Zoysia japonica, expressing oat phytochrome A (PhyA) or a hyperactive mutant phytochrome A (S599A) with post-translational phosphorylation blocked. Biochemical analysis of zoysiagrass subjected to cold stress revealed reduced levels of hydrogen peroxide, increased proline accumulation, and enhanced specific activities of antioxidant enzymes compared to those of control plants. Detailed analyses of the chlorophyll-a fluorescence data through the so-called OJIP test exhibited a marked difference in the physiological status among transgenic and control plants. Overall, these findings suggest an enhanced level of cold tolerance in S599A zoysiagrass cultivars as reflected in the biochemical and physiological analyses. Further, we propose that chlorophyll-a fluorescence analysis using OJIP test is an efficient tool in determining the physiological status of plants under cold stress conditions. PMID:26010864

  9. Stability numerical analysis of soil cave in karst area to drawdown of underground water level

    NASA Astrophysics Data System (ADS)

    Mo, Yizheng; Xiao, Rencheng; Deng, Zongwei

    2018-05-01

    With the underground water level falling, the reliable estimates of the stability and deformation characteristics of soil caves in karst region area are required for analysis used for engineering design. Aimed at this goal, combined with practical engineering and field geotechnical test, detail analysis on vertical maximum displacement of top, vertical maximum displacement of surface, maximum principal stress and maximum shear stress were conducted by finite element software, with an emphasis on two varying factors: the size and the depth of soil cave. The calculations on the soil cave show that, its stability of soil cave is affected by both the size and depth, and only when extending a certain limit, the collapse occurred along with the falling of underground water; Additionally, its maximum shear stress is in arch toes, and its deformation curve trend of maximum displacement is similar to the maximum shear stress, which further verified that the collapse of soil cave was mainly due to shear-failure.

  10. The Review of Nuclear Microscopy Techniques: An Approach for Nondestructive Trace Elemental Analysis and Mapping of Biological Materials.

    PubMed

    Mulware, Stephen Juma

    2015-01-01

    The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.

  11. DETAIL VIEW ON THE MAIN ASSEMBLY LEVEL OF ELEVATOR SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW ON THE MAIN ASSEMBLY LEVEL OF ELEVATOR SHOWING THE DOUBLED COLUMN FOR THE BUILDING EXPANSION JOINT AT COLUMN LINE AA-18. - Offutt Air Force Base, Glenn L. Martin-Nebraska Bomber Plant, Building D, Peacekeeper Drive, Bellevue, Sarpy County, NE

  12. Accuracy and Spatial Variability in GPS Surveying for Landslide Mapping on Road Inventories at a Semi-Detailed Scale: the Case in Colombia

    NASA Astrophysics Data System (ADS)

    Murillo Feo, C. A.; Martnez Martinez, L. J.; Correa Muñoz, N. A.

    2016-06-01

    The accuracy of locating attributes on topographic surfaces when, using GPS in mountainous areas, is affected by obstacles to wave propagation. As part of this research on the semi-automatic detection of landslides, we evaluate the accuracy and spatial distribution of the horizontal error in GPS positioning in the tertiary road network of six municipalities located in mountainous areas in the department of Cauca, Colombia, using geo-referencing with GPS mapping equipment and static-fast and pseudo-kinematic methods. We obtained quality parameters for the GPS surveys with differential correction, using a post-processing method. The consolidated database underwent exploratory analyses to determine the statistical distribution, a multivariate analysis to establish relationships and partnerships between the variables, and an analysis of the spatial variability and calculus of accuracy, considering the effect of non-Gaussian distribution errors. The evaluation of the internal validity of the data provide metrics with a confidence level of 95% between 1.24 and 2.45 m in the static-fast mode and between 0.86 and 4.2 m in the pseudo-kinematic mode. The external validity had an absolute error of 4.69 m, indicating that this descriptor is more critical than precision. Based on the ASPRS standard, the scale obtained with the evaluated equipment was in the order of 1:20000, a level of detail expected in the landslide-mapping project. Modelling the spatial variability of the horizontal errors from the empirical semi-variogram analysis showed predictions errors close to the external validity of the devices.

  13. Risk assessment based on a combination of historical analysis, a detailed field study and numerical modeling on the alluvial fan Gadeinerbach as a basis for a risk management concept

    NASA Astrophysics Data System (ADS)

    Moser, M.

    2009-04-01

    The catchment Gadeinerbach in the District of Lungau/Salzburg/Austria is prone to debris flows. Large debris flow events dates back from the years 1934 and 1953. In the upper catchment large mass movements represent debris sources. A field study shows the debris potential and the catchment looks like a "sleeping torrential giant". To carry out mitigation measures a detailed risk management concept, based on a risk assessment in combination of historical analysis, field study and numerical modeling on the alluvial fan was conducted. Human activities have partly altered the surface of the alluvial fan Gadeinerbach but nevertheless some important hazard indicators could be found. With the hazard indicators and photo analysis from the large debris flow event 1934 the catchment character could be pointed out. With the help of these historical data sets (hazard indicators, sediment and debris amount...) it is possible to calibrate the provided numerical models and to win useful knowledge over the pro and cons and their application. The results were used to simulate the design event and furthermore to derive mitigation measures. Therefore the most effective protection against debris with a reduction of the high energy level to a lower level under particular energy change in combination with a debris/bedload deposition place has been carried out. Expert opinion, the study of historical data and a field work is in addition to numerical simulation techniques very necessary for the work in the field of natural hazard management.

  14. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    NASA Astrophysics Data System (ADS)

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-08-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In this study, domain ontology about the field of lead-lag compensator design has been built and used for automatic exercise generation, graphical user interface population and interaction with the user at any level of detail, including explanations about why things occur. An application called Onto-CELE (ontology-based control engineering learning environment) uses the ontology for implementing a learning environment that can be used for self and lifelong learning purposes. The experience has shown that the use of knowledge models as the basis for educational software applications is capable of showing students the whole complexity of the analysis and design processes at any level of detail. A practical experience with postgraduate students has shown the mentioned benefits and possibilities of the approach.

  15. Comparative study between single core model and detail core model of CFD modelling on reactor core cooling behaviour

    NASA Astrophysics Data System (ADS)

    Darmawan, R.

    2018-01-01

    Nuclear power industry is facing uncertainties since the occurrence of the unfortunate accident at Fukushima Daiichi Nuclear Power Plant. The issue of nuclear power plant safety becomes the major hindrance in the planning of nuclear power program for new build countries. Thus, the understanding of the behaviour of reactor system is very important to ensure the continuous development and improvement on reactor safety. Throughout the development of nuclear reactor technology, investigation and analysis on reactor safety have gone through several phases. In the early days, analytical and experimental methods were employed. For the last four decades 1D system level codes were widely used. The continuous development of nuclear reactor technology has brought about more complex system and processes of nuclear reactor operation. More detailed dimensional simulation codes are needed to assess these new reactors. Recently, 2D and 3D system level codes such as CFD are being explored. This paper discusses a comparative study on two different approaches of CFD modelling on reactor core cooling behaviour.

  16. A longitudinal study on the development of moral judgement in Korean nursing students.

    PubMed

    Kim, Yong-Soon; Park, Jee-Won; Son, Youn-Jung; Han, Sung-Suk

    2004-01-01

    This longitudinal study examined the development of moral judgement in 37 nursin students attending a university in Suwon, Korea. The participants completed the Korean version of the Defining Issues Test to allow analysis of their level of moral judgement. The development of moral judgement was quantified using 'the moral development score' at each stage (i.e. the six stages detailed by Kohlberg) and the 'P(%) score' (a measure of the overall moral judgement level). The results were as follows: (1) the moral development score for stage 5A was consistently the highest across the four years of the students' course, showing significant differences in some sociodemographic factors including home, birth order and monthly income; and (2) the P(%) score was higher in fourth-year (47.47 +/- 11.21) than in first-year (46.13 +/- 9.73) students. There was no significant difference in the P(%) score according to sociodemographic factors. Further studies will examine in detail the correlation between curriculum and moral judgement development. We suggest that courses in ethics education should be made more relevant.

  17. Method for VAWT Placement on a Complex Building Structure

    DTIC Science & Technology

    2013-06-01

    85 APPENDIX C: ANSYS CFX SPECIFICAITONS FOR WIND FLOW ANALYSIS .....87 APPENDIX D: SINGLE ROTOR ANALYSIS ANSYS CFX MESH DETAILS...89 APPENDIX E: SINGLE ROTOR ANALYSIS, ANSYS CFX SPECIFICS .....................91 APPENDIX F: DETAILED RESULTS OF SINGLE ROTOR...101 APPENDIX I: DUAL ROTOR ANALYSIS- ANSYS CFX SPECIFICATIONS (6 BLADED VAWTS

  18. Structure and dynamics of European sports science textual contents: Analysis of ECSS abstracts (1996-2014).

    PubMed

    Hristovski, Robert; Aceski, Aleksandar; Balague, Natalia; Seifert, Ludovic; Tufekcievski, Aleksandar; Cecilia, Aguirre

    2017-02-01

    The article discusses general structure and dynamics of the sports science research content as obtained from the analysis of 21998 European College of Sport Science abstracts belonging to 12 science topics. The structural analysis showed intertwined multidisciplinary and unifying tendencies structured along horizontal (scope) and vertical (level) axes. Methodological (instrumental and mode of inquiry) integrative tendencies are dominant. Theoretical integrative tendencies are much less detectable along both horizontal and vertical axes. The dynamic analysis of written abstracts text content over the 19 years reveals the contextualizing and guiding role of thematic skeletons of each sports science topic in forming more detailed contingent research ideas and the role of the latter in stabilizing and procreating the former. This circular causality between both hierarchical levels and functioning on separate characteristic time scales is crucial for understanding how stable research traditions self-maintain and self-procreate through innovative contingencies. The structure of sports science continuously rebuilds itself through use and re-use of contingent research ideas. The thematic skeleton ensures its identity and the contingent conceptual sets its flexibility and adaptability to different research or applicative problems.

  19. Acquisition and production of skilled behavior in dynamic decision-making tasks

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex

    1992-01-01

    Detailed summaries of two NASA-funded research projects are provided. The first project was an ecological task analysis of the Star Cruiser model. Star Cruiser is a psychological model designed to test a subject's level of cognitive activity. Ecological task analysis is used as a framework to predict the types of cognitive activity required to achieve productive behavior and to suggest how interfaces can be manipulated to alleviate certain types of cognitive demands. The second project is presented in the form of a thesis for the Masters Degree. The thesis discusses the modeling of decision-making through the use of neural network and genetic-algorithm machine learning technologies.

  20. Analysis and design of the Multimission Modular Spacecraft hydrazine propulsion module

    NASA Technical Reports Server (NTRS)

    Etheridge, F. G.; Woodruff, W. L.

    1978-01-01

    The translational velocity increment, stabilization and control requirements, vehicle weight, and geometric considerations of the Multimission Modular Spacecraft (MMS) provided the basic data on which to initiate the analysis and design of the hydrazine propulsion modules. The Landsat D was used as the mission model. Tradeoff studies were conducted on thrust level, thruster location, and clustering arrangement together with tankage volume and location. The impact of the use of single and dual seat thruster valves on plumbing configuration, reliability, and overall system cost was studied in detail. Conceptual designs of a recommended propulsion module configuration for both the Delta 3910 and Shuttle were prepared.

  1. Wavelets and molecular structure

    NASA Astrophysics Data System (ADS)

    Carson, Mike

    1996-08-01

    The wavelet method offers possibilities for display, editing, and topological comparison of proteins at a user-specified level of detail. Wavelets are a mathematical tool that first found application in signal processing. The multiresolution analysis of a signal via wavelets provides a hierarchical series of `best' lower-resolution approximations. B-spline ribbons model the protein fold, with one control point per residue. Wavelet analysis sets limits on the information required to define the winding of the backbone through space, suggesting a recognizable fold is generated from a number of points equal to 1/4 or less the number of residues. Wavelets applied to surfaces and volumes show promise in structure-based drug design.

  2. Quantum Gravitational Effects on the Boundary

    NASA Astrophysics Data System (ADS)

    James, F.; Park, I. Y.

    2018-04-01

    Quantum gravitational effects might hold the key to some of the outstanding problems in theoretical physics. We analyze the perturbative quantum effects on the boundary of a gravitational system and the Dirichlet boundary condition imposed at the classical level. Our analysis reveals that for a black hole solution, there is a contradiction between the quantum effects and the Dirichlet boundary condition: the black hole solution of the one-particle-irreducible action no longer satisfies the Dirichlet boundary condition as would be expected without going into details. The analysis also suggests that the tension between the Dirichlet boundary condition and loop effects is connected with a certain mechanism of information storage on the boundary.

  3. Characterisation of the PXIE Allison-type emittance scanner

    DOE PAGES

    D'Arcy, R.; Alvarez, M.; Gaynier, J.; ...

    2016-01-26

    An Allison-type emittance scanner has been designed for PXIE at FNAL with the goal of providing fast and accurate phase space reconstruction. The device has been modified from previous LBNL/SNS designs to operate in both pulsed and DC modes with the addition of water-cooled front slits. Extensive calibration techniques and error analysis allowed confinement of uncertainty to the <5% level (with known caveats). With a 16-bit, 1 MHz electronics scheme the device is able to analyse a pulse with a resolution of 1 μs, allowing for analysis of neutralisation effects. As a result, this paper describes a detailed breakdown ofmore » the R&D, as well as post-run analysis techniques.« less

  4. Biodiversity in cities needs space: a meta-analysis of factors determining intra-urban biodiversity variation.

    PubMed

    Beninde, Joscha; Veith, Michael; Hochkirch, Axel

    2015-06-01

    Understanding varying levels of biodiversity within cities is pivotal to protect it in the face of global urbanisation. In the early stages of urban ecology studies on intra-urban biodiversity focused on the urban-rural gradient, representing a broad generalisation of features of the urban landscape. Increasingly, studies classify the urban landscape in more detail, quantifying separately the effects of individual urban features on biodiversity levels. However, while separate factors influencing biodiversity variation among cities worldwide have recently been analysed, a global analysis on the factors influencing biodiversity levels within cities is still lacking. We here present the first meta-analysis on intra-urban biodiversity variation across a large variety of taxonomic groups of 75 cities worldwide. Our results show that patch area and corridors have the strongest positive effects on biodiversity, complemented by vegetation structure. Local, biotic and management habitat variables were significantly more important than landscape, abiotic or design variables. Large sites greater than 50 ha are necessary to prevent a rapid loss of area-sensitive species. This indicates that, despite positive impacts of biodiversity-friendly management, increasing the area of habitat patches and creating a network of corridors is the most important strategy to maintain high levels of urban biodiversity. © 2015 John Wiley & Sons Ltd/CNRS.

  5. 2. DETAIL VIEW SHOWING WOODEN CRIBBING WITH LOWERED LAKE LEVEL, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. DETAIL VIEW SHOWING WOODEN CRIBBING WITH LOWERED LAKE LEVEL, EAST DAM, LOOKING NORTHEAST (View is middle of the perimeter showing in MT-88-A-1 above.) - Three Bears Lake & Dams, East Dam, North of Marias Pass, East Glacier Park, Glacier County, MT

  6. A kernel regression approach to gene-gene interaction detection for case-control studies.

    PubMed

    Larson, Nicholas B; Schaid, Daniel J

    2013-11-01

    Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.

  7. Supporting tactical intelligence using collaborative environments and social networking

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur B.; Farry, Michael P.; Stark, Robert F.

    2013-05-01

    Modern military environments place an increased emphasis on the collection and analysis of intelligence at the tactical level. The deployment of analytical tools at the tactical level helps support the Warfighter's need for rapid collection, analysis, and dissemination of intelligence. However, given the lack of experience and staffing at the tactical level, most of the available intelligence is not exploited. Tactical environments are staffed by a new generation of intelligence analysts who are well-versed in modern collaboration environments and social networking. An opportunity exists to enhance tactical intelligence analysis by exploiting these personnel strengths, but is dependent on appropriately designed information sharing technologies. Existing social information sharing technologies enable users to publish information quickly, but do not unite or organize information in a manner that effectively supports intelligence analysis. In this paper, we present an alternative approach to structuring and supporting tactical intelligence analysis that combines the benefits of existing concepts, and provide detail on a prototype system embodying that approach. Since this approach employs familiar collaboration support concepts from social media, it enables new-generation analysts to identify the decision-relevant data scattered among databases and the mental models of other personnel, increasing the timeliness of collaborative analysis. Also, the approach enables analysts to collaborate visually to associate heterogeneous and uncertain data within the intelligence analysis process, increasing the robustness of collaborative analyses. Utilizing this familiar dynamic collaboration environment, we hope to achieve a significant reduction of time and skill required to glean actionable intelligence in these challenging operational environments.

  8. Kalman approach to accuracy management for interoperable heterogeneous model abstraction within an HLA-compliant simulation

    NASA Astrophysics Data System (ADS)

    Leskiw, Donald M.; Zhau, Junmei

    2000-06-01

    This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.

  9. Pixel-based image fusion with false color mapping

    NASA Astrophysics Data System (ADS)

    Zhao, Wei; Mao, Shiyi

    2003-06-01

    In this paper, we propose a pixel-based image fusion algorithm that combines the gray-level image fusion method with the false color mapping. This algorithm integrates two gray-level images presenting different sensor modalities or at different frequencies and produces a fused false-color image. The resulting image has higher information content than each of the original images. The objects in the fused color image are easy to be recognized. This algorithm has three steps: first, obtaining the fused gray-level image of two original images; second, giving the generalized high-boost filtering images between fused gray-level image and two source images respectively; third, generating the fused false-color image. We use the hybrid averaging and selection fusion method to obtain the fused gray-level image. The fused gray-level image will provide better details than two original images and reduce noise at the same time. But the fused gray-level image can't contain all detail information in two source images. At the same time, the details in gray-level image cannot be discerned as easy as in a color image. So a color fused image is necessary. In order to create color variation and enhance details in the final fusion image, we produce three generalized high-boost filtering images. These three images are displayed through red, green and blue channel respectively. A fused color image is produced finally. This method is used to fuse two SAR images acquired on the San Francisco area (California, USA). The result shows that fused false-color image enhances the visibility of certain details. The resolution of the final false-color image is the same as the resolution of the input images.

  10. A survey of computational methods and error rate estimation procedures for peptide and protein identification in shotgun proteomics

    PubMed Central

    Nesvizhskii, Alexey I.

    2010-01-01

    This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881

  11. The Effects of Varying Pictorial Detail and Presentation Strategy on Concept Formation.

    ERIC Educational Resources Information Center

    Gorman, Don A.

    The purpose of this study is to determine the effects of varying pictorial detail and presentation strategy on learners of varying grade levels in a visually transmitted concept formation task. Specifically, line drawings containing only relevant details and halftones containing relevant and irrelevant detail were presented successively and…

  12. Statistical analysis of electromagnetic radiation measurements in the vicinity of GSM/UMTS base station antenna masts.

    PubMed

    Koprivica, Mladen; Neskovic, Natasa; Neskovic, Aleksandar; Paunovic, George

    2014-01-01

    As a result of dense installations of public mobile base station, additional electromagnetic radiation occurs in the living environment. In order to determine the level of radio-frequency radiation generated by base stations, extensive electromagnetic field strength measurements were carried out for 664 base station locations. Base station locations were classified into three categories: indoor, masts and locations with installations on buildings. Having in mind the large percentage (47 %) of sites with antenna masts, a detailed analysis of this location category was performed, and the measurement results were presented. It was concluded that the total electric field strength in the vicinity of base station antenna masts in no case exceeded 10 V m(-1), which is quite below the International Commission on Non-Ionizing Radiation Protection reference levels. At horizontal distances >50 m from the mast bottom, the median and maximum values were <1 and 2 V m(-1), respectively.

  13. Automatic Parametrization of Somatosensory Evoked Potentials With Chirp Modeling.

    PubMed

    Vayrynen, Eero; Noponen, Kai; Vipin, Ashwati; Thow, X Y; Al-Nashash, Hasan; Kortelainen, Jukka; All, Angelo

    2016-09-01

    In this paper, an approach using polynomial phase chirp signals to model somatosensory evoked potentials (SEPs) is proposed. SEP waveforms are assumed as impulses undergoing group velocity dispersion while propagating along a multipath neural connection. Mathematical analysis of pulse dispersion resulting in chirp signals is performed. An automatic parameterization of SEPs is proposed using chirp models. A Particle Swarm Optimization algorithm is used to optimize the model parameters. Features describing the latencies and amplitudes of SEPs are automatically derived. A rat model is then used to evaluate the automatic parameterization of SEPs in two experimental cases, i.e., anesthesia level and spinal cord injury (SCI). Experimental results show that chirp-based model parameters and the derived SEP features are significant in describing both anesthesia level and SCI changes. The proposed automatic optimization based approach for extracting chirp parameters offers potential for detailed SEP analysis in future studies. The method implementation in Matlab technical computing language is provided online.

  14. Combinational pixel-by-pixel and object-level classifying, segmenting, and agglomerating in performing quantitative image analysis that distinguishes between healthy non-cancerous and cancerous cell nuclei and delineates nuclear, cytoplasm, and stromal material objects from stained biological tissue materials

    DOEpatents

    Boucheron, Laura E

    2013-07-16

    Quantitative object and spatial arrangement-level analysis of tissue are detailed using expert (pathologist) input to guide the classification process. A two-step method is disclosed for imaging tissue, by classifying one or more biological materials, e.g. nuclei, cytoplasm, and stroma, in the tissue into one or more identified classes on a pixel-by-pixel basis, and segmenting the identified classes to agglomerate one or more sets of identified pixels into segmented regions. Typically, the one or more biological materials comprises nuclear material, cytoplasm material, and stromal material. The method further allows a user to markup the image subsequent to the classification to re-classify said materials. The markup is performed via a graphic user interface to edit designated regions in the image.

  15. Granularity analysis for mathematical proofs.

    PubMed

    Schiller, Marvin R G

    2013-04-01

    Mathematical proofs generally allow for various levels of detail and conciseness, such that they can be adapted for a particular audience or purpose. Using automated reasoning approaches for teaching proof construction in mathematics presupposes that the step size of proofs in such a system is appropriate within the teaching context. This work proposes a framework that supports the granularity analysis of mathematical proofs, to be used in the automated assessment of students' proof attempts and for the presentation of hints and solutions at a suitable pace. Models for granularity are represented by classifiers, which can be generated by hand or inferred from a corpus of sample judgments via machine-learning techniques. This latter procedure is studied by modeling granularity judgments from four experts. The results provide support for the granularity of assertion-level proofs but also illustrate a degree of subjectivity in assessing step size. Copyright © 2013 Cognitive Science Society, Inc.

  16. Experimental Analysis of Steel Beams Subjected to Fire Enhanced by Brillouin Scattering-Based Fiber Optic Sensor Data.

    PubMed

    Bao, Yi; Chen, Yizheng; Hoehler, Matthew S; Smith, Christopher M; Bundy, Matthew; Chen, Genda

    2017-01-01

    This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C.

  17. Analysis of the Flicker Level Produced by a Fixed-Speed Wind Turbine

    NASA Astrophysics Data System (ADS)

    Suppioni, Vinicius; P. Grilo, Ahda

    2013-10-01

    In this article, the analysis of the flicker emission during continuous operation of a mid-scale fixed-speed wind turbine connected to a distribution system is presented. Flicker emission is investigated based on simulation results, and the dependence of flicker emission on short-circuit capacity, grid impedance angle, mean wind speed, and wind turbulence is analyzed. The simulations were conducted in different programs in order to provide a more realistic wind emulation and detailed model of mechanical and electrical components of the wind turbine. Such aim is accomplished by using FAST (Fatigue, Aerodynamics, Structures, and Turbulence) to simulate the mechanical parts of the wind turbine, Simulink/MatLab to simulate the electrical system, and TurbSim to obtain the wind model. The results show that, even for a small wind generator, the flicker level can limit the wind power capacity installed in a distribution system.

  18. Developments in remote sensing technology enable more detailed urban flood risk analysis.

    NASA Astrophysics Data System (ADS)

    Denniss, A.; Tewkesbury, A.

    2009-04-01

    Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest digital airborne sensors, both optical and lidar, to produce the input layer for surface water flood modelling. A national flood map product has been created. The new product utilises sophisticated modelling techniques, perfected over many years, which harness graphical processing power. This product will prove particularly valuable for risk assessment decision support within insurance/reinsurance, property/environmental, utilities, risk management and government agencies. However, it is not just the ground elevation that determines the behaviour of surface water. By combining height information (surface and terrain) with high resolution aerial photography and colour infrared imagery, a high definition land cover mapping dataset (LandBase) is being produced, which provides a precise measure of sealed versus non sealed surface. This will allows even more sophisticated modelling of flood scenarios. Thus, the value of airborne survey data can be demonstrated by flood risk analysis down to individual addresses in urban areas. However for some risks, an even more detailed survey may be justified. In order to achieve this, Infoterra is testing new 360˚ mobile lidar technology. Collecting lidar data from a moving vehicle allows each street to be mapped in very high detail, allowing precise information about the location, size and shape of features such as kerbstones, gullies, road camber and building threshold level to be captured quickly and accurately. These data can then be used to model the problem of overland flood risk at the scale of individual properties. Whilst at present it might be impractical to undertake such detailed modelling for all properties, these techniques can certainly be used to improve the flood risk analysis of key locations. This paper will demonstrate how these new high resolution remote sensing techniques can be combined to provide a new resolution of detail to aid urban flood modelling.

  19. Exterior building details of Building B, west façade: road level ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Exterior building details of Building B, west façade: road level four-over-four double-hung painted-wood windows with brick sill and arch brick lintels; easterly view - San Quentin State Prison, Building 22, Point San Quentin, San Quentin, Marin County, CA

  20. Forensic genetic analysis of bio-geographical ancestry.

    PubMed

    Phillips, Chris

    2015-09-01

    With the great strides made in the last ten years in the understanding of human population variation and the detailed characterization of the genome, it is now possible to identify sets of ancestry informative markers suitable for relatively small-scale PCR-based assays and use them to analyze the ancestry of an individual from forensic DNA. This review outlines some of the current understanding of past human population structure and how it may have influenced the complex distribution of contemporary human diversity. A simplified description of human diversity can provide a suitable basis for choosing the best ancestry-informative markers, which is important given the constraints of multiplex sizes in forensic DNA tests. It is also important to decide the level of geographic resolution that is realistic to ensure the balance between informativeness and an over-simplification of complex human diversity patterns. A detailed comparison is made of the most informative ancestry markers suitable for forensic use and assessments are made of the data analysis regimes that can provide statistical inferences of a DNA donor's bio-geographical ancestry. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Response of resin transfer molded (RTM) composites under reversed cyclic loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahfuz, H.; Haque, A.; Yu, D.

    1996-01-01

    Compressive behavior and the tension-compression fatigue response of resin transfer molded IM7 PW/PR 500 composite laminate with a circular notch have been studied. Fatigue damage characteristics have been investigated through the changes in the laminate strength and stiffness by gradually incrementing the fatigue cycles at a preselected load level. Progressive damage in the surface of the laminate during fatigue has been investigated using cellulose replicas. Failure mechanisms during static and cyclic tests have been identified and presented in detail. Extensive debonding of filaments and complete fiber bundle fracture accompanied by delamination were found to be responsible for fatigue failures, whilemore » fiber buckling, partial fiber fracture and delamination were characterized as the failure modes during static tests. Weibull analysis of the static, cyclic and residual tests have been performed and described in detail. Fractured as well as untested specimens were C-scanned, and the progressive damage growth during fatigue is presented. Optical Microscopy (OM) and Scanning Electron Microscopy (SEM) for the fractured specimen were also performed and the analysis of the failure behavior is presented.« less

  2. A high-level 3D visualization API for Java and ImageJ.

    PubMed

    Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin

    2010-05-21

    Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.

  3. System analysis to estimate subsurface flow: from global level to the State of Minnesota

    NASA Astrophysics Data System (ADS)

    Shmagin, Boris A.; Kanivetsky, Roman

    2002-06-01

    Stream runoff data globally and in the state of Minnesota were used to estimate subsurface water flow. This system approach is based, in principal, on unity of groundwater and surface water systems, and it is in stark contrast to the traditional deterministic approach based on modeling. In coordination with methodology of system analysis, two levels of study were used to estimate subsurface flow. First, the global stream runoff data were assessed to estimate the temporal-spatial variability of surface water runoff. Factor analysis was used to study the temporal-spatial variability of global runoff for the period from 1918 to 1967. Results of these analysis demonstrate that the variability of global runoff could be represented by seven major components (factor scores) that could be grouped into seven distinct independent grouping from the total of 18 continental slopes on the Earth. Computed variance value in this analysis is 76% and supports such analysis. The global stream runoff for this period is stationary, and is more closely connected with the stream flow of Asia to the Pacific Ocean as well as with the stream runoff of North America towards the Arctic and Pacific Oceans. The second level examines the distribution of river runoff (annual and for February) for various landscapes and the hydrogeological conditions in the State of Minnesota (218,000 km2). The annual and minimal monthly rate of stream runoff for 115 gauging stations with a period of observation of 47 years (1935-1981) were used to characterize the spatio-temporal distribution of stream runoff in Minnesota. Results of this analysis demonstrate that the annual stream runoff rate changes from 6.3, towards 3.95, and then to 2.09 l s-1 km-2 (the difference is significant based on Student's criteria). These values in Minnesota correspond to ecological provinces from a mixed forest province towards the broadleaf forest and to prairie province, respectively. The distribution of minimal monthly stream runoff rate (February runoff) is controlled by hydrogeological systems in Minnesota. The difference between the two hydrogeological regions, Precambrian crystalline basement and Paleozoic artesian basin of 0.83 and 2.09 l/s/km2, is statistically significant. Within these regions, the monthly minimal runoff (0.5 and 1.68, and 0.87 and 3.11 l s-1 km-2 for February, respectively) is also distinctly different for delineated subregions, depending on whether or not the Quaternary cover is present. The spatio-temporal structure that emerges could thus be used to generate river runoff and subsurface flow maps at any scale - from the global level to local detail. Such analysis was carried out in Minnesota with the detailed mapping of the subsurface flow for the Twin Cities Metropolitan area.

  4. System analysis to estimate subsurface flow: From global level to the State of Minnesota

    USGS Publications Warehouse

    Shmagin, B.A.; Kanivetsky, R.

    2002-01-01

    Stream runoff data globally and in the state of Minnesota were used to estimate subsurface water flow. This system approach is based, in principal, on unity of groundwater and surface water systems, and it is in stark contrast to the traditional deterministic approach based on modeling. In coordination with methodology of system analysis, two levels of study were used to estimate subsurface flow. First, the global stream runoff data were assessed to estimate the temporal-spatial variability of surface water runoff. Factor analysis was used to study the temporal-spatial variability of global runoff for the period from 1918 to 1967. Results of these analysis demonstrate that the variability of global runoff could be represented by seven major components (factor scores) that could be grouped into seven distinct independent grouping from the total of 18 continental slopes on the Earth. Computed variance value in this analysis is 76% and supports such analysis. The global stream runoff for this period is stationary, and is more closely connected with the stream flow of Asia to the Pacific Ocean as well as with the stream runoff of North America towards the Arctic and Pacific Oceans. The second level examines the distribution of river runoff (annual and for February) for various landscapes and the hydrogeological conditions in the State of Minnesota (218,000 km2). The annual and minimal monthly rate of stream runoff for 115 gauging stations with a period of observation of 47 years (1935-1981) were used to characterize the spatio-temporal distribution of stream runoff in Minnesota. Results of this analysis demonstrate that the annual stream runoff rate changes from 6.3, towards 3.95, and then to 2.09 1 s-1 km-2 (the difference is significant based on Student's criteria). These values in Minnesota correspond to ecological provinces from a mixed forest province towards the broadleaf forest and to prairie province, respectively. The distribution of minimal monthly stream runoff rate (February runoff) is controlled by hydrogeological systems in Minnesota. The difference between the two hydrogeological regions, Precambrian crystalline basement and Paleozoic artesian basin of 0.83 and 2.09 1/s/km2, is statistically significant. Within these regions, the monthly minimal runoff (0.5 and 1.68, and 0.87 and 3.11 1 s-1 km-2 for February, respectively) is also distinctly different for delineated subregions, depending on whether or not the Quaternary cover is present. The spatio-temporal structure that emerges could thus be used to generate river runoff and subsurface flow maps at any scale - from the global level to local detail. Such analysis was carried out in Minnesota with the detailed mapping of the subsurface flow for the Twin Cities Metropolitan area.

  5. Multiscale assessment of landscape structure in heterogeneous forested area

    NASA Astrophysics Data System (ADS)

    Simoniello, T.; Pignatti, S.; Carone, M. T.; Fusilli, L.; Lanfredi, M.; Coppola, R.; Santini, F.

    2010-05-01

    The characterization of landscape structure in space or time is fundamental to infer ecological processes (Ingegnoli, 2002). Landscape pattern arrangements strongly influence forest ecological functioning and biodiversity, as an example landscape fragmentation can induce habitat degradation reducing forest species populations or limiting their recolonization. Such arrangements are spatially correlated and scale-dependent, therefore they have distinctive operational-scales at which they can be best characterized (Wu, 2004). In addition, the detail of the land cover classification can have substantial influences on resulting pattern quantification (Greenberg et al.2001). In order to evaluate the influence of the observational scales and labelling details, we investigated a forested area (Pollino National Park; southern Italy) by analyzing the patch arrangement derived from three remote sensing sensors having different spectral and spatial resolutions. In particular, we elaborated data from the hyperspectral MIVIS (102 bands; ~7m) and Hyperion (220 bands; 30m), and the multispectral Landsat-TM (7 bands; 30m). Moreover, to assess the landscape evolution we investigated the hierarchical structure of the study area (landscape, class, patch) by elaborating two Landsat-TM acquired in 1987 and 1998. Preprocessed data were classified by adopting a supervised procedure based on the Minimum Distance classifier. The obtained labelling correspond to Corine level 5 for the high resolution MIVIS data, to Corine level 4 for Hyperion and to an intermediate level 4-3 for TM data. The analysis was performed by taking into account patch density, diversity and evenness at landscape level; mean patch size and interdispersion at class level; patch structure and perimeter regularity at patch level. The three sensors described a landscape with a quite high level of richness and distribution. The high spectral and spatial resolution of MIVIS data provided the highest diversity level (SHDI = 2.05), even if the results obtained for TM were not so different (1.93), Hyperion showed the lowest value (1.79). The obtained evenness index was similar for all the landscapes (~ 0.72). At class level, the interdispersion increases as the spatial and spectral resolution power decrease. Due to the low labelling detail, TM classes represent an aggregation of MIVIS and Hyperion classes; therefore they result larger and more diffused over the territory favouring higher interspersion values in the computation. The investigation of the patch structure highlighted the highest MIVIS capability in describing the patch articulation; Hyperion and TM showed quite similar situation. The historical analysis based on TM imagery showed a fragmentation process for some forested patches (mainly beeches): an increase of structure complexity (higher FRACT) is coupled with a higher patch number and an extension reduction. On the whole, the obtained results showed that the multispectral Landsat-TM images represent a good data source for supporting studies on landscape structure of forested areas and that for analyzing the articulation of particular species the high spectral resolution needs to be coupled with a high spatial resolution, i.e. Hyperion sampling is not adequate for such a purpose.

  6. Computational Analysis of a Prototype Martian Rotorcraft Experiment

    NASA Technical Reports Server (NTRS)

    Corfeld, Kelly J.; Strawn, Roger C.; Long, Lyle N.

    2002-01-01

    This paper presents Reynolds-averaged Navier-Stokes calculations for a prototype Martian rotorcraft. The computations are intended for comparison with an ongoing Mars rotor hover test at NASA Ames Research Center. These computational simulations present a new and challenging problem, since rotors that operate on Mars will experience a unique low Reynolds number and high Mach number environment. Computed results for the 3-D rotor differ substantially from 2-D sectional computations in that the 3-D results exhibit a stall delay phenomenon caused by rotational forces along the blade span. Computational results have yet to be compared to experimental data, but computed performance predictions match the experimental design goals fairly well. In addition, the computed results provide a high level of detail in the rotor wake and blade surface aerodynamics. These details provide an important supplement to the expected experimental performance data.

  7. Performance evaluation of the use of photovoltaics to power a street light in Lowell

    NASA Astrophysics Data System (ADS)

    Crowell, Adam B.

    Commercial, off-grid photovoltaic (PV) lighting systems present an attractive alternative to traditional outdoor lighting at sites where grid power is unavailable or unreliable. This study presents a comprehensive theoretical site analysis for the installation of standalone PV lighting systems at the Lowell National Historic Park in Lowell, MA. Detailed insolation studies are performed at the target site, resulting in expected daily Watt-hour totals available for battery charging for each month of the year. Illumination simulations are presented, detailing the expected lighting performance of the systems at night. Light levels are compared to those dictated by accepted standards. While it is acknowledged that the target site presents significant challenges to photovoltaics, such as severe shading, final system component specifications are provided, along with programming and positioning recommendations that will yield the best achievable performance.

  8. Overview and technical and practical aspects for use of geostatistics in hazardous-, toxic-, and radioactive-waste-site investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossong, C.R.; Karlinger, M.R.; Troutman, B.M.

    1999-10-01

    Technical and practical aspects of applying geostatistics are developed for individuals involved in investigation at hazardous-, toxic-, and radioactive-waste sites. Important geostatistical concepts, such as variograms and ordinary, universal, and indicator kriging, are described in general terms for introductory purposes and in more detail for practical applications. Variogram modeling using measured ground-water elevation data is described in detail to illustrate principles of stationarity, anisotropy, transformations, and cross validation. Several examples of kriging applications are described using ground-water-level elevations, bedrock elevations, and ground-water-quality data. A review of contemporary literature and selected public domain software associated with geostatistics also is provided, asmore » is a discussion of alternative methods for spatial modeling, including inverse distance weighting, triangulation, splines, trend-surface analysis, and simulation.« less

  9. Measuring the Carotid to Femoral Pulse Wave Velocity (Cf-PWV) to Evaluate Arterial Stiffness.

    PubMed

    Ji, Hongwei; Xiong, Jing; Yu, Shikai; Chi, Chen; Bai, Bin; Teliewubai, Jiadela; Lu, Yuyan; Zhang, Yi; Xu, Yawei

    2018-05-03

    For the elderly, arterial stiffening is a good marker for aging evaluation and it is recommended that the arterial stiffness be determined noninvasively by the measurement of carotid to femoral pulse wave velocity (cf-PWV) (Class I; Level of Evidence A). In literature, numerous community-based or disease-specific studies have reported that higher cf-PWV is associated with increased cardiovascular risk. Here, we discuss strategies to evaluate arterial stiffness with cf-PWV. Following the well-defined steps detailed here, e.g., proper position operator, distance measurement, and tonometer position, we will obtain a standard cf-PWV value to evaluate arterial stiffness. In this paper, a detailed stepwise method to record a good quality PWV and pulse wave analysis (PWA) using a non-invasive tonometry-based device will be discussed.

  10. Basic as well as detailed neurosonograms can be performed by offline analysis of three-dimensional fetal brain volumes.

    PubMed

    Bornstein, E; Monteagudo, A; Santos, R; Strock, I; Tsymbal, T; Lenchner, E; Timor-Tritsch, I E

    2010-07-01

    To evaluate the feasibility and the processing time of offline analysis of three-dimensional (3D) brain volumes to perform a basic, as well as a detailed, targeted, fetal neurosonogram. 3D fetal brain volumes were obtained in 103 consecutive healthy fetuses that underwent routine anatomical survey at 20-23 postmenstrual weeks. Transabdominal gray-scale and power Doppler volumes of the fetal brain were acquired by one of three experienced sonographers (an average of seven volumes per fetus). Acquisition was first attempted in the sagittal and coronal planes. When the fetal position did not enable easy and rapid access to these planes, axial acquisition at the level of the biparietal diameter was performed. Offline analysis of each volume was performed by two of the authors in a blinded manner. A systematic technique of 'volume manipulation' was used to identify a list of 25 brain dimensions/structures comprising a complete basic evaluation, intracranial biometry and a detailed targeted fetal neurosonogram. The feasibility and reproducibility of obtaining diagnostic-quality images of the different structures was evaluated, and processing times were recorded, by the two examiners. Diagnostic-quality visualization was feasible in all of the 25 structures, with an excellent visualization rate (85-100%) reported in 18 structures, a good visualization rate (69-97%) reported in five structures and a low visualization rate (38-54%) reported in two structures, by the two examiners. An average of 4.3 and 5.4 volumes were used to complete the examination by the two examiners, with a mean processing time of 7.2 and 8.8 minutes, respectively. The overall agreement rate for diagnostic visualization of the different brain structures between the two examiners was 89.9%, with a kappa coefficient of 0.5 (P < 0.001). In experienced hands, offline analysis of 3D brain volumes is a reproducible modality that can identify all structures necessary to complete both a basic and a detailed second-trimester fetal neurosonogram. Copyright 2010 ISUOG. Published by John Wiley & Sons, Ltd.

  11. Study of mathematical modeling of communication systems transponders and receivers

    NASA Technical Reports Server (NTRS)

    Walsh, J. R.

    1972-01-01

    The modeling of communication receivers is described at both the circuit detail level and at the block level. The largest effort was devoted to developing new models at the block modeling level. The available effort did not permit full development of all of the block modeling concepts envisioned, but idealized blocks were developed for signal sources, a variety of filters, limiters, amplifiers, mixers, and demodulators. These blocks were organized into an operational computer simulation of communications receiver circuits identified as the frequency and time circuit analysis technique (FATCAT). The simulation operates in both the time and frequency domains, and permits output plots or listings of either frequency spectra or time waveforms from any model block. Transfer between domains is handled with a fast Fourier transform algorithm.

  12. Astronaut Risk Levels During Crew Module (CM) Land Landing

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Carney, Kelly S.; Littell, Justin

    2007-01-01

    The NASA Engineering Safety Center (NESC) is investigating the merits of water and land landings for the crew exploration vehicle (CEV). The merits of these two options are being studied in terms of cost and risk to the astronauts, vehicle, support personnel, and general public. The objective of the present work is to determine the astronaut dynamic response index (DRI), which measures injury risks. Risks are determined for a range of vertical and horizontal landing velocities. A structural model of the crew module (CM) is developed and computational simulations are performed using a transient dynamic simulation analysis code (LS-DYNA) to determine acceleration profiles. Landing acceleration profiles are input in a human factors model that determines astronaut risk levels. Details of the modeling approach, the resulting accelerations, and astronaut risk levels are provided.

  13. Comments on settling chamber design for quiet, blowdown wind tunnels

    NASA Technical Reports Server (NTRS)

    Beckwith, I. E.

    1981-01-01

    Transfer of an existing continous circuit supersonic wind tunnel to Langley and its operation there as a blowdown tunnel is planned. Flow disturbance requirements in the supply section and methods for reducing the high level broad band acoustic disturbances present in typical blowdown tunnels are reviewed. Based on recent data and the analysis of two blowdown facilities at Langley, methods for reducing the total turbulence levels in the settling chamber, including both acoustic and vorticity modes, to less than one percent are recommended. The pertinent design details of the damping screens and honeycomb and the recommended minimum pressure drop across the porous components providing the required two orders of magnitude attenuation of acoustic noise levels are given. A suggestion for the support structure of these high pressure drop porous components is offered.

  14. Topography-based analysis of Hurricane Katrina inundation of New Orleans: Chapter 3G in Science and the storms-the USGS response to the hurricanes of 2005

    USGS Publications Warehouse

    Gesch, Dean

    2007-01-01

    The ready availability of high-resolution, high-accuracy elevation data proved valuable for development of topographybased products to determine rough estimates of the inundation of New Orleans, La., from Hurricane Katrina. Because of its high level of spatial detail and vertical accuracy of elevation measurements, light detection and ranging (lidar) remote sensing is an excellent mapping technology for use in low-relief hurricane-prone coastal areas.

  15. Intelligence Fusion Paradigm: Understanding Complex Operational Environments Implementing the Institutional Analysis and Development Framework

    DTIC Science & Technology

    2012-12-14

    NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Whitfield, Christy L., MAJOR, U.S. Army 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...the level of details necessary to consumer’s needs. It must also contain accurate assessments 4 of all applicable material and apply to the end ...half of the 20th century , focused on cultural perspectives as a pivotal concern regarding economic, political, and military powers and its global

  16. FBI fingerprint identification automation study: AIDS 3 evaluation report. Volume 7: Top down functional analysis

    NASA Technical Reports Server (NTRS)

    Mulhall, B. D. L.

    1980-01-01

    The functions are identified and described in chart form as a tree in which the basic functions, to 'Provide National Identification Service,' are shown at the top. The lower levels of the tree branch out to indicate functions and sub-functions. Symbols are used to indicate whether or not a function was automated in the AIDS 1 or 2 system or is planned to be automated in the AIDS 3 system. The tree chart is shown in detail.

  17. Review of USACE Institutional Information Related to Evaluation of Incremental Changes in Water Resources Planning

    DTIC Science & Technology

    2011-03-01

    The Corps will deliver a more holistic approach to solving water resources chal- lenges that effectively considers the broad variety of economic ...scales, and standards for a balanced evaluation of economic , social, and environmental factors, should be updated and expanded to a level of detail...comparable to cur- rent standards for traditional benefit-cost analysis of economic objec- tives of a project” (pp 5–6). • “The Corps should ensure that

  18. ATLAS I/O performance optimization in as-deployed environments

    NASA Astrophysics Data System (ADS)

    Maier, T.; Benjamin, D.; Bhimji, W.; Elmsheuser, J.; van Gemmeren, P.; Malon, D.; Krumnack, N.

    2015-12-01

    This paper provides an overview of an integrated program of work underway within the ATLAS experiment to optimise I/O performance for large-scale physics data analysis in a range of deployment environments. It proceeds to examine in greater detail one component of that work, the tuning of job-level I/O parameters in response to changes to the ATLAS event data model, and considers the implications of such tuning for a number of measures of I/O performance.

  19. Distillation tray structural parameter study: Phase 1

    NASA Technical Reports Server (NTRS)

    Winter, J. Ronald

    1991-01-01

    The purpose here is to identify the structural parameters (plate thickness, liquid level, beam size, number of beams, tray diameter, etc.) that affect the structural integrity of distillation trays in distillation columns. Once the sensitivity of the trays' dynamic response to these parameters has been established, the designer will be able to use this information to prepare more accurate specifications for the construction of new trays. Information is given on both static and dynamic analysis, modal response, and tray failure details.

  20. Ten New Recorded Species of Macrofungi on Ulleung Island, Korea

    PubMed Central

    Park, Myung Soo; Cho, Hae Jin; Kim, Nam Kyu; Park, Jae Young; Lee, Hyun; Park, Ki Hyeong; Kim, Min-Ji; Kim, Jae-Jin; Kim, Changmu

    2017-01-01

    Ulleung Island is a biodiversity hotspot in South Korea. During a survey of indigenous fungal species from Ulleung Island conducted from 2015 to 2016, we discovered 10 unrecorded macrofungi in Korea. These macrofungi were identified to the species level using morphological features and phylogenetic analysis based on the internal transcribed spacer region: Deconica phyllogena, Mycena zephirus, Phaeomarasmius proximans, Phlebia radiata, Pluteus semibulbosus, Postia alni, Resinicium pinicola, Scytinostroma portentosum, Tricholomopsis flammula, and Tyromyces kmetii. We also provide detailed morphological descriptions for these 10 species. PMID:29371796

  1. Comparative genomic analysis as a tool for biologicaldiscovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nobrega, Marcelo A.; Pennacchio, Len A.

    2003-03-30

    Biology is a discipline rooted in comparisons. Comparative physiology has assembled a detailed catalogue of the biological similarities and differences between species, revealing insights into how life has adapted to fill a wide-range of environmental niches. For example, the oxygen and carbon dioxide carrying capacity of vertebrate has evolved to provide strong advantages for species respiring at sea level, at high elevation or within water. Comparative- anatomy, -biochemistry, -pharmacology, -immunology and -cell biology have provided the fundamental paradigms from which each discipline has grown.

  2. Coup d’Oeil: Military Geography and the Operational Level of War

    DTIC Science & Technology

    1991-05-16

    afUP D’OEIL Every day I feel nmre rnd more in need of an atlas, as geogrphvy iv the minutest details. is essential to a true nli "tary education. I...categorizing terrain have provided the essential prerequisites for the development of the IPB process. The process allows for an in-depth technical analysis of...is theater whiidch define the lines of essential to the c •umnrs plan. operation. ... defined by a conpetent authority. CENTER Zi f GRAVIMI Center of

  3. System-Level Experimental Validations for Supersonic Commercial Transport Aircraft Entering Service in the 2018-2020 Time Period

    NASA Technical Reports Server (NTRS)

    Magee, Todd E.; Wilcox, Peter A.; Fugal, Spencer R.; Acheson, Kurt E.; Adamson, Eric E.; Bidwell, Alicia L.; Shaw, Stephen G.

    2013-01-01

    This report describes the work conducted by The Boeing Company under American Recovery and Reinvestment Act (ARRA) and NASA funding to experimentally validate the conceptual design of a supersonic airliner feasible for entry into service in the 2018 to 2020 timeframe (NASA N+2 generation). The report discusses the design, analysis and development of a low-boom concept that meets aggressive sonic boom and performance goals for a cruise Mach number of 1.8. The design is achieved through integrated multidisciplinary optimization tools. The report also describes the detailed design and fabrication of both sonic boom and performance wind tunnel models of the low-boom concept. Additionally, a description of the detailed validation wind tunnel testing that was performed with the wind tunnel models is provided along with validation comparisons with pretest Computational Fluid Dynamics (CFD). Finally, the report describes the evaluation of existing NASA sonic boom pressure rail measurement instrumentation and a detailed description of new sonic boom measurement instrumentation that was constructed for the validation wind tunnel testing.

  4. Digital Architecture – Results From a Gap Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna Helene; Thomas, Kenneth David; Fitzgerald, Kirk

    The digital architecture is defined as a collection of IT capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. The digital architecture can be thought of as an integration of the separate I&C and information systems already in place in NPPs, brought together for the purpose of creating new levels of automation in NPP work activities. In some cases, it might be an extension of the current communication systems, to provide digital communications where they are currently analog only. This collection of IT capabilities must in turn be based on amore » set of user requirements that must be supported for the interconnected technologies to operate in an integrated manner. These requirements, simply put, are a statement of what sorts of digital work functions will be exercised in a fully-implemented seamless digital environment and how much they will be used. The goal of the digital architecture research is to develop a methodology for mapping nuclear power plant operational and support activities into the digital architecture, which includes the development of a consensus model for advanced information and control architecture. The consensus model should be developed at a level of detail that is useful to the industry. In other words, not so detailed that it specifies specific protocols and not so vague that it is only provides a high level description of technology. The next step towards the model development is to determine the current state of digital architecture at typical NPPs. To investigate the current state, the researchers conducted a gap analysis to determine to what extent the NPPs can support the future digital technology environment with their existing I&C and IT structure, and where gaps exist with respect to the full deployment of technology over time. The methodology, result, and conclusions from the gap analysis are described in this report.« less

  5. Association Between Academic Medical Center Pharmaceutical Detailing Policies and Physician Prescribing

    PubMed Central

    Ang, Desmond; Steinhart, Jonathan; Chao, Matthew; Patterson, Mark; Sah, Sunita; Wu, Tina; Schoenbaum, Michael; Hutchins, David; Brennan, Troyen; Loewenstein, George

    2017-01-01

    Importance In an effort to regulate physician conflicts of interest, some US academic medical centers (AMCs) enacted policies restricting pharmaceutical representative sales visits to physicians (known as detailing) between 2006 and 2012. Little is known about the effect of these policies on physician prescribing. Objective To analyze the association between detailing policies enacted at AMCs and physician prescribing of actively detailed and not detailed drugs. Design, Setting, and Participants The study used a difference-in-differences multivariable regression analysis to compare changes in prescribing by physicians before and after implementation of detailing policies at AMCs in 5 states (California, Illinois, Massachusetts, Pennsylvania, and New York) that made up the intervention group with changes in prescribing by a matched control group of similar physicians not subject to a detailing policy. Exposures Academic medical center implementation of policies regulating pharmaceutical salesperson visits to attending physicians. Main Outcomes and Measures The monthly within-drug class market share of prescriptions written by an individual physician for detailed and nondetailed drugs in 8 drug classes (lipid-lowering drugs, gastroesophageal reflux disease drugs, diabetes drugs, antihypertensive drugs, hypnotic drugs approved for the treatment of insomnia [sleep aids], attention-deficit/hyperactivity disorder drugs, antidepressant drugs, and antipsychotic drugs) comparing the 10- to 36-month period before implementation of the detailing policies with the 12- to 36-month period after implementation, depending on data availability. Results The analysis included 16 121 483 prescriptions written between January 2006 and June 2012 by 2126 attending physicians at the 19 intervention group AMCs and by 24 593 matched control group physicians. The sample mean market share at the physician-drug-month level for detailed and nondetailed drugs prior to enactment of policies was 19.3% and 14.2%, respectively. Exposure to an AMC detailing policy was associated with a decrease in the market share of detailed drugs of 1.67 percentage points (95% CI, −2.18 to −1.18 percentage points; P < .001) and an increase in the market share of nondetailed drugs of 0.84 percentage points (95% CI, 0.54 to 1.14 percentage points; P < .001). Associations were statistically significant for 6 of 8 study drug classes for detailed drugs (lipid-lowering drugs, gastroesophageal reflux disease drugs, antihypertensive drugs, sleep aids, attention-deficit/hyperactivity disorder drugs, and antidepressant drugs) and for 9 of the 19 AMCs that implemented policies. Eleven of the 19 AMCs regulated salesperson gifts to physicians, restricted salesperson access to facilities, and incorporated explicit enforcement mechanisms. For 8 of these 11 AMCs, there was a significant change in prescribing. In contrast, there was a significant change at only 1 of 8 AMCs that did not enact policies in all 3 areas. Conclusions and Relevance Implementation of policies at AMCs that restricted pharmaceutical detailing between 2006 and 2012 was associated with modest but significant reductions in prescribing of detailed drugs across 6 of 8 major drug classes; however, changes were not seen in all of the AMCs that enacted policies. PMID:28464141

  6. DETAIL VIEW ABOVE THE MAIN ASSEMBLY LEVEL SHOWING HOIST AT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW ABOVE THE MAIN ASSEMBLY LEVEL SHOWING HOIST AT COLUMN LINE U-6 USED FOR LIFTING WING COMPONENTS FROM THE WING ASSEMBLY ANNEX TO THE B-29 PRODUCTION LINE. - Offutt Air Force Base, Glenn L. Martin-Nebraska Bomber Plant, Building D, Peacekeeper Drive, Bellevue, Sarpy County, NE

  7. 32. DETAIL INTERIOR VIEW OF LEVEL +55 IN POWERHOUSE #1, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    32. DETAIL INTERIOR VIEW OF LEVEL +55 IN POWERHOUSE #1, SHOWING GOVERNOR CONTROL CABINET BETWEEN TURBINE/GENERATOR UNIT NO 1 (ON FAR LEFT) AND NO 2 (OUT OF VIEW ON RIGHT). - Bonneville Project, Powerhouse No.1, Spanning Bradford Slough, from Bradford Island, Bonneville, Multnomah County, OR

  8. Software quality assurance | News

    Science.gov Websites

    Measure was removed: "Sufficient level of detail in the requirements to develop test cases." ; This control measure was removed since the sufficient level of detail needed to develop test cases is recorded for all test cases. (Note: This is mandatory for applications graded with a High Quality Assurance

  9. Simulation tools for particle-based reaction-diffusion dynamics in continuous space

    PubMed Central

    2014-01-01

    Particle-based reaction-diffusion algorithms facilitate the modeling of the diffusional motion of individual molecules and the reactions between them in cellular environments. A physically realistic model, depending on the system at hand and the questions asked, would require different levels of modeling detail such as particle diffusion, geometrical confinement, particle volume exclusion or particle-particle interaction potentials. Higher levels of detail usually correspond to increased number of parameters and higher computational cost. Certain systems however, require these investments to be modeled adequately. Here we present a review on the current field of particle-based reaction-diffusion software packages operating on continuous space. Four nested levels of modeling detail are identified that capture incrementing amount of detail. Their applicability to different biological questions is discussed, arching from straight diffusion simulations to sophisticated and expensive models that bridge towards coarse grained molecular dynamics. PMID:25737778

  10. 3D structure of macropore networks within natural and de-embarked estuary saltmarsh sediments: towards an improved understanding of network structural control over hydrologic function

    NASA Astrophysics Data System (ADS)

    Carr, Simon; Spencer, Kate; James, Tempest; Lucy, Diggens

    2015-04-01

    Saltmarshes are globally important environments which, though occupying < 4% of the Earth's surface, provide a range of ecosystem services. Yet, they are threatened by sea level rise, human population growth, urbanization and pollution resulting in degradation. To compensate for this habitat loss many coastal restoration projects have been implemented over the last few decades, largely driven by legislative requirements for improved biodiversity e.g. the EU Habitats Directive and Birds Directive. However, there is growing evidence that restored saltmarshes, recreated through the return to tidal inundation of previously drained and defended low-lying coastal land, do not have the same species composition even after 100 years and while environmental enhancement has been achieved, there may be consequences for ecosystem functioning This study presents the findings of a comparative analysis of detailed sediment structure and hydrological functioning of equivalent natural and de-embanked saltmarsh sediments at Orplands Farm, Essex, UK. 3D x-ray CT scanning of triplicate undisturbed sediment cores recovered in 2013 have been used to derive detailed volumetric reconstructions of macropore structure and networks, and to infer differences in bulk microporosity between natural and de-embanked saltmarshes. These volumes have been further visualised for qualitative analysis of the main sediment components, and extraction of key macropore space parameters for quantified analysis including total porosity and connectivity, as well as structure, organisation and efficiency (tortuosity) of macropore networks. Although total porosity was significantly greater within the de-embanked saltmarsh sediments, pore networks in these samples were less organised and more tortuous, and were also inferred to have significantly lower micro-porosity than those of the natural saltmarsh. These datasets are applied to explain significant differences in the hydraulic behaviour and functioning observed between natural and de-embarked saltmarsh at Orplands. Piezometer wells and pressure transducers recorded fluctuations in water level at 15 minute intervals over a 4.5 month period (winter 2011-2012). Basic patterns for water level fluctuations in both the natural and de-embanked saltmarsh are similar and reflect tidal flooding. However, in the de-embanked saltmarsh, water levels are higher and less responsive to tidal flooding.

  11. Content, Structure, and Sequence of the Detailing Discipline at Kendall College of Art and Design.

    ERIC Educational Resources Information Center

    Mulder, Bruce E.

    A study identified the appropriate general content, structure, and sequence for a detailing discipline that promoted student achievement to professional levels. Its focus was the detailing discipline, a sequence of studio courses within the furniture design program at Kendall College of Art and Design, Grand Rapids, Michigan. (Detailing, an…

  12. Geomorphic Map of Worcester County, Maryland, Interpreted from a LIDAR-Based, Digital Elevation Model

    USGS Publications Warehouse

    Newell, Wayne L.; Clark, Inga

    2008-01-01

    A recently compiled mosaic of a LIDAR-based digital elevation model (DEM) is presented with geomorphic analysis of new macro-topographic details. The geologic framework of the surficial and near surface late Cenozoic deposits of the central uplands, Pocomoke River valley, and the Atlantic Coast includes Cenozoic to recent sediments from fluvial, estuarine, and littoral depositional environments. Extensive Pleistocene (cold climate) sandy dune fields are deposited over much of the terraced landscape. The macro details from the LIDAR image reveal 2 meter-scale resolution of details of the shapes of individual dunes, and fields of translocated sand sheets. Most terrace surfaces are overprinted with circular to elliptical rimmed basins that represent complex histories of ephemeral ponds that were formed, drained, and overprinted by younger basins. The terrains of composite ephemeral ponds and the dune fields are inter-shingled at their margins indicating contemporaneous erosion, deposition, and re-arrangement and possible internal deformation of the surficial deposits. The aggregate of these landform details and their deposits are interpreted as the products of arid, cold climate processes that were common to the mid-Atlantic region during the Last Glacial Maximum. In the Pocomoke valley and its larger tributaries, erosional remnants of sandy flood plains with anastomosing channels indicate the dynamics of former hydrology and sediment load of the watershed that prevailed at the end of the Pleistocene. As the climate warmed and precipitation increased during the transition from late Pleistocene to Holocene, dune fields were stabilized by vegetation, and the stream discharge increased. The increased discharge and greater local relief of streams graded to lower sea levels stimulated down cutting and created the deeply incised valleys out onto the continental shelf. These incised valleys have been filling with fluvial to intertidal deposits that record the rising sea level and warmer, more humid climate in the mid-Atlantic region throughout the Holocene. Thus, the geomorphic details provided by the new LIDAR DEM actually record the response of the landscape to abrupt climate change. Holocene trends and land-use patterns from Colonial to modern times can also be interpreted from the local macro- scale details of the landscape. Beyond the obvious utility of these data for land-use planning and assessments of resources and hazards, the new map presents new details on the impact of climate changes on a mid-latitude, outer Coastal plain landscape.

  13. Improved low-level radioactive waste management practices for hospitals and research institutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-07-01

    This report provides a general overview and a compendium of source material on low-level radioactive waste management practices in the institutional sector. Institutional sector refers to hospitals, universities, clinics, and research facilities that use radioactive materials in scientific research and the practice of medicine, and the manufacturers of radiopharmaceuticals and radiography devices. This report provides information on effective waste management practices for institutional waste to state policymakers, regulatory agency officials, and waste generators. It is not intended to be a handbook for actual waste management, but rather a sourcebook of general information, as well as a survey of the moremore » detailed analysis.« less

  14. Effect of impurities and processing on silicon solar cells. Volume 1: Characterization methods for impurities in silicon and impurity effects data base

    NASA Technical Reports Server (NTRS)

    Hopkins, R. H.; Davis, J. R.; Rohatgi, A.; Campbell, R. B.; Blais, P. D.; Rai-Choudhury, P.; Stapleton, R. E.; Mollenkopf, H. C.; Mccormick, J. R.

    1980-01-01

    Two major topics are treated: methods to measure and evaluate impurity effects in silicon and comprehensive tabulations of data derived during the study. Discussions of deep level spectroscopy, detailed dark I-V measurements, recombination lifetime determination, scanned laser photo-response, conventional solar cell I-V techniques, and descriptions of silicon chemical analysis are presented and discussed. The tabulated data include lists of impurity segregation coefficients, ingot impurity analyses and estimated concentrations, typical deep level impurity spectra, photoconductive and open circuit decay lifetimes for individual metal-doped ingots, and a complete tabulation of the cell I-V characteristics of nearly 200 ingots.

  15. Decoherence-free evolution of time-dependent superposition states of two-level systems and thermal effects

    NASA Astrophysics Data System (ADS)

    Prado, F. O.; de Almeida, N. G.; Duzzioni, E. I.; Moussa, M. H. Y.; Villas-Boas, C. J.

    2011-07-01

    In this paper we detail some results advanced in a recent letter [Prado , Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.102.073008 102, 073008 (2009).] showing how to engineer reservoirs for two-level systems at absolute zero by means of a time-dependent master equation leading to a nonstationary superposition equilibrium state. We also present a general recipe showing how to build nonadiabatic coherent evolutions of a fermionic system interacting with a bosonic mode and investigate the influence of thermal reservoirs at finite temperature on the fidelity of the protected superposition state. Our analytical results are supported by numerical analysis of the full Hamiltonian model.

  16. Level II scour analysis for Bridge 28 (BRNATH00660028) on Town Highway 66, crossing Locust Creek, Barnard, Vermont

    USGS Publications Warehouse

    Severence, Timothy

    1997-01-01

    The Town Highway 66 crossing of the Locust Creek is a 41-ft-long, one-lane bridge consisting of a 39 ft steel stringer type bridge with a concrete deck (Vermont Agency of Transportation, written communication, August 24, 1994). The clear span is 36.8 ft. The bridge is supported by vertical, concrete abutments with wingwalls. The upstream right wingwall is protected by stone fill. The channel is skewed approximately 10 degrees to the opening while the opening-skew-to-roadway is 0 degrees. Additional details describing conditions at the site are included in the Level II Summary and Appendices D and E.

  17. Watershed-based Morphometric Analysis: A Review

    NASA Astrophysics Data System (ADS)

    Sukristiyanti, S.; Maria, R.; Lestiana, H.

    2018-02-01

    Drainage basin/watershed analysis based on morphometric parameters is very important for watershed planning. Morphometric analysis of watershed is the best method to identify the relationship of various aspects in the area. Despite many technical papers were dealt with in this area of study, there is no particular standard classification and implication of each parameter. It is very confusing to evaluate a value of every morphometric parameter. This paper deals with the meaning of values of the various morphometric parameters, with adequate contextual information. A critical review is presented on each classification, the range of values, and their implications. Besides classification and its impact, the authors also concern about the quality of input data, either in data preparation or scale/the detail level of mapping. This review paper hopefully can give a comprehensive explanation to assist the upcoming research dealing with morphometric analysis.

  18. Flight survey of the 757 wing noise field and its effects on laminar boundary layer transition. Volume 3: Extended data analysis

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A flight program was completed in June of 1985 using the Boeing 757 flight research aircraft with an NLF glove installed on the right wing just outboard of the engine. The objectives of this program were to measure noise levels on the wing and to investigate the effect of engine noise on the extent of laminar flow on the glove. Details of the flight test program and results are contained in Volume 1 of this document. Tabulations and plots of the measured data are contained in Volume 2. The present volume contains the results of additional engineering analysis of the data. The latter includes analysis of the measured noise data, a comparison of predicted and measured noise data, a boundary layer stability analysis of 21 flight data cases, and an analysis of the effect of noise on boundary layer transition.

  19. Multi-level of Fidelity Multi-Disciplinary Design Optimization of Small, Solid-Propellant Launch Vehicles

    NASA Astrophysics Data System (ADS)

    Roshanian, Jafar; Jodei, Jahangir; Mirshams, Mehran; Ebrahimi, Reza; Mirzaee, Masood

    A new automated multi-level of fidelity Multi-Disciplinary Design Optimization (MDO) methodology has been developed at the MDO Laboratory of K.N. Toosi University of Technology. This paper explains a new design approach by formulation of developed disciplinary modules. A conceptual design for a small, solid-propellant launch vehicle was considered at two levels of fidelity structure. Low and medium level of fidelity disciplinary codes were developed and linked. Appropriate design and analysis codes were defined according to their effect on the conceptual design process. Simultaneous optimization of the launch vehicle was performed at the discipline level and system level. Propulsion, aerodynamics, structure and trajectory disciplinary codes were used. To reach the minimum launch weight, the Low LoF code first searches the whole design space to achieve the mission requirements. Then the medium LoF code receives the output of the low LoF and gives a value near the optimum launch weight with more details and higher fidelity.

  20. Multielemental analysis of 18 essential and toxic elements in amniotic fluid samples by ICP-MS: Full procedure validation and estimation of measurement uncertainty.

    PubMed

    Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D

    2017-11-01

    Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty than on trueness factor. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, J.

    1999-01-01

    A new atmospheric objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 1 X 1 lat-lon grid with 18 levels of heights and winds and 10 levels of moisture) using 120,000 observations in 17 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system is totally portable and can run on several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from 1 to 32 CPUs is 18%. In addition, the analysis results are identical regardless of the number of processors used. This system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. Static tests with a 2 X 2.5 resolution version of this system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from several months of cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (O-F statistics) as the current operational system.

  2. Seal Technology Development for Advanced Component for Airbreathing Engines

    NASA Technical Reports Server (NTRS)

    Snyder, Philip H.

    2008-01-01

    Key aspects of the design of sealing systems for On Rotor Combustion/Wave Rotor (ORC/WR) systems were addressed. ORC/WR systems generally fit within a broad class of pressure gain Constant Volume Combustors (CVCs) or Pulse Detonation Combustors (PDCs) which are currently being considered for use in many classes of turbine engines for dramatic efficiency improvement. Technology readiness level of this ORC/WR approaches are presently at 2.0. The results of detailed modeling of an ORC/WR system as applied to a regional jet engine application were shown to capture a high degree of pressure gain capabilities. The results of engine cycle analysis indicated the level of specific fuel consumption (SFC) benefits to be 17 percent. The potential losses in pressure gain due to leakage were found to be closely coupled to the wave processes at the rotor endpoints of the ORC/WR system. Extensive investigation into the sealing approaches is reported. Sensitivity studies show that SFC gains of 10 percent remain available even when pressure gain levels are highly penalized. This indicates ORC/WR systems to have a high degree of tolerance to rotor leakage effects but also emphasizes their importance. An engine demonstration of an ORC/WR system is seen as key to progressing the TRL of this technology. An industrial engine was judged to be a highly advantageous platform for demonstration of a first generation ORC/WR system. Prior to such a demonstration, the existing NASA pressure exchanger wave rotor rig was identified as an opportunity to apply both expanded analytical modeling capabilities developed within this program and to identify and fix identified leakage issues existing within this rig. Extensive leakage analysis of the rig was performed and a detailed design of additional sealing strategies for this rig was generated.

  3. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  4. Comparison between Proteome and Transcriptome Response in Potato (Solanum tuberosum L.) Leaves Following Potato Virus Y (PVY) Infection.

    PubMed

    Stare, Tjaša; Stare, Katja; Weckwerth, Wolfram; Wienkoop, Stefanie; Gruden, Kristina

    2017-07-06

    Plant diseases caused by viral infection are affecting all major crops. Being an obligate intracellular organisms, chemical control of these pathogens is so far not applied in the field except to control the insect vectors of the viruses. Understanding of molecular responses of plant immunity is therefore economically important, guiding the enforcement of crop resistance. To disentangle complex regulatory mechanisms of the plant immune responses, understanding system as a whole is a must. However, integrating data from different molecular analysis (transcriptomics, proteomics, metabolomics, smallRNA regulation etc.) is not straightforward. We evaluated the response of potato ( Solanum tuberosum L.) following the infection with potato virus Y (PVY). The response has been analyzed on two molecular levels, with microarray transcriptome analysis and mass spectroscopy-based proteomics. Within this report, we performed detailed analysis of the results on both levels and compared two different approaches for analysis of proteomic data (spectral count versus MaxQuant). To link the data on different molecular levels, each protein was mapped to the corresponding potato transcript according to StNIB paralogue grouping. Only 33% of the proteins mapped to microarray probes in a one-to-one relation and additionally many showed discordance in detected levels of proteins with corresponding transcripts. We discussed functional importance of true biological differences between both levels and showed that the reason for the discordance between transcript and protein abundance lies partly in complexity and structure of biological regulation of proteome and transcriptome and partly in technical issues contributing to it.

  5. Comparison between Proteome and Transcriptome Response in Potato (Solanum tuberosum L.) Leaves Following Potato Virus Y (PVY) Infection

    PubMed Central

    Stare, Tjaša; Stare, Katja; Weckwerth, Wolfram; Wienkoop, Stefanie

    2017-01-01

    Plant diseases caused by viral infection are affecting all major crops. Being an obligate intracellular organisms, chemical control of these pathogens is so far not applied in the field except to control the insect vectors of the viruses. Understanding of molecular responses of plant immunity is therefore economically important, guiding the enforcement of crop resistance. To disentangle complex regulatory mechanisms of the plant immune responses, understanding system as a whole is a must. However, integrating data from different molecular analysis (transcriptomics, proteomics, metabolomics, smallRNA regulation etc.) is not straightforward. We evaluated the response of potato (Solanum tuberosum L.) following the infection with potato virus Y (PVY). The response has been analyzed on two molecular levels, with microarray transcriptome analysis and mass spectroscopy-based proteomics. Within this report, we performed detailed analysis of the results on both levels and compared two different approaches for analysis of proteomic data (spectral count versus MaxQuant). To link the data on different molecular levels, each protein was mapped to the corresponding potato transcript according to StNIB paralogue grouping. Only 33% of the proteins mapped to microarray probes in a one-to-one relation and additionally many showed discordance in detected levels of proteins with corresponding transcripts. We discussed functional importance of true biological differences between both levels and showed that the reason for the discordance between transcript and protein abundance lies partly in complexity and structure of biological regulation of proteome and transcriptome and partly in technical issues contributing to it. PMID:28684682

  6. Elastomeric Structural Attachment Concepts for Aircraft Flap Noise Reduction - Challenges and Approaches to Hyperelastic Structural Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Sreekantamurthy, Thammaiah; Turner, Travis L.; Moore, James B.; Su, Ji

    2014-01-01

    Airframe noise is a significant part of the overall noise of transport aircraft during the approach and landing phases of flight. Airframe noise reduction is currently emphasized under the Environmentally Responsible Aviation (ERA) and Fixed Wing (FW) Project goals of NASA. A promising concept for trailing-edge-flap noise reduction is a flexible structural element or link that connects the side edges of the deployable flap to the adjacent main-wing structure. The proposed solution is distinguished by minimization of the span-wise extent of the structural link, thereby minimizing the aerodynamic load on the link structure at the expense of increased deformation requirement. Development of such a flexible structural link necessitated application of hyperelastic materials, atypical structural configurations and novel interface hardware. The resulting highly-deformable structural concept was termed the FLEXible Side Edge Link (FLEXSEL) concept. Prediction of atypical elastomeric deformation responses from detailed structural analysis was essential for evaluating feasible concepts that met the design constraints. The focus of this paper is to describe the many challenges encountered with hyperelastic finite element modeling and the nonlinear structural analysis of evolving FLEXSEL concepts. Detailed herein is the nonlinear analysis of FLEXSEL concepts that emerged during the project which include solid-section, foamcore, hollow, extended-span and pre-stressed concepts. Coupon-level analysis performed on elastomeric interface joints, which form a part of the FLEXSEL topology development, are also presented.

  7. Universality and diversity of folding mechanics for three-helix bundle proteins.

    PubMed

    Yang, Jae Shick; Wallin, Stefan; Shakhnovich, Eugene I

    2008-01-22

    In this study we evaluate, at full atomic detail, the folding processes of two small helical proteins, the B domain of protein A and the Villin headpiece. Folding kinetics are studied by performing a large number of ab initio Monte Carlo folding simulations using a single transferable all-atom potential. Using these trajectories, we examine the relaxation behavior, secondary structure formation, and transition-state ensembles (TSEs) of the two proteins and compare our results with experimental data and previous computational studies. To obtain a detailed structural information on the folding dynamics viewed as an ensemble process, we perform a clustering analysis procedure based on graph theory. Moreover, rigorous p(fold) analysis is used to obtain representative samples of the TSEs and a good quantitative agreement between experimental and simulated Phi values is obtained for protein A. Phi values for Villin also are obtained and left as predictions to be tested by future experiments. Our analysis shows that the two-helix hairpin is a common partially stable structural motif that gets formed before entering the TSE in the studied proteins. These results together with our earlier study of Engrailed Homeodomain and recent experimental studies provide a comprehensive, atomic-level picture of folding mechanics of three-helix bundle proteins.

  8. Malassezia globosa and restricta: breakthrough understanding of the etiology and treatment of dandruff and seborrheic dermatitis through whole-genome analysis.

    PubMed

    Dawson, Thomas L

    2007-12-01

    Dandruff and seborrheic dermatitis (D/SD) share an etiology dependent upon three factors: sebum, microbial metabolism (specifically, Malassezia yeasts), and individual susceptibility. Advances in microbiological and analytical techniques permit a more detailed understanding of these etiologic factors, especially the role of Malassezia. Malassezia are lipid-dependent and demonstrate adaptation allowing them to exploit a narrow niche on sebum-rich skin. Work in our and our collaborators' laboratories has focused on understanding these adaptations by detailed analysis of biochemistry and gene expression. We have shown that Malassezia globosa and M. restricta predominate on dandruff scalp, that oleic acid alone can initiate dandruff-like desquamation, that M. globosa is the most likely initiating organism by virtue of its high lipase activity, and that an M. globosa lipase is expressed on human scalp. Considering the importance of M. globosa in D/SD (and the overall importance of commensal fungi), we have sequenced the M. globosa and M. restricta genomes. Genomic analysis indicates key adaptations to the skin environment, several of which yield important clues to the role Malassezia play in human disease. This work offers the promise of defining new treatments to D/SD that are targeted at changing the level or activities of Malassezia genes.

  9. Technical aspects and recommendations for single-cell qPCR.

    PubMed

    Ståhlberg, Anders; Kubista, Mikael

    2018-02-01

    Single cells are basic physiological and biological units that can function individually as well as in groups in tissues and organs. It is central to identify, characterize and profile single cells at molecular level to be able to distinguish different kinds, to understand their functions and determine how they interact with each other. During the last decade several technologies for single-cell profiling have been developed and used in various applications, revealing many novel findings. Quantitative PCR (qPCR) is one of the most developed methods for single-cell profiling that can be used to interrogate several analytes, including DNA, RNA and protein. Single-cell qPCR has the potential to become routine methodology but the technique is still challenging, as it involves several experimental steps and few molecules are handled. Here, we discuss technical aspects and provide recommendation for single-cell qPCR analysis. The workflow includes experimental design, sample preparation, single-cell collection, direct lysis, reverse transcription, preamplification, qPCR and data analysis. Detailed reporting and sharing of experimental details and data will promote further development and make validation studies possible. Efforts aiming to standardize single-cell qPCR open up means to move single-cell analysis from specialized research settings to standard research laboratories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. The International College of Neuropsychopharmacology (CINP) Treatment Guidelines for Bipolar Disorder in Adults (CINP-BD-2017), Part 1: Background and Methods of the Development of Guidelines

    PubMed Central

    Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried

    2017-01-01

    Abstract Background: This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. Methods: The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. Results: The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. Conclusion: The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. PMID:27815414

  11. Variational Bayesian Parameter Estimation Techniques for the General Linear Model

    PubMed Central

    Starke, Ludger; Ostwald, Dirk

    2017-01-01

    Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572

  12. Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface

    NASA Technical Reports Server (NTRS)

    Rubin, Carol

    2003-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.

  13. Heavy Metal Level in Human Semen with Different Fertility: a Meta-Analysis.

    PubMed

    Sun, Jiantao; Yu, Guangxia; Zhang, Yucheng; Liu, Xi; Du, Chuang; Wang, Lu; Li, Zhen; Wang, Chunhong

    2017-03-01

    There are conflicting reports on the heavy metal levels in human semen with different fertilities. The purpose of this analysis is to merge and analyze the differences of heavy metal lead (Pb), cadmium (Cd), zinc (Zn), and copper (Cu) levels in male semen with normal and low fertilities. All documents in both Chinese and English were collected from the PubMed, Web of Science, and Chinese National Knowledge Infrastructure (CNKI) database from inception date to February 19, 2016. We have used RevMan software (version 5.2) for the meta-analysis and Stata software (version 12.0) for the meta-regression and sensitivity analyses. A total of 20 literatures were included in the study. The results of the meta-analysis indicate a significant difference between fertility with three metal ions (Pb, Cd, Zn) while no significant difference with copper, detailed as follows: (i) 10 studies on the lead concentrations with a standardized mean difference (SMD) = 2.07, 95 %CI (0.97, 3.17), P < 0.01; (ii) 13 studies on the cadmium concentrations with an SMD = 0.75, 95 %CI (0.44, 1.07), P < 0.01; (iii) 8 studies on the concentrations of zinc with an SMD = -0.61, 95 %CI (-1.08, -0.14), P < 0.01; and (iv) 9 studies on the copper concentrations with an SMD = 0.42, 95 %CI (-0.29, 1.13), P = 0.247. The results indicate that the men with low fertility have higher semen Pb and Cd levels and lower semen Zn levels; more studies are needed to indicate the association of the semen copper level with fertility.

  14. A Three Dimensional Kinematic and Kinetic Study of the Golf Swing

    PubMed Central

    Nesbit, Steven M.

    2005-01-01

    This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key Points Full-body model of the golf swing. Mechanical description of the golf swing. Statistical analysis of golf swing mechanics. Comparisons of subject swing mechanics PMID:24627665

  15. A three dimensional kinematic and kinetic study of the golf swing.

    PubMed

    Nesbit, Steven M

    2005-12-01

    This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key PointsFull-body model of the golf swing.Mechanical description of the golf swing.Statistical analysis of golf swing mechanics.Comparisons of subject swing mechanics.

  16. C-Band Airport Surface Communications System Engineering-Initial High-Level Safety Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Zelkin, Natalie; Henriksen, Stephen

    2011-01-01

    This document is being provided as part of ITT's NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract: "New ATM Requirements--Future Communications, C-Band and L-Band Communications Standard Development." ITT has completed a safety hazard analysis providing a preliminary safety assessment for the proposed C-band (5091- to 5150-MHz) airport surface communication system. The assessment was performed following the guidelines outlined in the Federal Aviation Administration Safety Risk Management Guidance for System Acquisitions document. The safety analysis did not identify any hazards with an unacceptable risk, though a number of hazards with a medium risk were documented. This effort represents an initial high-level safety hazard analysis and notes the triggers for risk reassessment. A detailed safety hazards analysis is recommended as a follow-on activity to assess particular components of the C-band communication system after the profile is finalized and system rollout timing is determined. A security risk assessment has been performed by NASA as a parallel activity. While safety analysis is concerned with a prevention of accidental errors and failures, the security threat analysis focuses on deliberate attacks. Both processes identify the events that affect operation of the system; and from a safety perspective the security threats may present safety risks.

  17. NASA Multidisciplinary Design and Analysis Fellowship Program

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report is a Year 1 interim report of the progress on the NASA multidisciplinary Design and Analysis Fellowship Program covering the period, January 1, 1995 through September 30, 1995. It summarizes progress in establishing the MDA Fellowship Program at Georgia Tech during the initial year. Progress in the advertisement of the program, recruiting results for the 1995-96 academic year, placement of the Fellows in industry during Summer 1995, program development at the M.S. and Ph.D. levels, and collaboration and dissemination of results are summarized in this report. Further details of the first year's progress will be included in the report from the Year 1 Workshop to be held at NASA Langley on December 7-8, 1995.

  18. Part A: Cirrus ice crystal nucleation and growth. Part B: Automated analysis of aircraft ice particle data

    NASA Technical Reports Server (NTRS)

    Arnott, William P.; Hallett, John; Hudson, James G.

    1995-01-01

    Specific measurement of cirrus crystals by aircraft and temperature modified CN are used to specify measurements necessary to provide a basis for a conceptual model of cirrus particle formation. Key to this is the ability to measure the complete spectrum of particles at cirrus levels. The most difficult regions for such measurement is from a few to 100 microns, and uses a replicator. The details of the system to automate replicator data analysis are given, together with an example case study of the system provided from a cirrus cloud in FIRE 2, with particles detectable by replicator and FSSP, but not 2DC.

  19. Characterization of Microgravity Environment on Mir

    NASA Technical Reports Server (NTRS)

    Kim, Hyoung; Kaouk, Mohamed

    2000-01-01

    This paper presents the microgravity analysis results using dynamic response data collected during the first phase of the Mir Structural Dynamics Experiment (MiSDE). Although MiSDE was designed and performed to verify structural dynamic models, it also provided information for determining microgravity characteristics of the structure. This study analyzed ambient responses acquired during orbital day-to-night and night-to-day transitions, crew treadmill and ergometer exercises, and intentional crew activities. Acceleration levels for one-third octave bands were calculated to characterize the microgravity environment of the station. Spectrograms were also used to analyze the time transient nature of the responses. Detailed theoretical background and analysis results will also be included in the final draft.

  20. [The application of inductively coupled plasma atomic emission spectrometry/mass spectrometry to the analysis of advanced ceramic materials].

    PubMed

    Wang, Zheng; Wang, Shi-Wei; Qiu, De-Ren; Yang, Peng-Yuan

    2009-10-01

    Advanced ceramics have been applied to various important fields such as information science, aeronautics and astronautics, and life sciences. However, the optics and electric properties of ceramics are significantly affected by the micro and trace impurities existing in the material even at very low concentration level. Thus, the accurate determination of impurities is important for materials preparation and performance. Methodology of the analysis of advanced ceramic materials using ICP-AES/MS was reviewed in the present paper for the past decade. Various techniques of sample introduction, especially advances in the authors' recent work, are described in detail. The developing trend is also presented. Sixty references are cited.

  1. A system for analysis and classification of voice communications

    NASA Technical Reports Server (NTRS)

    Older, H. J.; Jenney, L. L.; Garland, L.

    1973-01-01

    A method for analysis and classification of verbal communications typically associated with manned space missions or simulations was developed. The study was carried out in two phases. Phase 1 was devoted to identification of crew tasks and activities which require voice communication for accomplishment or reporting. Phase 2 entailed development of a message classification system and a preliminary test of its feasibility. The classification system permits voice communications to be analyzed to three progressively more specific levels of detail and to be described in terms of message content, purpose, and the participants in the information exchange. A coding technique was devised to allow messages to be recorded by an eight-digit number.

  2. Developing the Coach Analysis and Intervention System (CAIS): establishing validity and reliability of a computerised systematic observation instrument.

    PubMed

    Cushion, Christopher; Harvey, Stephen; Muir, Bob; Nelson, Lee

    2012-01-01

    We outline the evolution of a computerised systematic observation tool and describe the process for establishing the validity and reliability of this new instrument. The Coach Analysis and Interventions System (CAIS) has 23 primary behaviours related to physical behaviour, feedback/reinforcement, instruction, verbal/non-verbal, questioning and management. The instrument also analyses secondary coach behaviour related to performance states, recipient, timing, content and questioning/silence. The CAIS is a multi-dimensional and multi-level mechanism able to provide detailed and contextualised data about specific coaching behaviours occurring in complex and nuanced coaching interventions and environments that can be applied to both practice sessions and competition.

  3. Development of Improved Models, Stochasticity, and Frameworks for the MIT Extensible Air Network Simulation

    NASA Technical Reports Server (NTRS)

    Clarke, John-Paul

    2004-01-01

    MEANS, the MIT Extensible Air Network Simulation, was created in February of 2001, and has been developed with support from NASA Ames since August of 2001. MEANS is a simulation tool which is designed to maximize fidelity without requiring data of such a low level as to preclude easy examination of alternative scenarios. To this end, MEANS is structured in a modular fashion to allow more detailed components to be brought in when desired, and left out when they would only be an impediment. Traditionally, one of the difficulties with high-fidelity models is that they require a level of detail in their data that is difficult to obtain. For analysis of past scenarios, the required data may not have been collected, or may be considered proprietary and thus difficult for independent researchers to obtain. For hypothetical scenarios, generation of the data is sufficiently difficult to be a task in and of itself. Often, simulations designed by a researcher will model exactly one element of the problem well and in detail, while assuming away other parts of the problem which are not of interest or for which data is not available. While these models are useful for working with the task at hand, they are very often not applicable to future problems. The MEAN Simulation attempts to address these problems by using a modular design which provides components of varying fidelity for each aspect of the simulation. This allows for the most accurate model for which data is available to be used. It also provides for easy analysis of sensitivity to data accuracy. This can be particularly useful in the case where accurate data is available for some subset of the situations that are to be considered. Furthermore, the ability to use the same model while examining effects on different parts of a system reduces the time spent learning the simulation, and provides for easier comparisons between changes to different parts of the system.

  4. A wavelet based method for automatic detection of slow eye movements: a pilot study.

    PubMed

    Magosso, Elisa; Provini, Federica; Montagna, Pasquale; Ursino, Mauro

    2006-11-01

    Electro-oculographic (EOG) activity during the wake-sleep transition is characterized by the appearance of slow eye movements (SEM). The present work describes an algorithm for the automatic localisation of SEM events from EOG recordings. The algorithm is based on a wavelet multiresolution analysis of the difference between right and left EOG tracings, and includes three main steps: (i) wavelet decomposition down to 10 detail levels (i.e., 10 scales), using Daubechies order 4 wavelet; (ii) computation of energy in 0.5s time steps at any level of decomposition; (iii) construction of a non-linear discriminant function expressing the relative energy of high-scale details to both high- and low-scale details. The main assumption is that the value of the discriminant function increases above a given threshold during SEM episodes due to energy redistribution toward higher scales. Ten EOG recordings from ten male patients with obstructive sleep apnea syndrome were used. All tracings included a period from pre-sleep wakefulness to stage 2 sleep. Two experts inspected the tracings separately to score SEMs. A reference set of SEM (gold standard) were obtained by joint examination by both experts. Parameters of the discriminant function were assigned on three tracings (design set) to minimize the disagreement between the system classification and classification by the two experts; the algorithm was then tested on the remaining seven tracings (test set). Results show that the agreement between the algorithm and the gold standard was 80.44+/-4.09%, the sensitivity of the algorithm was 67.2+/-7.37% and the selectivity 83.93+/-8.65%. However, most errors were not caused by an inability of the system to detect intervals with SEM activity against NON-SEM intervals, but were due to a different localisation of the beginning and end of some SEM episodes. The proposed method may be a valuable tool for computerized EOG analysis.

  5. Exterior building details of Building A; west façade: white painted ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Exterior building details of Building A; west façade: white painted brick wall of road and second level, road level: paired four-light casement window and a small single-light wood casement window; second level: four-over-four wood double-hung window and a six-light horizontal pivot over a three-light fixed window; easterly view - San Quentin State Prison, Building 22, Point San Quentin, San Quentin, Marin County, CA

  6. PREDICTIONS OF STREAM WOOD RECRUITMENT FROM RIPARIAN FORESTS: EFFECTS OF DATA RESOLUTION

    EPA Science Inventory

    We evaluate whether different levels of detail of riparian forest characterizations result in different predictions of stream wood recruitment from riparian forests in northwestern Oregon. If less detailed information provides the same estimate of this function as more detailed i...

  7. Mapping Patterns and Trends in the Spatial Availability of Alcohol Using Low-Level Geographic Data: A Case Study in England 2003-2013.

    PubMed

    Angus, Colin; Holmes, John; Maheswaran, Ravi; Green, Mark A; Meier, Petra; Brennan, Alan

    2017-04-12

    Much literature examines the relationship between the spatial availability of alcohol and alcohol-related harm. This study aims to address an important gap in this evidence by using detailed outlet data to examine recent temporal trends in the sociodemographic distribution of spatial availability for different types of alcohol outlet in England. Descriptive analysis of measures of alcohol outlet density and proximity using extremely high resolution market research data stratified by outlet type and quintiles of area-level deprivation from 2003, 2007, 2010 and 2013 was undertaken and hierarchical linear growth models fitted to explore the significance of socioeconomic differences. We find that overall availability of alcohol changed very little from 2003 to 2013 (density +1.6%), but this conceals conflicting trends by outlet type and area-level deprivation. Mean on-trade density has decreased substantially (-2.2 outlets within 1 km (Inter-Quartile Range (IQR) -3-0), although access to restaurants has increased (+1.0 outlets (IQR 0-1)), while off-trade access has risen substantially (+2.4 outlets (IQR 0-3)). Availability is highest in the most deprived areas ( p < 0.0001) although these areas have also seen the greatest falls in on-trade outlet availability ( p < 0.0001). This study underlines the importance of using detailed, low-level geographic data to understand patterns and trends in the spatial availability of alcohol. There are significant variations in these trends by outlet type and deprivation level which may have important implications for health inequalities and public health policy.

  8. Mapping Patterns and Trends in the Spatial Availability of Alcohol Using Low-Level Geographic Data: A Case Study in England 2003–2013

    PubMed Central

    Angus, Colin; Holmes, John; Maheswaran, Ravi; Green, Mark A.; Meier, Petra; Brennan, Alan

    2017-01-01

    Much literature examines the relationship between the spatial availability of alcohol and alcohol-related harm. This study aims to address an important gap in this evidence by using detailed outlet data to examine recent temporal trends in the sociodemographic distribution of spatial availability for different types of alcohol outlet in England. Descriptive analysis of measures of alcohol outlet density and proximity using extremely high resolution market research data stratified by outlet type and quintiles of area-level deprivation from 2003, 2007, 2010 and 2013 was undertaken and hierarchical linear growth models fitted to explore the significance of socioeconomic differences. We find that overall availability of alcohol changed very little from 2003 to 2013 (density +1.6%), but this conceals conflicting trends by outlet type and area-level deprivation. Mean on-trade density has decreased substantially (−2.2 outlets within 1 km (Inter-Quartile Range (IQR) −3–0), although access to restaurants has increased (+1.0 outlets (IQR 0–1)), while off-trade access has risen substantially (+2.4 outlets (IQR 0–3)). Availability is highest in the most deprived areas (p < 0.0001) although these areas have also seen the greatest falls in on-trade outlet availability (p < 0.0001). This study underlines the importance of using detailed, low-level geographic data to understand patterns and trends in the spatial availability of alcohol. There are significant variations in these trends by outlet type and deprivation level which may have important implications for health inequalities and public health policy. PMID:28417941

  9. Analysis respons to the implementation of nuclear installations safety culture using AHP-TOPSIS

    NASA Astrophysics Data System (ADS)

    Situmorang, J.; Kuntoro, I.; Santoso, S.; Subekti, M.; Sunaryo, G. R.

    2018-02-01

    An analysis of responses to the implementation of nuclear installations safety culture has been done using AHP (Analitic Hierarchy Process) - TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution). Safety culture is considered as collective commitments of the decision-making level, management level, and individual level. Thus each level will provide a subjective perspective as an alternative approach to implementation. Furthermore safety culture is considered by the statement of five characteristics which in more detail form consist of 37 attributes, and therefore can be expressed as multi-attribute state. Those characteristics and or attributes will be a criterion and its value is difficult to determine. Those criteria of course, will determine and strongly influence the implementation of the corresponding safety culture. To determine the pattern and magnitude of the influence is done by using a TOPSIS that is based on decision matrix approach and is composed of alternatives and criteria. The weight of each criterion is determined by AHP technique. The data used are data collected through questionnaires at the workshop on safety and health in 2015. .Reliability test of data gives Cronbah Alpha value of 95.5% which according to the criteria is stated reliable. Validity test using bivariate correlation analysis technique between each attribute give Pearson correlation for all attribute is significant at level 0,01. Using confirmatory factor analysis gives Kaise-Meyer-Olkin of sampling Adequacy (KMO) is 0.719 and it is greater than the acceptance criterion 0.5 as well as the 0.000 significance level much smaller than 0.05 and stated that further analysis could be performed. As a result of the analysis it is found that responses from the level of decision maker (second echelon) dominate the best order preference rank to be the best solution in strengthening the nuclear installation safety culture, except for the first characteristics, safety is a clearly recognized value. The rank of preference order is obtained sequentially according to the level of policy maker, management and individual or staff.

  10. The Impacts of Water Quality and Food Availability on Children's Health in West Africa: A Spatial Analysis Using Remotely Sensed Data and Small-Scale Water Quality Data and Country-level Health Data

    NASA Astrophysics Data System (ADS)

    Frederick, L.; Grace, K.; Lloyd, B.

    2014-12-01

    As the global climate changes and the populations of many African countries grow, ensuring clean drinking water and food has become a pressing concern. Because of their vulnerability to malnutrition and food insecurity, children face the greatest risk for adverse health outcomes related to climate change. Vulnerability, however, is highly variable, with some children in food insecure communities showing healthy growth, while some children in food secure communities show signs of malnutrition. In West Africa, Burkina Faso faces high levels of child malnutrition, loses to farmland and a large share of the population have no access to clean water. Because the overwhelming majority of children rely on locally grown, rainfed agriculture and well/surface water, the combined impact of climate change and population growth decreases water availability and farmland per person. However, there is notable community and individual variation in malnutrition levels suggesting that there are important coping strategies that vulnerable families may use to secure their children's health. No spatially relevant analysis of water and food insecurity and children's health exists for Burkina Faso. The goal of this research is to identify and quantify the combined and inter-related impact of unsafe drinking water and community-level food availability on the physical health outcomes of Burkinabe children under five years of age. To accomplish this goal we rely on a publically available highly detailed, geo-referenced data set (Demographic and Health Survey (DHS)) to provide information on measures of childhood malnutrition and details on parental characteristics related to children's health. Information on water source (covered/uncovered well, piped water, etc.) and water quality (measures of arsenic and pollution) comes from DHS along with a recently collected geo-referenced US Agency for International Development (USAID) data set. Critical information on food production, environmental characteristics and population density come from high resolution remotely sensed data.

  11. The Impacts of Water Quality and Food Availability on Children's Health in West Africa: A Spatial Analysis Using Remotely Sensed Data and Small-Scale Water Quality Data and Country-level Health Data

    NASA Astrophysics Data System (ADS)

    Frederick, L.; Grace, K.; Lloyd, B.

    2015-12-01

    As the global climate changes and the populations of many African countries grow, ensuring clean drinking water and food has become a pressing concern. Because of their vulnerability to malnutrition and food insecurity, children face the greatest risk for adverse health outcomes related to climate change. Vulnerability, however, is highly variable, with some children in food insecure communities showing healthy growth, while some children in food secure communities show signs of malnutrition. In West Africa, Burkina Faso faces high levels of child malnutrition, loses to farmland and a large share of the population have no access to clean water. Because the overwhelming majority of children rely on locally grown, rainfed agriculture and well/surface water, the combined impact of climate change and population growth decreases water availability and farmland per person. However, there is notable community and individual variation in malnutrition levels suggesting that there are important coping strategies that vulnerable families may use to secure their children's health. No spatially relevant analysis of water and food insecurity and children's health exists for Burkina Faso. The goal of this research is to identify and quantify the combined and inter-related impact of unsafe drinking water and community-level food availability on the physical health outcomes of Burkinabe children under five years of age. To accomplish this goal we rely on a publically available highly detailed, geo-referenced data set (Demographic and Health Survey (DHS)) to provide information on measures of childhood malnutrition and details on parental characteristics related to children's health. Information on water source (covered/uncovered well, piped water, etc.) and water quality (measures of arsenic and pollution) comes from DHS along with a recently collected geo-referenced US Agency for International Development (USAID) data set. Critical information on food production, environmental characteristics and population density come from high resolution remotely sensed data.

  12. High Speed Cylindrical Roller Bearing Analysis, SKF Computer Program CYBEAN. Volume 1: Analysis

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Pirvics, J.

    1978-01-01

    The CYBEAN (CYlindrical BEaring ANalysis) program was created to detail radially loaded, aligned and misaligned Cylindrical roller bearing performance under a variety of operating conditions. The models and associated mathematics used within CYBEAN are described. The user is referred to the material for formulation assumptions and algorithm detail.

  13. Hydrology of Indiana lakes

    USGS Publications Warehouse

    Perrey, Joseph Irving; Corbett, Don Melvin

    1956-01-01

    The stabilization of lake levels often requires the construction of outlet control structures. A detailed study of past lake-level elevations and other hydologic date is necessary to establish a level that can be maintained and to determine the means necessary for maintaining the established level. Detailed lake-level records for 28 lakes are included in the report, and records for over 100 other lakes data are available in the U.S. Geological Survey Office, Indianapolis, Ind. Evaporation data from the four Class A evaporation station of the U. S. Weather Bureau have been compiled in this report. A table showing the established legal lake level and related data is included.

  14. Lithofacies, paleoenvironment and high-resolution stratigraphy of the D5 and D6 members of the Middle Jurassic carbonates Dhruma Formation, outcrop analog, central Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Yousif, Ibrahim M.; Abdullatif, Osman M.; Makkawi, Mohammad H.; Bashri, Mazin A.; Abdulghani, Waleed M.

    2018-03-01

    This study characterizes the lithofacies, paleoenvironment and stratigraphic architecture of the D5 and D6 members of carbonates Dhruma Formation outcrops in central Saudi Arabia. The study integrates detailed lithofacies analysis based on vertical and lateral profiles, in addition to thin-sections petrography to reveal the high-resolution architecture framework. Nine lithofacies types (LFTs) were defined namely: (1) skeletal peletal spiculitic wackestone (15%), (2) peloidal echinoderm packstone (19%), (3) fissile shale (36%), (4) peloidal spiculitic echinoderm pack-grainstone (5%), (5) cross-bedded peloidal skeletal oolitic grainstone (7%), (6) oolitic grainstone (2%), (7) intraformational rudstone (<1%), (8) skeletal peloidal foraminiferal packstone (12%) and (9) skeletal foraminiferal wackestone (4%). These lithofacies types were grouped into five major carbonate paleoenvironments that range from distal-to-proximal carbonate ramp setting. The detailed stratigraphic analysis revealed around 53 cycles and cycle sets with 5th to 6th orders magnitude, and thickness ranges from a few centimeters up to 6 m with an average of 1.5 m. Those are stacked to form four high-frequency sequences with thickness range from 1 m up to 14 m. The latter were grouped into a single depositional sequence of 3rd order magnitude. The architectural analysis also shows that the potential reservoir units were intensively affected by muddy-textured rocks which act as reservoir seals. These variations in the stratigraphic sequences in Middle Jurassic Dhruma Formation and its equivalents could be attributed to the eustatic sea-level changes, climate, tectonics, and local paleoenvironments. This study attempts to provide detailed insight into reservoir heterogeneity and architecture. The analog may help to understand and predict lithofacies heterogeneity, architecture, and quality in the subsurface equivalent reservoirs.

  15. Vehicle Design Evaluation Program (VDEP). A computer program for weight sizing, economic, performance and mission analysis of fuel-conservative aircraft, multibodied aircraft and large cargo aircraft using both JP and alternative fuels

    NASA Technical Reports Server (NTRS)

    Oman, B. H.

    1977-01-01

    The NASA Langley Research Center vehicle design evaluation program (VDEP-2) was expanded by (1) incorporating into the program a capability to conduct preliminary design studies on subsonic commercial transport type aircraft using both JP and such alternate fuels as hydrogen and methane;(2) incorporating an aircraft detailed mission and performance analysis capability; and (3) developing and incorporating an external loads analysis capability. The resulting computer program (VDEP-3) provides a preliminary design tool that enables the user to perform integrated sizing, structural analysis, and cost studies on subsonic commercial transport aircraft. Both versions of the VDEP-3 Program which are designated preliminary Analysis VDEP-3 and detailed Analysis VDEP utilize the same vehicle sizing subprogram which includes a detailed mission analysis capability, as well as a geometry and weight analysis for multibodied configurations.

  16. Identification of pumping influences in long-term water level fluctuations.

    PubMed

    Harp, Dylan R; Vesselinov, Velimir V

    2011-01-01

    Identification of the pumping influences at monitoring wells caused by spatially and temporally variable water supply pumping can be a challenging, yet an important hydrogeological task. The information that can be obtained can be critical for conceptualization of the hydrogeological conditions and indications of the zone of influence of the individual pumping wells. However, the pumping influences are often intermittent and small in magnitude with variable production rates from multiple pumping wells. While these difficulties may support an inclination to abandon the existing dataset and conduct a dedicated cross-hole pumping test, that option can be challenging and expensive to coordinate and execute. This paper presents a method that utilizes a simple analytical modeling approach for analysis of a long-term water level record utilizing an inverse modeling approach. The methodology allows the identification of pumping wells influencing the water level fluctuations. Thus, the analysis provides an efficient and cost-effective alternative to designed and coordinated cross-hole pumping tests. We apply this method on a dataset from the Los Alamos National Laboratory site. Our analysis also provides (1) an evaluation of the information content of the transient water level data; (2) indications of potential structures of the aquifer heterogeneity inhibiting or promoting pressure propagation; and (3) guidance for the development of more complicated models requiring detailed specification of the aquifer heterogeneity. Copyright © 2010 The Author(s). Journal compilation © 2010 National Ground Water Association.

  17. Molecular and Biochemical Characterization of a Cytokinin Oxidase from Maize1

    PubMed Central

    Bilyeu, Kristin D.; Cole, Jean L.; Laskey, James G.; Riekhof, Wayne R.; Esparza, Thomas J.; Kramer, Michelle D.; Morris, Roy O.

    2001-01-01

    It is generally accepted that cytokinin oxidases, which oxidatively remove cytokinin side chains to produce adenine and the corresponding isopentenyl aldehyde, play a major role in regulating cytokinin levels in planta. Partially purified fractions of cytokinin oxidase from various species have been studied for many years, but have yet to clearly reveal the properties of the enzyme or to define its biological significance. Details of the genomic organization of the recently isolated maize (Zea mays) cytokinin oxidase gene (ckx1) and some of its Arabidopsis homologs are now presented. Expression of an intronless ckx1 in Pichia pastoris allowed production of large amounts of recombinant cytokinin oxidase and facilitated detailed kinetic and cofactor analysis and comparison with the native enzyme. The enzyme is a flavoprotein containing covalently bound flavin adenine dinucleotide, but no detectable heavy metals. Expression of the oxidase in maize tissues is described. PMID:11154345

  18. The Microstructure of RR1000 Nickel-Base Superalloy: The FIB-SEM Dual-Beam Approach

    NASA Astrophysics Data System (ADS)

    Croxall, S. A.; Hardy, M. C.; Stone, H. J.; Midgley, P. A.

    Nickel-base superalloys are aerospace materials that exhibit exceptional mechanical properties and corrosion resistance at very high temperatures. RR1000 is used in discs in gas turbine engines, where temperatures reach in excess of 650°C with high mechanical stresses. Study of the microstructure at the micron and sub-micron level has conventionally been undertaken using scanning electron microscope images, often meaning the underlying 3D microstructure can be inferred only with additional knowledge. Using a dual-beam workstation, we are able to interrogate directly the 3D microstructure using a serial sectioning approach. The 3D data set, typically (10µm)3 in volume, reveals microstructural detail with lateral resolution of circa 8nm and a depth resolution dictated by the slice thickness, typically 50nm. Morphological and volumetric analysis of the 3D reconstruction of RR1000 superalloy reveals microstructural details hitherto unseen.

  19. Contracts for dispatchable power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahn, E.P.; Stoft, S.; Marnay, C.

    1990-10-01

    Competitive bidding for electric power is maturing. Increasing numbers of utilities are soliciting proposals from private suppliers. The amount of capacity being sought is increasing, and potential suppliers appear to be abundant. Analysis of these developments still remains limited. Evidence on the behavior of this market is scarce and sketchy. The underlying economic principles that are shaping the market have not clearly been articulated. In this report we examine the economics of competitive bidding both empirically and analytically. Previous study of this market has focused on the evaluation criteria specified in Requests for Proposals (RFPs), and highly aggregated summary statisticsmore » on participation and results. We continue the examination of RFPs, but also survey the details of long term contracts that have emerged from competitive bidding. Contracts provide a new level of specific detail that has not been previously available. 68 refs., 13 figs., 25 tabs.« less

  20. Advanced microwave radiometer antenna system study

    NASA Technical Reports Server (NTRS)

    Kummer, W. H.; Villeneuve, A. T.; Seaton, A. F.

    1976-01-01

    The practicability of a multi-frequency antenna for spaceborne microwave radiometers was considered in detail. The program consisted of a comparative study of various antenna systems, both mechanically and electronically scanned, in relation to specified design goals and desired system performance. The study involved several distinct tasks: definition of candidate antennas that are lightweight and that, at the specified frequencies of 5, 10, 18, 22, and 36 GHz, can provide conical scanning, dual linear polarization, and simultaneous multiple frequency operation; examination of various feed systems and phase-shifting techniques; detailed analysis of several key performance parameters such as beam efficiency, sidelobe level, and antenna beam footprint size; and conception of an antenna/feed system that could meet the design goals. Candidate antennas examined include phased arrays, lenses, and optical reflector systems. Mechanical, electrical, and performance characteristics of the various systems were tabulated for ease of comparison.

Top