The national biennial RCRA hazardous waste report (based on 1999 data) : state detail analysis
DOT National Transportation Integrated Search
2001-06-01
The State Detail Analysis is a detailed look at each State's waste handling practices, including overall totals for generation, management, and shipments and receipts, as well as totals for the largest fifty facilities.
Hopkins, F B; Gravett, M R; Self, A J; Wang, M; Chua, Hoe-Chee; Hoe-Chee, C; Lee, H S Nancy; Sim, N Lee Hoi; Jones, J T A; Timperley, C M; Riches, J R
2014-08-01
Detailed chemical analysis of solutions used to decontaminate chemical warfare agents can be used to support verification and forensic attribution. Decontamination solutions are amongst the most difficult matrices for chemical analysis because of their corrosive and potentially emulsion-based nature. Consequently, there are relatively few publications that report their detailed chemical analysis. This paper describes the application of modern analytical techniques to the analysis of decontamination solutions following decontamination of the chemical warfare agent O-ethyl S-2-diisopropylaminoethyl methylphosphonothiolate (VX). We confirm the formation of N,N-diisopropylformamide and N,N-diisopropylamine following decontamination of VX with hypochlorite-based solution, whereas they were not detected in extracts of hydroxide-based decontamination solutions by nuclear magnetic resonance (NMR) spectroscopy or gas chromatography-mass spectrometry. We report the electron ionisation and chemical ionisation mass spectroscopic details, retention indices, and NMR spectra of N,N-diisopropylformamide and N,N-diisopropylamine, as well as analytical methods suitable for their analysis and identification in solvent extracts and decontamination residues.
Finite element based micro-mechanics modeling of textile composites
NASA Technical Reports Server (NTRS)
Glaessgen, E. H.; Griffin, O. H., Jr.
1995-01-01
Textile composites have the advantage over laminated composites of a significantly greater damage tolerance and resistance to delamination. Currently, a disadvantage of textile composites is the inability to examine the details of the internal response of these materials under load. Traditional approaches to the study fo textile based composite materials neglect many of the geometric details that affect the performance of the material. The present three dimensional analysis, based on the representative volume element (RVE) of a plain weave, allows prediction of the internal details of displacement, strain, stress, and failure quantities. Through this analysis, the effect of geometric and material parameters on the aforementioned quantities are studied.
A Comparison of Functional Models for Use in the Function-Failure Design Method
NASA Technical Reports Server (NTRS)
Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.
2006-01-01
When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based on NTSB accident reports. To best record this data, standardized functional and failure mode vocabularies are used. Two separate function-failure knowledge bases are then created aid compared. Results indicate that encoding failure data using more detailed functional models allows for a more robust knowledge base. Interestingly however, when applying the EFDM, high level descriptions continue to produce useful results when using the knowledge base generated from the detailed functional models.
An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks
NASA Astrophysics Data System (ADS)
Zhao, Peng-yuan; Huang, Xiao-ping
2018-03-01
Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.
Structural analysis consultation using artificial intelligence
NASA Technical Reports Server (NTRS)
Melosh, R. J.; Marcal, P. V.; Berke, L.
1978-01-01
The primary goal of consultation is definition of the best strategy to deal with a structural engineering analysis objective. The knowledge base to meet the need is designed to identify the type of numerical analysis, the needed modeling detail, and specific analysis data required. Decisions are constructed on the basis of the data in the knowledge base - material behavior, relations between geometry and structural behavior, measures of the importance of time and temperature changes - and user supplied specifics characteristics of the spectrum of analysis types, the relation between accuracy and model detail on the structure, its mechanical loadings, and its temperature states. Existing software demonstrated the feasibility of the approach, encompassing the 36 analysis classes spanning nonlinear, temperature affected, incremental analyses which track the behavior of structural systems.
Relationship between Preferred and Actual Opinions about Inquiry-Based Instruction Classroom
ERIC Educational Resources Information Center
Nuangchalerm, Prasart
2017-01-01
Based on 10 preservice science teachers in 4 schools, this study presents a detailed analysis of how preservice teacher expectation interacts with school practicum and authentic classroom action of inquiry-based instruction. Classroom observation, lesson plan analysis, and interviews revealed that inquiry-based instruction in the expectation and…
Inspection of Piezoceramic Transducers Used for Structural Health Monitoring
Mueller, Inka; Fritzen, Claus-Peter
2017-01-01
The use of piezoelectric wafer active sensors (PWAS) for structural health monitoring (SHM) purposes is state of the art for acousto-ultrasonic-based methods. For system reliability, detailed information about the PWAS itself is necessary. This paper gives an overview on frequent PWAS faults and presents the effects of these faults on the wave propagation, used for active acousto-ultrasonics-based SHM. The analysis of the wave field is based on velocity measurements using a laser Doppler vibrometer (LDV). New and established methods of PWAS inspection are explained in detail, listing advantages and disadvantages. The electro-mechanical impedance spectrum as basis for these methods is discussed for different sensor faults. This way this contribution focuses on a detailed analysis of PWAS and the need of their inspection for an increased reliability of SHM systems. PMID:28772431
Development of a Aerothermoelastic-Acoustics Simulation Capability of Flight Vehicles
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Choi, S. B.; Ibrahim, A.
2010-01-01
A novel numerical, finite element based analysis methodology is presented in this paper suitable for accurate and efficient simulation of practical, complex flight vehicles. An associated computer code, developed in this connection, is also described in some detail. Thermal effects of high speed flow obtained from a heat conduction analysis are incorporated in the modal analysis which in turn affects the unsteady flow arising out of interaction of elastic structures with the air. Numerical examples pertaining to representative problems are given in much detail testifying to the efficacy of the advocated techniques. This is a unique implementation of temperature effects in a finite element CFD based multidisciplinary simulation analysis capability involving large scale computations.
Testing for detailed balance in a financial market
NASA Astrophysics Data System (ADS)
Fiebig, H. R.; Musgrove, D. P.
2015-06-01
We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.
Aeroservoelastic and Flight Dynamics Analysis Using Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Arena, Andrew S., Jr.
1999-01-01
This document in large part is based on the Masters Thesis of Cole Stephens. The document encompasses a variety of technical and practical issues involved when using the STARS codes for Aeroservoelastic analysis of vehicles. The document covers in great detail a number of technical issues and step-by-step details involved in the simulation of a system where aerodynamics, structures and controls are tightly coupled. Comparisons are made to a benchmark experimental program conducted at NASA Langley. One of the significant advantages of the methodology detailed is that as a result of the technique used to accelerate the CFD-based simulation, a systems model is produced which is very useful for developing the control law strategy, and subsequent high-speed simulations.
Cost Analysis Sources and Documents Data Base Reference Manual (Update)
1989-06-01
M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986
Gregory P. Asner; Michael Keller; Rodrigo Pereira; Johan C. Zweede
2002-01-01
We combined a detailed field study of forest canopy damage with calibrated Landsat 7 Enhanced Thematic Mapper Plus (ETM+) reflectance data and texture analysis to assess the sensitivity of basic broadband optical remote sensing to selective logging in Amazonia. Our field study encompassed measurements of ground damage and canopy gap fractions along a chronosequence of...
Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,
1975-12-01
Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and
Citygml and the Streets of New York - a Proposal for Detailed Street Space Modelling
NASA Astrophysics Data System (ADS)
Beil, C.; Kolbe, T. H.
2017-10-01
Three-dimensional semantic city models are increasingly used for the analysis of large urban areas. Until now the focus has mostly been on buildings. Nonetheless many applications could also benefit from detailed models of public street space for further analysis. However, there are only few guidelines for representing roads within city models. Therefore, related standards dealing with street modelling are examined and discussed. Nearly all street representations are based on linear abstractions. However, there are many use cases that require or would benefit from the detailed geometrical and semantic representation of street space. A variety of potential applications for detailed street space models are presented. Subsequently, based on related standards as well as on user requirements, a concept for a CityGML-compliant representation of street space in multiple levels of detail is developed. In the course of this process, the CityGML Transportation model of the currently valid OGC standard CityGML2.0 is examined to discover possibilities for further developments. Moreover, a number of improvements are presented. Finally, based on open data sources, the proposed concept is implemented within a semantic 3D city model of New York City generating a detailed 3D street space model for the entire city. As a result, 11 thematic classes, such as roadbeds, sidewalks or traffic islands are generated and enriched with a large number of thematic attributes.
Using Toulmin analysis to analyse an instructor's proof presentation in abstract algebra
NASA Astrophysics Data System (ADS)
Fukawa-connelly, Timothy
2014-01-01
This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of written models for their work. Similarly, the analysis shows that the details the instructor says aloud differ from what she writes down. Although her verbal commentary provides additional detail and appears to have pedagogical value, for instance, by modelling thinking that supports proof writing, this value might be better realized if she were to change her teaching practices.
Automatic network coupling analysis for dynamical systems based on detailed kinetic models.
Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich
2005-10-01
We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.
Lawlor, Debbie A; Peters, Tim J; Howe, Laura D; Noble, Sian M; Kipping, Ruth R; Jago, Russell
2013-07-24
The Active For Life Year 5 (AFLY5) randomised controlled trial protocol was published in this journal in 2011. It provided a summary analysis plan. This publication is an update of that protocol and provides a detailed analysis plan. This update provides a detailed analysis plan of the effectiveness and cost-effectiveness of the AFLY5 intervention. The plan includes details of how variables will be quality control checked and the criteria used to define derived variables. Details of four key analyses are provided: (a) effectiveness analysis 1 (the effect of the AFLY5 intervention on primary and secondary outcomes at the end of the school year in which the intervention is delivered); (b) mediation analyses (secondary analyses examining the extent to which any effects of the intervention are mediated via self-efficacy, parental support and knowledge, through which the intervention is theoretically believed to act); (c) effectiveness analysis 2 (the effect of the AFLY5 intervention on primary and secondary outcomes 12 months after the end of the intervention) and (d) cost effectiveness analysis (the cost-effectiveness of the AFLY5 intervention). The details include how the intention to treat and per-protocol analyses were defined and planned sensitivity analyses for dealing with missing data. A set of dummy tables are provided in Additional file 1. This detailed analysis plan was written prior to any analyst having access to any data and was approved by the AFLY5 Trial Steering Committee. Its publication will ensure that analyses are in accordance with an a priori plan related to the trial objectives and not driven by knowledge of the data. ISRCTN50133740.
75 FR 72611 - Assessments, Large Bank Pricing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-24
... the worst risk ranking and are included in the statistical analysis. Appendix 1 to the NPR describes the statistical analysis in detail. \\12\\ The percentage approximated by factors is based on the statistical model for that particual year. Actual weights assigned to each scorecard measure are largely based...
Titanium and advanced composite structures for a supersonic cruise arrow wing configuration
NASA Technical Reports Server (NTRS)
Turner, M. J.; Hoy, J. M.
1976-01-01
Structural design studies were made, based on current technology and on an estimate of technology to be available in the mid 1980's, to assess the relative merits of structural concepts and materials for an advanced arrow wing configuration cruising at Mach 2.7. Preliminary studies were made to insure compliance of the configuration with general design criteria, integrate the propulsion system with the airframe, and define an efficient structural arrangement. Material and concept selection, detailed structural analysis, structural design and airplane mass analysis were completed based on current technology. Based on estimated future technology, structural sizing for strength and a preliminary assessment of the flutter of a strength designed composite structure were completed. An advanced computerized structural design system was used, in conjunction with a relatively complex finite element model, for detailed analysis and sizing of structural members.
Design of robotic cells based on relative handling modules with use of SolidWorks system
NASA Astrophysics Data System (ADS)
Gaponenko, E. V.; Anciferov, S. I.
2018-05-01
The article presents a diagramed engineering solution for a robotic cell with six degrees of freedom for machining of complex details, consisting of the base with a tool installation module and a detail machining module made as parallel structure mechanisms. The output links of the detail machining module and the tool installation module can move along X-Y-Z coordinate axes each. A 3D-model of the complex is designed in the SolidWorks system. It will be used further for carrying out engineering calculations and mathematical analysis and obtaining all required documentation.
Mei, Liang; Svanberg, Sune
2015-03-20
This work presents a detailed study of the theoretical aspects of the Fourier analysis method, which has been utilized for gas absorption harmonic detection in wavelength modulation spectroscopy (WMS). The lock-in detection of the harmonic signal is accomplished by studying the phase term of the inverse Fourier transform of the Fourier spectrum that corresponds to the harmonic signal. The mathematics and the corresponding simulation results are given for each procedure when applying the Fourier analysis method. The present work provides a detailed view of the WMS technique when applying the Fourier analysis method.
Proposed Land Conveyance for Construction of Three Facilities at March Air Force Base, California
1988-09-01
identified would result from future development on the 845-acre parcel after it has been conveyed. Therefore, detailed development review and...Impact Analysis Process (EIAP) of the Air Force. This detailed development review is within the purview of the state and local government with...establishes the process under which subsequent detailed environmental review would be conducted. CEQA and its implementing regulations are administered by
Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...
2012-01-01
A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less
Nuclear Security: Target Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Surinder Paul; Gibbs, Philip W.; Bultz, Garl A.
2014-03-01
This objectives of this session were to understand the basic steps of target identification; describe the SNRI targets in detail; characterize specific targets with more detail; prioritize targets based on guidance documents; understand the graded safeguards concept; identify roll up and understand why it is a concern; and recognize the category for different materials.
ERIC Educational Resources Information Center
Gray, John S.
1994-01-01
A detailed analysis and computer-based solution to a puzzle addressing the arrangement of dominoes on a grid is presented. The problem is one used in a college-level data structures or algorithms course. The solution uses backtracking to generate all possible answers. Details of the use of backtracking and techniques for mapping abstract problems…
Turbulent Mixing of Primary and Secondary Flow Streams in a Rocket-Based Combined Cycle Engine
NASA Technical Reports Server (NTRS)
Cramer, J. M.; Greene, M. U.; Pal, S.; Santoro, R. J.; Turner, Jim (Technical Monitor)
2002-01-01
This viewgraph presentation gives an overview of the turbulent mixing of primary and secondary flow streams in a rocket-based combined cycle (RBCC) engine. A significant RBCC ejector mode database has been generated, detailing single and twin thruster configurations and global and local measurements. On-going analysis and correlation efforts include Marshall Space Flight Center computational fluid dynamics modeling and turbulent shear layer analysis. Potential follow-on activities include detailed measurements of air flow static pressure and velocity profiles, investigations into other thruster spacing configurations, performing a fundamental shear layer mixing study, and demonstrating single-shot Raman measurements.
Supporting Air and Space Expeditionary Forces: Analysis of Combat Support Basing Options
2004-01-01
Brooke et al., 2003. 13 For more information on Set Covering models, see Daskin , 1995. Analysis Methodology 43 Transportation Model. A detailed...This PDF document was made available from www.rand.org as a public service of the RAND Corporation. 6Jump down to document Visit RAND at...www.rand.org Explore RAND Project AIR FORCE View document details This document and trademark(s) contained herein are protected by law as indicated in a
Aquarius Reflector Surface Temperature Monitoring Test and Analysis
NASA Technical Reports Server (NTRS)
Abbott, Jamie; Lee, Siu-Chun; Becker, Ray
2008-01-01
The presentation addresses how to infer the front side temperatures for the Aquarius L-band reflector based upon backside measurement sites. Slides discussing the mission objectives and design details are at the same level found on typical project outreach websites and in conference papers respectively. The test discussion provides modest detail of an ordinary thermal balance test using mockup hardware. The photographs show an off-Lab vacuum chamber facility with no compromising details.
NASA Astrophysics Data System (ADS)
Yu, Yang; Zeng, Zheng
2009-10-01
By discussing the causes behind the high amendments ratio in the implementation of urban regulatory detailed plans in China despite its law-ensured status, the study aims to reconcile conflict between the legal authority of regulatory detailed planning and the insufficient scientific support in its decision-making and compilation by introducing into the process spatial analysis based on GIS technology and 3D modeling thus present a more scientific and flexible approach to regulatory detailed planning in China. The study first points out that the current compilation process of urban regulatory detailed plan in China employs mainly an empirical approach which renders it constantly subjected to amendments; the study then discusses the need and current utilization of GIS in the Chinese system and proposes the framework of a GIS-assisted 3D spatial analysis process from the designer's perspective which can be regarded as an alternating processes between the descriptive codes and physical design in the compilation of regulatory detailed planning. With a case study of the processes and results from the application of the framework, the paper concludes that the proposed framework can be an effective instrument which provides more rationality, flexibility and thus more efficiency to the compilation and decision-making process of urban regulatory detailed plan in China.
High-throughput sequencing: a failure mode analysis.
Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A
2005-01-04
Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.
A Method for the Analysis of Information Use in Source-Based Writing
ERIC Educational Resources Information Center
Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto
2012-01-01
Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…
Automating a Detailed Cognitive Task Analysis for Structuring Curriculum
1991-06-01
Cognitive Task Analysis For... cognitive task analysis o3 0 chniques. A rather substantial literature has been amassed relative to _ - cutonqed knowledge acquisition but only seven...references have been found in LO V*r data base seaci of literature specifically addressing cognitive task analysis . - A variety of forms of cognitive task analysis
Development and testing of a fast conceptual river water quality model.
Keupers, Ingrid; Willems, Patrick
2017-04-15
Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Meta-Analysis of Multiple Simulation-Based Experiments
2013-06-01
Alberts et al ., 2010), C2 Approaches differ on at least three major aspects: the allocation of decision rights (ADR), the pattern of interaction among...results obtained from the meta-analysis support the hypothesis that more network-enabled C2 Approaches are more agile (for details see Bernier et al ...consult Bernier, Chan et al . (2013) for more details. DoI PoI ADR Figure 2: Mapping of all CiCs into each axis of the C2 Approach Space. 18th
A method for identifying EMI critical circuits during development of a large C3
NASA Astrophysics Data System (ADS)
Barr, Douglas H.
The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.
Detailed α -decay study of 180Tl
NASA Astrophysics Data System (ADS)
Andel, B.; Andreyev, A. N.; Antalic, S.; Barzakh, A.; Bree, N.; Cocolios, T. E.; Comas, V. F.; Diriken, J.; Elseviers, J.; Fedorov, D. V.; Fedosseev, V. N.; Franchoo, S.; Ghys, L.; Heredia, J. A.; Huyse, M.; Ivanov, O.; Köster, U.; Liberati, V.; Marsh, B. A.; Nishio, K.; Page, R. D.; Patronis, N.; Seliverstov, M. D.; Tsekhanovich, I.; Van den Bergh, P.; Van De Walle, J.; Van Duppen, P.; Venhart, M.; Vermote, S.; Veselský, M.; Wagemans, C.
2017-11-01
A detailed α -decay spectroscopy study of 180Tl has been performed at ISOLDE (CERN). Z -selective ionization by the Resonance Ionization Laser Ion Source (RILIS) coupled to mass separation provided a high-purity beam of 180Tl. Fine-structure α decays to excited levels in the daughter 176Au were identified and an α -decay scheme of 180Tl was constructed based on an analysis of α -γ and α -γ -γ coincidences. Multipolarities of several γ -ray transitions deexciting levels in 176Au were determined. Based on the analysis of reduced α -decay widths, it was found that all α decays are hindered, which signifies a change of configuration between the parent and all daughter states.
Image encryption based on a delayed fractional-order chaotic logistic system
NASA Astrophysics Data System (ADS)
Wang, Zhen; Huang, Xia; Li, Ning; Song, Xiao-Na
2012-05-01
A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security.
NASA Technical Reports Server (NTRS)
1972-01-01
The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.
Issues, concerns, and initial implementation results for space based telerobotic control
NASA Technical Reports Server (NTRS)
Lawrence, D. A.; Chapel, J. D.; Depkovich, T. M.
1987-01-01
Telerobotic control for space based assembly and servicing tasks presents many problems in system design. Traditional force reflection teleoperation schemes are not well suited to this application, and the approaches to compliance control via computer algorithms have yet to see significant testing and comparison. These observations are discussed in detail, as well as the concerns they raise for imminent design and testing of space robotic systems. As an example of the detailed technical work yet to be done before such systems can be specified, a particular approach to providing manipulator compliance is examined experimentally and through modeling and analysis. This yields some initial insight into the limitations and design trade-offs for this class of manipulator control schemes. Implications of this investigation for space based telerobots are discussed in detail.
NASA Technical Reports Server (NTRS)
1977-01-01
An analysis of construction operation is presented as well as power system sizing requirements. Mission hardware requirements are reviewed in detail. Space construction base and design configurations are also examined.
Pasi, Marco; Maddocks, John H.; Lavery, Richard
2015-01-01
Microsecond molecular dynamics simulations of B-DNA oligomers carried out in an aqueous environment with a physiological salt concentration enable us to perform a detailed analysis of how potassium ions interact with the double helix. The oligomers studied contain all 136 distinct tetranucleotides and we are thus able to make a comprehensive analysis of base sequence effects. Using a recently developed curvilinear helicoidal coordinate method we are able to analyze the details of ion populations and densities within the major and minor grooves and in the space surrounding DNA. The results show higher ion populations than have typically been observed in earlier studies and sequence effects that go beyond the nature of individual base pairs or base pair steps. We also show that, in some special cases, ion distributions converge very slowly and, on a microsecond timescale, do not reflect the symmetry of the corresponding base sequence. PMID:25662221
Basic research for the geodynamics program
NASA Technical Reports Server (NTRS)
1991-01-01
The mathematical models of space very long base interferometry (VLBI) observables suitable for least squares covariance analysis were derived and estimatability problems inherent in the space VLBI system were explored, including a detailed rank defect analysis and sensitivity analysis. An important aim is to carry out a comparative analysis of the mathematical models of the ground-based VLBI and space VLBI observables in order to describe the background in detail. Computer programs were developed in order to check the relations, assess errors, and analyze sensitivity. In order to investigate the estimatability of different geodetic and geodynamic parameters from the space VLBI observables, the mathematical models for time delay and time delay rate observables of space VLBI were analytically derived along with the partial derivatives with respect to the parameters. Rank defect analysis was carried out both by analytical and numerical testing of linear dependencies between the columns of the normal matrix thus formed. Definite conclusions were formed about the rank defects in the system.
West Virginia forest industry transportation network analysis using GIS
Steven E. Harouff; Shawn T. Grushecky; Ben D. Spong
2008-01-01
To better understand and increase efficiency in delivery of harvested roundwood on West Virginia' roadways, a detailed network analysis using Geographic Information Systems (GIS) was conducted. Typical proximity-based analysis, which looks at straightline distances (buffers) around given features regardless of terrain and road characteristics, provides limited...
The national biennial RCRA hazardous waste report (based on 1997 data) : national analysis
DOT National Transportation Integrated Search
1999-09-01
National Analysis presents a detailed look at waste-handling practices in the EPA Regions, States, and largest facilities nationally, including (1) the quantity of waste generated, managed, shipped and received, and imported and exported between Stat...
ALTERNATIVE FUTURES FOR THE WILLAMETTE RIVER BASIN, OREGON
Alternative futures analysis is an assessment approach designed to inform community decisions regarding land and water use. We conducted an alternative futures analysis in the Willamette River Basin in western Oregon. Based on detailed input from local stakeholders, three alter...
An analysis of random projection for changeable and privacy-preserving biometric verification.
Wang, Yongjin; Plataniotis, Konstantinos N
2010-10-01
Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.
NASA Technical Reports Server (NTRS)
Valdez, T. I.; Firdosy, S.; Koel, B. E.; Narayanan, S. R.
2005-01-01
This viewgraph presentation gives a detailed review of the Direct Methanol Based Fuel Cell (DMFC) stack and investigates the Ruthenium that was found at the exit of the stack. The topics include: 1) Motivation; 2) Pathways for Cell Degradation; 3) Cell Duration Testing; 4) Duration Testing, MEA Analysis; and 5) Stack Degradation Analysis.
Structural Design of Ares V Interstage Composite Structure
NASA Technical Reports Server (NTRS)
Sleigh, David W.; Sreekantamurthy, Thammaiah; Kosareo, Daniel N.; Martin, Robert A.; Johnson, Theodore F.
2011-01-01
Preliminary and detailed design studies were performed to mature composite structural design concepts for the Ares V Interstage structure as a part of NASA s Advanced Composite Technologies Project. Aluminum honeycomb sandwich and hat-stiffened composite panel structural concepts were considered. The structural design and analysis studies were performed using HyperSizer design sizing software and MSC Nastran finite element analysis software. System-level design trade studies were carried out to predict weight and margins of safety for composite honeycomb-core sandwich and composite hat-stiffened skin design concepts. Details of both preliminary and detailed design studies are presented in the paper. For the range of loads and geometry considered in this work, the hat-stiffened designs were found to be approximately 11-16 percent lighter than the sandwich designs. A down-select process was used to choose the most favorable structural concept based on a set of figures of merit, and the honeycomb sandwich design was selected as the best concept based on advantages in manufacturing cost.
An Efficient Analysis Methodology for Fluted-Core Composite Structures
NASA Technical Reports Server (NTRS)
Oremont, Leonard; Schultz, Marc R.
2012-01-01
The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.
Different perspectives on economic base.
Lisa K. Crone; Richard W. Haynes; Nicholas E. Reyna
1999-01-01
Two general approaches for measuring the economic base are discussed. Each method is used to define the economic base for each of the counties included in the Interior Columbia Basin Ecosystem Management Project area. A more detailed look at four selected counties results in similar findings from different approaches. Limitations of economic base analysis also are...
Knowledge-based requirements analysis for automating software development
NASA Technical Reports Server (NTRS)
Markosian, Lawrence Z.
1988-01-01
We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.
NASA Astrophysics Data System (ADS)
Hamedianfar, Alireza; Shafri, Helmi Zulhaidi Mohd
2016-04-01
This paper integrates decision tree-based data mining (DM) and object-based image analysis (OBIA) to provide a transferable model for the detailed characterization of urban land-cover classes using WorldView-2 (WV-2) satellite images. Many articles have been published on OBIA in recent years based on DM for different applications. However, less attention has been paid to the generation of a transferable model for characterizing detailed urban land cover features. Three subsets of WV-2 images were used in this paper to generate transferable OBIA rule-sets. Many features were explored by using a DM algorithm, which created the classification rules as a decision tree (DT) structure from the first study area. The developed DT algorithm was applied to object-based classifications in the first study area. After this process, we validated the capability and transferability of the classification rules into second and third subsets. Detailed ground truth samples were collected to assess the classification results. The first, second, and third study areas achieved 88%, 85%, and 85% overall accuracies, respectively. Results from the investigation indicate that DM was an efficient method to provide the optimal and transferable classification rules for OBIA, which accelerates the rule-sets creation stage in the OBIA classification domain.
Performance analysis and dynamic modeling of a single-spool turbojet engine
NASA Astrophysics Data System (ADS)
Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin
2017-01-01
The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.
Plane-wave decomposition by spherical-convolution microphone array
NASA Astrophysics Data System (ADS)
Rafaely, Boaz; Park, Munhum
2004-05-01
Reverberant sound fields are widely studied, as they have a significant influence on the acoustic performance of enclosures in a variety of applications. For example, the intelligibility of speech in lecture rooms, the quality of music in auditoria, the noise level in offices, and the production of 3D sound in living rooms are all affected by the enclosed sound field. These sound fields are typically studied through frequency response measurements or statistical measures such as reverberation time, which do not provide detailed spatial information. The aim of the work presented in this seminar is the detailed analysis of reverberant sound fields. A measurement and analysis system based on acoustic theory and signal processing, designed around a spherical microphone array, is presented. Detailed analysis is achieved by decomposition of the sound field into waves, using spherical Fourier transform and spherical convolution. The presentation will include theoretical review, simulation studies, and initial experimental results.
The Design and the Formative Evaluation of a Web-Based Course for Simulation Analysis Experiences
ERIC Educational Resources Information Center
Tao, Yu-Hui; Guo, Shin-Ming; Lu, Ya-Hui
2006-01-01
Simulation output analysis has received little attention comparing to modeling and programming in real-world simulation applications. This is further evidenced by our observation that students and beginners acquire neither adequate details of knowledge nor relevant experience of simulation output analysis in traditional classroom learning. With…
Identification of large geomorphological anomalies based on 2D discrete wavelet transform
NASA Astrophysics Data System (ADS)
Doglioni, A.; Simeone, V.
2012-04-01
The identification and analysis based on quantitative evidences of large geomorphological anomalies is an important stage for the study of large landslides. Numerical geomorphic analyses represent an interesting approach to this kind of studies, allowing for a detailed and pretty accurate identification of hidden topographic anomalies that may be related to large landslides. Here a geomorphic numerical analyses of the Digital Terrain Model (DTM) is presented. The introduced approach is based on 2D discrete wavelet transform (Antoine et al., 2003; Bruun and Nilsen, 2003, Booth et al., 2009). The 2D wavelet decomposition of the DTM, and in particular the analysis of the detail coefficients of the wavelet transform can provide evidences of anomalies or singularities, i.e. discontinuities of the land surface. These discontinuities are not very evident from the DTM as it is, while 2D wavelet transform allows for grid-based analysis of DTM and for mapping the decomposition. In fact, the grid-based DTM can be assumed as a matrix, where a discrete wavelet transform (Daubechies, 1992) is performed columnwise and linewise, which basically represent horizontal and vertical directions. The outcomes of this analysis are low-frequency approximation coefficients and high-frequency detail coefficients. Detail coefficients are analyzed, since their variations are associated to discontinuities of the DTM. Detailed coefficients are estimated assuming to perform 2D wavelet transform both for the horizontal direction (east-west) and for the vertical direction (north-south). Detail coefficients are then mapped for both the cases, thus allowing to visualize and quantify potential anomalies of the land surface. Moreover, wavelet decomposition can be pushed to further levels, assuming a higher scale number of the transform. This may potentially return further interesting results, in terms of identification of the anomalies of land surface. In this kind of approach, the choice of a proper mother wavelet function is a tricky point, since it conditions the analysis and then their outcomes. Therefore multiple levels as well as multiple wavelet analyses are guessed. Here the introduced approach is applied to some interesting cases study of south Italy, in particular for the identification of large anomalies associated to large landslides at the transition between Apennine chain domain and the foredeep domain. In particular low Biferno valley and Fortore valley are here analyzed. Finally, the wavelet transforms are performed on multiple levels, thus trying to address the problem of which is the level extent for an accurate analysis fit to a specific problem. Antoine J.P., Carrette P., Murenzi R., and Piette B., (2003), Image analysis with two-dimensional continuous wavelet transform, Signal Processing, 31(3), pp. 241-272, doi:10.1016/0165-1684(93)90085-O. Booth A.M., Roering J.J., and Taylor Perron J., (2009), Automated landslide mapping using spectral analysis and high-resolution topographic data: Puget Sound lowlands, Washington, and Portland Hills, Oregon, Geomorphology, 109(3-4), pp. 132-147, doi:10.1016/j.geomorph.2009.02.027. Bruun B.T., and Nilsen S., (2003), Wavelet representation of large digital terrain models, Computers and Geoscience, 29(6), pp. 695-703, doi:10.1016/S0098-3004(03)00015-3. Daubechies, I. (1992), Ten lectures on wavelets, SIAM.
Edge enhancement and noise suppression for infrared image based on feature analysis
NASA Astrophysics Data System (ADS)
Jiang, Meng
2018-06-01
Infrared images are often suffering from background noise, blurred edges, few details and low signal-to-noise ratios. To improve infrared image quality, it is essential to suppress noise and enhance edges simultaneously. To realize it in this paper, we propose a novel algorithm based on feature analysis in shearlet domain. Firstly, as one of multi-scale geometric analysis (MGA), we introduce the theory and superiority of shearlet transform. Secondly, after analyzing the defects of traditional thresholding technique to suppress noise, we propose a novel feature extraction distinguishing image structures from noise well and use it to improve the traditional thresholding technique. Thirdly, with computing the correlations between neighboring shearlet coefficients, the feature attribute maps identifying the weak detail and strong edges are completed to improve the generalized unsharped masking (GUM). At last, experiment results with infrared images captured in different scenes demonstrate that the proposed algorithm suppresses noise efficiently and enhances image edges adaptively.
Tomographic assessment of the spine in children with spondylocostal dysotosis syndrome.
Kaissi, Ali Al; Klaushofer, Klaus; Grill, Franz
2010-01-01
The aim of this study was to perform a detailed tomographic analysis of the skull base, craniocervical junction, and the entire spine in seven patients with spondylocostal dysostosis syndrome. Detailed scanning images have been organized in accordance with the most prominent clinical pathology. The reasons behind plagiocephaly, torticollis, short immobile neck, scoliosis and rigid back have been detected. Radiographic documentation was insufficient modality. Detailed computed tomography scans provided excellent delineation of the osseous abnormality pattern in our patients. This article throws light on the most serious osseous manifestations of spondylocostal dysostosissyndrome.
NASA Technical Reports Server (NTRS)
Sundstrom, J. L.
1980-01-01
The techniques required to produce and validate six detailed task timeline scenarios for crew workload studies are described. Specific emphasis is given to: general aviation single pilot instrument flight rules operations in a high density traffic area; fixed path metering and spacing operations; and comparative workload operation between the forward and aft-flight decks of the NASA terminal control vehicle. The validation efforts also provide a cursory examination of the resultant demand workload based on the operating procedures depicted in the detailed task scenarios.
DOT National Transportation Integrated Search
1988-08-01
This report details the results of an analysis performed to evaluate the : representativeness of the Crash Avoidance Research accident data base : (CARDfile). The accident records for 1983 and 1984 from six states (Indiana, : Maryland, Michigan, Penn...
DOT National Transportation Integrated Search
1985-12-01
This report details the results of an analysis performed to evaluate the representativeness of the Crash Avoidance Research accident data base (CARDfile). The accident records for 1983 and 1984 from six states (Indiana, Maryland, Michigan, Pennsylvan...
Innovations in Site Characterization Case Study: Hanscom Air Force Base, Operable Unit 1
This document is a condensation of the information provided in the much more detailed Hanscom AFB Report entitled A Dynamic Site Investigation: Adaptive Sampling and Analysis Program for Operable Unit 1 at Hanscom Air Force Base, Bedford, Massachusetts.
Comparison of Document Data Bases
ERIC Educational Resources Information Center
Schipma, Peter B.; And Others
This paper presents a detailed analysis of the content and format of seven machine-readable bibliographic data bases: Chemical Abstracts Service Condensates, Chemical and Biological Activities, and Polymer Science and Technology, Biosciences Information Service's BA Previews including Biological Abstracts and BioResearch Index, Institute for…
Synthesis, Structure And Properties of Electrochemically Active Nanocomposites
2003-05-01
milling. Detailed systematic impedance analysis , electronic conductivity measurement and high-resolution electron microscopy studies have shown that...carbon particles determined by TEM analysis . Results of the studies so far have shown that Sn and Si-based nanocomposites appear to be quite promising... Analysis of the As-milled Powders 117 2. Electrochemical Characteristics of Si/SiC Nanocomposites 120 3. Microstructural/Morphological Analysis of
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.
Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy
2014-11-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
An airborne combined radiometric and magnetic survey was performed for the Department of Energy (DOE) over the Durango A, Durango B, Durango C, and Durango D Detail Areas of southwestern Colorado. The Durango A Detail Area is within the coverage of the Needle Mountains and Silverton 15' map sheets, and the Pole Creek Mountain, Rio Grande Pyramid, Emerald Lake, Granite Peak, Vallecito Reservoir, and Lemon Reservoir 7.5' map sheets of the National Topographic Map Series (NTMS). The Durango B Detail Area is within the coverage of the Silverton 15' map sheet and the Wetterhorn Peak, Uncompahgre Peak, Lake City, Redcloudmore » Peak, Lake San Cristobal, Pole Creek Mountain, and Finger Mesa 7.5' map sheets of the NTMS. The Durango C Detail Area is within the coverage of the Platoro and Wolf Creek Pass 15' map sheets of the NTMS. The Durango D Detail Area is within the coverage of the Granite Lake, Cimarrona Peak, Bear Mountain, and Oakbrush Ridge 7.5' map sheets of the NTMS. Radiometric data were corrected for live time, aircraft and equipment background, cosmic background, atmospheric radon, Compton scatter, and altitude dependence. The corrected data were statistically evaluated, gridded, and contoured to produce maps of the radiometric variables, uranium, potassium, and thorium; their ratios; and the residual magnetic field. These maps have been analyzed in order to produce a multi-variant analysis contour map based on the radiometric response of the individual geological units. A geochemical analysis has been performed, using the radiometric and magnetic contour maps, the multi-variant analysis map, and factor analysis techniques, to produce a geochemical analysis map for the area.« less
Substructure program for analysis of helicopter vibrations
NASA Technical Reports Server (NTRS)
Sopher, R.
1981-01-01
A substructure vibration analysis which was developed as a design tool for predicting helicopter vibrations is described. The substructure assembly method and the composition of the transformation matrix are analyzed. The procedure for obtaining solutions to the equations of motion is illustrated for the steady-state forced response solution mode, and rotor hub load excitation and impedance are analyzed. Calculation of the mass, damping, and stiffness matrices, as well as the forcing function vectors of physical components resident in the base program code, are discussed in detail. Refinement of the model is achieved by exercising modules which interface with the external program to represent rotor induced variable inflow and fuselage induced variable inflow at the rotor. The calculation of various flow fields is discussed, and base program applications are detailed.
Example-based super-resolution for single-image analysis from the Chang'e-1 Mission
NASA Astrophysics Data System (ADS)
Wu, Fan-Lu; Wang, Xiang-Jun
2016-11-01
Due to the low spatial resolution of images taken from the Chang'e-1 (CE-1) orbiter, the details of the lunar surface are blurred and lost. Considering the limited spatial resolution of image data obtained by a CCD camera on CE-1, an example-based super-resolution (SR) algorithm is employed to obtain high-resolution (HR) images. SR reconstruction is important for the application of image data to increase the resolution of images. In this article, a novel example-based algorithm is proposed to implement SR reconstruction by single-image analysis, and the computational cost is reduced compared to other example-based SR methods. The results show that this method can enhance the resolution of images using SR and recover detailed information about the lunar surface. Thus it can be used for surveying HR terrain and geological features. Moreover, the algorithm is significant for the HR processing of remotely sensed images obtained by other imaging systems.
Improved medical image fusion based on cascaded PCA and shift invariant wavelet transforms.
Reena Benjamin, J; Jayasree, T
2018-02-01
In the medical field, radiologists need more informative and high-quality medical images to diagnose diseases. Image fusion plays a vital role in the field of biomedical image analysis. It aims to integrate the complementary information from multimodal images, producing a new composite image which is expected to be more informative for visual perception than any of the individual input images. The main objective of this paper is to improve the information, to preserve the edges and to enhance the quality of the fused image using cascaded principal component analysis (PCA) and shift invariant wavelet transforms. A novel image fusion technique based on cascaded PCA and shift invariant wavelet transforms is proposed in this paper. PCA in spatial domain extracts relevant information from the large dataset based on eigenvalue decomposition, and the wavelet transform operating in the complex domain with shift invariant properties brings out more directional and phase details of the image. The significance of maximum fusion rule applied in dual-tree complex wavelet transform domain enhances the average information and morphological details. The input images of the human brain of two different modalities (MRI and CT) are collected from whole brain atlas data distributed by Harvard University. Both MRI and CT images are fused using cascaded PCA and shift invariant wavelet transform method. The proposed method is evaluated based on three main key factors, namely structure preservation, edge preservation, contrast preservation. The experimental results and comparison with other existing fusion methods show the superior performance of the proposed image fusion framework in terms of visual and quantitative evaluations. In this paper, a complex wavelet-based image fusion has been discussed. The experimental results demonstrate that the proposed method enhances the directional features as well as fine edge details. Also, it reduces the redundant details, artifacts, distortions.
Cortijo, Sandra; Charoensawan, Varodom; Roudier, François; Wigge, Philip A
2018-01-01
Chromatin immunoprecipitation combined with next-generation sequencing (ChIP-seq) is a powerful technique to investigate in vivo transcription factor (TF) binding to DNA, as well as chromatin marks. Here we provide a detailed protocol for all the key steps to perform ChIP-seq in Arabidopsis thaliana roots, also working on other A. thaliana tissues and in most non-ligneous plants. We detail all steps from material collection, fixation, chromatin preparation, immunoprecipitation, library preparation, and finally computational analysis based on a combination of publicly available tools.
Base-By-Base: single nucleotide-level analysis of whole viral genome alignments.
Brodie, Ryan; Smith, Alex J; Roper, Rachel L; Tcherepanov, Vasily; Upton, Chris
2004-07-14
With ever increasing numbers of closely related virus genomes being sequenced, it has become desirable to be able to compare two genomes at a level more detailed than gene content because two strains of an organism may share the same set of predicted genes but still differ in their pathogenicity profiles. For example, detailed comparison of multiple isolates of the smallpox virus genome (each approximately 200 kb, with 200 genes) is not feasible without new bioinformatics tools. A software package, Base-By-Base, has been developed that provides visualization tools to enable researchers to 1) rapidly identify and correct alignment errors in large, multiple genome alignments; and 2) generate tabular and graphical output of differences between the genomes at the nucleotide level. Base-By-Base uses detailed annotation information about the aligned genomes and can list each predicted gene with nucleotide differences, display whether variations occur within promoter regions or coding regions and whether these changes result in amino acid substitutions. Base-By-Base can connect to our mySQL database (Virus Orthologous Clusters; VOCs) to retrieve detailed annotation information about the aligned genomes or use information from text files. Base-By-Base enables users to quickly and easily compare large viral genomes; it highlights small differences that may be responsible for important phenotypic differences such as virulence. It is available via the Internet using Java Web Start and runs on Macintosh, PC and Linux operating systems with the Java 1.4 virtual machine.
2004-01-01
Cognitive Task Analysis Abstract As Department of Defense (DoD) leaders rely more on modeling and simulation to provide information on which to base...capabilities and intent. Cognitive Task Analysis (CTA) Cognitive Task Analysis (CTA) is an extensive/detailed look at tasks and subtasks performed by a...Domain Analysis and Task Analysis: A Difference That Matters. In Cognitive Task Analysis , edited by J. M. Schraagen, S.
Final Environmental Assessment for Camp Rudder Master Plan at Eglin Air Force Base, FL
2005-06-07
warranting detailed analysis. Hazardous Materials The 6th RTB currently generates hazardous materials in the form of weapons cleaning products and...wastes. There would be no increase in the use of weapons cleaning products ; and therefore, this area does not require analysis. Additionally
Stitching Footballs: Voices of Children in Sailkot, Pakistan.
ERIC Educational Resources Information Center
Marcus, Rachel; Husselbee, David; Shah, Faiz; Harper, Annie; Ali, Bahar
This report details a situation analysis of children working in football stitching around Sialkot, Pakistan. The analysis (1) examined the reasons that children work and the probable impact of eradicating children's involvement and phasing out home-based production and (2) determined a baseline for monitoring changes in children's and families'…
Performance Analysis of GAME: A Generic Automated Marking Environment
ERIC Educational Resources Information Center
Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram
2008-01-01
This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…
Space station systems analysis study. Part 3: Documentation. Volume 5: Cost and schedule data
NASA Technical Reports Server (NTRS)
1977-01-01
Cost estimates for the space station systems analysis were recorded. Space construction base costs and characteristics were cited as well as mission hardware costs and characteristics. Also delineated were cost ground rules, the program schedule, and a detail cost estimate and funding distribution.
GIS-based Landing-Site Analysis and Passive Decision Support
NASA Astrophysics Data System (ADS)
van Gasselt, Stephan; Nass, Andrea
2016-04-01
The increase of surface coverage and the availability and accessibility of planetary data allow researchers and engineers to remotely perform detailed studies on surface processes and properties, in particular on objects such as Mars and the Moon for which Terabytes of multi-temporal data at multiple spatial resolution levels have become available during the last 15 years. Orbiters, rovers and landers have been returning information and insights into the surface evolution of the terrestrial planets in unprecedented detail. While rover- and lander-based analyses are one major research aim to obtain ground truth, resource exploration or even potential establishment of bases using autonomous platforms are others and they require detailed investigation of settings in order to identify spots on the surface that are suitable for spacecraft to land and operate safely and over a long period of time. What has been done using hardcopy material in the past is today being carried by using either in-house developments or off-the-shelf spatial information system technology which allows to manage, integrate and analyse data as well as visualize and create user-defined reports for performing assessments. Usually, such analyses can be broken down (manually) by considering scientific wishes, engineering boundary conditions, potential hazards and various tertiary constraints. We here (1) review standard tasks of landing site analyses, (2) discuss issues inherently related to the analysis using integrated spatial analysis systems and (3) demonstrate a modular analysis framework for integration of data and for the evaluation of results from individual tasks in order to support decisions for landing-site selection.
NASA Astrophysics Data System (ADS)
Didenko, A. N.; Nosyrev, M. Yu.; Shevchenko, B. F.; Gilmanova, G. Z.
2017-11-01
The depth of the base of the magnetoactive layer and the geothermal gradient in the Sikhote Alin crust are estimated based on a method determining the Curie depth point of magnetoactive masses by using spectral analysis of the anomalous magnetic field. A detailed map of the geothermal gradient is constructed for the first time for the Sikhote Alin and adjacent areas of the Central Asian belt. Analysis of this map shows that the zones with a higher geothermal gradient geographically fit the areas with a higher level of seismicity.
Mórocz, István Akos; Janoos, Firdaus; van Gelderen, Peter; Manor, David; Karni, Avi; Breznitz, Zvia; von Aster, Michael; Kushnir, Tammar; Shalev, Ruth
2012-01-01
The aim of this article is to report on the importance and challenges of a time-resolved and spatio-temporal analysis of fMRI data from complex cognitive processes and associated disorders using a study on developmental dyscalculia (DD). Participants underwent fMRI while judging the incorrectness of multiplication results, and the data were analyzed using a sequence of methods, each of which progressively provided more a detailed picture of the spatio-temporal aspect of this disease. Healthy subjects and subjects with DD performed alike behaviorally though they exhibited parietal disparities using traditional voxel-based group analyses. Further and more detailed differences, however, surfaced with a time-resolved examination of the neural responses during the experiment. While performing inter-group comparisons, a third group of subjects with dyslexia (DL) but with no arithmetic difficulties was included to test the specificity of the analysis and strengthen the statistical base with overall fifty-eight subjects. Surprisingly, the analysis showed a functional dissimilarity during an initial reading phase for the group of dyslexic but otherwise normal subjects, with respect to controls, even though only numerical digits and no alphabetic characters were presented. Thus our results suggest that time-resolved multi-variate analysis of complex experimental paradigms has the ability to yield powerful new clinical insights about abnormal brain function. Similarly, a detailed compilation of aberrations in the functional cascade may have much greater potential to delineate the core processing problems in mental disorders. PMID:22368322
Economic effects of propulsion system technology on existing and future transport aircraft
NASA Technical Reports Server (NTRS)
Sallee, G. P.
1974-01-01
The results of an airline study of the economic effects of propulsion system technology on current and future transport aircraft are presented. This report represents the results of a detailed study of propulsion system operating economics. The study has four major parts: (1) a detailed analysis of current propulsion system maintenance with respect to the material and labor costs encountered versus years in service and the design characteristics of the major elements of the propulsion system of the B707, b727, and B747. (2) an analysis of the economic impact of a future representative 1979 propulsion system is presented with emphasis on depreciation of investment, fuel costs and maintenance costs developed on the basis of the analysis of the historical trends observed. (3) recommendations concerning improved methods of forecasting the maintenance cost of future propulsion systems are presented. A detailed method based on the summation of the projected labor and material repair costs for each major engine module and its installation along with a shorter form suitable for quick, less detailed analysis are presented, and (4) recommendations concerning areas where additional technology is needed to improve the economics of future commercial propulsion systems are presented along with the suggested economic benefits available from such advanced technology efforts.
Fusion and quality analysis for remote sensing images using contourlet transform
NASA Astrophysics Data System (ADS)
Choi, Yoonsuk; Sharifahmadian, Ershad; Latifi, Shahram
2013-05-01
Recent developments in remote sensing technologies have provided various images with high spatial and spectral resolutions. However, multispectral images have low spatial resolution and panchromatic images have low spectral resolution. Therefore, image fusion techniques are necessary to improve the spatial resolution of spectral images by injecting spatial details of high-resolution panchromatic images. The objective of image fusion is to provide useful information by improving the spatial resolution and the spectral information of the original images. The fusion results can be utilized in various applications, such as military, medical imaging, and remote sensing. This paper addresses two issues in image fusion: i) image fusion method and ii) quality analysis of fusion results. First, a new contourlet-based image fusion method is presented, which is an improvement over the wavelet-based fusion. This fusion method is then applied to a case study to demonstrate its fusion performance. Fusion framework and scheme used in the study are discussed in detail. Second, quality analysis for the fusion results is discussed. We employed various quality metrics in order to analyze the fusion results both spatially and spectrally. Our results indicate that the proposed contourlet-based fusion method performs better than the conventional wavelet-based fusion methods.
Using the DOE Knowledge Base for Special Event Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, H.M.; Harris, J.M.; Young, C.J.
1998-10-20
The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by spatial proximity searches or through waveform correlation processing. The locations and waveforms of these events can then be made available for side-by-side comparison and processing. If synthetic modeling is thought to be warranted, a wide variety of rele- vant contextu~l information (e.g. crustal thickness and layering, seismic velocities, attenuation factors) can be retrieved and sent to the appropriate applications. Once formedj the synthetics can then be brought in for side-by-side comparison and fhrther processing. Based on our study, we make two general recommendations. First, proper inter-process communication between sensor data analysis software and contextual data analysis sofisvare should be developed. Second, some of the Knowl- edge Base data sets should be prioritized or winnowed to streamline comparison with observed quantities.« less
ERIC Educational Resources Information Center
Davids, Mogamat Razeen; Chikte, Usuf M. E.; Halperin, Mitchell L.
2011-01-01
This article reports on the development and evaluation of a Web-based application that provides instruction and hands-on practice in managing electrolyte and acid-base disorders. Our teaching approach, which focuses on concepts rather than details, encourages quantitative analysis and a logical problem-solving approach. Identifying any dangers to…
Tomographic assessment of the spine in children with spondylocostal dysotosis syndrome
Kaissi, Ali Al; Klaushofer, Klaus; Grill, Franz
2010-01-01
OBJECTIVE: The aim of this study was to perform a detailed tomographic analysis of the skull base, craniocervical junction, and the entire spine in seven patients with spondylocostal dysostosis syndrome. METHOD: Detailed scanning images have been organized in accordance with the most prominent clinical pathology. The reasons behind plagiocephaly, torticollis, short immobile neck, scoliosis and rigid back have been detected. Radiographic documentation was insufficient modality. RESULTS: Detailed computed tomography scans provided excellent delineation of the osseous abnormality pattern in our patients. CONCLUSION: This article throws light on the most serious osseous manifestations of spondylocostal dysostosis syndrome. PMID:21120293
NASA Astrophysics Data System (ADS)
Xiao, Heng; Gou, Xiaolong; Yang, Suwen
2011-05-01
Thermoelectric (TE) power generation technology, due to its several advantages, is becoming a noteworthy research direction. Many researchers conduct their performance analysis and optimization of TE devices and related applications based on the generalized thermoelectric energy balance equations. These generalized TE equations involve the internal irreversibility of Joule heating inside the thermoelectric device and heat leakage through the thermoelectric couple leg. However, it is assumed that the thermoelectric generator (TEG) is thermally isolated from the surroundings except for the heat flows at the cold and hot junctions. Since the thermoelectric generator is a multi-element device in practice, being composed of many fundamental TE couple legs, the effect of heat transfer between the TE couple leg and the ambient environment is not negligible. In this paper, based on basic theories of thermoelectric power generation and thermal science, detailed modeling of a thermoelectric generator taking account of the phenomenon of energy loss from the TE couple leg is reported. The revised generalized thermoelectric energy balance equations considering the effect of heat transfer between the TE couple leg and the ambient environment have been derived. Furthermore, characteristics of a multi-element thermoelectric generator with irreversibility have been investigated on the basis of the new derived TE equations. In the present investigation, second-law-based thermodynamic analysis (exergy analysis) has been applied to the irreversible heat transfer process in particular. It is found that the existence of the irreversible heat convection process causes a large loss of heat exergy in the TEG system, and using thermoelectric generators for low-grade waste heat recovery has promising potential. The results of irreversibility analysis, especially irreversible effects on generator system performance, based on the system model established in detail have guiding significance for the development and application of thermoelectric generators, particularly for the design and optimization of TE modules.
Analysis of off-axis tension test of wood specimens
Jen Y. Liu
2002-01-01
This paper presents a stress analysis of the off-axis tension test of clear wood specimens based on orthotropic elasticity theory. The effects of Poisson's ratio and shear coupling coefficient on stress distribution are analyzed in detail. The analysis also provides a theoretical foundation for the selection of a 10° grain angle in wood specimens for the...
Graph-based normalization and whitening for non-linear data analysis.
Aaron, Catherine
2006-01-01
In this paper we construct a graph-based normalization algorithm for non-linear data analysis. The principle of this algorithm is to get a spherical average neighborhood with unit radius. First we present a class of global dispersion measures used for "global normalization"; we then adapt these measures using a weighted graph to build a local normalization called "graph-based" normalization. Then we give details of the graph-based normalization algorithm and illustrate some results. In the second part we present a graph-based whitening algorithm built by analogy between the "global" and the "local" problem.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis
Steele, Joe; Bastola, Dhundy
2014-01-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502
DOT National Transportation Integrated Search
1987-07-01
This report details the results of an analysis that compared the Crash Avoidance : Research Data Base (CARDfile) with the National Accident Sampling System (NASS). : CARDfile combines, in one data base, the police accident records for three years : (...
NASA Astrophysics Data System (ADS)
Bell, Lisa Y.; Boles, Walter; Smith, Alvin
1991-08-01
In an environment of intense competition for Federal funding, the U.S. space research community is responsible for developing a feasible, cost-effective approach to establishing a surface base on the moon to fulfill long-term Government objectives. This report presents the results of a construction operations analysis of two lunar scenarios provided by the National Aeronautics and Space Administration (NASA). Activities necessary to install the lunar base surface elements are defined and scheduled, based on the productivities and availability of the base resources allocated to the projects depicted in each scenario. The only construction project in which the required project milestones were not completed within the nominal timeframe was the initial startup phase of NASA's FY89 Lunar Evolution Case Study (LECS), primarily because this scenario did not include any Earth-based telerobotic site preparation before the arrival of the first crew. The other scenario analyzed. Reference Mission A from NASA's 90-Day Study of the Human Exploration of the Moon and Mars, did use telerobotic site preparation before the manned phase of the base construction. Details of the analysis for LECS are provided, including spreadsheets indicating quantities of work and Gantt charts depicting the general schedule for the work. This level of detail is not presented for the scenario based on the 90-Day Study because many of the projects include the same (or similar) surface elements and facilities.
NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity.
Al-Awami, Ali K; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter; Hadwiger, Markus
2014-12-01
We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.
Nonexistence of global solutions of abstract wave equations with high energies.
Esquivel-Avila, Jorge A
2017-01-01
We consider an undamped second order in time evolution equation. For any positive value of the initial energy, we give sufficient conditions to conclude nonexistence of global solutions. The analysis is based on a differential inequality. The success of our result is based in a detailed analysis which is different from the ones commonly used to prove blow-up. Several examples are given improving known results in the literature.
Analysis of large space structures assembly: Man/machine assembly analysis
NASA Technical Reports Server (NTRS)
1983-01-01
Procedures for analyzing large space structures assembly via three primary modes: manual, remote and automated are outlined. Data bases on each of the assembly modes and a general data base on the shuttle capabilities to support structures assembly are presented. Task element times and structure assembly component costs are given to provide a basis for determining the comparative economics of assembly alternatives. The lessons learned from simulations of space structures assembly are detailed.
Three-Dimensional Numerical Analyses of Earth Penetration Dynamics
1979-01-31
Lagrangian formulation based on the HEMP method and has been adapted and validated for treatment of normal-incidence (axisymmetric) impact and...code, is a detailed analysis of the structural response of the EPW. This analysis is generated using a nonlinear dynamic, elastic- plastic finite element...based on the HEMP scheme. Thus, the code has the same material modeling capabilities and abilities to track large scale motion found in the WAVE-L code
Research on three-dimensional visualization based on virtual reality and Internet
NASA Astrophysics Data System (ADS)
Wang, Zongmin; Yang, Haibo; Zhao, Hongling; Li, Jiren; Zhu, Qiang; Zhang, Xiaohong; Sun, Kai
2007-06-01
To disclose and display water information, a three-dimensional visualization system based on Virtual Reality (VR) and Internet is researched for demonstrating "digital water conservancy" application and also for routine management of reservoir. To explore and mine in-depth information, after completion of modeling high resolution DEM with reliable quality, topographical analysis, visibility analysis and reservoir volume computation are studied. And also, some parameters including slope, water level and NDVI are selected to classify easy-landslide zone in water-level-fluctuating zone of reservoir area. To establish virtual reservoir scene, two kinds of methods are used respectively for experiencing immersion, interaction and imagination (3I). First virtual scene contains more detailed textures to increase reality on graphical workstation with virtual reality engine Open Scene Graph (OSG). Second virtual scene is for internet users with fewer details for assuring fluent speed.
NASA Astrophysics Data System (ADS)
Sutton, M. A.; Gilat, A.; Seidt, J.; Rajan, S.; Kidane, A.
2018-01-01
The very early stages of high rate tensile loading are important when attempting to characterize the response of materials during the transient loading time. To improve understanding of the conditions imposed on the specimen during the transient stage, a series of high rate loading experiments are performed using a Kolsky tensile bar system. Specimen forces and velocities during the high rate loading experiment are obtained by performing a thorough method of characteristics analysis of the system employed in the experiments. The in-situ full-field specimen displacements, velocities and accelerations during the loading process are quantified using modern ultra-high-speed imaging systems to provide detailed measurements of specimen response, with emphasis on the earliest stages of loading. Detailed analysis of the image-based measurements confirms that conditions are nominally consistent with those necessary for use of the one-dimensional wave equation within the relatively thin, dog-bone shaped tensile specimen. Specifically, measurements and use of the one-dimensional wave equation show clearly that the specimen has low inertial stresses in comparison to the applied transmitted force. Though the accelerations of the specimen continue for up to 50 μs, measurements show that the specimen is essentially in force equilibrium beginning a few microseconds after initial loading. These local measurements contrast with predictions based on comparison of the wave-based incident force measurements, which suggest that equilibrium occurs much later, on the order of 40-50 μs .
Clinical and Molecular Consequences of NF1 Microdeletion
2006-05-01
service based on meta -PCR/sequencing, dosage analysis , and loss of heterozygosity analysis . Genet Test 2004;8(4):368-80. 51. Kluwe L, Mautner VF. Mosaicism...neurofibromin in normal centrosome function and in maintaining genome stability. Our detailed analysis of human and chimpanzee genome sequences were...chromosomes and DNA fibers (1). Tandem duplication of the region would have significant impact on many aspects of NF1 research, e.g., mutational analysis
A Method for Populating the Knowledge Base of AFIT’s Domain-Oriented Application Composition System
1993-12-01
Analysis ( FODA ). The approach identifies prominent features (similarities) and distinctive features (differences) of software systems within an... analysis approaches we have summarized, the re- searchers described FODA in sufficient detail to use on large domain analysis projects (ones with...Software Technology Center, July 1991. 18. Kang, Kyo C. and others. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report, Software
Shuttle Electrical Power Analysis Program (SEPAP); single string circuit analysis report
NASA Technical Reports Server (NTRS)
Murdock, C. R.
1974-01-01
An evaluation is reported of the data obtained from an analysis of the distribution network characteristics of the shuttle during a spacelab mission. A description of the approach utilized in the development of the computer program and data base is provided and conclusions are drawn from the analysis of the data. Data sheets are provided for information to support the detailed discussion on each computer run.
Detailed analysis of an optimized FPP-based 3D imaging system
NASA Astrophysics Data System (ADS)
Tran, Dat; Thai, Anh; Duong, Kiet; Nguyen, Thanh; Nehmetallah, Georges
2016-05-01
In this paper, we present detail analysis and a step-by-step implementation of an optimized fringe projection profilometry (FPP) based 3D shape measurement system. First, we propose a multi-frequency and multi-phase shifting sinusoidal fringe pattern reconstruction approach to increase accuracy and sensitivity of the system. Second, phase error compensation caused by the nonlinear transfer function of the projector and camera is performed through polynomial approximation. Third, phase unwrapping is performed using spatial and temporal techniques and the tradeoff between processing speed and high accuracy is discussed in details. Fourth, generalized camera and system calibration are developed for phase to real world coordinate transformation. The calibration coefficients are estimated accurately using a reference plane and several gauge blocks with precisely known heights and by employing a nonlinear least square fitting method. Fifth, a texture will be attached to the height profile by registering a 2D real photo to the 3D height map. The last step is to perform 3D image fusion and registration using an iterative closest point (ICP) algorithm for a full field of view reconstruction. The system is experimentally constructed using compact, portable, and low cost off-the-shelf components. A MATLAB® based GUI is developed to control and synchronize the whole system.
RootGraph: a graphic optimization tool for automated image analysis of plant roots
Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.
2015-01-01
This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880
Analyzing Visibility Configurations.
Dachsbacher, C
2011-04-01
Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.
Cnn Based Retinal Image Upscaling Using Zero Component Analysis
NASA Astrophysics Data System (ADS)
Nasonov, A.; Chesnakov, K.; Krylov, A.
2017-05-01
The aim of the paper is to obtain high quality of image upscaling for noisy images that are typical in medical image processing. A new training scenario for convolutional neural network based image upscaling method is proposed. Its main idea is a novel dataset preparation method for deep learning. The dataset contains pairs of noisy low-resolution images and corresponding noiseless highresolution images. To achieve better results at edges and textured areas, Zero Component Analysis is applied to these images. The upscaling results are compared with other state-of-the-art methods like DCCI, SI-3 and SRCNN on noisy medical ophthalmological images. Objective evaluation of the results confirms high quality of the proposed method. Visual analysis shows that fine details and structures like blood vessels are preserved, noise level is reduced and no artifacts or non-existing details are added. These properties are essential in retinal diagnosis establishment, so the proposed algorithm is recommended to be used in real medical applications.
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Trefny, C. J.; Steffen, C. J., Jr.
1999-01-01
Design and analysis of the inlet for a rocket based combined cycle engine is discussed. Computational fluid dynamics was used in both the design and subsequent analysis. Reynolds averaged Navier-Stokes simulations were performed using both perfect gas and real gas assumptions. An inlet design that operates over the required Mach number range from 0 to 12 was produced. Performance data for cycle analysis was post processed using a stream thrust averaging technique. A detailed performance database for cycle analysis is presented. The effect ot vehicle forebody compression on air capture is also examined.
ERIC Educational Resources Information Center
Pu, Rongsun
2010-01-01
This article describes how to use protein extraction, quantification, and analysis in the undergraduate teaching laboratory to engage students in inquiry-based, discovery-driven learning. Detailed instructions for obtaining proteins from animal tissues, using BCA assay to quantify the proteins, and data analysis are provided. The experimental…
Pros and cons of practice-owned and office-based ambulatory surgery centers.
Bert, J M
2000-01-01
A detailed feasibility analysis is imperative to ensure the success of a practice-owned ASC. Analysis of the payer mix and the market relating to surgical volume that can be performed at the ASC is imperative. If overbuilding, overequipping, and overstaffing are avoided and the group has adequate volume that can be managed at the ASC, the facility should be a success. Building a practice-owned ASC without an accurate and detailed financial feasibility and payer study can place the endeavor at risk. A well-planned, economically constructed and properly managed ASC will result in an efficient and successful ancillary service for the orthopedic group practice.
The NHS Redress Act 2006 (UK): background and analysis.
Munro, Howard
2009-08-01
The NHS Redress Act 2006 (UK) is an example of a legislated compensation scheme for adverse health care incidents that aims to supplement the tort-based system of compensation, without going all the way to adopting a no-fault compensation system. It proposes an administrative method of providing speedier and more efficient and responsive remedies to adverse health care incidents than traditional legal proceedings. This article examines the detail of the United Kingdom policy arguments both prior to and since the passage of the legislation, as well as providing a detailed analysis of the original Bill, the parliamentary debates and the subsequent Act.
Urban Planning and Management Information Systems Analysis and Design Based on GIS
NASA Astrophysics Data System (ADS)
Xin, Wang
Based on the analysis of existing relevant systems on the basis of inadequate, after a detailed investigation and research, urban planning and management information system will be designed for three-tier structure system, under the LAN using C/S mode architecture. Related functions for the system designed in accordance with the requirements of the architecture design of the functional relationships between the modules. Analysis of the relevant interface and design, data storage solutions proposed. The design for small and medium urban planning information system provides a viable building program.
Tamer, Ömer; Avcı, Davut; Atalay, Yusuf
2014-01-03
The molecular modeling of N-(2-hydroxybenzylidene)acetohydrazide (HBAH) was carried out using B3LYP, CAMB3LYP and PBE1PBE levels of density functional theory (DFT). The molecular structure of HBAH was solved by means of IR, NMR and UV-vis spectroscopies. In order to find the stable conformers, conformational analysis was performed based on B3LYP level. A detailed vibrational analysis was made on the basis of potential energy distribution (PED). HOMO and LUMO energies were calculated, and the obtained energies displayed that charge transfer occurs in HBAH. NLO analysis indicated that HBAH can be used as an effective NLO material. NBO analysis also proved that charge transfer, conjugative interactions and intramolecular hydrogen bonding interactions occur through HBAH. Additionally, major contributions from molecular orbitals to the electronic transitions were investigated theoretically. Copyright © 2013 Elsevier B.V. All rights reserved.
Measurement and control of detailed electronic properties in a single molecule break junction.
Wang, Kun; Hamill, Joseph; Zhou, Jianfeng; Guo, Cunlan; Xu, Bingqian
2014-01-01
The lack of detailed experimental controls has been one of the major obstacles hindering progress in molecular electronics. While large fluctuations have been occurring in the experimental data, specific details, related mechanisms, and data analysis techniques are in high demand to promote our physical understanding at the single-molecule level. A series of modulations we recently developed, based on traditional scanning probe microscopy break junctions (SPMBJs), have helped to discover significant properties in detail which are hidden in the contact interfaces of a single-molecule break junction (SMBJ). For example, in the past we have shown that the correlated force and conductance changes under the saw tooth modulation and stretch-hold mode of PZT movement revealed inherent differences in the contact geometries of a molecular junction. In this paper, using a bias-modulated SPMBJ and utilizing emerging data analysis techniques, we report on the measurement of the altered alignment of the HOMO of benzene molecules with changing the anchoring group which coupled the molecule to metal electrodes. Further calculations based on Landauer fitting and transition voltage spectroscopy (TVS) demonstrated the effects of modulated bias on the location of the frontier molecular orbitals. Understanding the alignment of the molecular orbitals with the Fermi level of the electrodes is essential for understanding the behaviour of SMBJs and for the future design of more complex devices. With these modulations and analysis techniques, fruitful information has been found about the nature of the metal-molecule junction, providing us insightful clues towards the next step for in-depth study.
78 FR 66929 - Intent To Conduct a Detailed Economic Impact Analysis
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-07
... EXPORT-IMPORT BANK Intent To Conduct a Detailed Economic Impact Analysis AGENCY: Policy and... Federal Register notice informing the public of its intent to conduct a detailed economic impact analysis... subject to a detailed economic impact analysis. DATES: The Federal Register notice published on August 5...
Preliminary Cost Model for Space Telescopes
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Prince, F. Andrew; Smart, Christian; Stephens, Kyle; Henrichs, Todd
2009-01-01
Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. However, great care is required. Some space telescope cost models, such as those based only on mass, lack sufficient detail to support such analysis and may lead to inaccurate conclusions. Similarly, using ground based telescope models which include the dome cost will also lead to inaccurate conclusions. This paper reviews current and historical models. Then, based on data from 22 different NASA space telescopes, this paper tests those models and presents preliminary analysis of single and multi-variable space telescope cost models.
NASA Technical Reports Server (NTRS)
Hu, Shaowen; Cucinotta, Francis A.
2009-01-01
The Ku70/80 heterodimer is the first repair protein in the initial binding of double-strand break (DSB) ends following DNA damage, and is a component of nonhomologous end joining repair, the primary pathway for DSB repair in mammalian cells. In this study we constructed a full-length human Ku70 structure based on its crystal structure, and performed 20 ns conventional molecular dynamic (CMD) simulations on this protein and several other complexes with short DNA duplexes of different sequences. The trajectories of these simulations indicated that, without the topological support of Ku80, the residues in the bridge and C-terminal arm of Ku70 are more flexible than other experimentally identified domains. We studied the two missing loops in the crystal structure and predicted that they are also very flexible. Simulations revealed that they make an important contribution to the Ku70 interaction with DNA. Dislocation of the previously studied SAP domain was observed in several systems, implying its role in DNA binding. Targeted molecular dynamic (TMD) simulation was also performed for one system with a far-away 14bp DNA duplex. The TMD trajectory and energetic analysis disclosed detailed interactions of the DNA-binding residues during the DNA dislocation, and revealed a possible conformational transition for a DSB end when encountering Ku70 in solution. Compared to experimentally based analysis, this study identified more detailed interactions between DNA and Ku70. Free energy analysis indicated Ku70 alone is able to bind DNA with relatively high affinity, with consistent contributions from various domains of Ku70 in different systems. The functional implications of these domains in the processes of Ku heterodimerization and DNA damage recognition and repair can be characterized in detail based upon this analysis.
Re'class'ification of 'quant'ified classical simulated annealing
NASA Astrophysics Data System (ADS)
Tanaka, Toshiyuki
2009-12-01
We discuss a classical reinterpretation of quantum-mechanics-based analysis of classical Markov chains with detailed balance, that is based on the quantum-classical correspondence. The classical reinterpretation is then used to demonstrate that it successfully reproduces a sufficient condition for cooling schedule in classical simulated annealing, which has the inverse-logarithmic scaling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Mergers of Non-spinning Black-hole Binaries: Gravitational Radiation Characteristics
NASA Technical Reports Server (NTRS)
Baker, John G.; Boggs, William D.; Centrella, Joan; Kelly, Bernard J.; McWilliams, Sean T.; vanMeter, James R.
2008-01-01
We present a detailed descriptive analysis of the gravitational radiation from black-hole binary mergers of non-spinning black holes, based on numerical simulations of systems varying from equal-mass to a 6:1 mass ratio. Our primary goal is to present relatively complete information about the waveforms, including all the leading multipolar components, to interested researchers. In our analysis, we pursue the simplest physical description of the dominant features in the radiation, providing an interpretation of the waveforms in terms of an implicit rotating source. This interpretation applies uniformly to the full wavetrain, from inspiral through ringdown. We emphasize strong relationships among the l = m modes that persist through the full wavetrain. Exploring the structure of the waveforms in more detail, we conduct detailed analytic fitting of the late-time frequency evolution, identifying a key quantitative feature shared by the l = m modes among all mass-ratios. We identify relationships, with a simple interpretation in terms of the implicit rotating source, among the evolution of frequency and amplitude, which hold for the late-time radiation. These detailed relationships provide sufficient information about the late-time radiation to yield a predictive model for the late-time waveforms, an alternative to the common practice of modeling by a sum of quasinormal mode overtones. We demonstrate an application of this in a new effective-one-body-based analytic waveform model.
Hirshhorn, Marnie; Grady, Cheryl; Rosenbaum, R Shayna; Winocur, Gordon; Moscovitch, Morris
2012-11-01
Functional magnetic resonance imaging (fMRI) was used to compare brain activity during the retrieval of coarse- and fine-grained spatial details and episodic details associated with a familiar environment. Long-time Toronto residents compared pairs of landmarks based on their absolute geographic locations (requiring either coarse or fine discriminations) or based on previous visits to those landmarks (requiring episodic details). An ROI analysis of the hippocampus showed that all three conditions activated the hippocampus bilaterally. Fine-grained spatial judgments recruited an additional region of the right posterior hippocampus, while episodic judgments recruited an additional region of the right anterior hippocampus, and a more extensive region along the length of the left hippocampus. To examine whole-brain patterns of activity, Partial Least Squares (PLS) analysis was used to identify sets of brain regions whose activity covaried with the three conditions. All three comparison judgments recruited the default mode network including the posterior cingulate/retrosplenial cortex, middle frontal gyrus, hippocampus, and precuneus. Fine-grained spatial judgments also recruited additional regions of the precuneus, parahippocampal cortex and the supramarginal gyrus. Episodic judgments recruited the posterior cingulate and medial frontal lobes as well as the angular gyrus. These results are discussed in terms of their implications for theories of hippocampal function and spatial and episodic memory. Copyright © 2012 Elsevier Ltd. All rights reserved.
Mergers of nonspinning black-hole binaries: Gravitational radiation characteristics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, John G.; Centrella, Joan; Kelly, Bernard J.
2008-08-15
We present a detailed descriptive analysis of the gravitational radiation from black-hole binary mergers of nonspinning black holes, based on numerical simulations of systems varying from equal mass to a 6 ratio 1 mass ratio. Our primary goal is to present relatively complete information about the waveforms, including all the leading multipolar components, to interested researchers. In our analysis, we pursue the simplest physical description of the dominant features in the radiation, providing an interpretation of the waveforms in terms of an implicit rotating source. This interpretation applies uniformly to the full wave train, from inspiral through ringdown. We emphasizemore » strong relationships among the l=m modes that persist through the full wave train. Exploring the structure of the waveforms in more detail, we conduct detailed analytic fitting of the late-time frequency evolution, identifying a key quantitative feature shared by the l=m modes among all mass ratios. We identify relationships, with a simple interpretation in terms of the implicit rotating source, among the evolution of frequency and amplitude, which hold for the late-time radiation. These detailed relationships provide sufficient information about the late-time radiation to yield a predictive model for the late-time waveforms, an alternative to the common practice of modeling by a sum of quasinormal mode overtones. We demonstrate an application of this in a new effective-one-body-based analytic waveform model.« less
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
Exploitation of ERTS-1 imagery utilizing snow enhancement techniques
NASA Technical Reports Server (NTRS)
Wobber, F. J.; Martin, K. R.
1973-01-01
Photogeological analysis of ERTS-simulation and ERTS-1 imagery of snowcovered terrain within the ERAP Feather River site and within the New England (ERTS) test area provided new fracture detail which does not appear on available geological maps. Comparative analysis of snowfree ERTS-1 images has demonstrated that MSS Bands 5 and 7 supply the greatest amount of geological fracture detail. Interpretation of the first snow-covered ERTS-1 images in correlation with ground snow depth data indicates that a heavy blanket of snow (more than 9 inches) accentuates major structural features while a light "dusting", (less than 1 inch) accentuates more subtle topographic expressions. An effective mail-based method for acquiring timely ground-truth (snowdepth) information was established and provides a ready correlation of fracture detail with snow depth so as to establish the working limits of the technique. The method is both efficient and inexpensive compared with the cost of similarly scaled direct field observations.
A detail enhancement and dynamic range adjustment algorithm for high dynamic range images
NASA Astrophysics Data System (ADS)
Xu, Bo; Wang, Huachuang; Liang, Mingtao; Yu, Cong; Hu, Jinlong; Cheng, Hua
2014-08-01
Although high dynamic range (HDR) images contain large amounts of information, they have weak texture and low contrast. What's more, these images are difficult to be reproduced on low dynamic range displaying mediums. If much more information is to be acquired when these images are displayed on PCs, some specific transforms, such as compressing the dynamic range, enhancing the portions of little difference in original contrast and highlighting the texture details on the premise of keeping the parts of large contrast, are needed. To this ends, a multi-scale guided filter enhancement algorithm which derives from the single-scale guided filter based on the analysis of non-physical model is proposed in this paper. Firstly, this algorithm decomposes the original HDR images into base image and detail images of different scales, and then it adaptively selects a transform function which acts on the enhanced detail images and original images. By comparing the treatment effects of HDR images and low dynamic range (LDR) images of different scene features, it proves that this algorithm, on the basis of maintaining the hierarchy and texture details of images, not only improves the contrast and enhances the details of images, but also adjusts the dynamic range well. Thus, it is much suitable for human observation or analytical processing of machines.
Seismic risk analysis for the Babcock and Wilcox facility, Leechburg, Pennsylvania
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-10-21
The results of a detailed seismic risk analysis of the Babcock and Wilcox Plutonium Fuel Fabrication facility at Leechburg, Pennsylvania are presented. This report focuses on earthquakes; the other natural hazards, being addressed in separate reports, are severe weather (strong winds and tornados) and floods. The calculational method used is based on Cornell's work (1968); it has been previously applied to safety evaluations of major projects. The historical seismic record was established after a review of available literature, consultation with operators of local seismic arrays and examination of appropriate seismic data bases. Because of the aseismicity of the region aroundmore » the site, an analysis different from the conventional closest approach in a tectonic province was adapted. Earthquakes as far from the site as 1,000 km were included, as were the possibility of earthquakes at the site. In addition, various uncertainties in the input were explicitly considered in the analysis. The results of the risk analysis, which include a Bayesian estimate of the uncertainties, are presented, expressed as return period accelerations. The best estimate curve indicates that the Babcock and Wilcox facility will experience 0.05 g every 220 years and 0.10 g every 1400 years. The bounding curves roughly represent the one standard deviation confidence limits about the best estimate, reflecting the uncertainty in certain of the input. Detailed examination of the results show that the accelerations are very insensitive to the details of the source region geometries or the historical earthquake statistics in each region and that each of the source regions contributes almost equally to the cumulative risk at the site. If required for structural analysis, acceleration response spectra for the site can be constructed by scaling the mean response spectrum for alluvium in WASH 1255 by these peak accelerations.« less
Hydrodynamic design of generic pump components
NASA Technical Reports Server (NTRS)
Eastland, A. H. J.; Dodson, H. C.
1991-01-01
Inducer and impellar base geometries were defined for a fuel pump for a generic generator cycle. Blade surface data and inlet flowfield definition are available in sufficient detail to allow computational fluid dynamic analysis of the two components.
Guided SAR image despeckling with probabilistic non local weights
NASA Astrophysics Data System (ADS)
Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny
2017-12-01
SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.
Self-organizing maps: a versatile tool for the automatic analysis of untargeted imaging datasets.
Franceschi, Pietro; Wehrens, Ron
2014-04-01
MS-based imaging approaches allow for location-specific identification of chemical components in biological samples, opening up possibilities of much more detailed understanding of biological processes and mechanisms. Data analysis, however, is challenging, mainly because of the sheer size of such datasets. This article presents a novel approach based on self-organizing maps, extending previous work in order to be able to handle the large number of variables present in high-resolution mass spectra. The key idea is to generate prototype images, representing spatial distributions of ions, rather than prototypical mass spectra. This allows for a two-stage approach, first generating typical spatial distributions and associated m/z bins, and later analyzing the interesting bins in more detail using accurate masses. The possibilities and advantages of the new approach are illustrated on an in-house dataset of apple slices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Williams, D. L.; Walthall, C. L.; Goward, S. N.
1984-01-01
An important part of fundamental remote sensing research is based on the measurement and analysis of spectral reflectance from earth surface materials in situ. It has been found that for an effective analysis of the target of interest, different applications of remotely sensed data require spectral measurements from different portions of the electromagnetic spectrum. It is pointed out that the detailed spectral reflectance characteristics of forest vegetation are currently not well understood, particularly in the middle infrared wavelength region. Details regarding the need for in situ forest canopy measurements are examined, taking into account certain difficulties arising in the case of satellite observations. Because of these difficulties, the present paper provides a discussion of methodology and preliminary spectra based on an experiment to use a helicopter as an observing platform for in situ forest canopy spectra measurement.
ERIC Educational Resources Information Center
Finch, Holmes
2010-01-01
The accuracy of item parameter estimates in the multidimensional item response theory (MIRT) model context is one that has not been researched in great detail. This study examines the ability of two confirmatory factor analysis models specifically for dichotomous data to properly estimate item parameters using common formulae for converting factor…
ERIC Educational Resources Information Center
McCormick, Joe Lew
This study examined major stakeholders' perceptions of their involvement and role in the legislative process surrounding the introduction, deliberation, and ultimate passage of the Direct Loan Demonstration Program (DLDP), a federal pilot student loan program. Data analysis was based on a detailed description of the legislative process surrounding…
Using Toulmin Analysis to Analyse an Instructor's Proof Presentation in Abstract Algebra
ERIC Educational Resources Information Center
Fukawa-Connelly, Timothy
2014-01-01
This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of…
A Content Analysis Concerning the Studies on Challenges Faced by Novice Teachers
ERIC Educational Resources Information Center
Kozikoglu, Ishak
2017-01-01
The purpose of this research is to analyze the studies concerning challenges faced by novice teachers in terms of various aspects and compare challenges according to location of the studies conducted in Turkey and abroad. A total of 30 research studies were included in detailed analysis. This is a descriptive study based on qualitative research…
Teaching Prejudice: A Content Analysis of Social Studies Textbooks Authorized for Use in Ontario.
ERIC Educational Resources Information Center
McDiarmid, Garnet; Pratt, David
This report of a study, undertaken at the request of the Ontario Human Rights Commission, details: 1) precedents and historical backgrounds in textbook analysis; 2) the methodology of the present study; and, 3) recommendations based on the findings. Groups selected for study were: Jews, immigrants, Moslems, Negroes, and American Indians. The…
Yajima, Shuichi; Shimizu, Hisanori; Sakamaki, Hiroyuki; Ikeda, Shunya; Ikegami, Naoki; Murayama, Jun-Ichiro
2016-01-04
Various chemotherapy regimens for advanced colorectal cancer have been introduced to clinical practice in Japan over the past decade. The cost profiles of these regimens, however, remain unclear in Japan. To explore the detailed costs of different regimens used to treat advanced colorectal cancer during the entire course of chemotherapy in patients treated in a practical setting, we conducted a so-called "real-world" cost analysis. A detailed cost analysis was performed retrospectively. Patients with advanced colorectal cancer who had received chemotherapy in a practical healthcare setting from July 2004 through October 2010 were extracted from the ordering system database of Showa University Hospital. Direct medical costs of chemotherapy regimens were calculated from the hospital billing data of the patients. The analysis was conducted from a payer's perspective. A total of 30 patients with advanced colorectal cancer were identified. Twenty patients received up to second-line treatment, and 8 received up to third-line treatment. The regimens identified from among all courses of treatment in all patients were 13 oxaliplatin-based regimens, 31 irinotecan-based regimens, and 11 regimens including molecular targeted agents. The average (95% confidence interval [95% CI]) monthly cost during the overall period from the beginning of treatment to the end of treatment was 308,363 (258,792 to 357,933) Japanese yen (JPY). According to the type of regimen, the average monthly cost was 418,463 (357,413 to 479,513) JPY for oxaliplatin-based regimens, 215,499 (188,359 to 242,639) JPY for irinotecan-based regimens, and 705,460 (586,733 to 824,187) JPY for regimens including molecular targeted agents. Anticancer drug costs and hospital fees accounted for 50 to 77% and 11 to 25% of the overall costs of chemotherapy, respectively. The costs of irinotecan-based regimens were lower than those of oxaliplatin-based regimens and regimens including molecular targeted agents in Japan. Using a lower cost regimen for first-line treatment can potentially reduce the overall cost of chemotherapy. The main cost drivers were the anticancer drug costs and hospitalization costs.
DataHub: Science data management in support of interactive exploratory analysis
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Rubin, Mark R.
1993-01-01
The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
DataHub - Science data management in support of interactive exploratory analysis
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Rubin, Mark R.
1993-01-01
DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
Binary partition tree analysis based on region evolution and its application to tree simplification.
Lu, Huihai; Woods, John C; Ghanbari, Mohammed
2007-04-01
Pyramid image representations via tree structures are recognized methods for region-based image analysis. Binary partition trees can be applied which document the merging process with small details found at the bottom levels and larger ones close to the root. Hindsight of the merging process is stored within the tree structure and provides the change histories of an image property from the leaf to the root node. In this work, the change histories are modelled by evolvement functions and their second order statistics are analyzed by using a knee function. Knee values show the reluctancy of each merge. We have systematically formulated these findings to provide a novel framework for binary partition tree analysis, where tree simplification is demonstrated. Based on an evolvement function, for each upward path in a tree, the tree node associated with the first reluctant merge is considered as a pruning candidate. The result is a simplified version providing a reduced solution space and still complying with the definition of a binary tree. The experiments show that image details are preserved whilst the number of nodes is dramatically reduced. An image filtering tool also results which preserves object boundaries and has applications for segmentation.
Capturing Fine Details Involving Low-Cost Sensors -a Comparative Study
NASA Astrophysics Data System (ADS)
Rehany, N.; Barsi, A.; Lovas, T.
2017-11-01
Capturing the fine details on the surface of small objects is a real challenge to many conventional surveying methods. Our paper discusses the investigation of several data acquisition technologies, such as arm scanner, structured light scanner, terrestrial laser scanner, object line-scanner, DSLR camera, and mobile phone camera. A palm-sized embossed sculpture reproduction was used as a test object; it has been surveyed by all the instruments. The result point clouds and meshes were then analyzed, using the arm scanner's dataset as reference. In addition to general statistics, the results have been evaluated based both on 3D deviation maps and 2D deviation graphs; the latter allows even more accurate analysis of the characteristics of the different data acquisition approaches. Additionally, own-developed local minimum maps were created that nicely visualize the potential level of detail provided by the applied technologies. Besides the usual geometric assessment, the paper discusses the different resource needs (cost, time, expertise) of the discussed techniques. Our results proved that even amateur sensors operated by amateur users can provide high quality datasets that enable engineering analysis. Based on the results, the paper contains an outlook to potential future investigations in this field.
High Efficiency Large Area Polysilicon Solar Cells
NASA Technical Reports Server (NTRS)
Johnson, S. M.; Winter, C.
1985-01-01
Large area (100 sq cm) polysilicon solar cells having efficiencies of up to 14.1% (100 mW/sq cm, 25 C) were fabricated and a detailed analysis was performed to identify the efficiency loss mechanisms. The 1-5 characteristics of the best cell were dominated by recombination in the quasi-neutral base due to the combination of minority carrier diffusion length and base resistivity. An analysis of the microstructural defects present in the material and their effect on the electrical properties is presented.
ERIC Educational Resources Information Center
Stromsdorfer, Ernst W.; Moayed-Dadkhah, Kamran
Presenting a cost-benefit analysis of the Mountain-Plains Career Education Program (a family based program for the economically deprived in the mountain plains states operating out of Glasgow Air Force Base in Montana) and the methodological basis for a full and more detailed study, this evaluation includes: (1) discussion of theoretical issues…
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Terrain-analysis procedures for modeling radar backscatter
Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis
1978-01-01
The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.
Expert systems in transmission planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galiana, F.D.; McGillis, D.T.; Marin, M.A.
1992-05-01
In this paper the state of the field of expert systems and knowledge engineering in transmission planning is reviewed. A detailed analysis of the goals, definition, requirements and methodology of transmission planning is presented. Potential benefits of knowledge-based applications in transmission planning are reviewed. This is followed by a thorough review of the area broken down into subareas or important related topics. The conclusions offer a number of suggestions for possible future research and development. Finally, a detailed bibliography divided into subareas is presented.
LED traffic signal replacement schedules : facilitating smooth freight flows.
DOT National Transportation Integrated Search
2011-11-01
This research details a field study of LED traffic signals in Missouri and develops a replacement schedule based on key findings. : Rates of degradation were statistically analyzed using Analysis of Variance (ANOVA). Results of this research will pro...
An analysis of the job of railroad train dispatcher.
DOT National Transportation Integrated Search
1974-04-01
This report constitutes a detailed study of the job of railroad train dispatcher, conducted to provide a data base for the derivation of criteria of job knowledge, skills and training consonant with safe operations. Documentation was reviewed; specia...
The need for conducting forensic analysis of decommissioned bridges.
DOT National Transportation Integrated Search
2014-01-01
A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...
Geothermal industry assessment
NASA Astrophysics Data System (ADS)
1980-07-01
Focus is on industry structure, corporate activities and strategies, and detailed analysis of the technological, economic, financial, and institutional issues important to government policy formulation. The study is based principally on confidential interviews and with executives of 75 companies active in the field.
Sizing and Lifecycle Cost Analysis of an Ares V Composite Interstage
NASA Technical Reports Server (NTRS)
Mann, Troy; Smeltzer, Stan; Grenoble, Ray; Mason, Brian; Rosario, Sev; Fairbairn, Bob
2012-01-01
The Interstage Element of the Ares V launch vehicle was sized using a commercially available structural sizing software tool. Two different concepts were considered, a metallic design and a composite design. Both concepts were sized using similar levels of analysis fidelity and included the influence of design details on each concept. Additionally, the impact of the different manufacturing techniques and failure mechanisms for composite and metallic construction were considered. Significant details were included in analysis models of each concept, including penetrations for human access, joint connections, as well as secondary loading effects. The designs and results of the analysis were used to determine lifecycle cost estimates for the two Interstage designs. Lifecycle cost estimates were based on industry provided cost data for similar launch vehicle components. The results indicated that significant mass as well as cost savings are attainable for the chosen composite concept as compared with a metallic option.
17 CFR 229.910 - (Item 910) Fairness of the transaction.
Code of Federal Regulations, 2010 CFR
2010-04-01
... reasonable detail the material factors upon which the belief stated in paragraph (a) of this Item (§ 229.910) is based and, to the extent practicable, the weight assigned to each such factor. Such discussion should include an analysis of the extent, if any, to which such belief is based on the factors set forth...
Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis
NASA Astrophysics Data System (ADS)
Fu, Pei-hua; Yin, Hong-bo
In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.
Skull base bony lesions: Management nuances; a retrospective analysis from a Tertiary Care Centre
Singh, Amit Kumar; Srivastava, Arun Kumar; Sardhara, Jayesh; Bhaisora, Kamlesh Singh; Das, Kuntal Kanti; Mehrotra, Anant; Sahu, Rabi Narayan; Jaiswal, Awadhesh Kumar; Behari, Sanjay
2017-01-01
Background: Skull base lesions are not uncommon, but their management has been challenging for surgeons. There is large no of bony tumors at the skull base which has not been studied in detail as a group. These tumors are difficult not only because of their location but also due to their variability in the involvement of important local structure. Through this retrospective analysis from a Tertiary Care Centre, we are summarizing the details of skull base bony lesions and its management nuances. Materials and Methods: The histopathologically, radiologically, and surgically proven cases of skull base bony tumors or lesions involving bone were analyzed from the neurosurgery, neuropathology record of our Tertiary Care Institute from January 2009 to January 2014. All available preoperative and postoperative details were noted from their case files. The extent of excision was ascertained from operation records and postoperative magnetic resonance imaging if available. Results: We have surgically managed 41 cases of skull base bony tumors. It includes 11 patients of anterior skull base, 13 middle skull base, and 17 posterior skull base bony tumors. The most common bony tumor was chordoma 15 (36.6%), followed by fibrous dysplasia 5 (12.2%), chondrosarcoma (12.2%), and ewings sarcoma-peripheral primitive neuroectodermal tumor (EWS-pPNET) five cases (12.2%) each. There were more malignant lesions (n = 29, 70.7%) at skull base than benign (n = 12, 29.3%) lesions. The surgical approach employed depended on location of tumor and pathology. Total mortality was 8 (20%) of whom 5 patients were of histological proven EWS-pPNET. Conclusions: Bony skull base lesion consists of wide variety of lesions, and requires multispecialty management. The complex lesions required tailored approaches surgery of these lesions. With the advent of microsurgical and endoscopic techniques, and use of navigation better outcomes are being seen, but these lesions require further study for development of proper management plan. PMID:28761532
Aerodynamic design and analysis of small horizontal axis wind turbine blades
NASA Astrophysics Data System (ADS)
Tang, Xinzi
This work investigates the aerodynamic design and analysis of small horizontal axis wind turbine blades via the blade element momentum (BEM) based approach and the computational fluid dynamics (CFD) based approach. From this research, it is possible to draw a series of detailed guidelines on small wind turbine blade design and analysis. The research also provides a platform for further comprehensive study using these two approaches. The wake induction corrections and stall corrections of the BEM method were examined through a case study of the NREL/NASA Phase VI wind turbine. A hybrid stall correction model was proposed to analyse wind turbine power performance. The proposed model shows improvement in power prediction for the validation case, compared with the existing stall correction models. The effects of the key rotor parameters of a small wind turbine as well as the blade chord and twist angle distributions on power performance were investigated through two typical wind turbines, i.e. a fixed-pitch variable-speed (FPVS) wind turbine and a fixed-pitch fixed-speed (FPFS) wind turbine. An engineering blade design and analysis code was developed in MATLAB to accommodate aerodynamic design and analysis of the blades.. The linearisation for radial profiles of blade chord and twist angle for the FPFS wind turbine blade design was discussed. Results show that, the proposed linearisation approach leads to reduced manufacturing cost and higher annual energy production (AEP), with minimal effects on the low wind speed performance. Comparative studies of mesh and turbulence models in 2D and 3D CFD modelling were conducted. The CFD predicted lift and drag coefficients of the airfoil S809 were compared with wind tunnel test data and the 3D CFD modelling method of the NREL/NASA Phase VI wind turbine were validated against measurements. Airfoil aerodynamic characterisation and wind turbine power performance as well as 3D flow details were studied. The detailed flow characteristics from the CFD modelling are quantitatively comparable to the measurements, such as blade surface pressure distribution and integrated forces and moments. It is confirmed that the CFD approach is able to provide a more detailed qualitative and quantitative analysis for wind turbine airfoils and rotors..
Dynamic analysis of CO₂ labeling and cell respiration using membrane-inlet mass spectrometry.
Yang, Tae Hoon
2014-01-01
Here, we introduce a mass spectrometry-based analytical method and relevant technical details for dynamic cell respiration and CO2 labeling analysis. Such measurements can be utilized as additional information and constraints for model-based (13)C metabolic flux analysis. Dissolved dynamics of oxygen consumption and CO2 mass isotopomer evolution from (13)C-labeled tracer substrates through different cellular processes can be precisely measured on-line using a miniaturized reactor system equipped with a membrane-inlet mass spectrometer. The corresponding specific rates of physiologically relevant gases and CO2 mass isotopomers can be quantified within a short-term range based on the liquid-phase dynamics of dissolved fermentation gases.
NASA Astrophysics Data System (ADS)
Hsu, L.; Lehnert, K. A.; Walker, J. D.; Chan, C.; Ash, J.; Johansson, A. K.; Rivera, T. A.
2011-12-01
Sample-based measurements in geochemistry are highly diverse, due to the large variety of sample types, measured properties, and idiosyncratic analytical procedures. In order to ensure the utility of sample-based data for re-use in research or education they must be associated with a high quality and quantity of descriptive, discipline-specific metadata. Without an adequate level of documentation, it is not possible to reproduce scientific results or have confidence in using the data for new research inquiries. The required detail in data documentation makes it challenging to aggregate large sets of data from different investigators and disciplines. One solution to this challenge is to build data systems with several tiers of intricacy, where the less detailed tiers are geared toward discovery and interoperability, and the more detailed tiers have higher value for data analysis. The Geoinformatics for Geochemistry (GfG) group, which is part of the Integrated Earth Data Applications facility (http://www.iedadata.org), has taken this approach to provide services for the discovery, access, and analysis of sample-based geochemical data for a diverse user community, ranging from the highly informed geochemist to non-domain scientists and undergraduate students. GfG builds and maintains three tiers in the sample based data systems, from a simple data catalog (Geochemical Resource Library), to a substantially richer data model for the EarthChem Portal (EarthChem XML), and finally to detailed discipline-specific data models for petrologic (PetDB), sedimentary (SedDB), hydrothermal spring (VentDB), and geochronological (GeoChron) samples. The data catalog, the lowest level in the hierarchy, contains the sample data values plus metadata only about the dataset itself (Dublin Core metadata such as dataset title and author), and therefore can accommodate the widest diversity of data holdings. The second level includes measured data values from the sample, basic information about the analytical method, and metadata about the samples such as geospatial information and sample type. The third and highest level includes detailed data quality documentation and more specific information about the scientific context of the sample. The three tiers are linked to allow users to quickly navigate to their desired level of metadata detail. Links are based on the use of unique identifiers: (a) DOI at the granularity of datasets, and (b) the International Geo Sample Number IGSN at the granularity of samples. Current developments in the GfG sample-based systems include new registry architecture for the IGSN to advance international implementation, growth and modification of EarthChemXML to include geochemical data for new sample types such as soils and liquids, and the construction of a hydrothermal vent data system. This flexible, tiered, model provides a solution for offering varying levels of detail in order to aggregate a large quantity of data and serve the largest user group of both disciplinary novices and experts.
Li, Yanyun; Chen, Minjian; Liu, Cuiping; Xia, Yankai; Xu, Bo; Hu, Yanhui; Chen, Ting; Shen, Meiping; Tang, Wei
2018-05-01
Papillary thyroid carcinoma (PTC) is the most common thyroid cancer. Nuclear magnetic resonance (NMR)‑based metabolomic technique is the gold standard in metabolite structural elucidation, and can provide different coverage of information compared with other metabolomic techniques. Here, we firstly conducted NMR based metabolomics study regarding detailed metabolic changes especially metabolic pathway changes related to PTC pathogenesis. 1H NMR-based metabolomic technique was adopted in conju-nction with multivariate analysis to analyze matched tumor and normal thyroid tissues obtained from 16 patients. The results were further annotated with Kyoto Encyclopedia of Genes and Genomes (KEGG), and Human Metabolome Database, and then were analyzed using modules of pathway analysis and enrichment analysis of MetaboAnalyst 3.0. Based on the analytical techniques, we established the models of principal component analysis (PCA), partial least squares-discriminant analysis (PLS-DA), and orthogonal partial least-squares discriminant analysis (OPLS‑DA) which could discriminate PTC from normal thyroid tissue, and found 15 robust differentiated metabolites from two OPLS-DA models. We identified 8 KEGG pathways and 3 pathways of small molecular pathway database which were significantly related to PTC by using pathway analysis and enrichment analysis, respectively, through which we identified metabolisms related to PTC including branched chain amino acid metabolism (leucine and valine), other amino acid metabolism (glycine and taurine), glycolysis (lactate), tricarboxylic acid cycle (citrate), choline metabolism (choline, ethanolamine and glycerolphosphocholine) and lipid metabolism (very-low‑density lipoprotein and low-density lipoprotein). In conclusion, the PTC was characterized with increased glycolysis and inhibited tricarboxylic acid cycle, increased oncogenic amino acids as well as abnormal choline and lipid metabolism. The findings in this study provide new insights into detailed metabolic changes of PTC, and hold great potential in the treatment of PTC.
Multivariate analysis: A statistical approach for computations
NASA Astrophysics Data System (ADS)
Michu, Sachin; Kaushik, Vandana
2014-10-01
Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.
An Assessment of the State-of-the-Art in Multidisciplinary Aeromechanical Analyses
2008-01-01
monolithic formulations. In summary, for aerospace structures, partitioned formulations provide fundamental advantages over fully coupled ones, in addition...important frequencies of local analysis directly to global analysis using detailed modeling. Performed ju- diciously, based on a fundamental understanding of...in 2000 has com- prehensively described the problem, and reviewed the status of fundamental understanding, experimental data, and analytical
Analysis of Private Returns to Vocational Education and Training: Support Document
ERIC Educational Resources Information Center
Lee, Wang-Sheng; Coelli, Michael
2010-01-01
This document is an appendix that is meant to accompany the main report, "Analysis of Private Returns to Vocational Education and Training". Included here are the detailed regression results that correspond to Tables 4 to 59 of the main report. This document was produced by the authors based on their research for the main report, and is…
Model-based time-series analysis of FIA panel data absent re-measurements
Raymond L. Czaplewski; Mike T. Thompson
2013-01-01
An epidemic of lodgepole pine (Pinus contorta) mortality from the mountain pine beetle (Dendroctonus ponderosae) has swept across the Interior West. Aerial surveys monitor the areal extent of the epidemic, but only Forest Inventory and Analysis (FIA) field data support a detailed assessment at the tree level. Dynamics of the lodgepole pine population occur at a more...
ERIC Educational Resources Information Center
Mitchell, Douglas E.; Mitchell, Ross E.
This report presents a comprehensive preliminary analysis of how California's Class Size Reduction (CSR) initiative has impacted student achievement during the first 2 years of implementation. The analysis is based on complete student, classroom, and teacher records from 26,126 students in 1,174 classrooms from 83 schools in 8 Southern California…
Oak Ridge Environmental Information System (OREIS) functional system design document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birchfield, T.E.; Brown, M.O.; Coleman, P.R.
1994-03-01
The OREIS Functional System Design document provides a detailed functional description of the Oak Ridge Environmental Information System (OREIS). It expands the system requirements defined in the OREIS Phase 1-System Definition Document (ES/ER/TM-34). Documentation of OREIS development is based on the Automated Data Processing System Development Methodology, a Martin Marietta Energy Systems, Inc., procedure written to assist in developing scientific and technical computer systems. This document focuses on the development of the functional design of the user interface, which includes the integration of commercial applications software. The data model and data dictionary are summarized briefly; however, the Data Management Planmore » for OREIS (ES/ER/TM-39), a companion document to the Functional System Design document, provides the complete data dictionary and detailed descriptions of the requirements for the data base structure. The OREIS system will provide the following functions, which are executed from a Menu Manager: (1) preferences, (2) view manager, (3) macro manager, (4) data analysis (assisted analysis and unassisted analysis), and (5) spatial analysis/map generation (assisted ARC/INFO and unassisted ARC/INFO). Additional functionality includes interprocess communications, which handle background operations of OREIS.« less
Analysis of the Water Resources on Baseflow River Basin in Jeju Island, Korea
NASA Astrophysics Data System (ADS)
Yang, S.-K.; Jung, W.-Y.; Kang, M.-S.
2012-04-01
Jeju Island is a volcanic island located at the southernmost of Korea, and is the heaviest raining area in Korea, but due to its hydrological / geological characteristics different from those of inland areas, most streams are of the dry form, and it relies on groundwater for water resources. As for some streams, however, springwater is discharged at a point near the downstream of the final discharge to maintain the flow of the stream; this has been developed as the source for water supply since the past, but the studies on detail observations and analysis are yet inadequate. This study utilizes the ADCP (Acoustic Doppler Current Profiler) hydrometer to regularly observe the flow amount of base run-off stream, and the water resources of base discharge basin of Jeju Island were analyzed using the SWAT (Soil & Water Assessment Tool) model. The detail water resource analysis study using modeling and site observation with high precision for Jeju Island water resources is expected to become the foundation for efficient usage and security of water resources against future climate changes.
Education in the Post-Integration Era.
ERIC Educational Resources Information Center
Irvine, Russell William
1986-01-01
Jeff Howard and Raymond Hammond have based on faulty analysis of data their theory that poor Black academic performance is caused by internalized feelings of inferiority and the resultant fear of intellectual competition. Three aspects of the hypothesis are examined in detail. (PS)
Dynamics of land change in India: a fine-scale spatial analysis
NASA Astrophysics Data System (ADS)
Meiyappan, P.; Roy, P. S.; Sharma, Y.; Jain, A. K.; Ramachandran, R.; Joshi, P. K.
2015-12-01
Land is scarce in India: India occupies 2.4% of worlds land area, but supports over 1/6th of worlds human and livestock population. This high population to land ratio, combined with socioeconomic development and increasing consumption has placed tremendous pressure on India's land resources for food, feed, and fuel. In this talk, we present contemporary (1985 to 2005) spatial estimates of land change in India using national-level analysis of Landsat imageries. Further, we investigate the causes of the spatial patterns of change using two complementary lines of evidence. First, we use statistical models estimated at macro-scale to understand the spatial relationships between land change patterns and their concomitant drivers. This analysis using our newly compiled extensive socioeconomic database at village level (~630,000 units), is 100x higher in spatial resolution compared to existing datasets, and covers over 200 variables. The detailed socioeconomic data enabled the fine-scale spatial analysis with Landsat data. Second, we synthesized information from over 130 survey based case studies on land use drivers in India to complement our macro-scale analysis. The case studies are especially useful to identify unobserved variables (e.g. farmer's attitude towards risk). Ours is the most detailed analysis of contemporary land change in India, both in terms of national extent, and the use of detailed spatial information on land change, socioeconomic factors, and synthesis of case studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
A robotically constructed production and supply base on Phobos
NASA Astrophysics Data System (ADS)
1989-05-01
PHOBIA Corporation is involved with the design of a man-tenable robotically constructed, bootstrap base on Mars' moon, Phobos. This base will be a pit-stop for future manned missions to Mars and beyond and will be a control facility during the robotic construction of a Martian base. An introduction is given to the concepts and the ground rules followed during the design process. Details of a base design and its location are given along with information about some of the subsystems. Since a major purpose of the base is to supply fuel to spacecraft so they can limit their fuel mass, mining and production systems are discussed. Surface support activities such as docks, anchors, and surface transportation systems are detailed. Several power supplies for the base are investigated and include fuel cells and a nuclear reactor. Tasks for the robots are defined along with descriptions of the robots capable of completing the tasks. Finally, failure modes for the entire PHOBIA Corporation design are presented along with an effects analysis and preventative recommendations.
A robotically constructed production and supply base on Phobos
NASA Technical Reports Server (NTRS)
1989-01-01
PHOBIA Corporation is involved with the design of a man-tenable robotically constructed, bootstrap base on Mars' moon, Phobos. This base will be a pit-stop for future manned missions to Mars and beyond and will be a control facility during the robotic construction of a Martian base. An introduction is given to the concepts and the ground rules followed during the design process. Details of a base design and its location are given along with information about some of the subsystems. Since a major purpose of the base is to supply fuel to spacecraft so they can limit their fuel mass, mining and production systems are discussed. Surface support activities such as docks, anchors, and surface transportation systems are detailed. Several power supplies for the base are investigated and include fuel cells and a nuclear reactor. Tasks for the robots are defined along with descriptions of the robots capable of completing the tasks. Finally, failure modes for the entire PHOBIA Corporation design are presented along with an effects analysis and preventative recommendations.
Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase
NASA Astrophysics Data System (ADS)
Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki
2013-09-01
In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Espinosa-Paredes, Gilberto; Prieto-Guerrero, Alfonso; Nunez-Carrera, Alejandro
This paper introduces a wavelet-based method to analyze instability events in a boiling water reactor (BWR) during transient phenomena. The methodology to analyze BWR signals includes the following: (a) the short-time Fourier transform (STFT) analysis, (b) decomposition using the continuous wavelet transform (CWT), and (c) application of multiresolution analysis (MRA) using discrete wavelet transform (DWT). STFT analysis permits the study, in time, of the spectral content of analyzed signals. The CWT provides information about ruptures, discontinuities, and fractal behavior. To detect these important features in the signal, a mother wavelet has to be chosen and applied at several scales tomore » obtain optimum results. MRA allows fast implementation of the DWT. Features like important frequencies, discontinuities, and transients can be detected with analysis at different levels of detail coefficients. The STFT was used to provide a comparison between a classic method and the wavelet-based method. The damping ratio, which is an important stability parameter, was calculated as a function of time. The transient behavior can be detected by analyzing the maximum contained in detail coefficients at different levels in the signal decomposition. This method allows analysis of both stationary signals and highly nonstationary signals in the timescale plane. This methodology has been tested with the benchmark power instability event of Laguna Verde nuclear power plant (NPP) Unit 1, which is a BWR-5 NPP.« less
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
On Target Localization Using Combined RSS and AoA Measurements
Beko, Marko; Dinis, Rui
2018-01-01
This work revises existing solutions for a problem of target localization in wireless sensor networks (WSNs), utilizing integrated measurements, namely received signal strength (RSS) and angle of arrival (AoA). The problem of RSS/AoA-based target localization became very popular in the research community recently, owing to its great applicability potential and relatively low implementation cost. Therefore, here, a comprehensive study of the state-of-the-art (SoA) solutions and their detailed analysis is presented. The beginning of this work starts by considering the SoA approaches based on convex relaxation techniques (more computationally complex in general), and it goes through other (less computationally complex) approaches, as well, such as the ones based on the generalized trust region sub-problems framework and linear least squares. Furthermore, a detailed analysis of the computational complexity of each solution is reviewed. Furthermore, an extensive set of simulation results is presented. Finally, the main conclusions are summarized, and a set of future aspects and trends that might be interesting for future research in this area is identified. PMID:29671832
Chalcogenide-based van der Waals epitaxy: Interface conductivity of tellurium on Si(111)
NASA Astrophysics Data System (ADS)
Lüpke, Felix; Just, Sven; Bihlmayer, Gustav; Lanius, Martin; Luysberg, Martina; Doležal, Jiří; Neumann, Elmar; Cherepanov, Vasily; Ošt'ádal, Ivan; Mussler, Gregor; Grützmacher, Detlev; Voigtländer, Bert
2017-07-01
We present a combined experimental and theoretical analysis of a Te rich interface layer which represents a template for chalcogenide-based van der Waals epitaxy on Si(111). On a clean Si(111)-(1 ×1 ) surface, we find Te to form a Te/Si(111)-(1 ×1 ) reconstruction to saturate the substrate bonds. A problem arising is that such an interface layer can potentially be highly conductive, undermining the applicability of the on-top grown films in electric devices. We perform here a detailed structural analysis of the pristine Te termination and present direct measurements of its electrical conductivity by in situ distance-dependent four-probe measurements. The experimental results are analyzed with respect to density functional theory calculations and the implications of the interface termination with respect to the electrical conductivity of chalcogenide-based topological insulator thin films are discussed. In detail, we find a Te/Si(111)-(1 ×1 ) interface conductivity of σ2D Te=2.6 (5 ) ×10-7S /□ , which is small compared to the typical conductivity of topological surface states.
Ruppin, Eytan; Papin, Jason A; de Figueiredo, Luis F; Schuster, Stefan
2010-08-01
With the advent of modern omics technologies, it has become feasible to reconstruct (quasi-) whole-cell metabolic networks and characterize them in more and more detail. Computer simulations of the dynamic behavior of such networks are difficult due to a lack of kinetic data and to computational limitations. In contrast, network analysis based on appropriate constraints such as the steady-state condition (constraint-based analysis) is feasible and allows one to derive conclusions about the system's metabolic capabilities. Here, we review methods for the reconstruction of metabolic networks, modeling techniques such as flux balance analysis and elementary flux modes and current progress in their development and applications. Game-theoretical methods for studying metabolic networks are discussed as well. Copyright © 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Judd, Terry; Kennedy, Gregor
2011-01-01
Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…
Environmental Assessment for Airborne Laser Debris Management Vandenberg Air Force Base, California
2008-07-01
hazardous waste management, water resources, air quality, and biological resources. Based on the analysis of the Proposed Action and No-Action...aesthetics, hazardous materials management, soils and geology, noise, cultural resources, and environmental justice. The resources analyzed in more detail...include: health and safety, hazardous waste management, water resources, air quality, and biological resources. Environmental Effects Under the
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
Monopulse azimuth measurement in the ATC Radar Beacon System
DOT National Transportation Integrated Search
1971-12-01
A review is made of the application of sum-difference beam : techniques to the ATC Radar Beacon System. A detailed error analysis : is presented for the case of a monopulse azimuth measurement based : on the existing beacon antenna with a modified fe...
Feasibility of a web-based system for police crash report review and information recording.
DOT National Transportation Integrated Search
2016-04-01
Police crash reports include useful additional information that is not available in crash summary records. : This information may include police sketches and narratives and is often needed for detailed site-specific : safety analysis. In addition, so...
Life expectancy evaluation and development of a replacement schedule for LED traffic signals.
DOT National Transportation Integrated Search
2011-03-01
This research details a field study of LED traffic signals in Missouri and develops a replacement schedule : based on key findings. Rates of degradation were statistically analyzed using Analysis of Variance : (ANOVA). Results of this research will p...
Handbook of Applied Behavior Analysis
ERIC Educational Resources Information Center
Fisher, Wayne W., Ed.; Piazza, Cathleen C., Ed.; Roane, Henry S., Ed.
2011-01-01
Describing the state of the science of ABA, this comprehensive handbook provides detailed information about theory, research, and intervention. The contributors are leading ABA authorities who present current best practices in behavioral assessment and demonstrate evidence-based strategies for supporting positive behaviors and reducing problem…
Digital microarray analysis for digital artifact genomics
NASA Astrophysics Data System (ADS)
Jaenisch, Holger; Handley, James; Williams, Deborah
2013-06-01
We implement a Spatial Voting (SV) based analogy of microarray analysis for digital gene marker identification in malware code sections. We examine a famous set of malware formally analyzed by Mandiant and code named Advanced Persistent Threat (APT1). APT1 is a Chinese organization formed with specific intent to infiltrate and exploit US resources. Manidant provided a detailed behavior and sting analysis report for the 288 malware samples available. We performed an independent analysis using a new alternative to the traditional dynamic analysis and static analysis we call Spatial Analysis (SA). We perform unsupervised SA on the APT1 originating malware code sections and report our findings. We also show the results of SA performed on some members of the families associated by Manidant. We conclude that SV based SA is a practical fast alternative to dynamics analysis and static analysis.
Objective analysis of observational data from the FGGE observing systems
NASA Technical Reports Server (NTRS)
Baker, W.; Edelmann, D.; Iredell, M.; Han, D.; Jakkempudi, S.
1981-01-01
An objective analysis procedure for updating the GLAS second and fourth order general atmospheric circulation models using observational data from the first GARP global experiment is described. The objective analysis procedure is based on a successive corrections method and the model is updated in a data assimilation cycle. Preparation of the observational data for analysis and the objective analysis scheme are described. The organization of the program and description of the required data sets are presented. The program logic and detailed descriptions of each subroutine are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jannik, Tim; Hartman, Larry
During the operational history of Savannah River Site, many different radionuclides have been released from site facilities. However, as shown in this analysis, only a relatively small number of the released radionuclides have been significant contributors to doses to the offsite public. This report is an update to the 2011 analysis, Critical Radionuclide and Pathway Analysis for the Savannah River Site. SRS-based Performance Assessments for E-Area, Saltstone, F-Tank Farm, H-Tank Farm, and a Comprehensive SRS Composite Analysis have been completed. The critical radionuclides and pathways identified in those extensive reports are also detailed and included in this analysis.
An infrastructure for accurate characterization of single-event transients in digital circuits.
Savulimedu Veeravalli, Varadan; Polzer, Thomas; Schmid, Ulrich; Steininger, Andreas; Hofbauer, Michael; Schweiger, Kurt; Dietrich, Horst; Schneider-Hornstein, Kerstin; Zimmermann, Horst; Voss, Kay-Obbe; Merk, Bruno; Hajek, Michael
2013-11-01
We present the architecture and a detailed pre-fabrication analysis of a digital measurement ASIC facilitating long-term irradiation experiments of basic asynchronous circuits, which also demonstrates the suitability of the general approach for obtaining accurate radiation failure models developed in our FATAL project. Our ASIC design combines radiation targets like Muller C-elements and elastic pipelines as well as standard combinational gates and flip-flops with an elaborate on-chip measurement infrastructure. Major architectural challenges result from the fact that the latter must operate reliably under the same radiation conditions the target circuits are exposed to, without wasting precious die area for a rad-hard design. A measurement architecture based on multiple non-rad-hard counters is used, which we show to be resilient against double faults, as well as many triple and even higher-multiplicity faults. The design evaluation is done by means of comprehensive fault injection experiments, which are based on detailed Spice models of the target circuits in conjunction with a standard double-exponential current injection model for single-event transients (SET). To be as accurate as possible, the parameters of this current model have been aligned with results obtained from 3D device simulation models, which have in turn been validated and calibrated using micro-beam radiation experiments at the GSI in Darmstadt, Germany. For the latter, target circuits instrumented with high-speed sense amplifiers have been used for analog SET recording. Together with a probabilistic analysis of the sustainable particle flow rates, based on a detailed area analysis and experimental cross-section data, we can conclude that the proposed architecture will indeed sustain significant target hit rates, without exceeding the resilience bound of the measurement infrastructure.
An infrastructure for accurate characterization of single-event transients in digital circuits☆
Savulimedu Veeravalli, Varadan; Polzer, Thomas; Schmid, Ulrich; Steininger, Andreas; Hofbauer, Michael; Schweiger, Kurt; Dietrich, Horst; Schneider-Hornstein, Kerstin; Zimmermann, Horst; Voss, Kay-Obbe; Merk, Bruno; Hajek, Michael
2013-01-01
We present the architecture and a detailed pre-fabrication analysis of a digital measurement ASIC facilitating long-term irradiation experiments of basic asynchronous circuits, which also demonstrates the suitability of the general approach for obtaining accurate radiation failure models developed in our FATAL project. Our ASIC design combines radiation targets like Muller C-elements and elastic pipelines as well as standard combinational gates and flip-flops with an elaborate on-chip measurement infrastructure. Major architectural challenges result from the fact that the latter must operate reliably under the same radiation conditions the target circuits are exposed to, without wasting precious die area for a rad-hard design. A measurement architecture based on multiple non-rad-hard counters is used, which we show to be resilient against double faults, as well as many triple and even higher-multiplicity faults. The design evaluation is done by means of comprehensive fault injection experiments, which are based on detailed Spice models of the target circuits in conjunction with a standard double-exponential current injection model for single-event transients (SET). To be as accurate as possible, the parameters of this current model have been aligned with results obtained from 3D device simulation models, which have in turn been validated and calibrated using micro-beam radiation experiments at the GSI in Darmstadt, Germany. For the latter, target circuits instrumented with high-speed sense amplifiers have been used for analog SET recording. Together with a probabilistic analysis of the sustainable particle flow rates, based on a detailed area analysis and experimental cross-section data, we can conclude that the proposed architecture will indeed sustain significant target hit rates, without exceeding the resilience bound of the measurement infrastructure. PMID:24748694
Design, development and testing twin pulse tube cryocooler
NASA Astrophysics Data System (ADS)
Gour, Abhay Singh; Sagar, Pankaj; Karunanithi, R.
2017-09-01
The design and development of Twin Pulse Tube Cryocooler (TPTC) is presented. Both the coolers are driven by a single Linear Moving Magnet Synchronous Motor (LMMSM) with piston heads at both ends of the mover shaft. Magnetostatic analysis for flux line distribution was carried-out during design and development of LMMSM based pressure wave generator. Based on the performance of PWG, design of TPTC was carried out using Sage and Computational Fluid Dynamics (CFD) analysis. Detailed design, fabrication and testing of LMMSM, TPTC and their integration tests are presented in this paper.
Algal Biomass Analysis by Laser-Based Analytical Techniques—A Review
Pořízka, Pavel; Prochazková, Petra; Prochazka, David; Sládková, Lucia; Novotný, Jan; Petrilak, Michal; Brada, Michal; Samek, Ota; Pilát, Zdeněk; Zemánek, Pavel; Adam, Vojtěch; Kizek, René; Novotný, Karel; Kaiser, Jozef
2014-01-01
Algal biomass that is represented mainly by commercially grown algal strains has recently found many potential applications in various fields of interest. Its utilization has been found advantageous in the fields of bioremediation, biofuel production and the food industry. This paper reviews recent developments in the analysis of algal biomass with the main focus on the Laser-Induced Breakdown Spectroscopy, Raman spectroscopy, and partly Laser-Ablation Inductively Coupled Plasma techniques. The advantages of the selected laser-based analytical techniques are revealed and their fields of use are discussed in detail. PMID:25251409
Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried
2017-01-01
Abstract Background: This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. Methods: The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. Results: The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. Conclusion: The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. PMID:27815414
NASA Astrophysics Data System (ADS)
Arnold, N. D.; Attig, J.; Banks, G.; Bechtold, R.; Beczek, K.; Benson, C.; Berg, S.; Berg, W.; Biedron, S. G.; Biggs, J. A.; Borland, M.; Boerste, K.; Bosek, M.; Brzowski, W. R.; Budz, J.; Carwardine, J. A.; Castro, P.; Chae, Y.-C.; Christensen, S.; Clark, C.; Conde, M.; Crosbie, E. A.; Decker, G. A.; Dejus, R. J.; DeLeon, H.; Den Hartog, P. K.; Deriy, B. N.; Dohan, D.; Dombrowski, P.; Donkers, D.; Doose, C. L.; Dortwegt, R. J.; Edwards, G. A.; Eidelman, Y.; Erdmann, M. J.; Error, J.; Ferry, R.; Flood, R.; Forrestal, J.; Freund, H.; Friedsam, H.; Gagliano, J.; Gai, W.; Galayda, J. N.; Gerig, R.; Gilmore, R. L.; Gluskin, E.; Goeppner, G. A.; Goetzen, J.; Gold, C.; Gorski, A. J.; Grelick, A. E.; Hahne, M. W.; Hanuska, S.; Harkay, K. C.; Harris, G.; Hillman, A. L.; Hogrefe, R.; Hoyt, J.; Huang, Z.; Jagger, J. M.; Jansma, W. G.; Jaski, M.; Jones, S. J.; Keane, R. T.; Kelly, A. L.; Keyser, C.; Kim, K.-J.; Kim, S. H.; Kirshenbaum, M.; Klick, J. H.; Knoerzer, K.; Koldenhoven, R. J.; Knott, M.; Labuda, S.; Laird, R.; Lang, J.; Lenkszus, F.; Lessner, E. S.; Lewellen, J. W.; Li, Y.; Lill, R. M.; Lumpkin, A. H.; Makarov, O. A.; Markovich, G. M.; McDowell, M.; McDowell, W. P.; McNamara, P. E.; Meier, T.; Meyer, D.; Michalek, W.; Milton, S. V.; Moe, H.; Moog, E. R.; Morrison, L.; Nassiri, A.; Noonan, J. R.; Otto, R.; Pace, J.; Pasky, S. J.; Penicka, J. M.; Pietryla, A. F.; Pile, G.; Pitts, C.; Power, J.; Powers, T.; Putnam, C. C.; Puttkammer, A. J.; Reigle, D.; Reigle, L.; Ronzhin, D.; Rotela, E. R.; Russell, E. F.; Sajaev, V.; Sarkar, S.; Scapino, J. C.; Schroeder, K.; Seglem, R. A.; Sereno, N. S.; Sharma, S. K.; Sidarous, J. F.; Singh, O.; Smith, T. L.; Soliday, R.; Sprau, G. A.; Stein, S. J.; Stejskal, B.; Svirtun, V.; Teng, L. C.; Theres, E.; Thompson, K.; Tieman, B. J.; Torres, J. A.; Trakhtenberg, E. M.; Travish, G.; Trento, G. F.; Vacca, J.; Vasserman, I. B.; Vinokurov, N. A.; Walters, D. R.; Wang, J.; Wang, X. J.; Warren, J.; Wesling, S.; Weyer, D. L.; Wiemerslage, G.; Wilhelmi, K.; Wright, R.; Wyncott, D.; Xu, S.; Yang, B.-X.; Yoder, W.; Zabel, R. B.
2001-12-01
Exponential growth of self-amplified spontaneous emission at 530 nm was first experimentally observed at the Advanced Photon Source low-energy undulator test line in December 1999. Since then, further detailed measurements and analysis of the results have been made. Here, we present the measurements and compare these with calculations based on measured electron beam properties and theoretical expectations.
ERIC Educational Resources Information Center
Guirguis, Ruth; Pankowski, Jennifer
2017-01-01
The purpose of this meta-analysis was to explore in detail the research by Hattie (2009) in order to then re-examine it through a new lens and better understand the research and possible bias. For this study, the unit of analysis were the studies used in Hattie (2009). The criteria for inclusion were based on strategies currently implemented in…
GOplot: an R package for visually combining expression data with functional analysis.
Walter, Wencke; Sánchez-Cabo, Fátima; Ricote, Mercedes
2015-09-01
Despite the plethora of methods available for the functional analysis of omics data, obtaining comprehensive-yet detailed understanding of the results remains challenging. This is mainly due to the lack of publicly available tools for the visualization of this type of information. Here we present an R package called GOplot, based on ggplot2, for enhanced graphical representation. Our package takes the output of any general enrichment analysis and generates plots at different levels of detail: from a general overview to identify the most enriched categories (bar plot, bubble plot) to a more detailed view displaying different types of information for molecules in a given set of categories (circle plot, chord plot, cluster plot). The package provides a deeper insight into omics data and allows scientists to generate insightful plots with only a few lines of code to easily communicate the findings. The R package GOplot is available via CRAN-The Comprehensive R Archive Network: http://cran.r-project.org/web/packages/GOplot. The shiny web application of the Venn diagram can be found at: https://wwalter.shinyapps.io/Venn/. A detailed manual of the package with sample figures can be found at https://wencke.github.io/ fscabo@cnic.es or mricote@cnic.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoak, T.E.; Decker, A.D.
Mesaverde Group reservoirs in the Piceance Basin, Western Colorado contain a large reservoir base. Attempts to exploit this resource base are stymied by low permeability reservoir conditions. The presence of abundant natural fracture systems throughout this basin, however, does permit economic production. Substantial production is associated with fractured reservoirs in Divide Creek, Piceance Creek, Wolf Creek, White River Dome, Plateau, Shire Gulch, Grand Valley, Parachute and Rulison fields. Successful Piceance Basin gas production requires detailed information about fracture networks and subsurface gas and water distribution in an overall gas-centered basin geometry. Assessment of these three parameters requires an integrated basinmore » analysis incorporating conventional subsurface geology, seismic data, remote sensing imagery analysis, and an analysis of regional tectonics. To delineate the gas-centered basin geometry in the Piceance Basin, a regional cross-section spanning the basin was constructed using hydrocarbon and gamma radiation logs. The resultant hybrid logs were used for stratigraphic correlations in addition to outlining the trans-basin gas-saturated conditions. The magnitude of both pressure gradients (paludal and marine intervals) is greater than can be generated by a hydrodynamic model. To investigate the relationships between structure and production, detailed mapping of the basin (top of the Iles Formation) was used to define subtle subsurface structures that control fractured reservoir development. The most productive fields in the basin possess fractured reservoirs. Detailed studies in the Grand Valley-Parachute-Rulison and Shire Gulch-Plateau fields indicate that zones of maximum structural flexure on kilometer-scale structural features are directly related to areas of enhanced production.« less
Computer-based analysis of microvascular alterations in a mouse model for Alzheimer's disease
NASA Astrophysics Data System (ADS)
Heinzer, Stefan; Müller, Ralph; Stampanoni, Marco; Abela, Rafael; Meyer, Eric P.; Ulmann-Schuler, Alexandra; Krucker, Thomas
2007-03-01
Vascular factors associated with Alzheimer's disease (AD) have recently gained increased attention. To investigate changes in vascular, particularly microvascular architecture, we developed a hierarchical imaging framework to obtain large-volume, high-resolution 3D images from brains of transgenic mice modeling AD. In this paper, we present imaging and data analysis methods which allow compiling unique characteristics from several hundred gigabytes of image data. Image acquisition is based on desktop micro-computed tomography (µCT) and local synchrotron-radiation µCT (SRµCT) scanning with a nominal voxel size of 16 µm and 1.4 µm, respectively. Two visualization approaches were implemented: stacks of Z-buffer projections for fast data browsing, and progressive-mesh based surface rendering for detailed 3D visualization of the large datasets. In a first step, image data was assessed visually via a Java client connected to a central database. Identified characteristics of interest were subsequently quantified using global morphometry software. To obtain even deeper insight into microvascular alterations, tree analysis software was developed providing local morphometric parameters such as number of vessel segments or vessel tortuosity. In the context of ever increasing image resolution and large datasets, computer-aided analysis has proven both powerful and indispensable. The hierarchical approach maintains the context of local phenomena, while proper visualization and morphometry provide the basis for detailed analysis of the pathology related to structure. Beyond analysis of microvascular changes in AD this framework will have significant impact considering that vascular changes are involved in other neurodegenerative diseases as well as in cancer, cardiovascular disease, asthma, and arthritis.
How Do the Metabolic Effects of Chronic Stress Influence Breast Cancer Biology
2013-04-01
meta - analysis . International Journal of Cancer. 2003;107:1023-9. 3. Song M, Lee K-M, Kang D. Breast Cancer Prevention Based on Gene- Environment...7 PCR system. Details of the statistical analysis are provided in supplemental methods 1 section. 2 Adipocyte glucose consumption and...life events and breast cancer risk: a meta - analysis . International Journal of Cancer. 7 2003;107:1023-9. 8 3. Song M, Lee K-M, Kang D. Breast
Hydrogen Production from Nuclear Energy via High Temperature Electrolysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
James E. O'Brien; Carl M. Stoots; J. Stephen Herring
2006-04-01
This paper presents the technical case for high-temperature nuclear hydrogen production. A general thermodynamic analysis of hydrogen production based on high-temperature thermal water splitting processes is presented. Specific details of hydrogen production based on high-temperature electrolysis are also provided, including results of recent experiments performed at the Idaho National Laboratory. Based on these results, high-temperature electrolysis appears to be a promising technology for efficient large-scale hydrogen production.
Parzefall, Thomas; Wolf, Axel; Frei, Klemens; Kaider, Alexandra; Riss, Dominik
2017-03-01
Use of reliable grading scores to measure epistaxis severity in hereditary hemorrhagic telangiectasia (HHT) is essential in clinical routine and for scientific purposes. For practical reasons, visual analog scale (VAS) scoring and the Epistaxis Severity Score (ESS) are widely used. VAS scores are purely subjective, and a potential shortcoming of the ESS is that it is based on self-reported anamnestic bleeding data. The aim of this study was to validate the level of correlation between VAS scores, the ESS, and actual bleeding events, based on detailed epistaxis diaries of patients. Records from daily epistaxis diaries maintained by 16 HHT patients over 112 consecutive days were compared with the monthly ESS and daily VAS scores in the corresponding time period. The Spearman rank correlation coefficient, analysis of variance models, and multiple R 2 measures were used for statistical analysis. Although the ESS and VAS scores generally showed a high degree of correlation with actual bleeding events, mild events were underrepresented in both scores. Our results highlight the usefulness of the ESS as a standard epistaxis score in cohorts with moderate to severe degrees of epistaxis. The use of detailed epistaxis diaries should be considered when monitoring patients and cohorts with mild forms of HHT. © 2016 ARS-AAOA, LLC.
Tools and techniques for developing policies for complex and uncertain systems.
Bankes, Steven C
2002-05-14
Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.
NASA Astrophysics Data System (ADS)
Fernandez Galarreta, J.; Kerle, N.; Gerke, M.
2015-06-01
Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.
Magnetic Stars After the Hayashi Phase. I
NASA Astrophysics Data System (ADS)
Glagolevskij, Yu. V.
2016-06-01
The problems of the origin and evolution of magnetic stars based on analysis of observational data are discussed. It is assumed that magnetic stars acquire their major properties during the protostellar collapse stage. The properties of magnetic stars after the Hayashi phase are examined in detail.
Detailed modeling of the train-to-train impact test : rail passenger equipment impact tests
DOT National Transportation Integrated Search
2007-07-01
This report describes the results of a finite element-based analysis of the train-to-train impact test conducted at the Federal Railroad Administrations Transportation Technology Center in Pueblo, CO, on January 31, 2002. The ABAQUS/Explicit dynam...
ERIC Educational Resources Information Center
Moore, Virginia; Sumrall, William; Mott, Michael; Mitchell, Elizabeth; Theobald, Becky
2015-01-01
Methods for facilitating students' standards-based consumer literacy are addressed via the use of problem solving with food and product labels. Fifth graders will be able to: (1) provide detailed analysis of food and product labels; (2) understand large themes, including production, distribution, and consumption; and (3) explore consumer…
Jia, Tao; Gao, Di
2018-04-03
Molecular dynamics simulation is employed to investigate the microscopic heat current inside an argon-copper nanofluid. Wavelet analysis of the microscopic heat current inside the nanofluid system is conducted. The signal of the microscopic heat current is decomposed into two parts: one is the approximation part; the other is the detail part. The approximation part is associated with the low-frequency part of the signal, and the detail part is associated with the high-frequency part of the signal. Both the probability distributions of the high-frequency and the low-frequency parts of the signals demonstrate Gaussian-like characteristics. The curves fit to data of the probability distribution of the microscopic heat current are established, and the parameters including the mean value and the standard deviation in the mathematical formulas of the curves show dramatic changes for the cases before and after adding copper nanoparticles into the argon base fluid.
Super-Resolution Reconstruction of Remote Sensing Images Using Multifractal Analysis
Hu, Mao-Gui; Wang, Jin-Feng; Ge, Yong
2009-01-01
Satellite remote sensing (RS) is an important contributor to Earth observation, providing various kinds of imagery every day, but low spatial resolution remains a critical bottleneck in a lot of applications, restricting higher spatial resolution analysis (e.g., intra-urban). In this study, a multifractal-based super-resolution reconstruction method is proposed to alleviate this problem. The multifractal characteristic is common in Nature. The self-similarity or self-affinity presented in the image is useful to estimate details at larger and smaller scales than the original. We first look for the presence of multifractal characteristics in the images. Then we estimate parameters of the information transfer function and noise of the low resolution image. Finally, a noise-free, spatial resolution-enhanced image is generated by a fractal coding-based denoising and downscaling method. The empirical case shows that the reconstructed super-resolution image performs well in detail enhancement. This method is not only useful for remote sensing in investigating Earth, but also for other images with multifractal characteristics. PMID:22291530
The Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
Cryogenic Tank Structure Sizing With Structural Optimization Method
NASA Technical Reports Server (NTRS)
Wang, J. T.; Johnson, T. F.; Sleight, D. W.; Saether, E.
2001-01-01
Structural optimization methods in MSC /NASTRAN are used to size substructures and to reduce the weight of a composite sandwich cryogenic tank for future launch vehicles. Because the feasible design space of this problem is non-convex, many local minima are found. This non-convex problem is investigated in detail by conducting a series of analyses along a design line connecting two feasible designs. Strain constraint violations occur for some design points along the design line. Since MSC/NASTRAN uses gradient-based optimization procedures. it does not guarantee that the lowest weight design can be found. In this study, a simple procedure is introduced to create a new starting point based on design variable values from previous optimization analyses. Optimization analysis using this new starting point can produce a lower weight design. Detailed inputs for setting up the MSC/NASTRAN optimization analysis and final tank design results are presented in this paper. Approaches for obtaining further weight reductions are also discussed.
NASA Astrophysics Data System (ADS)
Gao, Jie; Jiang, Li-Li; Xu, Zhen-Yuan
2009-10-01
A new chaos game representation of protein sequences based on the detailed hydrophobic-hydrophilic (HP) model has been proposed by Yu et al (Physica A 337 (2004) 171). A CGR-walk model is proposed based on the new CGR coordinates for the protein sequences from complete genomes in the present paper. The new CGR coordinates based on the detailed HP model are converted into a time series, and a long-memory ARFIMA(p, d, q) model is introduced into the protein sequence analysis. This model is applied to simulating real CGR-walk sequence data of twelve protein sequences. Remarkably long-range correlations are uncovered in the data and the results obtained from these models are reasonably consistent with those available from the ARFIMA(p, d, q) model.
Neutrino and axion bounds from the globular cluster M5 (NGC 5904).
Viaux, N; Catelan, M; Stetson, P B; Raffelt, G G; Redondo, J; Valcarce, A A R; Weiss, A
2013-12-06
The red-giant branch (RGB) in globular clusters is extended to larger brightness if the degenerate helium core loses too much energy in "dark channels." Based on a large set of archival observations, we provide high-precision photometry for the Galactic globular cluster M5 (NGC 5904), allowing for a detailed comparison between the observed tip of the RGB with predictions based on contemporary stellar evolution theory. In particular, we derive 95% confidence limits of g(ae)<4.3×10(-13) on the axion-electron coupling and μ(ν)<4.5×10(-12)μ(B) (Bohr magneton μ(B)=e/2m(e)) on a neutrino dipole moment, based on a detailed analysis of statistical and systematic uncertainties. The cluster distance is the single largest source of uncertainty and can be improved in the future.
Packaging consideration of two-dimensional polymer-based photonic crystals for laser beam steering
NASA Astrophysics Data System (ADS)
Dou, Xinyuan; Chen, Xiaonan; Chen, Maggie Yihong; Wang, Alan Xiaolong; Jiang, Wei; Chen, Ray T.
2009-02-01
In this paper, we report the theoretical study of polymer-based photonic crystals for laser beam steering which is based on the superprism effect as well as the experiment fabrication of the two dimensional photonic crystals for the laser beam steering. Superprism effect, the principle for beam steering, was separately studied in details through EFC (Equifrequency Contour) analysis. Polymer based photonic crystals were fabricated through double exposure holographic interference method using SU8-2007. The experiment results were also reported.
Simulation on a car interior aerodynamic noise control based on statistical energy analysis
NASA Astrophysics Data System (ADS)
Chen, Xin; Wang, Dengfeng; Ma, Zhengdong
2012-09-01
How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.
NASA Astrophysics Data System (ADS)
Unni, Vineet; Sankara Narayanan, E. M.
2017-04-01
This is the first report on the numerical analysis of the performance of nanoscale vertical superjunction structures based on impurity doping and an innovative approach that utilizes the polarisation properties inherent in III-V nitride semiconductors. Such nanoscale vertical polarisation super junction structures can be realized by employing a combination of epitaxial growth along the non-polar crystallographic axes of Wurtzite GaN and nanolithography-based processing techniques. Detailed numerical simulations clearly highlight the limitations of a doping based approach and the advantages of the proposed solution for breaking the unipolar one-dimensional material limits of GaN by orders of magnitude.
Detailed Aerodynamic Analysis of a Shrouded Tail Rotor Using an Unstructured Mesh Flow Solver
NASA Astrophysics Data System (ADS)
Lee, Hee Dong; Kwon, Oh Joon
The detailed aerodynamics of a shrouded tail rotor in hover has been numerically studied using a parallel inviscid flow solver on unstructured meshes. The numerical method is based on a cell-centered finite-volume discretization and an implicit Gauss-Seidel time integration. The calculation was made for a single blade by imposing a periodic boundary condition between adjacent rotor blades. The grid periodicity was also imposed at the periodic boundary planes to avoid numerical inaccuracy resulting from solution interpolation. The results were compared with available experimental data and those from a disk vortex theory for validation. It was found that realistic three-dimensional modeling is important for the prediction of detailed aerodynamics of shrouded rotors including the tip clearance gap flow.
High-throughput density-functional perturbation theory phonons for inorganic materials
NASA Astrophysics Data System (ADS)
Petretto, Guido; Dwaraknath, Shyam; P. C. Miranda, Henrique; Winston, Donald; Giantomassi, Matteo; van Setten, Michiel J.; Gonze, Xavier; Persson, Kristin A.; Hautier, Geoffroy; Rignanese, Gian-Marco
2018-05-01
The knowledge of the vibrational properties of a material is of key importance to understand physical phenomena such as thermal conductivity, superconductivity, and ferroelectricity among others. However, detailed experimental phonon spectra are available only for a limited number of materials, which hinders the large-scale analysis of vibrational properties and their derived quantities. In this work, we perform ab initio calculations of the full phonon dispersion and vibrational density of states for 1521 semiconductor compounds in the harmonic approximation based on density functional perturbation theory. The data is collected along with derived dielectric and thermodynamic properties. We present the procedure used to obtain the results, the details of the provided database and a validation based on the comparison with experimental data.
Global Existence Analysis of Cross-Diffusion Population Systems for Multiple Species
NASA Astrophysics Data System (ADS)
Chen, Xiuqing; Daus, Esther S.; Jüngel, Ansgar
2018-02-01
The existence of global-in-time weak solutions to reaction-cross-diffusion systems for an arbitrary number of competing population species is proved. The equations can be derived from an on-lattice random-walk model with general transition rates. In the case of linear transition rates, it extends the two-species population model of Shigesada, Kawasaki, and Teramoto. The equations are considered in a bounded domain with homogeneous Neumann boundary conditions. The existence proof is based on a refined entropy method and a new approximation scheme. Global existence follows under a detailed balance or weak cross-diffusion condition. The detailed balance condition is related to the symmetry of the mobility matrix, which mirrors Onsager's principle in thermodynamics. Under detailed balance (and without reaction) the entropy is nonincreasing in time, but counter-examples show that the entropy may increase initially if detailed balance does not hold.
AGILE: Autonomous Global Integrated Language Exploitation
2009-12-01
combination, including METEOR-based alignment (with stemming and WordNet synonym matching) and GIZA ++ based alignment. So far, we have not seen any...parse trees and a detailed analysis of how function words operate in translation. This program lets us fix alignment errors that systems like GIZA ...correlates better with Pyramid than with Responsiveness scoring (i.e., it is a more precise, careful, measure) • BE generally outperforms ROUGE
X-Ray Phase Imaging for Breast Cancer Detection
2012-09-01
the Gerchberg-Saxton algorithm in the Fresnel diffraction regime, and is much more robust against image noise than the TIE-based method. For details...developed efficient coding with the software modules for the image registration, flat-filed correction , and phase retrievals. In addition, we...X, Liu H. 2010. Performance analysis of the attenuation-partition based iterative phase retrieval algorithm for in-line phase-contrast imaging
ERIC Educational Resources Information Center
Mantri, Archana
2014-01-01
The intent of the study presented in this paper is to show that the model of problem-based learning (PBL) can be made scalable by designing curriculum around a set of open-ended problems (OEPs). The detailed statistical analysis of the data collected to measure the effects of traditional and PBL instructions for three courses in Electronics and…
NASA Astrophysics Data System (ADS)
Vieira, Rodrigo Drumond; Kelly, Gregory J.
2014-11-01
In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.
Web-based Factors Affecting Online Purchasing Behaviour
NASA Astrophysics Data System (ADS)
Ariff, Mohd Shoki Md; Sze Yan, Ng; Zakuan, Norhayati; Zaidi Bahari, Ahamad; Jusoh, Ahmad
2013-06-01
The growing use of internet and online purchasing among young consumers in Malaysia provides a huge prospect in e-commerce market, specifically for B2C segment. In this market, if E-marketers know the web-based factors affecting online buyers' behaviour, and the effect of these factors on behaviour of online consumers, then they can develop their marketing strategies to convert potential customers into active one, while retaining existing online customers. Review of previous studies related to the online purchasing behaviour in B2C market has point out that the conceptualization and empirical validation of the online purchasing behaviour of Information and Communication Technology (ICT) literate users, or ICT professional, in Malaysia has not been clearly addressed. This paper focuses on (i) web-based factors which online buyers (ICT professional) keep in mind while shopping online; and (ii) the effect of web-based factors on online purchasing behaviour. Based on the extensive literature review, a conceptual framework of 24 items of five factors was constructed to determine web-based factors affecting online purchasing behaviour of ICT professional. Analysis of data was performed based on the 310 questionnaires, which were collected using a stratified random sampling method, from ICT undergraduate students in a public university in Malaysia. The Exploratory factor analysis performed showed that five factors affecting online purchase behaviour are Information Quality, Fulfilment/Reliability/Customer Service, Website Design, Quick and Details, and Privacy/Security. The result of Multiple Regression Analysis indicated that Information Quality, Quick and Details, and Privacy/Security affect positively online purchase behaviour. The results provide a usable model for measuring web-based factors affecting buyers' online purchase behaviour in B2C market, as well as for online shopping companies to focus on the factors that will increase customers' online purchase.
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
The applicability of frame imaging from a spinning spacecraft. Volume 1: Summary report
NASA Technical Reports Server (NTRS)
Botticelli, R. A.; Johnson, R. O.; Wallmark, G. N.
1973-01-01
A detailed study was made of frame-type imaging systems for use on board a spin stabilized spacecraft for outer planets applications. All types of frame imagers capable of performing this mission were considered, regardless of the current state of the art. Detailed sensor models of these systems were developed at the component level and used in the subsequent analyses. An overall assessment was then made of the various systems based upon results of a worst-case performance analysis, foreseeable technology problems, and the relative reliability and radiation tolerance of the systems. Special attention was directed at restraints imposed by image motion and the limited data transmission and storage capability of the spacecraft. Based upon this overall assessment, the most promising systems were selected and then examined in detail for a specified Jupiter orbiter mission. The relative merits of each selected system were then analyzed, and the system design characteristics were demonstrated using preliminary configurations, block diagrams, and tables of estimated weights, volumes and power consumption.
Boyanova, Desislava; Nilla, Santosh; Klau, Gunnar W.; Dandekar, Thomas; Müller, Tobias; Dittrich, Marcus
2014-01-01
The continuously evolving field of proteomics produces increasing amounts of data while improving the quality of protein identifications. Albeit quantitative measurements are becoming more popular, many proteomic studies are still based on non-quantitative methods for protein identification. These studies result in potentially large sets of identified proteins, where the biological interpretation of proteins can be challenging. Systems biology develops innovative network-based methods, which allow an integrated analysis of these data. Here we present a novel approach, which combines prior knowledge of protein-protein interactions (PPI) with proteomics data using functional similarity measurements of interacting proteins. This integrated network analysis exactly identifies network modules with a maximal consistent functional similarity reflecting biological processes of the investigated cells. We validated our approach on small (H9N2 virus-infected gastric cells) and large (blood constituents) proteomic data sets. Using this novel algorithm, we identified characteristic functional modules in virus-infected cells, comprising key signaling proteins (e.g. the stress-related kinase RAF1) and demonstrate that this method allows a module-based functional characterization of cell types. Analysis of a large proteome data set of blood constituents resulted in clear separation of blood cells according to their developmental origin. A detailed investigation of the T-cell proteome further illustrates how the algorithm partitions large networks into functional subnetworks each representing specific cellular functions. These results demonstrate that the integrated network approach not only allows a detailed analysis of proteome networks but also yields a functional decomposition of complex proteomic data sets and thereby provides deeper insights into the underlying cellular processes of the investigated system. PMID:24807868
NASA Astrophysics Data System (ADS)
Abidi, Oussama; Inoubli, Mohamed Hédi; Sebei, Kawthar; Amiri, Adnen; Boussiga, Haifa; Nasr, Imen Hamdi; Salem, Abdelhamid Ben; Elabed, Mahmoud
2017-05-01
The Maastrichtian-Paleocene El Haria formation was studied and defined in Tunisia on the basis of outcrops and borehole data; few studies were interested in its three-dimensional extent. In this paper, the El Haria formation is reviewed in the context of a tectono-stratigraphic interval using an integrated seismic stratigraphic analysis based on borehole lithology logs, electrical well logging, well shots, vertical seismic profiles and post-stack surface data. Seismic analysis benefits from appropriate calibration with borehole data, conventional interpretation, velocity mapping, seismic attributes and post-stack model-based inversion. The applied methodology proved to be powerful for charactering the marly Maastrichtian-Paleocene interval of the El Haria formation. Migrated seismic sections together with borehole measurements are used to detail the three-dimensional changes in thickness, facies and depositional environment in the Cap Bon and Gulf of Hammamet regions during the Maastrichtian-Paleocene time. Furthermore, dating based on their microfossil content divulges local and multiple internal hiatuses within the El Haria formation which are related to the geodynamic evolution of the depositional floor since the Campanian stage. Interpreted seismic sections display concordance, unconformities, pinchouts, sedimentary gaps, incised valleys and syn-sedimentary normal faulting. Based on the seismic reflection geometry and terminations, seven sequences are delineated. These sequences are related to base-level changes as the combination of depositional floor paleo-topography, tectonic forces, subsidence and the developed accommodation space. These factors controlled the occurrence of the various parts of the Maastrichtian-Paleocene interval. Detailed examinations of these deposits together with the analysis of the structural deformation at different time periods allowed us to obtain a better understanding of the sediment architecture in depth and the delineation of the geodynamic evolution of the region.
SWMPrats.net: A Web-Based Resource for Exploring SWMP ...
SWMPrats.net is a web-based resource that provides accessible approaches to using SWMP data. The website includes a user forum with instructional ‘Plots of the Month’; links to workshop content; and a description of the SWMPr data analysis package for R. Interactive “widgets” allow users to skip the boring parts of data analysis and get right to the fun: visualization and exploration! There are three widgets, each performing a different analysis: system-wide overviews, detailed temporal summaries of a single variable at a single site, and inter-comparisons between sites or variables through time. Users can visually explore system-wide trends in data using the Trends Map widget. For a more detailed analysis, users can create monthly and annual graphs of single variables and locations in the Summary Plot widget. Lastly, users can compare two variables or NERRS locations through time using the Aggregation widget. For all widgets, users can adjust the time period of interest. Plots and tables can also be downloaded for use in outreach, education, or further analysis. The tools and forums are meant to build a community of practice to move SWMP data analysis forward. All widgets will be demonstrated live at the poster session. This abstract is for a poster presentation at the 2016 annual meeting for the National Estuarine Research Reserve System, Nov. 13-18. We will describe our online web resources for the analysis and interpretation of monitoring da
DOT National Transportation Integrated Search
2016-02-01
In this study, a computational approach for conducting durability analysis of bridges using detailed finite element models is developed. The underlying approach adopted is based on the hypothesis that the two main factors affecting the life of a brid...
Education Cost Study, 2005-2006
ERIC Educational Resources Information Center
Washington Higher Education Coordinating Board, 2007
2007-01-01
Produced every four years by the Washington Higher Education Coordinating Board, the education cost study provides detailed instructional cost information for the state's public two-year and four-year institutions. The cost analysis is based on expenditures drawn from two sources: state Near-General Fund appropriations and tuition revenue. By…
NASA Astrophysics Data System (ADS)
Alsabry, A.; Truszkiewicz, P.; Szymański, K.; Łaskawiec, K.; Rojek, Ł.
2017-12-01
The article presents an analysis of buildings belonging the Department of Public Utilities and Housing in Zielona Góra. The research was based on a set of questions for building operators. The questionnaires consisted of 30 questions concerning general and detailed information about the buildings. In order to clearly present the results, this article includes data only about residential and residential-commercial buildings. Forty building built in different periods were selected for analysis.
NASA Technical Reports Server (NTRS)
1974-01-01
The feasibility is evaluated of an evolutionary development for use of a single-axis gimbal star tracker from prior two-axis gimbal star tracker based system applications. Detailed evaluation of the star tracker gimbal encoder is considered. A brief system description is given including the aspects of tracker evolution and encoder evaluation. System analysis includes evaluation of star availability and mounting constraints for the geosynchronous orbit application, and a covariance simulation analysis to evaluate performance potential. Star availability and covariance analysis digital computer programs are included.
Calculation of three-dimensional, inviscid, supersonic, steady flows
NASA Technical Reports Server (NTRS)
Moretti, G.
1981-01-01
A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.
NASA Astrophysics Data System (ADS)
Sung, Hae-Jin; Go, Byeong-Soo; Jiang, Zhenan; Park, Minwon; Yu, In-Keun
2016-11-01
The development of an effective high-temperature superconducting (HTS) generator is currently a research focus; however, the reduction of heat loss of a large-scale HTS generator is a challenge. This study deals with a heat loss analysis-based design of a 12 MW wind power generator module having an HTS flux pump exciter. The generator module consists of an HTS rotor of the generator and an HTS flux pump exciter. The specifications of the module were described, and the detailed configuration of the module was illustrated. For the heat loss analysis of the module, the excitation loss of the flux pump exciter, eddy current loss of all of the structures in the module, radiation loss, and conduction loss of an HTS coil supporter were assessed using a 3D finite elements method program. In the case of the conduction loss, different types of the supporters were compared to find out the supporter of the lowest conduction loss in the module. The heat loss analysis results of the module were reflected in the design of the generator module and discussed in detail. The results will be applied to the design of large-scale superconducting generators for wind turbines including a cooling system.
Advanced Thin Ionization Calorimeter (ATIC) Balloon Experiment
NASA Technical Reports Server (NTRS)
Wefel, John P.; Guzik, T. Gregory
2001-01-01
During grant NAG5-5064, Louisiana State University (LSU) led the ATIC team in the development, construction, testing, accelerator validation, pre-deployment integration and flight operations of the Advanced Thin Ionization Calorimeter (ATIC) Balloon Experiment. This involved interfacing among the ATIC collaborators (UMD, NRL/MSFC, SU, MSU, WI, SNU) to develop a new balloon payload based upon a fully active calorimeter, a carbon target, a scintillator strip hodoscope and a pixilated silicon solid state detector for a detailed investigation of the very high energy cosmic rays to energies beyond 10(exp 14) eV/nucleus. It is in this very high energy region that theory predicts changes in composition and energy spectra related to the Supernova Remnant Acceleration model for cosmic rays below the "knee" in the all-particle spectrum. This report provides a documentation list, details the anticipated ATIC science return, describes the particle detection principles on which the experiment is based, summarizes the simulation results for the system, describes the validation work at the CERN SPS accelerator and details the balloon flight configuration. The ATIC experiment had a very successful LDB flight from McMurdo, Antarctica in 12/00 - 1/01. The instrument performed well for the entire 15 days. Preliminary data analysis shows acceptable charge resolution and an all-particle power law energy deposition distribution not inconsistent with previous measurements. Detailed analysis is underway and will result in new data on the cosmic ray charge and energy spectra in the GeV - TeV energy range. ATIC is currently being refurbished in anticipation of another LDB flight in the 2002-03 period.
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
FPCAS3D User's guide: A three dimensional full potential aeroelastic program, version 1
NASA Technical Reports Server (NTRS)
Bakhle, Milind A.
1995-01-01
The FPCAS3D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady three-dimensional full potential equation which is solved for a blade row. The structural analysis is based on a finite-element model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS3D code. A complete description of the input data is provided in this report. In addition, six examples, including inputs and outputs, are provided.
FPCAS2D user's guide, version 1.0
NASA Technical Reports Server (NTRS)
Bakhle, Milind A.
1994-01-01
The FPCAS2D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady two-dimensional full potential equation which is solved for a cascade of blades. The structural analysis is based on a two degree-of-freedom rigid typical section model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS2D code. A complete description of the input data is provided in this report. In addition, four test cases, including inputs and outputs, are provided.
Analysis of BSRT Profiles in the LHC at Injection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitterer, M.; Stancari, G.; Papadopoulou, S.
The beam synchrotron radiation telescope (BSRT) at the LHC allows to take profiles of the transverse beam distribution, which can provide useful additional insight in the evolution of the transverse beam distribution. A python class has been developed [1], which allows to read in the BSRT profiles, usually stored in binary format, run different analysis tools and generate plots of the statistical parameters and profiles as well as videos of the the profiles. The detailed analysis will be described in this note. The analysis is based on the data obtained at injection energy (450 GeV) during MD1217 [2] and MD1415more » [3] which will be also used as illustrative example. A similar approach is also taken with a MATLAB based analysis described in [4].« less
Ageing management of french NPP civil work structures
NASA Astrophysics Data System (ADS)
Gallitre, E.; Dauffer, D.
2011-04-01
This paper presents EDF practice about concrete structure ageing management, from the mechanisms analysis to the formal procedure which allows the French company to increase 900 MWe NPP lifetime until 40 years; it will also introduce its action plan for 60 years lifetime extension. This practice is based on a methodology which identifies every ageing mechanism; both plants feedback and state of the art are screened and conclusions are drawn up into an "ageing analysis data sheet". That leads at first to a collection of 57 data sheets which give the mechanism identification, the components that are concerned and an analysis grid which is designed to assess the safety risk. This analysis screens the reference documents describing the mechanism, the design lifetime hypotheses, the associated regulation or codification, the feedback experiences, the accessibility, the maintenance actions, the repair possibility and so one. This analysis has to lead to a conclusion about the risk taking into account monitoring and maintenance. If the data sheet conclusion is not clear enough, then a more detailed report is launched. The technical document which is needed, is a formal detailed report which summarizes every theoretical knowledge and monitoring data: its objective is to propose a solution for ageing management: this solution can include more inspections or specific research development, or additional maintenance. After a first stage on the 900 MWe units, only two generic ageing management detailed reports have been needed for the civil engineering part: one about reactor building containment, and one about other structures which focuses on concrete inflating reactions. The second stage consists on deriving this generic analysis (ageing mechanism and detailed reports) to every plant where a complete ageing report is required (one report for all equipments and structures of the plant, but specific for each reactor). This ageing management is a continuous process because the 57 generic data sheets set is updated every year and the detailed generic reports every five years. After this 40 year lifetime extension, EDF is preparing a 60 years lifetime action plan which includes R&D actions, specific industrial studies and also monitoring improvements.
Analysis of the implementation of ergonomic design at the new units of an oil refinery.
Passero, Carolina Reich Marcon; Ogasawara, Erika Lye; Baú, Lucy Mara Silva; Buso, Sandro Artur; Bianchi, Marcos Cesar
2012-01-01
Ergonomic design is the adaptation of working conditions to human limitations and skills in the physical design phase of a new installation, a new working system, or new products or tools. Based on this concept, the purpose of this work was to analyze the implementation of ergonomic design at the new industrial units of an oil refinery, using the method of Ergonomic Workplace Assessment. This study was conducted by a multidisciplinary team composed of operation, maintenance and industrial safety technicians, ergonomists, designers and engineers. The analysis involved 6 production units, 1 industrial wastewater treatment unit, and 3 utilities units, all in the design detailing phase, for which 455 ergonomic requirements were identified. An analysis and characterization of the requirements identified for 5 of the production units, involving a total of 246 items, indicated that 62% were related to difficult access and blockage operations, while 15% were related to difficulties in the circulation of employees inside the units. Based on these data, it was found that the ergonomic requirements identified in the design detailing phase of an industrial unit involve physical ergonomics, and that it is very difficult to identify requirements related to organizational or cognitive ergonomics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finsterle, Stefan A.
2010-11-01
iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional , multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. It performs sensitivity analysis, parameter estimation, and uncertainty propagation, analysis in geosciences and reservoir engineering and other application areas. It supports a number of different combination of fluids and components [equation-of-state (EOS) modules]. In addition, the optimization routines implemented in iTOUGH2 can also be used or sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files. This link is achieved by means of the PEST application programmingmore » interface. iTOUGH2 solves the inverse problem by minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative fee, gradient-based and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlos simulation for uncertainty propagation analysis. A detailed residual and error analysis is provided. This upgrade includes new EOS modules (specifically EOS7c, ECO2N and TMVOC), hysteretic relative permeability and capillary pressure functions and the PEST API. More details can be found at http://esd.lbl.gov/iTOUGH2 and the publications cited there. Hardware Req.: Multi-platform; Related/auxiliary software PVM (if running in parallel).« less
Non-minimally coupled condensate cosmologies: a phase space analysis
NASA Astrophysics Data System (ADS)
Carloni, Sante; Vignolo, Stefano; Cianci, Roberto
2014-09-01
We present an analysis of the phase space of cosmological models based on a non-minimal coupling between the geometry and a fermionic condensate. We observe that the strong constraint coming from the Dirac equations allows a detailed design of the cosmology of these models, and at the same time guarantees an evolution towards a state indistinguishable from general relativistic cosmological models. In this light, we show in detail how the use of some specific potentials can naturally reproduce a phase of accelerated expansion. In particular, we find for the first time that an exponential potential is able to induce two de Sitter phases separated by a power law expansion, which could be an interesting model for the unification of an inflationary phase and a dark energy era.
Design optimization of a prescribed vibration system using conjoint value analysis
NASA Astrophysics Data System (ADS)
Malinga, Bongani; Buckner, Gregory D.
2016-12-01
This article details a novel design optimization strategy for a prescribed vibration system (PVS) used to mechanically filter solids from fluids in oil and gas drilling operations. A dynamic model of the PVS is developed, and the effects of disturbance torques are detailed. This model is used to predict the effects of design parameters on system performance and efficiency, as quantified by system attributes. Conjoint value analysis, a statistical technique commonly used in marketing science, is utilized to incorporate designer preferences. This approach effectively quantifies and optimizes preference-based trade-offs in the design process. The effects of designer preferences on system performance and efficiency are simulated. This novel optimization strategy yields improvements in all system attributes across all simulated vibration profiles, and is applicable to other industrial electromechanical systems.
Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora
2018-06-15
Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.
Risk-based maintenance of ethylene oxide production facilities.
Khan, Faisal I; Haddara, Mahmoud R
2004-05-20
This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.
A new image encryption algorithm based on the fractional-order hyperchaotic Lorenz system
NASA Astrophysics Data System (ADS)
Wang, Zhen; Huang, Xia; Li, Yu-Xia; Song, Xiao-Na
2013-01-01
We propose a new image encryption algorithm on the basis of the fractional-order hyperchaotic Lorenz system. While in the process of generating a key stream, the system parameters and the derivative order are embedded in the proposed algorithm to enhance the security. Such an algorithm is detailed in terms of security analyses, including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. The experimental results demonstrate that the proposed image encryption scheme has the advantages of large key space and high security for practical image encryption.
Cognitive approaches for patterns analysis and security applications
NASA Astrophysics Data System (ADS)
Ogiela, Marek R.; Ogiela, Lidia
2017-08-01
In this paper will be presented new opportunities for developing innovative solutions for semantic pattern classification and visual cryptography, which will base on cognitive and bio-inspired approaches. Such techniques can be used for evaluation of the meaning of analyzed patterns or encrypted information, and allow to involve such meaning into the classification task or encryption process. It also allows using some crypto-biometric solutions to extend personalized cryptography methodologies based on visual pattern analysis. In particular application of cognitive information systems for semantic analysis of different patterns will be presented, and also a novel application of such systems for visual secret sharing will be described. Visual shares for divided information can be created based on threshold procedure, which may be dependent on personal abilities to recognize some image details visible on divided images.
Toward Theory-Based Instruction in Scientific Problem Solving.
ERIC Educational Resources Information Center
Heller, Joan I.; And Others
Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…
Publications - GMC 383 | Alaska Division of Geological & Geophysical
Alaska MAPTEACH Tsunami Inundation Mapping Energy Resources Gas Hydrates STATEMAP Program information DGGS GMC 383 Publication Details Title: Makushin Geothermal Project ST-1R, A-1, D-2 Core 2009 re -sampling and analysis: Analytical results for anomalous precious and base metals associated with geothermal
Publications - GMC 366 | Alaska Division of Geological & Geophysical
Alaska MAPTEACH Tsunami Inundation Mapping Energy Resources Gas Hydrates STATEMAP Program information DGGS GMC 366 Publication Details Title: Makushin Geothermal Project ST-1R Core 2009 re-sampling and analysis: Analytical results for anomalous precious and base metals associated with geothermal systems
Education Cost Study, 2001-02. Revised
ERIC Educational Resources Information Center
Washington Higher Education Coordinating Board, 2004
2004-01-01
Produced every four years by the Washington Higher Education Coordinating Board (HECB), the Education Cost Study provides detailed instructional cost information for the state's public two-year and four-year institutions. The cost analysis is based on expenditures drawn from two sources: (1) state appropriations; and (2) tuition revenue. By using…
Learning Molecular Behaviour May Improve Student Explanatory Models of the Greenhouse Effect
ERIC Educational Resources Information Center
Harris, Sara E.; Gold, Anne U.
2018-01-01
We assessed undergraduates' representations of the greenhouse effect, based on student-generated concept sketches, before and after a 30-min constructivist lesson. Principal component analysis of features in student sketches revealed seven distinct and coherent explanatory models including a new "Molecular Details" model. After the…
Molecular phylogeny and evolutionary timescale for the family of mammalian herpesviruses.
McGeoch, D J; Cook, S; Dolan, A; Jamieson, F E; Telford, E A
1995-03-31
A detailed phylogenetic analysis for mammalian members of the family Herpesviridae, based on molecular sequences is reported. Sets of encoded amino acid sequences were collected for eight well conserved genes that are common to mammalian herpesviruses. Phylogenetic trees were inferred from alignments of these sequence sets using both maximum parsimony and distance methods, and evaluated by bootstrap analysis. In all cases the three recognised subfamilies (Alpha-, Beta- and Gammaherpesvirinae), and major sublineages in each subfamily, were clearly distinguished, but within sublineages some finer details of branching were incompletely resolved. Multiple-gene sets were assembled to give a broadly based tree. The root position of the tree was estimated by assuming a constant molecular clock and also by analysis of one herpesviral gene set (that encoding uracil-DNA glycosylase) using cellular homologues as outgroups. Both procedures placed the root between the Alphaherpesvirinae and the other two subfamilies. Substitution rates were calculated for the combined gene sets based on a previous estimate for alphaherpesviral UL27 genes, where the time base had been obtained according to the hypothesis of cospeciation of virus and host lineages. Assuming a constant molecular clock, it was then estimated that the three subfamilies arose approximately 180 to 220 million years ago, that major sublineages within subfamilies were probably generated before the mammalian radiation of 80 to 60 million years ago, and that speciations within sublineages took place in the last 80 million years, probably with a major component of cospeciation with host lineages.
NASA Technical Reports Server (NTRS)
Sease, Brad
2017-01-01
The Wide Field Infrared Survey Telescope is a 2.4-meter telescope planned for launch to the Sun-Earth L2 point in 2026. This paper details a preliminary study of the achievable accuracy for WFIRST from ground-based orbit determination routines. The analysis here is divided into two segments. First, a linear covariance analysis of early mission and routine operations provides an estimate of the tracking schedule required to meet mission requirements. Second, a simulated operations scenario gives insight into the expected behavior of a daily Extended Kalman Filter orbit estimate over the first mission year given a variety of potential momentum unloading schemes.
NASA Technical Reports Server (NTRS)
Sease, Bradley; Myers, Jessica; Lorah, John; Webster, Cassandra
2017-01-01
The Wide Field Infrared Survey Telescope is a 2.4-meter telescope planned for launch to the Sun-Earth L2 point in 2026. This paper details a preliminary study of the achievable accuracy for WFIRST from ground-based orbit determination routines. The analysis here is divided into two segments. First, a linear covariance analysis of early mission and routine operations provides an estimate of the tracking schedule required to meet mission requirements. Second, a simulated operations'' scenario gives insight into the expected behavior of a daily Extended Kalman Filter orbit estimate over the first mission year given a variety of potential momentum unloading schemes.
Lagrangian analysis of multiscale particulate flows with the particle finite element method
NASA Astrophysics Data System (ADS)
Oñate, Eugenio; Celigueta, Miguel Angel; Latorre, Salvador; Casas, Guillermo; Rossi, Riccardo; Rojek, Jerzy
2014-05-01
We present a Lagrangian numerical technique for the analysis of flows incorporating physical particles of different sizes. The numerical approach is based on the particle finite element method (PFEM) which blends concepts from particle-based techniques and the FEM. The basis of the Lagrangian formulation for particulate flows and the procedure for modelling the motion of small and large particles that are submerged in the fluid are described in detail. The numerical technique for analysis of this type of multiscale particulate flows using a stabilized mixed velocity-pressure formulation and the PFEM is also presented. Examples of application of the PFEM to several particulate flows problems are given.
Trellis coding with Continuous Phase Modulation (CPM) for satellite-based land-mobile communications
NASA Technical Reports Server (NTRS)
1989-01-01
This volume of the final report summarizes the results of our studies on the satellite-based mobile communications project. It includes: a detailed analysis, design, and simulations of trellis coded, full/partial response CPM signals with/without interleaving over various Rician fading channels; analysis and simulation of computational cutoff rates for coherent, noncoherent, and differential detection of CPM signals; optimization of the complete transmission system; analysis and simulation of power spectrum of the CPM signals; design and development of a class of Doppler frequency shift estimators; design and development of a symbol timing recovery circuit; and breadboard implementation of the transmission system. Studies prove the suitability of the CPM system for mobile communications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reynolds, K.M.; Holsten, E.H.; Werner, R.A.
1995-03-01
SBexpert version 1.0 is a knowledge-based decision-support system for management of spruce beetle developed for use in Microsoft Windows. The users guide provides detailed instructions on the use of all SBexpert features. SBexpert has four main subprograms; introduction, analysis, textbook, and literature. The introduction is the first of the five subtopics in the SBexpert help system. The analysis topic is an advisory system for spruce beetle management that provides recommendation for reducing spruce beetle hazard and risk to spruce stands and is the main analytical topic in SBexpert. The textbook and literature topics provide complementary decision support for analysis.
1988-06-01
Based Software Engineering Project Course .............. 83 SSoftware Engineering, Software Engineering Concepts: The Importance of Object-Based...quality assurance, and independent system testing . The Chief Programmer is responsible for all software development activities, including prototyping...during the Requirements Analysis phase, the Preliminary Design, the Detailed Design, Coding and Unit Testing , CSC Integration and Testing , and informal
Operational Based Vision Assessment Cone Contrast Test: Description and Operation
2016-06-02
Jun 2016. Report contains color . 14. ABSTRACT The work detailed in this report was conducted by the Operational Based Vision Assessment (OBVA...currently used by the Air Force for aircrew color vision screening. The new OBVA CCT is differentiated from the Rabin device primarily by hardware...test procedures, and analysis techniques. Like the Rabin CCT, the OBVA CCT uses colors that selectively stimulate the cone photoreceptors of the
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
Satellite power systems (SPS) concept definition study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Hanley, G. M.
1980-01-01
System definition studies resulted in a further definition of the reference system using gallium arsenide solar arrays, analysis of alternative subsystem options for the reference concept, preliminary solid state microwave concept studies, and an environmental analysis of laser transmission systems. The special emphasis studies concentrated on satellite construction, satellite construction base definition, satellite construction base construction, and rectenna construction. Major emphasis in the transportation studies was put on definition of a two stage parallel burn, vertical takeoff/horizontal landing concept. The electric orbit transfer vehicle was defined in greater detail. Program definition included cost analyses and schedule definition.
Analysis of Arterial Mechanics During Head-down Tilt Bed Rest
NASA Technical Reports Server (NTRS)
Elliot, Morgan; Martin, David S.; Westby, Christian M.; Stenger, Michael B.; Platts, Steve
2014-01-01
Arterial health may be affected by microgravity or ground based analogs of spaceflight, as shown by an increase in thoracic aorta stiffness1. Head-down tilt bed rest (HDTBR) is often used as a ground-based simulation of spaceflight because it induces physiological changes similar to those that occur in space2, 3. This abstract details an analysis of arterial stiffness (a subclinical measure of atherosclerosis), the distensibility coefficient (DC), and the pressure-strain elastic modulus (PSE) of the arterial walls during HDTBR. This project may help determine how spaceflight differentially affects arterial function in the upper vs. lower body.
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1992-01-01
Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.
Design of impact-resistant boron/aluminum large fan blade
NASA Technical Reports Server (NTRS)
Salemme, C. T.; Yokel, S. A.
1978-01-01
The technical program was comprised of two technical tasks. Task 1 encompassed the preliminary boron/aluminum fan blade design effort. Two preliminary designs were evolved. An initial design consisted of 32 blades per stage and was based on material properties extracted from manufactured blades. A final design of 36 blades per stage was based on rule-of-mixture material properties. In Task 2, the selected preliminary blade design was refined via more sophisticated analytical tools. Detailed finite element stress analysis and aero performance analysis were carried out to determine blade material frequencies and directional stresses.
Analysis of high voltage step-up nonisolated DC-DC boost converters
NASA Astrophysics Data System (ADS)
Alisson Alencar Freitas, Antônio; Lessa Tofoli, Fernando; Junior, Edilson Mineiro Sá; Daher, Sergio; Antunes, Fernando Luiz Marcelo
2016-05-01
A high voltage step-up nonisolated DC-DC converter based on coupled inductors suitable to photovoltaic (PV) systems applications is proposed in this paper. Considering that numerous approaches exist to extend the voltage conversion ratio of DC-DC converters that do not use transformers, a detailed comparison is also presented among the proposed converter and other popular topologies such as the conventional boost converter and the quadratic boost converter. The qualitative analysis of the coupled-inductor-based topology is developed so that a design procedure can be obtained, from which an experimental prototype is implemented to validate the theoretical assumptions.
Zinken, Katarzyna M; Cradock, Sue; Skinner, T Chas
2008-08-01
The paper presents the development of a coding tool for self-efficacy orientated interventions in diabetes self-management programmes (Analysis System for Self-Efficacy Training, ASSET) and explores its construct validity and clinical utility. Based on four sources of self-efficacy (i.e., mastery experience, role modelling, verbal persuasion and physiological and affective states), published self-efficacy based interventions for diabetes care were analysed in order to identify specific verbal behavioural techniques. Video-recorded facilitating behaviours were evaluated using ASSET. The reliability between four coders was high (K=0.71). ASSET enabled assessment of both self-efficacy based techniques and participants' response to those techniques. Individual patterns of delivery and shifts over time across facilitators were found. In the presented intervention we observed that self-efficacy utterances were followed by longer patient verbal responses than non-self-efficacy utterances. These detailed analyses with ASSET provide rich data and give the researcher an insight into the underlying mechanism of the intervention process. By providing a detailed description of self-efficacy strategies ASSET can be used by health care professionals to guide reflective practice and support training programmes.
A low-cost drone based application for identifying and mapping of coastal fish nursery grounds
NASA Astrophysics Data System (ADS)
Ventura, Daniele; Bruno, Michele; Jona Lasinio, Giovanna; Belluscio, Andrea; Ardizzone, Giandomenico
2016-03-01
Acquiring seabed, landform or other topographic data in the field of marine ecology has a pivotal role in defining and mapping key marine habitats. However, accessibility for this kind of data with a high level of detail for very shallow and inaccessible marine habitats has been often challenging, time consuming. Spatial and temporal coverage often has to be compromised to make more cost effective the monitoring routine. Nowadays, emerging technologies, can overcome many of these constraints. Here we describe a recent development in remote sensing based on a small unmanned drone (UAVs) that produce very fine scale maps of fish nursery areas. This technology is simple to use, inexpensive, and timely in producing aerial photographs of marine areas. Both technical details regarding aerial photos acquisition (drone and camera settings) and post processing workflow (3D model generation with Structure From Motion algorithm and photo-stitching) are given. Finally by applying modern algorithm of semi-automatic image analysis and classification (Maximum Likelihood, ECHO and Object-based Image Analysis) we compared the results of three thematic maps of nursery area for juvenile sparid fishes, highlighting the potential of this method in mapping and monitoring coastal marine habitats.
Axisymmetric computational fluid dynamics analysis of a film/dump-cooled rocket nozzle plume
NASA Technical Reports Server (NTRS)
Tucker, P. K.; Warsi, S. A.
1993-01-01
Prediction of convective base heating rates for a new launch vehicle presents significant challenges to analysts concerned with base environments. The present effort seeks to augment classical base heating scaling techniques via a detailed investigation of the exhaust plume shear layer of a single H2/O2 Space Transportation Main Engine (STME). Use of fuel-rich turbine exhaust to cool the STME nozzle presented concerns regarding potential recirculation of these gases to the base region with attendant increase in the base heating rate. A pressure-based full Navier-Stokes computational fluid dynamics (CFD) code with finite rate chemistry is used to predict plumes for vehicle altitudes of 10 kft and 50 kft. Levels of combustible species within the plume shear layers are calculated in order to assess assumptions made in the base heating analysis.
SU-E-T-635: Process Mapping of Eye Plaque Brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huynh, J; Kim, Y
Purpose: To apply a risk-based assessment and analysis technique (AAPM TG 100) to eye plaque brachytherapy treatment of ocular melanoma. Methods: The role and responsibility of personnel involved in the eye plaque brachytherapy is defined for retinal specialist, radiation oncologist, nurse and medical physicist. The entire procedure was examined carefully. First, major processes were identified and then details for each major process were followed. Results: Seventy-one total potential modes were identified. Eight major processes (corresponding detailed number of modes) are patient consultation (2 modes), pretreatment tumor localization (11), treatment planning (13), seed ordering and calibration (10), eye plaque assembly (10),more » implantation (11), removal (11), and deconstruction (3), respectively. Half of the total modes (36 modes) are related to physicist while physicist is not involved in processes such as during the actual procedure of suturing and removing the plaque. Conclusion: Not only can failure modes arise from physicist-related procedures such as treatment planning and source activity calibration, but it can also exist in more clinical procedures by other medical staff. The improvement of the accurate communication for non-physicist-related clinical procedures could potentially be an approach to prevent human errors. More rigorous physics double check would reduce the error for physicist-related procedures. Eventually, based on this detailed process map, failure mode and effect analysis (FMEA) will identify top tiers of modes by ranking all possible modes with risk priority number (RPN). For those high risk modes, fault tree analysis (FTA) will provide possible preventive action plans.« less
2016 Cost of Wind Energy Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stehly, Tyler J.; Heimiller, Donna M.; Scott, George N.
This report uses representative utility-scale projects to estimate the levelized cost of energy (LCOE) for land-based and offshore wind power plants in the United States. Data and results detailed here are derived from 2016 commissioned plants. More specifically, analysis detailed here relies on recent market data and state-of-the-art modeling capabilities to maintain an up-to-date understanding of wind energy cost trends and drivers. This report is intended to provide insight into current component-level costs as well as a basis for understanding variability in LCOE across the country. This publication represents the sixth installment of this annual report.
PEMNetwork: Barriers and Enablers to Collaboration and Multimedia Education in the Digital Age.
Lumba-Brown, Angela; Tat, Sonny; Auerbach, Marc A; Kessler, David O; Alletag, Michelle; Grover, Purva; Schnadower, David; Macias, Charles G; Chang, Todd P
2016-08-01
In January 2005, PEMFellows.com was created to unify fellows in pediatric emergency medicine. Since then, the website has expanded, contracted, and focused to adapt to the interests of the pediatric emergency medicine practitioner during the internet boom. This review details the innovation of the PEMNetwork, from the inception of the initial website and its evolution into a needs-based, user-directed educational hub. Barriers and enablers to success are detailed with unique examples from descriptive analysis and metrics of PEMNetwork web traffic as well as examples from other online medical communities and digital education websites.
2015 Cost of Wind Energy Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moné, Christopher; Hand, Maureen; Bolinger, Mark
This report uses representative utility-scale projects to estimate the levelized cost of energy (LCOE) for land-based and offshore wind plants in the United States. Data and results detailed here are derived from 2015 commissioned plants. More specifically, analysis detailed here relies on recent market data and state-of-the-art modeling capabilities to maintain an up-to-date understanding of wind energy cost trends and drivers. It is intended to provide insight into current component-level costs as well as a basis for understanding variability in LCOE across the industry. This publication reflects the fifth installment of this annual report.
A Detailed Picture of the (93) Minerva Triple System
NASA Astrophysics Data System (ADS)
Marchis, F.; Descamps, P.; Dalba, P.; Enriquez, J. E.; Durech, J.; Emery, J. P.; Berthier, J.; Vachier, F.; Merlbourne, J.; Stockton, A. N.; Fassnacht, C. D.; Dupuy, T. J.
2011-10-01
We developed an orbital model of the satellites of (93) Minerva based on Keck II AO observations recorded in 2009 and a mutual event between one moon and the primary detected in March 2010. Using new lightcurves we found an approximated ellipsoid shape model for the primary. With a reanalysis of the IRAS data, we derived a preliminary bulk density of 1.5±0.2 g/cc. We will present a detailed analysis of the system, including a 3D shape model of the 93 Minerva primary derived by combining our AO observations, lightcurve, and stellar occultations.
Detailed investigation of causes of avionics field failures
NASA Astrophysics Data System (ADS)
Kallis, J. M.; Buechler, D. W.; Richardson, Z. C.; Backes, P. G.; Lopez, S. B.; Erickson, J. J.; van Westerhuyzen, D. H.
A detailed analysis of digital and analog modules from the F-15 AN/APG-63 Radar was performed to identify the kinds, types, and number of life models based on observed failure modes, mechanisms, locations, and characteristics needed to perform a Failure Free Operating Period prediction for these items. It is found that a significant fraction of the failures of the analog module and a small fraction of those of the digital module resulted from the exacerbation of latent defects by environmental stresses. It is also found that the fraction of failures resulting from thermal cycling and vibration is small.
Wheeze sound analysis using computer-based techniques: a systematic review.
Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian
2017-10-31
Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.
A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.
Intelligent Operation and Maintenance of Micro-grid Technology and System Development
NASA Astrophysics Data System (ADS)
Fu, Ming; Song, Jinyan; Zhao, Jingtao; Du, Jian
2018-01-01
In order to achieve the micro-grid operation and management, Studying the micro-grid operation and maintenance knowledge base. Based on the advanced Petri net theory, the fault diagnosis model of micro-grid is established, and the intelligent diagnosis and analysis method of micro-grid fault is put forward. Based on the technology, the functional system and architecture of the intelligent operation and maintenance system of micro-grid are studied, and the microcomputer fault diagnosis function is introduced in detail. Finally, the system is deployed based on the micro-grid of a park, and the micro-grid fault diagnosis and analysis is carried out based on the micro-grid operation. The system operation and maintenance function interface is displayed, which verifies the correctness and reliability of the system.
Scaltriti, Erika; Sassera, Davide; Comandatore, Francesco; Morganti, Marina; Mandalari, Carmen; Gaiarsa, Stefano; Bandi, Claudio; Zehender, Gianguglielmo; Bolzoni, Luca; Casadei, Gabriele
2015-01-01
We retrospectively analyzed a rare Salmonella enterica serovar Manhattan outbreak that occurred in Italy in 2009 to evaluate the potential of new genomic tools based on differential single nucleotide polymorphism (SNP) analysis in comparison with the gold standard genotyping method, pulsed-field gel electrophoresis. A total of 39 isolates were analyzed from patients (n = 15) and food, feed, animal, and environmental sources (n = 24), resulting in five different pulsed-field gel electrophoresis (PFGE) profiles. Isolates epidemiologically related to the outbreak clustered within the same pulsotype, SXB_BS.0003, without any further differentiation. Thirty-three isolates were considered for genomic analysis based on different sets of SNPs, core, synonymous, nonsynonymous, as well as SNPs in different codon positions, by Bayesian and maximum likelihood algorithms. Trees generated from core and nonsynonymous SNPs, as well as SNPs at the second and first plus second codon positions detailed four distinct groups of isolates within the outbreak pulsotype, discriminating outbreak-related isolates of human and food origins. Conversely, the trees derived from synonymous and third-codon-position SNPs clustered food and human isolates together, indicating that all outbreak-related isolates constituted a single clone, which was in line with the epidemiological evidence. Further experiments are in place to extend this approach within our regional enteropathogen surveillance system. PMID:25653407
Scaltriti, Erika; Sassera, Davide; Comandatore, Francesco; Morganti, Marina; Mandalari, Carmen; Gaiarsa, Stefano; Bandi, Claudio; Zehender, Gianguglielmo; Bolzoni, Luca; Casadei, Gabriele; Pongolini, Stefano
2015-04-01
We retrospectively analyzed a rare Salmonella enterica serovar Manhattan outbreak that occurred in Italy in 2009 to evaluate the potential of new genomic tools based on differential single nucleotide polymorphism (SNP) analysis in comparison with the gold standard genotyping method, pulsed-field gel electrophoresis. A total of 39 isolates were analyzed from patients (n=15) and food, feed, animal, and environmental sources (n=24), resulting in five different pulsed-field gel electrophoresis (PFGE) profiles. Isolates epidemiologically related to the outbreak clustered within the same pulsotype, SXB_BS.0003, without any further differentiation. Thirty-three isolates were considered for genomic analysis based on different sets of SNPs, core, synonymous, nonsynonymous, as well as SNPs in different codon positions, by Bayesian and maximum likelihood algorithms. Trees generated from core and nonsynonymous SNPs, as well as SNPs at the second and first plus second codon positions detailed four distinct groups of isolates within the outbreak pulsotype, discriminating outbreak-related isolates of human and food origins. Conversely, the trees derived from synonymous and third-codon-position SNPs clustered food and human isolates together, indicating that all outbreak-related isolates constituted a single clone, which was in line with the epidemiological evidence. Further experiments are in place to extend this approach within our regional enteropathogen surveillance system. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, M.K.
Technoeconomic analyses have been conducted on two processes to produce hydrogen from biomass: indirectly-heated gasification of biomass followed by steam reforming of the syngas, and biomass pyrolysis followed by steam reforming of the pyrolysis oil. The analysis of the gasification-based process was highly detailed, including a process flowsheet, material and energy balances calculated with a process simulation program, equipment cost estimation, and the determination of the necessary selling price of hydrogen. The pyrolysis-based process analysis was of a less detailed nature, as all necessary experimental data have not been obtained; this analysis is a follow-up to the preliminary economic analysismore » presented at the 1994 Hydrogen Program Review. A coproduct option in which pyrolysis oil is used to produce hydrogen and a commercial adhesive was also studied for economic viability. Based on feedstock availability estimates, three plant sizes were studied: 907 T/day, 272 T/day, and 27 T/day. The necessary selling price of hydrogen produced by steam reforming syngas from the Battelle Columbus Laboratories indirectly heated biomass gasifier falls within current market values for the large and medium size plants within a wide range of feedstock costs. Results show that the small scale plant does not produce hydrogen at economically competitive prices, indicating that if gasification is used as the upstream process to produce hydrogen, local refueling stations similar to current gasoline stations, would probably not be feasible.« less
Khandoker, Ahsan H; Karmakar, Chandan K; Begg, Rezaul K; Palaniswami, Marimuthu
2007-01-01
As humans age or are influenced by pathology of the neuromuscular system, gait patterns are known to adjust, accommodating for reduced function in the balance control system. The aim of this study was to investigate the effectiveness of a wavelet based multiscale analysis of a gait variable [minimum toe clearance (MTC)] in deriving indexes for understanding age-related declines in gait performance and screening of balance impairments in the elderly. MTC during walking on a treadmill for 30 healthy young, 27 healthy elderly and 10 falls risk elderly subjects with a history of tripping falls were analyzed. The MTC signal from each subject was decomposed to eight detailed signals at different wavelet scales by using the discrete wavelet transform. The variances of detailed signals at scales 8 to 1 were calculated. The multiscale exponent (beta) was then estimated from the slope of the variance progression at successive scales. The variance at scale 5 was significantly (p<0.01) different between young and healthy elderly group. Results also suggest that the Beta between scales 1 to 2 are effective for recognizing falls risk gait patterns. Results have implication for quantifying gait dynamics in normal, ageing and pathological conditions. Early detection of gait pattern changes due to ageing and balance impairments using wavelet-based multiscale analysis might provide the opportunity to initiate preemptive measures to be undertaken to avoid injurious falls.
Fountoulakis, Konstantinos N; Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried
2017-02-01
This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. © The Author 2016. Published by Oxford University Press on behalf of CINP.
Updated MDRIZTAB Parameters for ACS/WFC
NASA Astrophysics Data System (ADS)
Hoffman, S. L.; Avila, R. J.
2017-03-01
The Mikulski Archive for Space Telescopes (MAST) pipeline performs geometric distortion corrections, associated image combinations, and cosmic ray rejections with AstroDrizzle. The MDRIZTAB reference table contains a list of relevant parameters that controls this program. This document details our photometric analysis of Advanced Camera for Surveys Wide Field Channel (ACS/WFC) data processed by AstroDrizzle. Based on this analysis, we update the MDRIZTAB table to improve the quality of the drizzled products delivered by MAST.
Low-carbon building assessment and multi-scale input-output analysis
NASA Astrophysics Data System (ADS)
Chen, G. Q.; Chen, H.; Chen, Z. M.; Zhang, Bo; Shao, L.; Guo, S.; Zhou, S. Y.; Jiang, M. M.
2011-01-01
Presented as a low-carbon building evaluation framework in this paper are detailed carbon emission account procedures for the life cycle of buildings in terms of nine stages as building construction, fitment, outdoor facility construction, transportation, operation, waste treatment, property management, demolition, and disposal for buildings, supported by integrated carbon intensity databases based on multi-scale input-output analysis, essential for low-carbon planning, procurement and supply chain design, and logistics management.
[The reference pricing of pharmaceuticals in European countries].
Gildeyeva, G N; Starykh, D A
2013-01-01
The article presents the analysis of various approaches to estimation of pharmaceuticals prices in conditions of actual systems of pharmaceuticals support. The pricing is considered in pegging to actual systems of pharmaceuticals support based on the principles of insurance and co-financing. The detailed analysis is presented concerning the methodology of estimation of reference prices of pharmaceuticals in different countries of Europe. The experience of European countries in evaluation of interchangeability of pharmaceuticals is discussed.
Measurements and Analysis of Reverberation, Target Echo, and Clutter
2007-09-30
for Problem 11 (isospeed water). Beyond a few seconds there is excellent agreement between the normal mode (NOGRP and CSNAP), ray (SAFFIRE), and energy...the reverberation, so energy flux or ray -based models are more realistic. More detailed analysis is being developed for a journal paper. 3...comparison of ray , normal-mode, and energy flux results for reverberation in a Pekeris waveguide,” to be presented at special session on Underwater
ERIC Educational Resources Information Center
Berkner, Lutz; Wei, Christina Chang
2006-01-01
This report, based on data from the 2003-04 National Postsecondary Student Aid Study (NASAS:04), provides detailed information about undergraduate tuition and total price of attendance at various types of institutions, the percentage of students receiving various types of financial aid, and the average amounts that they received. In 2003-04,…
USB environment measurements based on full-scale static engine ground tests
NASA Technical Reports Server (NTRS)
Sussman, M. B.; Harkonen, D. L.; Reed, J. B.
1976-01-01
Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.
[Raman, FTIR spectra and normal mode analysis of acetanilide].
Liang, Hui-Qin; Tao, Ya-Ping; Han, Li-Gang; Han, Yun-Xia; Mo, Yu-Jun
2012-10-01
The Raman and FTIR spectra of acetanilide (ACN) were measured experimentally in the regions of 3 500-50 and 3 500-600 cm(-1) respectively. The equilibrium geometry and vibration frequencies of ACN were calculated based on density functional theory (DFT) method (B3LYP/6-311G(d, p)). The results showed that the theoretical calculation of molecular structure parameters are in good agreement with previous report and better than the ones calculated based on 6-31G(d), and the calculated frequencies agree well with the experimental ones. Potential energy distribution of each frequency was worked out by normal mode analysis, and based on this, a detailed and accurate vibration frequency assignment of ACN was obtained.
Use of paired simple and complex models to reduce predictive bias and quantify uncertainty
NASA Astrophysics Data System (ADS)
Doherty, John; Christensen, Steen
2011-12-01
Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology is demonstrated using a synthetic example based on groundwater modeling environments commonly encountered in northern Europe and North America.
Methodology for determination and use of the no-escape envelope of an air-to-air-missile
NASA Technical Reports Server (NTRS)
Neuman, Frank
1988-01-01
A large gap exists between optimal control and differential-game theory and their applications. The purpose of this paper is to show how this gap may be bridged. Missile-avoidance of realistically simulated infrared heat-seeking, fire-and-forget missile is studied. In detailed simulations, sweeping out the discretized initial condition space, avoidance methods based on pilot experience are combined with those based on simplified optimal control analysis to derive an approximation to the no-escape missile envelopes. The detailed missile equations and no-escape envelopes were then incorporated into an existing piloted simulation of air-to-air combat to generate missile firing decisions as well as missile avoidance commands. The use of these envelopes was found to be effective in both functions.
NASA Astrophysics Data System (ADS)
Henderson, Charles; Yerushalmi, Edit; Kuo, Vince H.; Heller, Kenneth; Heller, Patricia
2007-12-01
To identify and describe the basis upon which instructors make curricular and pedagogical decisions, we have developed an artifact-based interview and an analysis technique based on multilayered concept maps. The policy capturing technique used in the interview asks instructors to make judgments about concrete instructional artifacts similar to those they likely encounter in their teaching environment. The analysis procedure alternatively employs both an a priori systems view analysis and an emergent categorization to construct a multilayered concept map, which is a hierarchically arranged set of concept maps where child maps include more details than parent maps. Although our goal was to develop a model of physics faculty beliefs about the teaching and learning of problem solving in the context of an introductory calculus-based physics course, the techniques described here are applicable to a variety of situations in which instructors make decisions that influence teaching and learning.
Nuclear reactor descriptions for space power systems analysis
NASA Technical Reports Server (NTRS)
Mccauley, E. W.; Brown, N. J.
1972-01-01
For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
The spatial configuration of ordered polynucleotide chains. II. The poly(rA) helix.
Olson, W K
1975-01-01
Approximate details of the spatial configuration of the ordered single-stranded poly(rA) molecule in dilute solution have been obtained in a combined theoretical analysis of base stacking and chain flexibility. Only those regularly repeating structures which fulfill the criterion of conformational flexibility (based upon all available experimental and theoretical evidence of preferred bond rotations) and which also exhibit the right-handed base stacking pattern observed in nmr investigations of poly(rA) are deemed suitable single-stranded helices. In addition, the helical geometry of the stacked structures is required to be consistent with the experimentally observed dimensions of both completely ordered and partially ordered poly(rA) chains. Only a single category of poly(rA) helices (very similar in all conformational details to the individual chains of the poly(rA) double-stranded X-ray structure) is thus obtained. Other conformationally feasible polynucleotide helices characterized simply by a parallel and overlapping base stacking arrangement are also discussed. PMID:1052529
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.
1990-01-01
An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.
Strengthening of competence planning truss through instructional media development details
NASA Astrophysics Data System (ADS)
Handayani, Sri; Nurcahyono, M. Hadi
2017-03-01
Competency-Based Learning is a model of learning in which the planning, implementation, and assessment refers to the mastery of competencies. Learning in lectures conducted in the framework for comprehensively realizing student competency. Competence means the orientation of the learning activities in the classroom must be given to the students to be more active learning, active search for information themselves and explore alone or with friends in learning activities in pairs or in groups, learn to use a variety of learning resources and printed materials, electronic media, as well as environment. Analysis of learning wooden structure known weakness in the understanding of the truss detail. Hence the need for the development of media that can provide a clear picture of what the structure of the wooden horses and connection details. Development of instructional media consisted of three phases of activity, namely planning, production and assessment. Learning Media planning should be tailored to the needs and conditions necessary to provide reinforcement to the mastery of competencies, through the table material needs. The production process of learning media is done by using hardware (hardware) and software (software) to support the creation of a medium of learning. Assessment of the media poduk yan include feasibility studies, namely by subject matter experts, media experts, while testing was done according to the student's perception of the product. The results of the analysis of the materials for the instructional aspects of the results obtained 100% (very good) and media analysis for the design aspects of the media expressed very good with a percentage of 88.93%. While the analysis of student perceptions expressed very good with a percentage of 84.84%. Media Learning Truss Details feasible and can be used in the implementation of learning wooden structure to provide capacity-building in planning truss
Hahn, Tobias; Figge, Frank; Liesen, Andrea; Barkemeyer, Ralf
2010-10-01
In this paper, we propose the return-to-cost-ratio (RCR) as an alternative approach to the analysis of operational eco-efficiency of companies based on the notion of opportunity costs. RCR helps to overcome two fundamental deficits of existing approaches to eco-efficiency. (1) It translates eco-efficiency into managerial terms by applying the well-established notion of opportunity costs to eco-efficiency analysis. (2) RCR allows to identify and quantify the drivers behind changes in corporate eco-efficiency. RCR is applied to the analysis of the CO(2)-efficiency of German companies in order to illustrate its usefulness for a detailed analysis of changes in corporate eco-efficiency as well as for the development of effective environmental strategies. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Knowledge-based assistance for science visualization and analysis using large distributed databases
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.
1993-01-01
Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.
Knowledge-based assistance for science visualization and analysis using large distributed databases
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.
1992-01-01
Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.
Rapid Disaster Damage Estimation
NASA Astrophysics Data System (ADS)
Vu, T. T.
2012-07-01
The experiences from recent disaster events showed that detailed information derived from high-resolution satellite images could accommodate the requirements from damage analysts and disaster management practitioners. Richer information contained in such high-resolution images, however, increases the complexity of image analysis. As a result, few image analysis solutions can be practically used under time pressure in the context of post-disaster and emergency responses. To fill the gap in employment of remote sensing in disaster response, this research develops a rapid high-resolution satellite mapping solution built upon a dual-scale contextual framework to support damage estimation after a catastrophe. The target objects are building (or building blocks) and their condition. On the coarse processing level, statistical region merging deployed to group pixels into a number of coarse clusters. Based on majority rule of vegetation index, water and shadow index, it is possible to eliminate the irrelevant clusters. The remaining clusters likely consist of building structures and others. On the fine processing level details, within each considering clusters, smaller objects are formed using morphological analysis. Numerous indicators including spectral, textural and shape indices are computed to be used in a rule-based object classification. Computation time of raster-based analysis highly depends on the image size or number of processed pixels in order words. Breaking into 2 level processing helps to reduce the processed number of pixels and the redundancy of processing irrelevant information. In addition, it allows a data- and tasks- based parallel implementation. The performance is demonstrated with QuickBird images captured a disaster-affected area of Phanga, Thailand by the 2004 Indian Ocean tsunami are used for demonstration of the performance. The developed solution will be implemented in different platforms as well as a web processing service for operational uses.
Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A
2014-07-01
A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.
Research in interactive scene analysis
NASA Technical Reports Server (NTRS)
Tenenbaum, J. M.; Barrow, H. G.; Weyl, S. A.
1976-01-01
Cooperative (man-machine) scene analysis techniques were developed whereby humans can provide a computer with guidance when completely automated processing is infeasible. An interactive approach promises significant near-term payoffs in analyzing various types of high volume satellite imagery, as well as vehicle-based imagery used in robot planetary exploration. This report summarizes the work accomplished over the duration of the project and describes in detail three major accomplishments: (1) the interactive design of texture classifiers; (2) a new approach for integrating the segmentation and interpretation phases of scene analysis; and (3) the application of interactive scene analysis techniques to cartography.
F-4 Beryllium Rudders; A Precis of the Design, Fabrication, Ground and Flight Test Demonstrations
1975-05-01
Wright-Patterson Air Force Base , Ohio 45433. AIR FORCE FLIGHT DYNAMICS LABORATORY AIR FORCE SYSTEMS COMMAND WRIGHT-PATTERSON AIR FORCE BASE , OHIO 45433...rudder. These sequential ground tests include: - A 50,000 cycle fatigue test of upper balance weight support structure. A static test to...Design Details 6. Design Analysis 7. Rudder Mass Balance 8, Rudder Moment of Inertia 9, Rudder Weight RUDDER FABRICATION AND ASSEMBLY 1. 2
AFB/open cycle gas turbine conceptual design study
NASA Technical Reports Server (NTRS)
Dickinson, T. W.; Tashjian, R.
1983-01-01
Applications of coal fired atmospheric fluidized bed gas turbine systems in industrial cogeneration are identified. Based on site-specific conceptual designs, the potential benefits of the AFB/gas turbine system were compared with an atmospheric fluidized design steam boiler/steam turbine system. The application of these cogeneration systems at four industrial plant sites is reviewed. A performance and benefit analysis was made along with a study of the representativeness of the sites both in regard to their own industry and compared to industry as a whole. A site was selected for the conceptual design, which included detailed site definition, AFB/gas turbine and AFB/steam turbine cogeneration system designs, detailed cost estimates, and comparative performance and benefit analysis. Market and benefit analyses identified the potential market penetration for the cogeneration technologies and quantified the potential benefits.
AFB/open cycle gas turbine conceptual design study
NASA Astrophysics Data System (ADS)
Dickinson, T. W.; Tashjian, R.
1983-09-01
Applications of coal fired atmospheric fluidized bed gas turbine systems in industrial cogeneration are identified. Based on site-specific conceptual designs, the potential benefits of the AFB/gas turbine system were compared with an atmospheric fluidized design steam boiler/steam turbine system. The application of these cogeneration systems at four industrial plant sites is reviewed. A performance and benefit analysis was made along with a study of the representativeness of the sites both in regard to their own industry and compared to industry as a whole. A site was selected for the conceptual design, which included detailed site definition, AFB/gas turbine and AFB/steam turbine cogeneration system designs, detailed cost estimates, and comparative performance and benefit analysis. Market and benefit analyses identified the potential market penetration for the cogeneration technologies and quantified the potential benefits.
Soil Components in Heterogeneous Impact Glass in Martian Meteorite EETA79001
NASA Technical Reports Server (NTRS)
Schrader, C. M.; Cohen, B. A.; Donovan, J. J.; Vicenzi, E. P.
2010-01-01
Martian soil composition can illuminate past and ongoing near-surface processes such as impact gardening [2] and hydrothermal and volcanic activity [3,4]. Though the Mars Exploration Rovers (MER) have analyzed the major-element composition of Martian soils, no soil samples have been returned to Earth for detailed chemical analysis. Rao et al. [1] suggested that Martian meteorite EETA79001 contains melted Martian soil in its impact glass (Lithology C) based on sulfur enrichment of Lithology C relative to the meteorite s basaltic lithologies (A and B) [1,2]. If true, it may be possible to extract detailed soil chemical analyses using this meteoritic sample. We conducted high-resolution (0.3 m/pixel) element mapping of Lithology C in thin section EETA79001,18 by energy dispersive spectrometry (EDS). We use these data for principal component analysis (PCA).
Complications in proximal humeral fractures.
Calori, Giorgio Maria; Colombo, Massimiliano; Bucci, Miguel Simon; Fadigati, Piero; Colombo, Alessandra Ines Maria; Mazzola, Simone; Cefalo, Vittorio; Mazza, Emilio
2016-10-01
Necrosis of the humeral head, infections and non-unions are among the most dangerous and difficult-to-treat complications of proximal humeral fractures. The aim of this work was to analyse in detail non-unions and post-traumatic bone defects and to suggest an algorithm of care. Treatment options are based not only on the radiological frame, but also according to a detailed analysis of the patient, who is classified using a risk factor analysis. This method enables the surgeon to choose the most suitable treatment for the patient, thereby facilitating return of function in the shortest possible time. The treatment of such serious complications requires the surgeon to be knowledgeable about the following possible solutions: increased mechanical stability; biological stimulation; and reconstructive techniques in two steps, with application of biotechnologies and prosthetic substitution. Copyright © 2016 Elsevier Ltd. All rights reserved.
Automating a Detailed Cognitive Task Analysis for Structuring Curriculum
1991-08-01
1991-- ] Aleeo/i ISM’-19# l Title: Automating a Detailed Cognitive Task Analysis for Structuring Curriculum Activities: To date we have completed task...The Institute for Management Sciences. Although the particular application of the modified GOMS cognitive task analysis technique under development is...Laboratories 91 9 23 074 Automnating a Detailed Cognitive Task Analysis For Stucuring Curriculum Research Plan Year 1 Task 1.0 Design Task 1.1 Conduct body
RoleSim and RoleMatch: Role-Based Similarity and Graph Matching
ERIC Educational Resources Information Center
Lee, Victor Eugene
2012-01-01
With the rise of the internet, mobile communications, electronic transactions, and personal broadcasting, the scale of connectedness has grown immensely. Not only can an individual interact with thousands and millions of others, but details about those interactions are being stored in databases, for later retrieval and analysis. Two key concepts…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-11
...-on, more detailed, digital forensics analysis or damage assessments of individual incidents... information. In addition, during any follow-on forensics or damage assessment activities, the Government and...), (c) and (d) of this section are maintained by the digital and multimedia forensics laboratory at DC3...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-20
... produced water. These changes are discussed in more detail below, and in the fact sheet accompanying the... part, the proposed permit is very similar to the 2004 permit. The major changes from the 2004 permit... limits and monitoring requirements for produced water based on an updated reasonable potential analysis...
The Mastersingers: Language and Practice in an Operatic Masterclass
ERIC Educational Resources Information Center
Atkinson, Paul
2013-01-01
The paper presents a microethnographic examination of an operatic masterclass, based on a transcribed video recording of just one such class. It is a companion piece to a more generalised ethnographic account of such masterclasses as pedagogic events. The detailed analysis demonstrates the close relationship between spoken and unspoken actions in…
Environmental Conservation. The Oil and Gas Industries, Volume One.
ERIC Educational Resources Information Center
National Petroleum Council, Washington, DC.
Prepared in response to a Department of the Interior request, this report is a comprehensive study of environmental conservation problems as they relate to or have impact on the petroleum industry. It contains the general comments and conclusions of The National Petroleum Council based on an analysis of detailed data. For presentation of key…
Analysing the Preferences of Prospective Students for Higher Education Institution Attributes
ERIC Educational Resources Information Center
Walsh, Sharon; Flannery, Darragh; Cullinan, John
2018-01-01
We utilise a dataset of students in their final year of upper secondary education in Ireland to provide a detailed examination of the preferences of prospective students for higher education institutions (HEIs). Our analysis is based upon a discrete choice experiment methodology with willingness to pay estimates derived for specific HEI attributes…
Use of molecular genetic markers in forest management
Craig S. Echt
1997-01-01
When managing forests for biodiversity or sustainability, attention must be given to how silvicultural practices affect genetic diversity. A new generation of DNA-based markers affords a greater detail of genetic analysis than previously possible. These new markers, SSRs or microsatellites, have been used to demonstrate genetic diversity and infer evolutionary history...
Changes in Grouping Practices over Primary and Secondary School
ERIC Educational Resources Information Center
Baines, Ed; Blatchford, Peter; Kutnick, Peter
2003-01-01
The research detailed in this paper provides a systematic description and analysis of classroom grouping practices in primary and secondary schools in England. Practices are compared to main findings in developmental and educational literature with regard to effective contexts for learning and recent ideas about pedagogy. The research is based on…
NASA Technical Reports Server (NTRS)
Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.
1974-01-01
The Shuttle Electric Power System Analysis SEPS computer program which performs detailed load analysis including predicting energy demands and consumables requirements of the shuttle electric power system along with parameteric and special case studies on the shuttle electric power system is described. The functional flow diagram of the SEPS program is presented along with data base requirements and formats, procedure and activity definitions, and mission timeline input formats. Distribution circuit input and fixed data requirements are included. Run procedures and deck setups are described.
Nonlinear analysis of NPP safety against the aircraft attack
DOE Office of Scientific and Technical Information (OSTI.GOV)
Králik, Juraj, E-mail: juraj.kralik@stuba.sk; Králik, Juraj, E-mail: kralik@fa.stuba.sk
The paper presents the nonlinear probabilistic analysis of the reinforced concrete buildings of nuclear power plant under the aircraft attack. The dynamic load is defined in time on base of the airplane impact simulations considering the real stiffness, masses, direction and velocity of the flight. The dynamic response is calculated in the system ANSYS using the transient nonlinear analysis solution method. The damage of the concrete wall is evaluated in accordance with the standard NDRC considering the spalling, scabbing and perforation effects. The simple and detailed calculations of the wall damage are compared.
Probabilistic atlas and geometric variability estimation to drive tissue segmentation.
Xu, Hao; Thirion, Bertrand; Allassonnière, Stéphanie
2014-09-10
Computerized anatomical atlases play an important role in medical image analysis. While an atlas usually refers to a standard or mean image also called template, which presumably represents well a given population, it is not enough to characterize the observed population in detail. A template image should be learned jointly with the geometric variability of the shapes represented in the observations. These two quantities will in the sequel form the atlas of the corresponding population. The geometric variability is modeled as deformations of the template image so that it fits the observations. In this paper, we provide a detailed analysis of a new generative statistical model based on dense deformable templates that represents several tissue types observed in medical images. Our atlas contains both an estimation of probability maps of each tissue (called class) and the deformation metric. We use a stochastic algorithm for the estimation of the probabilistic atlas given a dataset. This atlas is then used for atlas-based segmentation method to segment the new images. Experiments are shown on brain T1 MRI datasets. Copyright © 2014 John Wiley & Sons, Ltd.
Jiang, Wei; Yu, Weichuan
2017-02-15
In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Oracle Applications Patch Administration Tool (PAT) Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
2002-01-04
PAT is a Patch Administration Tool that provides analysis, tracking, and management of Oracle Application patches. This includes capabilities as outlined below: Patch Analysis & Management Tool Outline of capabilities: Administration Patch Data Maintenance -- track Oracle Application patches applied to what database instance & machine Patch Analysis capture text files (readme.txt and driver files) form comparison detail report comparison detail PL/SQL package comparison detail SQL scripts detail JSP module comparison detail Parse and load the current applptch.txt (10.7) or load patch data from Oracle Application database patch tables (11i) Display Analysis -- Compare patch to be applied with currentmore » Oracle Application installed Appl_top code versions Patch Detail Module comparison detail Analyze and display one Oracle Application module patch. Patch Management -- automatic queue and execution of patches Administration Parameter maintenance -- setting for directory structure of Oracle Application appl_top Validation data maintenance -- machine names and instances to patch Operation Patch Data Maintenance Schedule a patch (queue for later execution) Run a patch (queue for immediate execution) Review the patch logs Patch Management Reports« less
NASA Technical Reports Server (NTRS)
Tomei, B. A.; Smith, L. G.
1986-01-01
Sounding rockets equipped to monitor electron density and its fine structure were launched into the auroral and equatorial ionosphere in 1980 and 1983, respectively. The measurement electronics are based on the Langmuir probe and are described in detail. An approach to the spectral analysis of the density irregularities is addressed and a software algorithm implementing the approach is given. Preliminary results of the analysis are presented.
NASA Astrophysics Data System (ADS)
Sachdeva, Ritika; Soni, Abhinav; Singh, V. P.; Saini, G. S. S.
2018-05-01
Etoricoxib is one of the selective cyclooxygenase inhibitor drug which plays a significant role in the pharmacological management of arthritis and pain. The theoretical investigation of its reactivity is done using Density Functional Theory calculations. Molecular Electrostatic Potential Surface of etoricoxib and its Mulliken atomic charge distribution are used for the prediction of its electrophilic and nucleophilic sites. The detailed analysis of its frontier molecular orbitals is also done.
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1995-01-01
This report presents the results of a study to implement convergence acceleration techniques based on the multigrid concept in the two-dimensional and three-dimensional versions of the Proteus computer code. The first section presents a review of the relevant literature on the implementation of the multigrid methods in computer codes for compressible flow analysis. The next two sections present detailed stability analysis of numerical schemes for solving the Euler and Navier-Stokes equations, based on conventional von Neumann analysis and the bi-grid analysis, respectively. The next section presents details of the computational method used in the Proteus computer code. Finally, the multigrid implementation and applications to several two-dimensional and three-dimensional test problems are presented. The results of the present study show that the multigrid method always leads to a reduction in the number of iterations (or time steps) required for convergence. However, there is an overhead associated with the use of multigrid acceleration. The overhead is higher in 2-D problems than in 3-D problems, thus overall multigrid savings in CPU time are in general better in the latter. Savings of about 40-50 percent are typical in 3-D problems, but they are about 20-30 percent in large 2-D problems. The present multigrid method is applicable to steady-state problems and is therefore ineffective in problems with inherently unstable solutions.
Dendrobium micropropagation: a review.
da Silva, Jaime A Teixeira; Cardoso, Jean Carlos; Dobránszki, Judit; Zeng, Songjun
2015-05-01
Dendrobium is one of the largest and most important (ornamentally and medicinally) orchid genera. Tissue culture is now an established method for the effective propagation of members of this genus. This review provides a detailed overview of the Dendrobium micropropagation literature. Through a chronological analysis, aspects such as explant, basal medium, plant growth regulators, culture conditions and final organogenic outcome are chronicled in detail. This review will allow Dendrobium specialists to use the information that has been documented to establish, more efficiently, protocols for their own germplasm and to improve in vitro culture conditions based on the optimized parameters detailed in this review. Not only will this expand the use for mass propagation, but will also allow for the conservation of important germplasm. Information on the in vitro responses of Dendrobium for developing efficient protocols for breeding techniques based on tissue culture, such as polyploidization, somatic hybridization, isolation of mutants and somaclonal variants and for synthetic seed and bioreactor technology, or for genetic transformation, is discussed in this review. This is the first such review on this genus and represents half a decade of literature dedicated to Dendrobium micropropagation.
NASA Astrophysics Data System (ADS)
Christian, Paul M.
2002-07-01
This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Optis, Michael; Scott, George N.; Draxl, Caroline
The goal of this analysis was to assess the wind power forecast accuracy of the Vermont Weather Analytics Center (VTWAC) forecast system and to identify potential improvements to the forecasts. Based on the analysis at Georgia Mountain, the following recommendations for improving forecast performance were made: 1. Resolve the significant negative forecast bias in February-March 2017 (50% underprediction on average) 2. Improve the ability of the forecast model to capture the strong diurnal cycle of wind power 3. Add ability for forecast model to assess internal wake loss, particularly at sites where strong diurnal shifts in wind direction are present.more » Data availability and quality limited the robustness of this forecast assessment. A more thorough analysis would be possible given a longer period of record for the data (at least one full year), detailed supervisory control and data acquisition data for each wind plant, and more detailed information on the forecast system input data and methodologies.« less
Goel, Nidhi; Singh, Udai P
2013-10-10
Four new acid-base complexes using picric acid [(OH)(NO2)3C6H2] (PA) and N-heterocyclic bases (1,10-phenanthroline (phen)/2,2';6',2"-terpyridine (terpy)/hexamethylenetetramine (hmta)/2,4,6-tri(2-pyridyl)-1,3,5-triazine (tptz)) were prepared and characterized by elemental analysis, IR, NMR and X-ray crystallography. Crystal structures provide detailed information of the noncovalent interactions present in different complexes. The optimized structures of the complexes were calculated in terms of the density functional theory. The thermolysis of these complexes was investigated by TG-DSC and ignition delay measurements. The model-free isoconversional and model-fitting kinetic approaches have been applied to isothermal TG data for kinetics investigation of thermal decomposition of these complexes.
Koshiyama, Kenichiro; Nishimoto, Keisuke; Ii, Satoshi; Sera, Toshihiro; Wada, Shigeo
2018-01-20
The pulmonary acinus is a dead-end microstructure that consists of ducts and alveoli. High-resolution micro-CT imaging has recently provided detailed anatomical information of a complete in vivo acinus, but relating its mechanical response with its detailed acinar structure remains challenging. This study aimed to investigate the mechanical response of acinar tissue in a whole acinus for static inflation using computational approaches. We performed finite element analysis of a whole acinus for static inflation. The acinar structure model was generated based on micro-CT images of an intact acinus. A continuum mechanics model of the lung parenchyma was used for acinar tissue material model, and surface tension effects were explicitly included. An anisotropic mechanical field analysis based on a stretch tensor was combined with a curvature-based local structure analysis. The airspace of the acinus exhibited nonspherical deformation as a result of the anisotropic deformation of acinar tissue. A strain hotspot occurred at the ridge-shaped region caused by a rod-like deformation of acinar tissue on the ridge. The local structure becomes bowl-shaped for inflation and, without surface tension effects, the surface of the bowl-shaped region primarily experiences isotropic deformation. Surface tension effects suppressed the increase in airspace volume and inner surface area, while facilitating anisotropic deformation on the alveolar surface. In the lungs, the heterogeneous acinar structure and surface tension induce anisotropic deformation at the acinar and alveolar scales. Further research is needed on structural variation of acini, inter-acini connectivity, or dynamic behavior to understand multiscale lung mechanics. Copyright © 2018 Elsevier Ltd. All rights reserved.
Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities
NASA Astrophysics Data System (ADS)
Esposito, Gaetano
Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.
Laboratory cost control and financial management software.
Mayer, M
1998-02-09
Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.
Jarchi, Delaram; Lo, Benny; Wong, Charence; Ieong, Edmund; Nathwani, Dinesh; Yang, Guang-Zhong
2016-08-01
Objective assessment of detailed gait patterns after orthopaedic surgery is important for post-surgical follow-up and rehabilitation. The purpose of this paper is to assess the use of a single ear-worn sensor for clinical gait analysis. A reliability measure is devised for indicating the confidence level of the estimated gait events, allowing it to be used in free-walking environments and for facilitating clinical assessment of orthopaedic patients after surgery. Patient groups prior to or following anterior cruciate ligament (ACL) reconstruction and knee replacement were recruited to assess the proposed method. The ability of the sensor for detailed longitudinal analysis is demonstrated with a group of patients after lower limb reconstruction by considering parameters such as temporal and force-related gait asymmetry derived from gait events. The results suggest that the ear-worn sensor can be used for objective gait assessments of orthopaedic patients without the requirement and expense of an elaborate laboratory setup for gait analysis. It significantly simplifies the monitoring protocol and opens the possibilities for home-based remote patient assessment.
An innovative and shared methodology for event reconstruction using images in forensic science.
Milliet, Quentin; Jendly, Manon; Delémont, Olivier
2015-09-01
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Functional Interaction Network Construction and Analysis for Disease Discovery.
Wu, Guanming; Haw, Robin
2017-01-01
Network-based approaches project seemingly unrelated genes or proteins onto a large-scale network context, therefore providing a holistic visualization and analysis platform for genomic data generated from high-throughput experiments, reducing the dimensionality of data via using network modules and increasing the statistic analysis power. Based on the Reactome database, the most popular and comprehensive open-source biological pathway knowledgebase, we have developed a highly reliable protein functional interaction network covering around 60 % of total human genes and an app called ReactomeFIViz for Cytoscape, the most popular biological network visualization and analysis platform. In this chapter, we describe the detailed procedures on how this functional interaction network is constructed by integrating multiple external data sources, extracting functional interactions from human curated pathway databases, building a machine learning classifier called a Naïve Bayesian Classifier, predicting interactions based on the trained Naïve Bayesian Classifier, and finally constructing the functional interaction database. We also provide an example on how to use ReactomeFIViz for performing network-based data analysis for a list of genes.
Renton, Michael
2011-01-01
Background and aims Simulations that integrate sub-models of important biological processes can be used to ask questions about optimal management strategies in agricultural and ecological systems. Building sub-models with more detail and aiming for greater accuracy and realism may seem attractive, but is likely to be more expensive and time-consuming and result in more complicated models that lack transparency. This paper illustrates a general integrated approach for constructing models of agricultural and ecological systems that is based on the principle of starting simple and then directly testing for the need to add additional detail and complexity. Methodology The approach is demonstrated using LUSO (Land Use Sequence Optimizer), an agricultural system analysis framework based on simulation and optimization. A simple sensitivity analysis and functional perturbation analysis is used to test to what extent LUSO's crop–weed competition sub-model affects the answers to a number of questions at the scale of the whole farming system regarding optimal land-use sequencing strategies and resulting profitability. Principal results The need for accuracy in the crop–weed competition sub-model within LUSO depended to a small extent on the parameter being varied, but more importantly and interestingly on the type of question being addressed with the model. Only a small part of the crop–weed competition model actually affects the answers to these questions. Conclusions This study illustrates an example application of the proposed integrated approach for constructing models of agricultural and ecological systems based on testing whether complexity needs to be added to address particular questions of interest. We conclude that this example clearly demonstrates the potential value of the general approach. Advantages of this approach include minimizing costs and resources required for model construction, keeping models transparent and easy to analyse, and ensuring the model is well suited to address the question of interest. PMID:22476477
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.
2011-01-01
Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.
A Combined Experimental/Computational Investigation of a Rocket Based Combined Cycle Inlet
NASA Technical Reports Server (NTRS)
Smart, Michael K.; Trexler, Carl A.; Goldman, Allen L.
2001-01-01
A rocket based combined cycle inlet geometry has undergone wind tunnel testing and computational analysis with Mach 4 flow at the inlet face. Performance parameters obtained from the wind tunnel tests were the mass capture, the maximum back-pressure, and the self-starting characteristics of the inlet. The CFD analysis supplied a confirmation of the mass capture, the inlet efficiency and the details of the flowfield structure. Physical parameters varied during the test program were cowl geometry, cowl position, body-side bleed magnitude and ingested boundary layer thickness. An optimum configuration was determined for the inlet as a result of this work.
Latent Class Analysis of Incomplete Data via an Entropy-Based Criterion
Larose, Chantal; Harel, Ofer; Kordas, Katarzyna; Dey, Dipak K.
2016-01-01
Latent class analysis is used to group categorical data into classes via a probability model. Model selection criteria then judge how well the model fits the data. When addressing incomplete data, the current methodology restricts the imputation to a single, pre-specified number of classes. We seek to develop an entropy-based model selection criterion that does not restrict the imputation to one number of clusters. Simulations show the new criterion performing well against the current standards of AIC and BIC, while a family studies application demonstrates how the criterion provides more detailed and useful results than AIC and BIC. PMID:27695391
Comparison between two methodologies for urban drainage decision aid.
Moura, P M; Baptista, M B; Barraud, S
2006-01-01
The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.
Design and optimization of liquid core optical ring resonator for refractive index sensing.
Lin, Nai; Jiang, Lan; Wang, Sumei; Xiao, Hai; Lu, Yongfeng; Tsai, Hai-Lung
2011-07-10
This study performs a detailed theoretical analysis of refractive index (RI) sensors based on whispering gallery modes (WGMs) in liquid core optical ring resonators (LCORRs). Both TE- and TM-polarized WGMs of various orders are considered. The analysis shows that WGMs of higher orders need thicker walls to achieve a near-zero thermal drift, but WGMs of different orders exhibit a similar RI sensing performance at the thermostable wall thicknesses. The RI detection limit is very low at the thermostable thickness. The theoretical predications should provide a general guidance in the development of LCORR-based thermostable RI sensors. © 2011 Optical Society of America
Yong, Alan; Hough, Susan E.; Cox, Brady R.; Rathje, Ellen M.; Bachhuber, Jeff; Dulberg, Ranon; Hulslander, David; Christiansen, Lisa; and Abrams, Michael J.
2011-01-01
We report about a preliminary study to evaluate the use of semi-automated imaging analysis of remotely-sensed DEM and field geophysical measurements to develop a seismic-zonation map of Port-au-Prince, Haiti. For in situ data, VS30 values are derived from the MASW technique deployed in and around the city. For satellite imagery, we use an ASTER GDEM of Hispaniola. We apply both pixel- and object-based imaging methods on the ASTER GDEM to explore local topography (absolute elevation values) and classify terrain types such as mountains, alluvial fans and basins/near-shore regions. We assign NEHRP seismic site class ranges based on available VS30 values. A comparison of results from imagery-based methods to results from traditional geologic-based approaches reveals good overall correspondence. We conclude that image analysis of RS data provides reliable first-order site characterization results in the absence of local data and can be useful to refine detailed site maps with sparse local data.
Faust, Oliver; Yu, Wenwei; Rajendra Acharya, U
2015-03-01
The concept of real-time is very important, as it deals with the realizability of computer based health care systems. In this paper we review biomedical real-time systems with a meta-analysis on computational complexity (CC), delay (Δ) and speedup (Sp). During the review we found that, in the majority of papers, the term real-time is part of the thesis indicating that a proposed system or algorithm is practical. However, these papers were not considered for detailed scrutiny. Our detailed analysis focused on papers which support their claim of achieving real-time, with a discussion on CC or Sp. These papers were analyzed in terms of processing system used, application area (AA), CC, Δ, Sp, implementation/algorithm (I/A) and competition. The results show that the ideas of parallel processing and algorithm delay were only recently introduced and journal papers focus more on Algorithm (A) development than on implementation (I). Most authors compete on big O notation (O) and processing time (PT). Based on these results, we adopt the position that the concept of real-time will continue to play an important role in biomedical systems design. We predict that parallel processing considerations, such as Sp and algorithm scaling, will become more important. Copyright © 2015 Elsevier Ltd. All rights reserved.
Three-Way Analysis of Spectrospatial Electromyography Data: Classification and Interpretation
Kauppi, Jukka-Pekka; Hahne, Janne; Müller, Klaus-Robert; Hyvärinen, Aapo
2015-01-01
Classifying multivariate electromyography (EMG) data is an important problem in prosthesis control as well as in neurophysiological studies and diagnosis. With modern high-density EMG sensor technology, it is possible to capture the rich spectrospatial structure of the myoelectric activity. We hypothesize that multi-way machine learning methods can efficiently utilize this structure in classification as well as reveal interesting patterns in it. To this end, we investigate the suitability of existing three-way classification methods to EMG-based hand movement classification in spectrospatial domain, as well as extend these methods by sparsification and regularization. We propose to use Fourier-domain independent component analysis as preprocessing to improve classification and interpretability of the results. In high-density EMG experiments on hand movements across 10 subjects, three-way classification yielded higher average performance compared with state-of-the art classification based on temporal features, suggesting that the three-way analysis approach can efficiently utilize detailed spectrospatial information of high-density EMG. Phase and amplitude patterns of features selected by the classifier in finger-movement data were found to be consistent with known physiology. Thus, our approach can accurately resolve hand and finger movements on the basis of detailed spectrospatial information, and at the same time allows for physiological interpretation of the results. PMID:26039100
First results of the CINDI-2 semi-blind MAX-DOAS intercomparison
NASA Astrophysics Data System (ADS)
Kreher, Karin; van Roozendael, Michel; Hendrick, Francois; Apituley, Arnoud; Friess, Udo; Lampel, Johannes; Piters, Ankie; Richter, Andreas; Wagner, Thomas; Cindi-2 Participants, All
2017-04-01
The second Cabauw Intercomparison campaign for Nitrogen Dioxide measuring Instruments (CINDI-2) took place at the Cabauw Experimental Site for Atmospheric Research (CESAR; Utrecht area, The Netherlands) from 25 August until 7 October 2016. The goals of this inter-comparison campaign are to support the creation of high-quality ground-based data sets (e.g. to provide reliable long-term time series for trend analysis and satellite data validation), to characterise and better understand the differences between a large number of MAX-DOAS and DOAS instruments and analysis methods, and to contribute to a harmonisation of the measurement settings and retrieval methods. During a time period of 17 days, from 12 to 28 September 2016, a formal semi-blind intercomparison was held following a detailed measurement protocol. The development of this protocol was based on the experience gained during the first CINDI campaign held in 2009 as well as more recent projects and campaigns such as the MADCAT campaign in Mainz, Germany, in 2013. Strong emphasis was put on the careful synchronisation of the measurement sequence and on exact alignment of the elevation angles using horizon scans and lamp measurements. In this presentation, we provide an overview and some highlights of the MAX-DOAS semi-blind intercomparison campaign. We will introduce the participating groups, their instruments and the measurement protocol details, and then summarize the campaign outcomes to date. The CINDI-2 data sets have been investigated using a range of diagnostics including comparisons of daily time series and relative differences between the data sets, regression analysis and correlation plots. The data products so far investigated are NO2 (nitrogen dioxide) in the UV and visible wavelength region, O4 (oxygen dimer) in the same two wavelength intervals, O3 (ozone) in the UV and visible wavelength region, HCHO (formaldehyde) and NO2 in an additional (smaller) wavelength range in the visible. The results based on the regression analysis are presented in summary plots and tables, addressing MAX-DOAS and twilight zenith sky measurements separately. Further information on instrumental details such as the alignment of the viewing direction and elevation and the field of view are also summarized and included in the overall interpretation.
Development of an Aeroelastic Code Based on an Euler/Navier-Stokes Aerodynamic Solver
NASA Technical Reports Server (NTRS)
Bakhle, Milind A.; Srivastava, Rakesh; Keith, Theo G., Jr.; Stefko, George L.; Janus, Mark J.
1996-01-01
This paper describes the development of an aeroelastic code (TURBO-AE) based on an Euler/Navier-Stokes unsteady aerodynamic analysis. A brief review of the relevant research in the area of propulsion aeroelasticity is presented. The paper briefly describes the original Euler/Navier-Stokes code (TURBO) and then details the development of the aeroelastic extensions. The aeroelastic formulation is described. The modeling of the dynamics of the blade using a modal approach is detailed, along with the grid deformation approach used to model the elastic deformation of the blade. The work-per-cycle approach used to evaluate aeroelastic stability is described. Representative results used to verify the code are presented. The paper concludes with an evaluation of the development thus far, and some plans for further development and validation of the TURBO-AE code.
Measuring the Carotid to Femoral Pulse Wave Velocity (Cf-PWV) to Evaluate Arterial Stiffness.
Ji, Hongwei; Xiong, Jing; Yu, Shikai; Chi, Chen; Bai, Bin; Teliewubai, Jiadela; Lu, Yuyan; Zhang, Yi; Xu, Yawei
2018-05-03
For the elderly, arterial stiffening is a good marker for aging evaluation and it is recommended that the arterial stiffness be determined noninvasively by the measurement of carotid to femoral pulse wave velocity (cf-PWV) (Class I; Level of Evidence A). In literature, numerous community-based or disease-specific studies have reported that higher cf-PWV is associated with increased cardiovascular risk. Here, we discuss strategies to evaluate arterial stiffness with cf-PWV. Following the well-defined steps detailed here, e.g., proper position operator, distance measurement, and tonometer position, we will obtain a standard cf-PWV value to evaluate arterial stiffness. In this paper, a detailed stepwise method to record a good quality PWV and pulse wave analysis (PWA) using a non-invasive tonometry-based device will be discussed.
Detection of multiple chemicals based on external cavity quantum cascade laser spectroscopy
NASA Astrophysics Data System (ADS)
Sun, Juan; Ding, Junya; Liu, Ningwu; Yang, Guangxiang; Li, Jingsong
2018-02-01
A laser spectroscopy system based on a broadband tunable external cavity quantum cascade laser (ECQCL) and a mini quartz crystal tuning fork (QCTF) detector was developed for standoff detection of volatile organic compounds (VOCs). The self-established spectral analysis model based on multiple algorithms for quantitative and qualitative analysis of VOC components (i.e. ethanol and acetone) was detailedly investigated in both closed cell and open path configurations. A good agreement was obtained between the experimentally observed spectra and the standard reference spectra. For open path detection of VOCs, the sensor system was demonstrated at a distance of 30 m. The preliminary laboratory results show that standoff detection of VOCs at a distance of over 100 m is very promising.
NASA Technical Reports Server (NTRS)
Sussman, M. B.; Harkonen, D. L.; Reed, J. B.
1976-01-01
Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive-lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data and to establish a basis for future flight test comparisons.
Web-based data collection: detailed methods of a questionnaire and data gathering tool
Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R
2006-01-01
There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556
Network-based machine learning and graph theory algorithms for precision oncology.
Zhang, Wei; Chien, Jeremy; Yong, Jeongsik; Kuang, Rui
2017-01-01
Network-based analytics plays an increasingly important role in precision oncology. Growing evidence in recent studies suggests that cancer can be better understood through mutated or dysregulated pathways or networks rather than individual mutations and that the efficacy of repositioned drugs can be inferred from disease modules in molecular networks. This article reviews network-based machine learning and graph theory algorithms for integrative analysis of personal genomic data and biomedical knowledge bases to identify tumor-specific molecular mechanisms, candidate targets and repositioned drugs for personalized treatment. The review focuses on the algorithmic design and mathematical formulation of these methods to facilitate applications and implementations of network-based analysis in the practice of precision oncology. We review the methods applied in three scenarios to integrate genomic data and network models in different analysis pipelines, and we examine three categories of network-based approaches for repositioning drugs in drug-disease-gene networks. In addition, we perform a comprehensive subnetwork/pathway analysis of mutations in 31 cancer genome projects in the Cancer Genome Atlas and present a detailed case study on ovarian cancer. Finally, we discuss interesting observations, potential pitfalls and future directions in network-based precision oncology.
NASA Technical Reports Server (NTRS)
Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.
2006-01-01
System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.
Generating mouse lines for lineage tracing and knockout studies.
Kraus, Petra; Sivakamasundari, V; Xing, Xing; Lufkin, Thomas
2014-01-01
In 2007 Capecchi, Evans, and Smithies received the Nobel Prize in recognition for discovering the principles for introducing specific gene modifications in mice via embryonic stem cells, a technology, which has revolutionized the field of biomedical science allowing for the generation of genetically engineered animals. Here we describe detailed protocols based on and developed from these ground-breaking discoveries, allowing for the modification of genes not only to create mutations to study gene function but additionally to modify genes with fluorescent markers, thus permitting the isolation of specific rare wild-type and mutant cell types for further detailed analysis at the biochemical, pathological, and genomic levels.
The observed life cycle of a baroclinic instability
NASA Technical Reports Server (NTRS)
Randel, W. J.; Stanford, J. L.
1985-01-01
Medium-scale waves (zonal wavenumbers 4-7) frequently dominate Southern Hemisphere summer circulation patterns. Randel and Stanford have studied the dynamics of these features, demonstrating that the medium-scale waves result from baroclinic excitation and exhibit well-defined life cycles. This study details the evolution of the medium-scale waves during a particular life cycle. The specific case chosen exhibits a high degree of zonal symmetry, prompting study based upon zonally averaged diagnostics. An analysis of the medium-scale wave energetics reveals a well-defined life cycle of baroclinic growth, maturity, and barotropic decay. Eliassen-Palm flux diagrams detail the daily wave structure and its interaction with the zonally-averaged flow.
Analysis of hybrid mode-locking of two-section quantum dot lasers operating at 1.5 microm.
Heck, Martijn J R; Salumbides, Edcel J; Renault, Amandine; Bente, Erwin A J M; Oei, Yok-Siang; Smit, Meint K; van Veldhoven, René; Nötzel, Richard; Eikema, Kjeld S E; Ubachs, Wim
2009-09-28
For the first time a detailed study of hybrid mode-locking in two-section InAs/InP quantum dot Fabry-Pérot-type lasers is presented. The output pulses have a typical upchirp of approximately 8 ps/nm, leading to very elongated pulses. The mechanism leading to this typical pulse shape and the phase noise is investigated by detailed radio-frequency and optical spectral studies as well as time-domain studies. The pulse shaping mechanism in these lasers is found to be fundamentally different than the mechanism observed in conventional mode-locked laser diodes, based on quantum well gain or bulk material.
Integrated Glass Coating Manufacturing Line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brophy, Brenor
2015-09-30
This project aims to enable US module manufacturers to coat glass with Enki’s state of the art tunable functionalized AR coatings at the lowest possible cost and highest possible performance by encapsulating Enki’s coating process in an integrated tool that facilitates effective process improvement through metrology and data analysis for greater quality and performance while reducing footprint, operating and capital costs. The Phase 1 objective was a fully designed manufacturing line, including fully specified equipment ready for issue of purchase requisitions; a detailed economic justification based on market prices at the end of Phase 1 and projected manufacturing costs andmore » a detailed deployment plan for the equipment.« less
Small, high pressure ratio compressor: Aerodynamic and mechanical design
NASA Technical Reports Server (NTRS)
Bryce, C. A.; Erwin, J. R.; Perrone, G. L.; Nelson, E. L.; Tu, R. K.; Bosco, A.
1973-01-01
The Small, High-Pressure-Ratio Compressor Program was directed toward the analysis, design, and fabrication of a centrifugal compressor providing a 6:1 pressure ratio and an airflow rate of 2.0 pounds per second. The program consists of preliminary design, detailed areodynamic design, mechanical design, and mechanical acceptance tests. The preliminary design evaluate radial- and backward-curved blades, tandem bladed impellers, impeller-and diffuser-passage boundary-layer control, and vane, pipe, and multiple-stage diffusers. Based on this evaluation, a configuration was selected for detailed aerodynamic and mechanical design. Mechanical acceptance test was performed to demonstrate that mechanical design objectives of the research package were met.
Integration of remote sensing based surface information into a three-dimensional microclimate model
NASA Astrophysics Data System (ADS)
Heldens, Wieke; Heiden, Uta; Esch, Thomas; Mueller, Andreas; Dech, Stefan
2017-03-01
Climate change urges cities to consider the urban climate as part of sustainable planning. Urban microclimate models can provide knowledge on the climate at building block level. However, very detailed information on the area of interest is required. Most microclimate studies therefore make use of assumptions and generalizations to describe the model area. Remote sensing data with area wide coverage provides a means to derive many parameters at the detailed spatial and thematic scale required by urban climate models. This study shows how microclimate simulations for a series of real world urban areas can be supported by using remote sensing data. In an automated process, surface materials, albedo, LAI/LAD and object height have been derived and integrated into the urban microclimate model ENVI-met. Multiple microclimate simulations have been carried out both with the dynamic remote sensing based input data as well as with manual and static input data to analyze the impact of the RS-based surface information and the suitability of the applied data and techniques. A valuable support of the integration of the remote sensing based input data for ENVI-met is the use of an automated processing chain. This saves tedious manual editing and allows for fast and area wide generation of simulation areas. The analysis of the different modes shows the importance of high quality height data, detailed surface material information and albedo.
Koprivica, Mladen; Neskovic, Natasa; Neskovic, Aleksandar; Paunovic, George
2014-01-01
As a result of dense installations of public mobile base station, additional electromagnetic radiation occurs in the living environment. In order to determine the level of radio-frequency radiation generated by base stations, extensive electromagnetic field strength measurements were carried out for 664 base station locations. Base station locations were classified into three categories: indoor, masts and locations with installations on buildings. Having in mind the large percentage (47 %) of sites with antenna masts, a detailed analysis of this location category was performed, and the measurement results were presented. It was concluded that the total electric field strength in the vicinity of base station antenna masts in no case exceeded 10 V m(-1), which is quite below the International Commission on Non-Ionizing Radiation Protection reference levels. At horizontal distances >50 m from the mast bottom, the median and maximum values were <1 and 2 V m(-1), respectively.
Laser-Based Lighting: Experimental Analysis and Perspectives
Yushchenko, Maksym; Buffolo, Matteo; Meneghini, Matteo; Zanoni, Enrico
2017-01-01
This paper presents an extensive analysis of the operating principles, theoretical background, advantages and limitations of laser-based lighting systems. In the first part of the paper we discuss the main advantages and issues of laser-based lighting, and present a comparison with conventional LED-lighting technology. In the second part of the paper, we present original experimental data on the stability and reliability of phosphor layers for laser lighting, based on high light-intensity and high-temperature degradation tests. In the third part of the paper (for the first time) we present a detailed comparison between three different solutions for laser lighting, based on (i) transmissive phosphor layers; (ii) a reflective/angled phosphor layer; and (iii) a parabolic reflector, by discussing the advantages and drawbacks of each approach. The results presented within this paper can be used as a guideline for the development of advanced lighting systems based on laser diodes. PMID:29019958
Missile Defense Information Technology Small Business Conference
2009-09-01
NetOps Survivability 4 • Supported User Base • Number of Workstations • Number of Servers • Number of Special Circuits • Number of Sites • Number...Contracts, MDIOC • Ground Test (DTC) • MDSEC (SS) • Infrastructure (IC) • BMDS Support (BCT) • JTAAS – SETA • Mod & Sim ( DES ) • Analysis (GML) • Tenants...AUG 09) 4 MDA DOCE Engineering Functions • Design Engineers – Develop detailed design artifacts based on architectural specifications – Coordinate
Multidisciplinary aeroelastic analysis of a generic hypersonic vehicle
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Petersen, K. L.
1993-01-01
This paper presents details of a flutter and stability analysis of aerospace structures such as hypersonic vehicles. Both structural and aerodynamic domains are discretized by the common finite element technique. A vibration analysis is first performed by the STARS code employing a block Lanczos solution scheme. This is followed by the generation of a linear aerodynamic grid for subsequent linear flutter analysis within subsonic and supersonic regimes of the flight envelope; the doublet lattice and constant pressure techniques are employed to generate the unsteady aerodynamic forces. Flutter analysis is then performed for several representative flight points. The nonlinear flutter solution is effected by first implementing a CFD solution of the entire vehicle. Thus, a 3-D unstructured grid for the entire flow domain is generated by a moving front technique. A finite element Euler solution is then implemented employing a quasi-implicit as well as an explicit solution scheme. A novel multidisciplinary analysis is next effected that employs modal and aerodynamic data to yield aerodynamic damping characteristics. Such analyses are performed for a number of flight points to yield a large set of pertinent data that define flight flutter characteristics of the vehicle. This paper outlines the finite-element-based integrated analysis procedures in detail, which is followed by the results of numerical analyses of flight flutter simulation.
NASA Astrophysics Data System (ADS)
Kang, Daiwen
In this research, the sources, distributions, transport, ozone formation potential, and biogenic emissions of VOCs are investigated focusing on three Southeast United States National Parks: Shenandoah National Park, Big Meadows site (SHEN), Great Smoky Mountains National Park at Cove Mountain (GRSM) and Mammoth Cave National Park (MACA). A detailed modeling analysis is conducted using the Multiscale Air Quality SImulation Platform (MAQSIP) focusing on nonmethane hydrocarbons and ozone characterized by high O3 surface concentrations. Nine emissions perturbation using the Multiscale Air Quality SImulation Platform (MAQSIP) focusing on nonmethane hydrocarbons and ozone characterized by high O 3 surface concentrations. In the observation-based analysis, source classification techniques based on correlation coefficient, chemical reactivity, and certain ratios were developed and applied to the data set. Anthropogenic VOCs from automobile exhaust dominate at Mammoth Cave National Park, and at Cove Mountain, Great Smoky Mountains National Park, while at Big Meadows, Shenandoah National Park, the source composition is complex and changed from 1995 to 1996. The dependence of isoprene concentrations on ambient temperatures is investigated, and similar regressional relationships are obtained for all three monitoring locations. Propylene-equivalent concentrations are calculated to account for differences in reaction rates between the OH and individual hydrocarbons, and to thereby estimate their relative contributions to ozone formation. Isoprene fluxes were also estimated for all these rural areas. Model predictions (base scenario) tend to give lower daily maximum O 3 concentrations than observations by 10 to 30%. Model predicted concentrations of lumped paraffin compounds are of the same order of magnitude as the observed values, while the observed concentrations for other species (isoprene, ethene, surrogate olefin, surrogate toluene, and surrogate xylene) are usually an order of magnitude higher than the predictions. Detailed sensitivity and process analyses in terms of ozone and VOC scenarios including the base scenario are designed and utilized in the model simulations. Model predictions are compared with the observed values at the three locations for the same time period. Detailed sensitivity and process analyses in terms of ozone and VOC budgets, and relative importance of various VOCs species are provided. (Abstract shortened by UMI.)
What Touched Your Heart? Collaborative Story Analysis Emerging From an Apsáalooke Cultural Context.
Hallett, John; Held, Suzanne; McCormick, Alma Knows His Gun; Simonds, Vanessa; Real Bird, Sloane; Martin, Christine; Simpson, Colleen; Schure, Mark; Turnsplenty, Nicole; Trottier, Coleen
2017-07-01
Community-based participatory research and decolonizing research share some recommendations for best practices for conducting research. One commonality is partnering on all stages of research; co-developing methods of data analysis is one stage with a deficit of partnering examples. We present a novel community-based and developed method for analyzing qualitative data within an Indigenous health study and explain incompatibilities of existing methods for our purposes and community needs. We describe how we explored available literature, received counsel from community Elders and experts in the field, and collaboratively developed a data analysis method consonant with community values. The method of analysis, in which interview/story remained intact, team members received story, made meaning through discussion, and generated a conceptual framework to inform intervention development, is detailed. We offer the development process and method as an example for researchers working with communities who want to keep stories intact during qualitative data analysis.
Meshfree truncated hierarchical refinement for isogeometric analysis
NASA Astrophysics Data System (ADS)
Atri, H. R.; Shojaee, S.
2018-05-01
In this paper truncated hierarchical B-spline (THB-spline) is coupled with reproducing kernel particle method (RKPM) to blend advantages of the isogeometric analysis and meshfree methods. Since under certain conditions, the isogeometric B-spline and NURBS basis functions are exactly represented by reproducing kernel meshfree shape functions, recursive process of producing isogeometric bases can be omitted. More importantly, a seamless link between meshfree methods and isogeometric analysis can be easily defined which provide an authentic meshfree approach to refine the model locally in isogeometric analysis. This procedure can be accomplished using truncated hierarchical B-splines to construct new bases and adaptively refine them. It is also shown that the THB-RKPM method can provide efficient approximation schemes for numerical simulations and represent a promising performance in adaptive refinement of partial differential equations via isogeometric analysis. The proposed approach for adaptive locally refinement is presented in detail and its effectiveness is investigated through well-known benchmark examples.
Dou, Xinyuan; Chen, Xiaonan; Chen, Maggie Yihong; Wang, Alan Xiaolong; Jiang, Wei; Chen, Ray T
2010-03-01
In this paper, we report the theoretical study of polymer-based photonic crystals for laser beam steering which is based on the superprism effect as well as the experiment fabrication of the two dimensional photonic crystals for the laser beam steering. Superprism effect, the principle for beam steering, was separately studied in details through EFC (Equifrequency Contour) analysis. Polymer based photonic crystals were fabricated through double exposure holographic interference method using SU8-2007. The experiment results showed a beam steering angle of 10 degree for 30 nm wavelength variation.
Design of an electro-optic-polymer-based Mach-Zehnder modulator
NASA Astrophysics Data System (ADS)
Haugen, Chris J.; DeCorby, Ray G.; McMullin, James N.; Pulikkaseril, C.
2000-12-01
A novel structure for an electro-optic (e-o) polymer based Mach-Zehnder modulator is proposed and its anticipated device performance is detailed. The modulator is designed using commercially available materials and makes usc of wellcharacterized electrical and optical structures. The modulator is designed to be competitive with the pertrmance of LiNbO based modulators. The results of the analysis predict a bandwidth of 20 GHz, V of 8-10 V, optical insertion loss of S d13, and a contrast ratio of approximately 13 dB.
A review of biostratigraphic studies in the olistostrome deposits of Karangsambung Formation
NASA Astrophysics Data System (ADS)
Hendrizan, Marfasran
2018-02-01
Planktonic foraminifera is widely used for marine sediment biostratigraphy. Foraminiferal biostratigraphy of Karangsambung Formation is relatively rare to be investigated by previous researchers. A review of foraminiferal biostratigraphy is expected to be early work to perform a research about the ages of Tertiary rock formations in Karangsambung. The research area is formed by olistostrome process; a sedimentary slide deposit characterized by bodies of harder rock mixed and dispersed in a matrix. Biostratigraphic studies based on foraminifera and nannoplankton in Karangsambung Formation are still qualitative analysis using fossils biomarker. However, the age of this formation is still debatable based on foraminifera and nannofossil analysis. Two explanations of debatable ages in Karangsambung Formation that is possibly developed in Karangsambung area: firstly, Karangsambung Formation is characterized by normal sedimentation in some places and other regions such Kali Welaran and Clebok, Village as a product of olistostrome, and secondly, Karangsambung Formation is olistostrome deposit. However, micropaleontology sampling and analysis in matrix clays from olistostrome were ignored causing biostratigraphical results in those matrix clays occurred in normal sedimentation process and achieving the age of middle Eocene to Oligocene. We suppose previous authors picked samples in matrix of Karangsambung Formation from several river sections, which will make misinterpretation of the age of Karangsambung Formation. The age of middle to late Eocene probably is the dates of the older sediment that was reworked by sliding and sampling process and accumulated in Karangsambung Formation. The date of Karangsambung Fm is in Oligocene period based on a finding of several calcareous nannofossils. Detailed micropaleontological analysis of olistostrome deposits in Karangsambung Formation should be reevaluated for new finding of the accurate dating. Re-evaluation should start from detailed sedimentological mapping of Karangsambung Fm transects based on previous authors especially Kali Welaran, Jatibungkus transect and Clebok section followed by systematic sampling of normal sedimentation process from olistostrome products and matrix clays of olistostrome Karangsambung Formation. Finally, quantitative method of micropaleontological analysis can be applied to identify the age of Karangsambung Formation.
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
Algorithms and architecture for multiprocessor based circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deutsch, J.T.
Accurate electrical simulation is critical to the design of high performance integrated circuits. Logic simulators can verify function and give first-order timing information. Switch level simulators are more effective at dealing with charge sharing than standard logic simulators, but cannot provide accurate timing information or discover DC problems. Delay estimation techniques and cell level simulation can be used in constrained design methods, but must be tuned for each application, and circuit simulation must still be used to generate the cell models. None of these methods has the guaranteed accuracy that many circuit designers desire, and none can provide detailed waveformmore » information. Detailed electrical-level simulation can predict circuit performance if devices and parasitics are modeled accurately. However, the computational requirements of conventional circuit simulators make it impractical to simulate current large circuits. In this dissertation, the implementation of Iterated Timing Analysis (ITA), a relaxation-based technique for accurate circuit simulation, on a special-purpose multiprocessor is presented. The ITA method is an SOR-Newton, relaxation-based method which uses event-driven analysis and selective trace to exploit the temporal sparsity of the electrical network. Because event-driven selective trace techniques are employed, this algorithm lends itself to implementation on a data-driven computer.« less
Wu, M.; Xin, Houlin L.; Wang, J. O.; ...
2018-04-24
Synchrotron-based L 2,3-edge absorption spectra show strong sensitivities to the local electronic structure and chemical environment. However, detailed physical information cannot be extracted easily without computational aids. Here in this study using the experimental Ti L 2,3-edges absorption spectrum of SrTiO 3as a fingerprint and considering full multiplet effects, calculations yield different energy parameters characterizing local ground state properties. The peak splitting and intensity ratios of the L 3 and L 2 set of peaks are carefully analyzed quantitatively, giving rise to a small hybridization energy around 1.2 eV, and the different hybridization energy values reported in the literature aremore » further addressed. Finally, absorption spectra with different linearly polarized photons under various tetragonal crystal fields are investigated, revealing a non-linear orbital–lattice interaction, and a theoretical guidance for material engineering of SrTiO 3-based thin films and heterostructures is offered. Finally, detailed analysis of spectrum shifts with different tetragonal crystal fields suggests that the e g crystal field splitting is a necessary parameter for a thorough analysis of the spectra, even though it is not relevant for the ground state properties.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, M.; Xin, Houlin L.; Wang, J. O.
Synchrotron-based L 2,3-edge absorption spectra show strong sensitivities to the local electronic structure and chemical environment. However, detailed physical information cannot be extracted easily without computational aids. Here in this study using the experimental Ti L 2,3-edges absorption spectrum of SrTiO 3as a fingerprint and considering full multiplet effects, calculations yield different energy parameters characterizing local ground state properties. The peak splitting and intensity ratios of the L 3 and L 2 set of peaks are carefully analyzed quantitatively, giving rise to a small hybridization energy around 1.2 eV, and the different hybridization energy values reported in the literature aremore » further addressed. Finally, absorption spectra with different linearly polarized photons under various tetragonal crystal fields are investigated, revealing a non-linear orbital–lattice interaction, and a theoretical guidance for material engineering of SrTiO 3-based thin films and heterostructures is offered. Finally, detailed analysis of spectrum shifts with different tetragonal crystal fields suggests that the e g crystal field splitting is a necessary parameter for a thorough analysis of the spectra, even though it is not relevant for the ground state properties.« less
Toward a Coherent Detailed Evaluation of Aerosol Data Products from Multiple Satellite Sensors
NASA Technical Reports Server (NTRS)
Ichoku, Charles; Petrenko, Maksym; Leptoukh, Gregory
2011-01-01
Atmospheric aerosols represent one of the greatest uncertainties in climate research. Although satellite-based aerosol retrieval has practically become routine, especially during the last decade, there is often disagreement between similar aerosol parameters retrieved from different sensors, leaving users confused as to which sensors to trust for answering important science questions about the distribution, properties, and impacts of aerosols. As long as there is no consensus and the inconsistencies are not well characterized and understood, there will be no way of developing reliable climate data records from satellite aerosol measurements. Fortunately, the most globally representative well-calibrated ground-based aerosol measurements corresponding to the satellite-retrieved products are available from the Aerosol Robotic Network (AERONET). To adequately utilize the advantages offered by this vital resource, an online Multi-sensor Aerosol Products Sampling System (MAPSS) was recently developed. The aim of MAPSS is to facilitate detailed comparative analysis of satellite aerosol measurements from different sensors (Terra-MODIS, Aqua-MODIS, TerraMISR, Aura-OMI, Parasol-POLDER, and Calipso-CALIOP) based on the collocation of these data products over AERONET stations. In this presentation, we will describe the strategy of the MASS system, its potential advantages for the aerosol community, and the preliminary results of an integrated comparative uncertainly analysis of aerosol products from multiple satellite sensors.
Srivastava, Anubhav; Evans, Krystal J; Sexton, Anna E; Schofield, Louis; Creek, Darren J
2017-04-07
A detailed analysis of the metabolic state of human-stem-cell-derived erythrocytes allowed us to characterize the existence of active metabolic pathways in younger reticulocytes and compare them to mature erythrocytes. Using high-resolution LC-MS-based untargeted metabolomics, we found that reticulocytes had a comparatively much richer repertoire of metabolites, which spanned a range of metabolite classes. An untargeted metabolomics analysis using stable-isotope-labeled glucose showed that only glycolysis and the pentose phosphate pathway actively contributed to the biosynthesis of metabolites in erythrocytes, and these pathways were upregulated in reticulocytes. Most metabolite species found to be enriched in reticulocytes were residual pools of metabolites produced by earlier erythropoietic processes, and their systematic depletion in mature erythrocytes aligns with the simplification process, which is also seen at the cellular and the structural level. Our work shows that high-resolution LC-MS-based untargeted metabolomics provides a global coverage of the biochemical species that are present in erythrocytes. However, the incorporation of stable isotope labeling provides a more accurate description of the active metabolic processes that occur in each developmental stage. To our knowledge, this is the first detailed characterization of the active metabolic pathways of the erythroid lineage, and it provides a rich database for understanding the physiology of the maturation of reticulocytes into mature erythrocytes.
Wingard, G. Lynn
1993-01-01
Current theories on the causes of extinction at the CretaceousTertiary boundary have been based on previously published data; however, few workers have stopped to ask the question, 'How good is the basic data set?' To test the accuracy of the published record, a quantitative and qualitative analysis of the Crassatellidae (Mollusca, Bivalvia) of the Gulf and Mid-Atlantic Coastal Plains of the United States for the Upper Cretaceous and lower Tertiary was conducted. Thirty-eight species names and four generic names are used in publications for the Crassatellidae within the geographic and stratigraphic constraints of this analysis. Fourteen of the 38 species names are represented by statistically valid numbers of specimens and were tested by using canonical discriminant analysis. All 38 names, with the exception of 1 invalid name and 4 names for which no representative specimen could be located, were evaluated qualitatively. The results show that the published fossil record is highly inaccurate. Only 8 valid, recognizable species exist in the Crassatellidae within the limits of this study, 14 names are synonymized, and 11 names are represented by indeterminate molds or poorly preserved specimens. Three of the four genera are well founded; the fourth is based on the juvenile of another genus and therefore synonymized. This detailed taxonomic analysis of the Crassatellidae illustrates that the published fossil record is not reliable. Calculations of evolutionary and paleobiologic significance based on poorly defined, overly split fossil groups, such as the Crassatellidae, are biased in the following ways: Rates of evolution and extinction are higher, Faunal turnover at mass extinctions appears more catastrophic, Species diversity is high, Average species durations are shortened, and Geographic ranges are restricted. The data on the taxonomically standardized Crassatellidae show evolutionary rates one-quarter to one-half that of the published fossil record; faunal change at the Cretaceous-Tertiary boundary that was not catastrophic; a constant number of species on each side of the Cretaceous-Tertiary boundary; a decrease in abundance in the Tertiary; and lower species diversity, longer average species durations, and expanded geographic ranges. Similar detailed taxonomic studies need to be conducted on other groups of organisms to test the patterns illustrated for the Crassatellidae and to determine the extent and direction of the bias in the published fossil record. Answers to our questions about evolutionary change cannot be found in the literature but rather with the fossils themselves. Evolution and extinction occur within small populations of species groups, and it is only through detailed analysis of these groups that we can achieve an understanding of the causes and effects of evolution and extinction.
The STROBE statement and neuropsychology: lighting the way toward evidence-based practice.
Loring, David W; Bowden, Stephen C
2014-01-01
Reporting appropriate research detail across clinical disciplines is often inconsistent or incomplete. Insufficient report detail reduces confidence in findings, makes study replication more difficult, and decreases the precision of data available for critical review including meta-analysis. In response to these concerns, cooperative attempts across multiple specialties have developed explicit research reporting standards to guide publication detail. These recommendations have been widely adopted by high impact medical journals, but have not yet been widely embraced by neuropsychology. The STROBE Statement (STrengthening the Reporting of Observational studies in Epidemiology) is particularly relevant to neuropsychology since clinical research is often based on non-funded studies of patient samples. In this paper we describe the STROBE Statement and demonstrate how STROBE criteria, applied to reporting of neuropsychological findings, will maintain neuropsychology's position as a leader in quantifying brain-behavior relationships. We also provide specific recommendations for data reporting and disclosure of perceived conflicts of interest that will further enhance reporting transparency for possible perceived sources of bias. In an era in which evidence-based practice assumes an increasingly prominent role, improved reporting standards will promote better patient care, assist in developing quality practice guidelines, and ensure that neuropsychology remains a vigorous discipline in the clinical neurosciences that consciously aspires to high methodological rigor.
Point-based and model-based geolocation analysis of airborne laser scanning data
NASA Astrophysics Data System (ADS)
Sefercik, Umut Gunes; Buyuksalih, Gurcan; Jacobsen, Karsten; Alkan, Mehmet
2017-01-01
Airborne laser scanning (ALS) is one of the most effective remote sensing technologies providing precise three-dimensional (3-D) dense point clouds. A large-size ALS digital surface model (DSM) covering the whole Istanbul province was analyzed by point-based and model-based comprehensive statistical approaches. Point-based analysis was performed using checkpoints on flat areas. Model-based approaches were implemented in two steps as strip to strip comparing overlapping ALS DSMs individually in three subareas and comparing the merged ALS DSMs with terrestrial laser scanning (TLS) DSMs in four other subareas. In the model-based approach, the standard deviation of height and normalized median absolute deviation were used as the accuracy indicators combined with the dependency of terrain inclination. The results demonstrate that terrain roughness has a strong impact on the vertical accuracy of ALS DSMs. From the relative horizontal shifts determined and partially improved by merging the overlapping strips and comparison of the ALS, and the TLS, data were found not to be negligible. The analysis of ALS DSM in relation to TLS DSM allowed us to determine the characteristics of the DSM in detail.
Graphene Based Electrochemical Sensors and Biosensors: A Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shao, Yuyan; Wang, Jun; Wu, Hong
2010-05-01
Graphene, emerging as a true 2-dimensional material, has received increasing attention due to its unique physicochemical properties (high surface area, excellent conductivity, high mechanical strength, and ease of functionalization and mass production). This article selectively reviews recent advances in graphene-based electrochemical sensors and biosensors. In particular, graphene for direct electrochemistry of enzyme, its electrocatalytic activity toward small biomolecules (hydrogen peroxide, NADH, dopamine, etc.), and graphene-based enzyme biosensors have been summarized in more detail; Graphene-based DNA sensing and environmental analysis have been discussed. Future perspectives in this rapidly developing field are also discussed.
Watershed-based Morphometric Analysis: A Review
NASA Astrophysics Data System (ADS)
Sukristiyanti, S.; Maria, R.; Lestiana, H.
2018-02-01
Drainage basin/watershed analysis based on morphometric parameters is very important for watershed planning. Morphometric analysis of watershed is the best method to identify the relationship of various aspects in the area. Despite many technical papers were dealt with in this area of study, there is no particular standard classification and implication of each parameter. It is very confusing to evaluate a value of every morphometric parameter. This paper deals with the meaning of values of the various morphometric parameters, with adequate contextual information. A critical review is presented on each classification, the range of values, and their implications. Besides classification and its impact, the authors also concern about the quality of input data, either in data preparation or scale/the detail level of mapping. This review paper hopefully can give a comprehensive explanation to assist the upcoming research dealing with morphometric analysis.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
Stories and Gossip in English: The Macro-Structure of Casual Talk.
ERIC Educational Resources Information Center
Slade, Diana
1997-01-01
A discussion of two text-types commonly occurring in casual conversation, stories and gossip, (1) details four kinds of stories told in casual talk, (2) demonstrates that gossip is a culturally-determined process with a distinctive structure, and (3) considers implications for teaching English-as-a- Second-Language. Analysis is based on over three…
ERIC Educational Resources Information Center
Gulzar, Malik Ajmal; Farooq, Muhammad Umar; Umer, Muhammad
2013-01-01
This article has sought to contribute to discussions concerning the value of inter-sentential patterns of code-switching (henceforth ISPCS) particularly in the context of EFL classrooms. Through a detailed analysis of recorded data produced in that context, distinctive features in the discourse were discerned which were associated with males' and…
Hydrogen ion speciation in the acid precipitation of the northeastern United States
James N. Galloway; Gene E. Likens; Eric S. Edgerton
1976-01-01
The acidity of precipitation in rural, forested areas of the northeastern United States is dominated by the strong mineral acids, sulfuric and nitric. Weak acids have a negligible effect on the measured acidity (pH) of precipitation. These conclusions are based on total acidity titrations and detailed analysis of organic and inorganic components in precipitation.
Porphyry copper assessment of western Central Asia: Chapter N in Global mineral resource assessment
Berger, Byron R.; Mars, John L.; Denning, Paul; Phillips, Jeffrey D.; Hammarstrom, Jane M.; Zientek, Michael L.; Dicken, Connie L.; Drew, Lawrence J.; with contributions from Alexeiev, Dmitriy; Seltmann, Reimar; Herrington, Richard J.
2014-01-01
Detailed descriptions of each permissive tract, including the rationales for delineation and assessment, are given in appendixes, along with a geographic information system (GIS) that includes permissive tract boundaries, point locations of known porphyry copper deposits and significant occurrences, and hydrothermal alteration data based on analysis of remote sensing data.
ERIC Educational Resources Information Center
Whitburn, Ben
2015-01-01
In this paper, a detailed analysis based on the lived experiences of the study participants and the researcher (each with vision impairment) in education, post school and in the pursuit for employment is developed. The policy discourses of disability legislation--both at national and international levels--are explored with particular reference to…
ERIC Educational Resources Information Center
Kanakis, Ioannis
1997-01-01
Examines the Socratic method through a comparative analysis of early Platonic dialogs with theories of critical rationalism and cognitive theories based on achievement motivation. Presents details of the Socratic strategy of teaching and learning, including critical reflection, conversation, and intellectual honesty; asserts that these methods are…
ERIC Educational Resources Information Center
Waring, Hansun Zhang
2013-01-01
Despite the push for fostering reflective practices in teacher education in the last 20 years, true reflection remains rare (Farr, 2011). Based on a detailed analysis of four mentor-teacher meetings in a graduate TESOL program, I show how specific mentor practices generate teacher reflection without explicit solicitations. Findings of this study…
The Influence of Video Technology in Adolescence. Media Panel Report No. 27.
ERIC Educational Resources Information Center
Roe, Keith
This report provides a detailed analysis of the video use and preferences of Swedish adolescents based on data drawn from the Media Panel project, a three-wave, longitudinal research program on video use conducted at the Department of Sociology, The University of Lund, and the Department for Information Techniques, the University College of Vaxjo,…
The Educational Meaning of Tiredness: Agamben and Buytendijk on the Experience of (Im)potentiality
ERIC Educational Resources Information Center
Vlieghe, Joris
2016-01-01
In this article, I go deeper into the educational meaning of tiredness. Over and against the mainstream view that tiredness is an impediment for education, I show that this phenomenon is intrinsically meaningful. My arguments are based, first, on a detailed phenomenological analysis of tiredness, as proposed by Buytendijk. Tiredness can be defined…
Geographic information system-based spatial analysis of sawmill wood procurement
Nathaniel M. Anderson; Rene H. Germain; Eddie Bevilacqua
2011-01-01
In the sawmill sector of the forest products industry, the clustering of mills and wide variation in forest stocking and ownership result in sawlog markets that are complex and spatially differentiated. Despite the inherent spatial attributes of markets for stumpage and logs, few studies have used geospatial methods to examine wood procurement in detail across...
ERIC Educational Resources Information Center
Wang, Xin
2017-01-01
Scholars debate whether corrective feedback contributes to improving L2 learners' grammatical accuracy in writing performance. Some researchers take a stance on the ineffectiveness of corrective feedback based on the impracticality of providing detailed corrective feedback for all L2 learners and detached grammar instruction in language…
ERIC Educational Resources Information Center
Cremin, Teresa; Glauert, Esme; Craft, Anna; Compton, Ashley; Stylianidou, Fani
2015-01-01
In the light of the European Union's interest in creativity and innovation, this paper, drawing on data from the EU project Creative Little Scientists (2011-2014), explores the teaching and learning of science and creativity in Early Years education. The project's conceptual framework, developed from detailed analysis of relevant literatures,…
Diplomas Count: An Essential Guide to Graduation Policy and Rates
ERIC Educational Resources Information Center
Edwards, Virginia B., Ed.
2006-01-01
"Education Week" provides a weekly review of state and federal K-12 education policy news. In this issue it offers detailed data on graduation rates across the 50 states and the District of Columbia, and in the nation's 50 largest school districts. The analysis is based on the Cumulative Promotion Index developed by Christopher B. Swanson, the…
Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems
NASA Astrophysics Data System (ADS)
McCrink, Matthew Henry
This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.
Coal-Based Fuel-Cell Powerplants
NASA Technical Reports Server (NTRS)
Ferral, J. F.; Pappano, A. W.; Jennings, C. N.
1986-01-01
Report assesses advanced technologyy design alternatives for integrated coal-gasifier/fuel-cell powerplants. Various gasifier, cleanup, and fuelcell options evaluated. Evaluation includes adjustments to assumed performances and costs of proposed technologies where required. Analysis identifies uncertainties remaining in designs and most promising alternatives and research and development required to develop these technologies. Bulk of report summary and detailed analysis of six major conceptual designs and variations of each. All designs for plant that uses Illinois No. 6 coal and produces 675 MW of net power.
Improving Image Drizzling in the HST Archive: Advanced Camera for Surveys
NASA Astrophysics Data System (ADS)
Hoffmann, Samantha L.; Avila, Roberto J.
2017-06-01
The Mikulski Archive for Space Telescopes (MAST) pipeline performs geometric distortion corrections, associated image combinations, and cosmic ray rejections with AstroDrizzle on Hubble Space Telescope (HST) data. The MDRIZTAB reference table contains a list of relevant parameters that controls this program. This document details our photometric analysis of Advanced Camera for Surveys Wide Field Channel (ACS/WFC) data processed by AstroDrizzle. Based on this analysis, we update the MDRIZTAB table to improve the quality of the drizzled products delivered by MAST.
2016-06-10
detailed description of issues surrounding a subject matter. The use of case studies provides the experimental foundation for qualitative analysis. As...The chapter provided a description of the case studies -based QCA methodology, highlighted how the Charles Ragin QCA will be used for data analysis...world. Against this backdrop, a study assessing the challenges and prospects of sub- regional post -conflict peacebuilding efforts will not only be
Quantitative analysis of single-molecule superresolution images
Coltharp, Carla; Yang, Xinxing; Xiao, Jie
2014-01-01
This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006
Analysis of NASA JP-4 fire tests data and development of a simple fire model
NASA Technical Reports Server (NTRS)
Raj, P.
1980-01-01
The temperature, velocity and species concentration data obtained during the NASA fire tests (3m, 7.5m and 15m diameter JP-4 fires) were analyzed. Utilizing the data analysis, a sample theoretical model was formulated to predict the temperature and velocity profiles in JP-4 fires. The theoretical model, which does not take into account the detailed chemistry of combustion, is capable of predicting the extent of necking of the fire near its base.
Aeroelastic Stability and Response of Rotating Structures
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Reddy, Tondapu
2004-01-01
A summary of the work performed under NASA grant is presented. More details can be found in the cited references. This grant led to the development of relatively faster aeroelastic analysis methods for predicting flutter and forced response in fans, compressors, and turbines using computational fluid dynamic (CFD) methods. These methods are based on linearized two- and three-dimensional, unsteady, nonlinear aerodynamic equations. During the period of the grant, aeroelastic analysis that includes the effects of uncertainties in the design variables has also been developed.
Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.
Tauber, J; Lahav, M
1987-11-01
A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.
Heat and Mass Transfer Processes in Scrubber of Flue Gas Heat Recovery Device
NASA Astrophysics Data System (ADS)
Veidenbergs, Ivars; Blumberga, Dagnija; Vigants, Edgars; Kozuhars, Grigorijs
2010-01-01
The paper deals with the heat and mass transfer process research in a flue gas heat recovery device, where complicated cooling, evaporation and condensation processes are taking place simultaneously. The analogy between heat and mass transfer is used during the process of analysis. In order to prepare a detailed process analysis based on heat and mass process descriptive equations, as well as the correlation for wet gas parameter calculation, software in the
An efficient liner cooling scheme for advanced small gas turbine combustors
NASA Technical Reports Server (NTRS)
Paskin, Marc D.; Mongia, Hukam C.; Acosta, Waldo A.
1993-01-01
A joint Army/NASA program was conducted to design, fabricate, and test an advanced, small gas turbine, reverse-flow combustor utilizing a compliant metal/ceramic (CMC) wall cooling concept. The objectives of this effort were to develop a design method (basic design data base and analysis) for the CMC cooling technique and then demonstrate its application to an advanced cycle, small, reverse-flow combustor with 3000 F burner outlet temperature. The CMC concept offers significant improvements in wall cooling effectiveness resulting in a large reduction in cooling air requirements. Therefore, more air is available for control of burner outlet temperature pattern in addition to the benefits of improved efficiency, reduced emissions, and lower smoke levels. The program was divided into four tasks. Task 1 defined component materials and localized design of the composite wall structure in conjunction with development of basic design models for the analysis of flow and heat transfer through the wall. Task 2 included implementation of the selected materials and validated design models during combustor preliminary design. Detail design of the selected combustor concept and its refinement with 3D aerothermal analysis were completed in Task 3. Task 4 covered detail drawings, process development and fabrication, and a series of burner rig tests. The purpose of this paper is to provide details of the investigation into the fundamental flow and heat transfer characteristics of the CMC wall structure as well as implementation of the fundamental analysis method for full-scale combustor design.
NASA Astrophysics Data System (ADS)
Niederheiser, R.; Rutzinger, M.; Bremer, M.; Wichmann, V.
2018-04-01
The investigation of changes in spatial patterns of vegetation and identification of potential micro-refugia requires detailed topographic and terrain information. However, mapping alpine topography at very detailed scales is challenging due to limited accessibility of sites. Close-range sensing by photogrammetric dense matching approaches based on terrestrial images captured with hand-held cameras offers a light-weight and low-cost solution to retrieve high-resolution measurements even in steep terrain and at locations, which are difficult to access. We propose a novel approach for rapid capturing of terrestrial images and a highly automated processing chain for retrieving detailed dense point clouds for topographic modelling. For this study, we modelled 249 plot locations. For the analysis of vegetation distribution and location properties, topographic parameters, such as slope, aspect, and potential solar irradiation were derived by applying a multi-scale approach utilizing voxel grids and spherical neighbourhoods. The result is a micro-topography archive of 249 alpine locations that includes topographic parameters at multiple scales ready for biogeomorphological analysis. Compared with regional elevation models at larger scales and traditional 2D gridding approaches to create elevation models, we employ analyses in a fully 3D environment that yield much more detailed insights into interrelations between topographic parameters, such as potential solar irradiation, surface area, aspect and roughness.
Assessing population exposure for landslide risk analysis using dasymetric cartography
NASA Astrophysics Data System (ADS)
Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.
2016-12-01
Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.
Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M
2018-04-10
To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.
The Tibesti Volcanoes of Chad: an ASTER-based Remote Sensing Analysis
NASA Astrophysics Data System (ADS)
Permenter, J. L.; Oppenheimer, C.
2002-12-01
Situated in the central Sahara desert, the Tibesti volcanic province of northern Chad, Africa, is a superb example of large-scale continental hot spot volcanism. The massif is comprised of seven major volcanoes and an assembly of related volcanic and tectonic structures, with a total surface area of over 350 km2. Its highest peak (Emi Koussi) rises above the surrounding desert to ~3415 m above sea level. Due, in part, to its remoteness, the Tibesti has never been described in volcanological detail. This study aims to provide the first modern synthesis of the volcanology of this significant hot spot province. It is based primarily on a detailed analysis and interpretation of a comprehensive set of multi-band imagery from NASA's Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER). ASTER has 14 spectral bands, divided between 3 optical subsystems; 3 in the very-near infrared (VNIR), 6 in the short-wave infrared, and 5 in the thermal infrared regions. In addition, the VNIR subsystem has aft-viewing optics for stereoscopic observation in the along-track direction, which permits generation of digital elevation models. The preliminary results presented here focus on the discrimination of lava composition, identification of pyroclastic deposits, and characterisation of the dimension of flows, craters, and other structural elements of the massif, using spectral and textural information gathered from the ASTER imagery. Furthermore, stratigraphic detail is obtained from the superposition of flow units and craters. The application of ASTER data to the Tibesti volcanic complex permits an initial first order description of the relative proportions and timing of different erupted materials, providing a framework for further interpretation of the volcanology and magmatic evolution of the Tibesti, based on modern geologic and tectonic concepts. It also allows intercomparisons to be made with other continental hot spot provinces.
Mechanism reduction for multicomponent surrogates: A case study using toluene reference fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niemeyer, Kyle E.; Sung, Chih-Jen
Strategies and recommendations for performing skeletal reductions of multicomponent surrogate fuels are presented, through the generation and validation of skeletal mechanisms for a three-component toluene reference fuel. Using the directed relation graph with error propagation and sensitivity analysis method followed by a further unimportant reaction elimination stage, skeletal mechanisms valid over comprehensive and high-temperature ranges of conditions were developed at varying levels of detail. These skeletal mechanisms were generated based on autoignition simulations, and validation using ignition delay predictions showed good agreement with the detailed mechanism in the target range of conditions. When validated using phenomena other than autoignition, suchmore » as perfectly stirred reactor and laminar flame propagation, tight error control or more restrictions on the reduction during the sensitivity analysis stage were needed to ensure good agreement. In addition, tight error limits were needed for close prediction of ignition delay when varying the mixture composition away from that used for the reduction. In homogeneous compression-ignition engine simulations, the skeletal mechanisms closely matched the point of ignition and accurately predicted species profiles for lean to stoichiometric conditions. Furthermore, the efficacy of generating a multicomponent skeletal mechanism was compared to combining skeletal mechanisms produced separately for neat fuel components; using the same error limits, the latter resulted in a larger skeletal mechanism size that also lacked important cross reactions between fuel components. Based on the present results, general guidelines for reducing detailed mechanisms for multicomponent fuels are discussed.« less
Mechanism reduction for multicomponent surrogates: A case study using toluene reference fuels
Niemeyer, Kyle E.; Sung, Chih-Jen
2014-11-01
Strategies and recommendations for performing skeletal reductions of multicomponent surrogate fuels are presented, through the generation and validation of skeletal mechanisms for a three-component toluene reference fuel. Using the directed relation graph with error propagation and sensitivity analysis method followed by a further unimportant reaction elimination stage, skeletal mechanisms valid over comprehensive and high-temperature ranges of conditions were developed at varying levels of detail. These skeletal mechanisms were generated based on autoignition simulations, and validation using ignition delay predictions showed good agreement with the detailed mechanism in the target range of conditions. When validated using phenomena other than autoignition, suchmore » as perfectly stirred reactor and laminar flame propagation, tight error control or more restrictions on the reduction during the sensitivity analysis stage were needed to ensure good agreement. In addition, tight error limits were needed for close prediction of ignition delay when varying the mixture composition away from that used for the reduction. In homogeneous compression-ignition engine simulations, the skeletal mechanisms closely matched the point of ignition and accurately predicted species profiles for lean to stoichiometric conditions. Furthermore, the efficacy of generating a multicomponent skeletal mechanism was compared to combining skeletal mechanisms produced separately for neat fuel components; using the same error limits, the latter resulted in a larger skeletal mechanism size that also lacked important cross reactions between fuel components. Based on the present results, general guidelines for reducing detailed mechanisms for multicomponent fuels are discussed.« less
An economic analysis of five selected LANDSAT assisted information systems in Oregon
NASA Technical Reports Server (NTRS)
Solomon, S.; Maher, K. M.
1979-01-01
A comparative cost analysis was performed on five LANDSAT-based information systems. In all cases, the LANDSAT system was found to have cost advantages over its alternative. The information sets generated by LANDSAT and the alternative method are not identical but are comparable in terms of satisfying the needs of the sponsor. The information obtained from the LANDSAT system in some cases is said to lack precision and detail. On the other hand, it was found to be superior in terms of providing information on areas that are inaccessible and unobtainable through conventional means. There is therefore a trade-off between precision and detail, and considerations of costs. The projects examined were concerned with locating irrigation circles in Morrow County; monitoring tansy ragwort infestation; inventoring old growth Douglas fir near Spotted Owl habitats; inventoring vegetation and resources in all state-owned lands; and determining and use for Columbia River water policies.
Geometric and dynamic perspectives on phase-coherent and noncoherent chaos.
Zou, Yong; Donner, Reik V; Kurths, Jürgen
2012-03-01
Statistically distinguishing between phase-coherent and noncoherent chaotic dynamics from time series is a contemporary problem in nonlinear sciences. In this work, we propose different measures based on recurrence properties of recorded trajectories, which characterize the underlying systems from both geometric and dynamic viewpoints. The potentials of the individual measures for discriminating phase-coherent and noncoherent chaotic oscillations are discussed. A detailed numerical analysis is performed for the chaotic Rössler system, which displays both types of chaos as one control parameter is varied, and the Mackey-Glass system as an example of a time-delay system with noncoherent chaos. Our results demonstrate that especially geometric measures from recurrence network analysis are well suited for tracing transitions between spiral- and screw-type chaos, a common route from phase-coherent to noncoherent chaos also found in other nonlinear oscillators. A detailed explanation of the observed behavior in terms of attractor geometry is given.
Direct optical detection of protein-ligand interactions.
Gesellchen, Frank; Zimmermann, Bastian; Herberg, Friedrich W
2005-01-01
Direct optical detection provides an excellent means to investigate interactions of molecules in biological systems. The dynamic equilibria inherent to these systems can be described in greater detail by recording the kinetics of a biomolecular interaction. Optical biosensors allow direct detection of interaction patterns without the need for labeling. An overview covering several commercially available biosensors is given, with a focus on instruments based on surface plasmon resonance (SPR) and reflectometric interference spectroscopy (RIFS). Potential assay formats and experimental design, appropriate controls, and calibration procedures, especially when handling low molecular weight substances, are discussed. The single steps of an interaction analysis combined with practical tips for evaluation, data processing, and interpretation of kinetic data are described in detail. In a practical example, a step-by-step procedure for the analysis of a low molecular weight compound interaction with serum protein, determined on a commercial SPR sensor, is presented.
Wang, Yanqing; Chong, Heap-Yih; Liao, Pin-Chao; Ren, Hantao
2017-09-25
Unsafe behavior is a leading factor in accidents, and the working environment significantly affects behaviors. However, few studies have focused on detailed mechanisms for addressing unsafe behaviors resulting from environmental constraints. This study aims to delineate these mechanisms using cognitive work analysis (CWA) for an elevator installation case study. Elevator installation was selected for study because it involves operations at heights: falls from heights remain a major cause of construction worker mortality. This study adopts a mixed research approach based on three research methodology stages. This research deconstructs the details of the working environment, the workers' decision-making processes, the strategies chosen given environmental conditions and the conceptual model for workers' behaviors, which jointly depict environment-behavior mechanisms at length. By applying CWA to the construction industry, environmental constraints can easily be identified, and targeted engineering suggestions can be generated.
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
NASA Astrophysics Data System (ADS)
Lu, Siqi; Wang, Xiaorong; Wu, Junyong
2018-01-01
The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.
Pusic, Martin V.; LeBlanc, Vicki; Patel, Vimla L.
2001-01-01
Traditional task analysis for instructional design has emphasized the importance of precisely defining behavioral educational objectives and working back to select objective-appropriate instructional strategies. However, this approach may miss effective strategies. Cognitive task analysis, on the other hand, breaks a process down into its component knowledge representations. Selection of instructional strategies based on all such representations in a domain is likely to lead to optimal instructional design. In this demonstration, using the interpretation of cervical spine x-rays as an educational example, we show how a detailed cognitive task analysis can guide the development of computer-aided instruction.
Ganga, G M D; Esposto, K F; Braatz, D
2012-01-01
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Cost/Effort Drivers and Decision Analysis
NASA Technical Reports Server (NTRS)
Seidel, Jonathan
2010-01-01
Engineering trade study analyses demand consideration of performance, cost and schedule impacts across the spectrum of alternative concepts and in direct reference to product requirements. Prior to detailed design, requirements are too often ill-defined (only goals ) and prone to creep, extending well beyond the Systems Requirements Review. Though lack of engineering design and definitive requirements inhibit the ability to perform detailed cost analyses, affordability trades still comprise the foundation of these future product decisions and must evolve in concert. This presentation excerpts results of the recent NASA subsonic Engine Concept Study for an Advanced Single Aisle Transport to demonstrate an affordability evaluation of performance characteristics and the subsequent impacts on engine architecture decisions. Applying the Process Based Economic Analysis Tool (PBEAT), development cost, production cost, as well as operation and support costs were considered in a traditional weighted ranking of the following system-level figures of merit: mission fuel burn, take-off noise, NOx emissions, and cruise speed. Weighting factors were varied to ascertain the architecture ranking sensitivities to these performance figures of merit with companion cost considerations. A more detailed examination of supersonic variable cycle engine cost is also briefly presented, with observations and recommendations for further refinements.
Two-dimensional analysis of coupled heat and moisture transport in masonry structures
NASA Astrophysics Data System (ADS)
Krejčí, Tomáš
2016-06-01
Reconstruction and maintenance of historical buildings and bridges require good knowledge of temperature and moisture distribution. Sharp changes in the temperature and moisture can lead to damage. This paper describes analysis of coupled heat and moisture transfer in masonry based on two-level approach. Macro-scale level describes the whole structure while meso-scale level takes into account detailed composition of the masonry. The two-level approach is very computationally demanding and it was implemented in parallel. The two-level approach was used in analysis of temperature and moisture distribution in Charles bridge in Prague, Czech Republic.
On the topological structure of multinationals network
NASA Astrophysics Data System (ADS)
Joyez, Charlie
2017-05-01
This paper uses a weighted network analysis to examine the structure of multinationals' implantation countries network. Based on French firm-level dataset of multinational enterprises (MNEs) the network analysis provides information on each country position in the network and in internationalization strategies of French MNEs through connectivity preferences among the nodes. The paper also details network-wide features and their recent evolution toward a more decentralized structure. While much has been said on international trade network, this paper shows that multinational firms' studies would also benefit from network analysis, notably by investigating the sensitivity of the network construction to firm heterogeneity.
Analysis of street sweepings, Portland, Oregon
Miller, Timothy L.; Rinella, Joseph F.; McKenzie, Stuart W.; Parmenter, Jerry
1977-01-01
A brief study involving collection and analysis of street sweepings was undertaken to provide the U.S. Army Corps of Engineers with data on physical, chemical, and biological characteristics of dust and dirt accumulating on Portland streets. Most of the analyses selected were based on the pollutant loads predicted by the Storage, Treatment, Overflow, and Runoff Model (STORM). Five different basins were selected for sampling, and samples were collected three times in each basin. Because the literature reports no methodology for analysis of dust and dirt, the analytical methodology is described in detail. Results of the analyses are summarized in table 1.
NASA Astrophysics Data System (ADS)
Siettos, C. I.; Gear, C. W.; Kevrekidis, I. G.
2012-08-01
We show how the equation-free approach can be exploited to enable agent-based simulators to perform system-level computations such as bifurcation, stability analysis and controller design. We illustrate these tasks through an event-driven agent-based model describing the dynamic behaviour of many interacting investors in the presence of mimesis. Using short bursts of appropriately initialized runs of the detailed, agent-based simulator, we construct the coarse-grained bifurcation diagram of the (expected) density of agents and investigate the stability of its multiple solution branches. When the mimetic coupling between agents becomes strong enough, the stable stationary state loses its stability at a coarse turning point bifurcation. We also demonstrate how the framework can be used to design a wash-out dynamic controller that stabilizes open-loop unstable stationary states even under model uncertainty.
Virtual prototyping of drop test using explicit analysis
NASA Astrophysics Data System (ADS)
Todorov, Georgi; Kamberov, Konstantin
2017-12-01
Increased requirements for reliability and safety, included in contemporary standards and norms, has high impact over new product development. New numerical techniques based on virtual prototyping technology, facilitates imrpoving product development cycle, resutling in reduced time/money spent for this stage as well as increased knowledge about certain failure mechanism. So called "drop test" became nearly a "must" step in development of any human operated product. This study aims to demonstrate dynamic behaviour assessment of a structure under impact loads, based on virtual prototyping using a typical nonlinear analysis - explicit dynamics. An example is presneted, based on a plastic container that is used as cartridge for a dispenser machine exposed to various work conditions. Different drop orientations were analyzed and critical load cases and design weaknesses have been found. Several design modifications have been proposed, based on detailed analyses results review.
Analysis of a Preloaded Bolted Joint in a Ceramic Composite Combustor
NASA Technical Reports Server (NTRS)
Hissam, D. Andy; Bower, Mark V.
2003-01-01
This paper presents the detailed analysis of a preloaded bolted joint incorporating ceramic materials. The objective of this analysis is to determine the suitability of a joint design for a ceramic combustor. The analysis addresses critical factors in bolted joint design including preload, preload uncertainty, and load factor. The relationship between key joint variables is also investigated. The analysis is based on four key design criteria, each addressing an anticipated failure mode. The criteria are defined in terms of margin of safety, which must be greater than zero for the design criteria to be satisfied. Since the proposed joint has positive margins of safety, the design criteria are satisfied. Therefore, the joint design is acceptable.
Modelling, design and stability analysis of an improved SEPIC converter for renewable energy systems
NASA Astrophysics Data System (ADS)
G, Dileep; Singh, S. N.; Singh, G. K.
2017-09-01
In this paper, a detailed modelling and analysis of a switched inductor (SI)-based improved single-ended primary inductor converter (SEPIC) has been presented. To increase the gain of conventional SEPIC converter, input and output side inductors are replaced with SI structures. Design and stability analysis for continuous conduction mode operation of the proposed SI-SEPIC converter has also been presented in this paper. State space averaging technique is used to model the converter and carry out the stability analysis. Performance and stability analysis of closed loop configuration is predicted by observing the open loop behaviour using Nyquist diagram and Nichols chart. System was found to stable and critically damped.
NASA's online machine aided indexing system
NASA Technical Reports Server (NTRS)
Silvester, June P.; Genuardi, Michael T.; Klingbiel, Paul H.
1993-01-01
This report describes the NASA Lexical Dictionary, a machine aided indexing system used online at the National Aeronautics and Space Administration's Center for Aerospace Information (CASI). This system is comprised of a text processor that is based on the computational, non-syntactic analysis of input text, and an extensive 'knowledge base' that serves to recognize and translate text-extracted concepts. The structure and function of the various NLD system components are described in detail. Methods used for the development of the knowledge base are discussed. Particular attention is given to a statistically-based text analysis program that provides the knowledge base developer with a list of concept-specific phrases extracted from large textual corpora. Production and quality benefits resulting from the integration of machine aided indexing at CASI are discussed along with a number of secondary applications of NLD-derived systems including on-line spell checking and machine aided lexicography.
NASA Technical Reports Server (NTRS)
Succi, G. P.
1983-01-01
The techniques of helicopter rotor noise prediction attempt to describe precisely the details of the noise field and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The Farassat noise prediction techniques was studied, and high speed helicopter noise prediction using more detailed representations of the thickness and loading noise sources was investigated. These predictions were based on the measured blade surface pressures on an AH-1G rotor and compared to the measured sound field. Although refinements in the representation of the thickness and loading noise sources improve the calculation, there are still discrepancies between the measured and predicted sound field. Analysis of the blade surface pressure data indicates shocks on the blades, which are probably responsible for these discrepancies.
NASA Astrophysics Data System (ADS)
Ziegler, Hannes Moritz
Planners and managers often rely on coarse population distribution data from the census for addressing various social, economic, and environmental problems. In the analysis of physical vulnerabilities to sea-level rise, census units such as blocks or block groups are coarse relative to the required decision-making application. This study explores the benefits offered from integrating image classification and dasymetric mapping at the household level to provide detailed small area population estimates at the scale of residential buildings. In a case study of Boca Raton, FL, a sea-level rise inundation grid based on mapping methods by NOAA is overlaid on the highly detailed population distribution data to identify vulnerable residences and estimate population displacement. The enhanced spatial detail offered through this method has the potential to better guide targeted strategies for future development, mitigation, and adaptation efforts.
Enhanced LOD Concepts for Virtual 3d City Models
NASA Astrophysics Data System (ADS)
Benner, J.; Geiger, A.; Gröger, G.; Häfele, K.-H.; Löwner, M.-O.
2013-09-01
Virtual 3D city models contain digital three dimensional representations of city objects like buildings, streets or technical infrastructure. Because size and complexity of these models continuously grow, a Level of Detail (LoD) concept effectively supporting the partitioning of a complete model into alternative models of different complexity and providing metadata, addressing informational content, complexity and quality of each alternative model is indispensable. After a short overview on various LoD concepts, this paper discusses the existing LoD concept of the CityGML standard for 3D city models and identifies a number of deficits. Based on this analysis, an alternative concept is developed and illustrated with several examples. It differentiates between first, a Geometric Level of Detail (GLoD) and a Semantic Level of Detail (SLoD), and second between the interior building and its exterior shell. Finally, a possible implementation of the new concept is demonstrated by means of an UML model.
Aircraft Conceptual Design and Risk Analysis Using Physics-Based Noise Prediction
NASA Technical Reports Server (NTRS)
Olson, Erik D.; Mavris, Dimitri N.
2006-01-01
An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid trade-off and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The methodology was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and take-off and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.
Development of an "Alert Framework" Based on the Practices in the Medical Front.
Sakata, Takuya; Araki, Kenji; Yamazaki, Tomoyoshi; Kawano, Koichi; Maeda, Minoru; Kushima, Muneo; Araki, Sanae
2018-05-09
At the University of Miyazaki Hospital (UMH), we have accumulated and semantically structured a vast amount of medical information since the activation of the electronic health record system approximately 10 years ago. With this medical information, we have decided to develop an alert system for aiding in medical treatment. The purpose of this investigation is to not only to integrate an alert framework into the electronic heath record system, but also to formulate a modeling method of this knowledge. A trial alert framework was developed for the staff in various occupational categories at the UMH. Based on findings of subsequent interviews, a more detailed and upgraded alert framework was constructed, resulting in the final model. Based on our current findings, an alert framework was developed with four major items. Based on the analysis of the medical practices from the trial model, it has been concluded that there are four major risk patterns that trigger the alert. Furthermore, the current alert framework contains detailed definitions which are easily substituted into the database, leading to easy implementation of the electronic health records.
Shrunken head (tsantsa): a complete forensic analysis procedure.
Charlier, P; Huynh-Charlier, I; Brun, L; Hervé, C; de la Grandmaison, G Lorin
2012-10-10
Based on the analysis of shrunken heads referred to our forensic laboratory for anthropological expertise, and data from both anthropological and medical literature, we propose a complete forensic procedure for the analysis of such pieces. A list of 14 original morphological criteria has been developed, based on the global aspect, color, physical deformation, anatomical details, and eventual associated material (wood, vegetal fibers, sand, charcoals, etc.). Such criteria have been tested on a control sample of 20 tsantsa (i.e. shrunken heads from the Jivaro or Shuar tribes of South America). Further complementary analyses are described such as CT-scan and microscopic examination. Such expertise is more and more asked to forensic anthropologists and practitioners in a context of global repatriation of human artifacts to native communities. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nasiri, Farshid; Aghbashlo, Mortaza; Rafiee, Shahin
2017-02-01
In this study, a detailed exergy analysis of an industrial-scale ultrafiltrated (UF) cheese production plant was conducted based on actual operational data in order to provide more comprehensive insights into the performance of the whole plant and its main subcomponents. The plant included four main subsystems, i.e., steam generator (I), above-zero refrigeration system (II), Bactocatch-assisted pasteurization line (III), and UF cheese production line (IV). In addition, this analysis was aimed at quantifying the exergy destroyed in processing a known quantity of the UF cheese using the mass allocation method. The specific exergy destruction of the UF cheese production was determined at 2330.42 kJ/kg. The contributions of the subsystems I, II, III, and IV to the specific exergy destruction of the UF cheese production were computed as 1337.67, 386.18, 283.05, and 323.51 kJ/kg, respectively. Additionally, it was observed through the analysis that the steam generation system had the largest contribution to the thermodynamic inefficiency of the UF cheese production, accounting for 57.40 % of the specific exergy destruction. Generally, the outcomes of this survey further manifested the benefits of applying exergy analysis for design, analysis, and optimization of industrial-scale dairy processing plants to achieve the most cost-effective and environmentally-benign production strategies.
High Fidelity System Simulation of Multiple Components in Support of the UEET Program
NASA Technical Reports Server (NTRS)
Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton
2006-01-01
The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.
Analyses of conversion efficiency in high-speed clock recovery based on Mach-Zehnder modulator
NASA Astrophysics Data System (ADS)
Dong, H.; Sun, H.; Zhu, G.; Dutta, N. K.
2006-09-01
In this paper, detailed analyses of the conversion efficiency in high-speed clock recovery based on Mach-Zehnder (MZ) modulator has been carried out. The theoretical results show the conversion efficiency changes with RF driving power and the mixing order. For high order clock recovery, the cascaded MZ modulator provides higher conversion efficiency. A study of clock recovery at 160 Gb/s using the cascaded MZ modulator has been carried out. The experimental results agree with the results of the analysis.
5-Hydroxymethylcytosine Profiling in Human DNA.
Thomson, John P; Nestor, Colm E; Meehan, Richard R
2017-01-01
Since its "re-discovery" in 2009, there has been significant interest in defining the genome-wide distribution of DNA marked by 5-hydroxymethylation at cytosine bases (5hmC). In recent years, technological advances have resulted in a multitude of unique strategies to map 5hmC across the human genome. Here we discuss the wide range of approaches available to map this modification and describe in detail the affinity based methods which result in the enrichment of 5hmC marked DNA for downstream analysis.
Method for VAWT Placement on a Complex Building Structure
2013-06-01
85 APPENDIX C: ANSYS CFX SPECIFICAITONS FOR WIND FLOW ANALYSIS .....87 APPENDIX D: SINGLE ROTOR ANALYSIS ANSYS CFX MESH DETAILS...89 APPENDIX E: SINGLE ROTOR ANALYSIS, ANSYS CFX SPECIFICS .....................91 APPENDIX F: DETAILED RESULTS OF SINGLE ROTOR...101 APPENDIX I: DUAL ROTOR ANALYSIS- ANSYS CFX SPECIFICATIONS (6 BLADED VAWTS
Zhao, Jiaduo; Gong, Weiguo; Tang, Yuzhen; Li, Weihong
2016-01-20
In this paper, we propose an effective human and nonhuman pyroelectric infrared (PIR) signal recognition method to reduce PIR detector false alarms. First, using the mathematical model of the PIR detector, we analyze the physical characteristics of the human and nonhuman PIR signals; second, based on the analysis results, we propose an empirical mode decomposition (EMD)-based symbolic dynamic analysis method for the recognition of human and nonhuman PIR signals. In the proposed method, first, we extract the detailed features of a PIR signal into five symbol sequences using an EMD-based symbolization method, then, we generate five feature descriptors for each PIR signal through constructing five probabilistic finite state automata with the symbol sequences. Finally, we use a weighted voting classification strategy to classify the PIR signals with their feature descriptors. Comparative experiments show that the proposed method can effectively classify the human and nonhuman PIR signals and reduce PIR detector's false alarms.
Járvás, Gábor; Varga, Tamás; Szigeti, Márton; Hajba, László; Fürjes, Péter; Rajta, István; Guttman, András
2018-02-01
As a continuation of our previously published work, this paper presents a detailed evaluation of a microfabricated cell capture device utilizing a doubly tilted micropillar array. The device was fabricated using a novel hybrid technology based on the combination of proton beam writing and conventional lithography techniques. Tilted pillars offer unique flow characteristics and support enhanced fluidic interaction for improved immunoaffinity based cell capture. The performance of the microdevice was evaluated by an image sequence analysis based in-house developed single-cell tracking system. Individual cell tracking allowed in-depth analysis of the cell-chip surface interaction mechanism from hydrodynamic point of view. Simulation results were validated by using the hybrid device and the optimized surface functionalization procedure. Finally, the cell capture capability of this new generation microdevice was demonstrated by efficiently arresting cells from a HT29 cell-line suspension. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Creep-Rupture Behavior of Ni-Based Alloy Tube Bends for A-USC Boilers
NASA Astrophysics Data System (ADS)
Shingledecker, John
Advanced ultrasupercritical (A-USC) boiler designs will require the use of nickel-based alloys for superheaters and reheaters and thus tube bending will be required. The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code Section II PG-19 limits the amount of cold-strain for boiler tube bends for austenitic materials. In this summary and analysis of research conducted to date, a number of candidate nickel-based A-USC alloys were evaluated. These alloys include alloy 230, alloy 617, and Inconel 740/740H. Uniaxial creep and novel structural tests and corresponding post-test analysis, which included physical measurements, simplified analytical analysis, and detailed microscopy, showed that different damage mechanisms may operate based on test conditions, alloy, and cold-strain levels. Overall, creep strength and ductility were reduced in all the alloys, but the degree of degradation varied substantially. The results support the current cold-strain limits now incorporated in ASME for these alloys for long-term A-USC boiler service.
Man-made objects cuing in satellite imagery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skurikhin, Alexei N
2009-01-01
We present a multi-scale framework for man-made structures cuing in satellite image regions. The approach is based on a hierarchical image segmentation followed by structural analysis. A hierarchical segmentation produces an image pyramid that contains a stack of irregular image partitions, represented as polygonized pixel patches, of successively reduced levels of detail (LOOs). We are jumping off from the over-segmented image represented by polygons attributed with spectral and texture information. The image is represented as a proximity graph with vertices corresponding to the polygons and edges reflecting polygon relations. This is followed by the iterative graph contraction based on Boruvka'smore » Minimum Spanning Tree (MST) construction algorithm. The graph contractions merge the patches based on their pairwise spectral and texture differences. Concurrently with the construction of the irregular image pyramid, structural analysis is done on the agglomerated patches. Man-made object cuing is based on the analysis of shape properties of the constructed patches and their spatial relations. The presented framework can be used as pre-scanning tool for wide area monitoring to quickly guide the further analysis to regions of interest.« less
Talebi, Mohsen; Patil, Rahul A; Sidisky, Leonard M; Berthod, Alain; Armstrong, Daniel W
2017-12-06
Twelve bis- or dicationic ionic liquids (ILs) including eight based on imidazolium, a single one based on phosphonium, and three based on pyrrolidinium cationic units were prepared with the bis(trifluoromethyl sulfonyl) imide anion. The two identical cationic moieties were attached by different alkyl spacers having three or five carbons and differing alkyl substituents attached to the spacer. The SLB-IL111 column, as the most polar commercial stationary phase known, was included in the study for comparison. Isothermal separations of a rapeseed oil fatty acid methyl ester (FAME) sample were used to study and compare the 12 IL-based column performances and selectivities. The retention times of the most retained methyl esters of lignoceric (C24:0) and erucic (C22:1) acids were used to estimate the IL polarity. The phosphonium dicationic IL column was, by far, the least polar. Imidazolium-based dicationic IL columns were the most polar. Polarity and selectivity for the FAME separation were somewhat related. The separation of a 37-FAME standard mixture allowed the investigation of selectivity variations observed on the 12 IL-based columns under temperature gradients up to 230 °C. The remarkable selectivity of the IL-based columns is demonstrated by the detailed analysis of the cis/trans C18:1 isomers of a partially hydrogenated vegetable oil sample on 30-m columns, separations competing with that done following an "official method" performed on a 100-m column. Graphical abstract Separation of fatty acid methyl esters on a 30-m 3m 2 C 5 (mpy) 2 . 2NTf 2 branched-chain dicationic IL-based column. Branched chain dicationic ILs show great selectivity for separation of cis/trans, ω-3/ω-6, and detailed analysis of cis/trans fats.
Wu, Guosheng; Robertson, Daniel H; Brooks, Charles L; Vieth, Michal
2003-10-01
The influence of various factors on the accuracy of protein-ligand docking is examined. The factors investigated include the role of a grid representation of protein-ligand interactions, the initial ligand conformation and orientation, the sampling rate of the energy hyper-surface, and the final minimization. A representative docking method is used to study these factors, namely, CDOCKER, a molecular dynamics (MD) simulated-annealing-based algorithm. A major emphasis in these studies is to compare the relative performance and accuracy of various grid-based approximations to explicit all-atom force field calculations. In these docking studies, the protein is kept rigid while the ligands are treated as fully flexible and a final minimization step is used to refine the docked poses. A docking success rate of 74% is observed when an explicit all-atom representation of the protein (full force field) is used, while a lower accuracy of 66-76% is observed for grid-based methods. All docking experiments considered a 41-member protein-ligand validation set. A significant improvement in accuracy (76 vs. 66%) for the grid-based docking is achieved if the explicit all-atom force field is used in a final minimization step to refine the docking poses. Statistical analysis shows that even lower-accuracy grid-based energy representations can be effectively used when followed with full force field minimization. The results of these grid-based protocols are statistically indistinguishable from the detailed atomic dockings and provide up to a sixfold reduction in computation time. For the test case examined here, improving the docking accuracy did not necessarily enhance the ability to estimate binding affinities using the docked structures. Copyright 2003 Wiley Periodicals, Inc.
Development of surrogate models for the prediction of the flow around an aircraft propeller
NASA Astrophysics Data System (ADS)
Salpigidou, Christina; Misirlis, Dimitris; Vlahostergios, Zinon; Yakinthos, Kyros
2018-05-01
In the present work, the derivation of two surrogate models (SMs) for modelling the flow around a propeller for small aircrafts is presented. Both methodologies use derived functions based on computations with the detailed propeller geometry. The computations were performed using k-ω shear stress transport for modelling turbulence. In the SMs, the modelling of the propeller was performed in a computational domain of disk-like geometry, where source terms were introduced in the momentum equations. In the first SM, the source terms were polynomial functions of swirl and thrust, mainly related to the propeller radius. In the second SM, regression analysis was used to correlate the source terms with the velocity distribution through the propeller. The proposed SMs achieved faster convergence, in relation to the detail model, by providing also results closer to the available operational data. The regression-based model was the most accurate and required less computational time for convergence.
The Microstructure of RR1000 Nickel-Base Superalloy: The FIB-SEM Dual-Beam Approach
NASA Astrophysics Data System (ADS)
Croxall, S. A.; Hardy, M. C.; Stone, H. J.; Midgley, P. A.
Nickel-base superalloys are aerospace materials that exhibit exceptional mechanical properties and corrosion resistance at very high temperatures. RR1000 is used in discs in gas turbine engines, where temperatures reach in excess of 650°C with high mechanical stresses. Study of the microstructure at the micron and sub-micron level has conventionally been undertaken using scanning electron microscope images, often meaning the underlying 3D microstructure can be inferred only with additional knowledge. Using a dual-beam workstation, we are able to interrogate directly the 3D microstructure using a serial sectioning approach. The 3D data set, typically (10µm)3 in volume, reveals microstructural detail with lateral resolution of circa 8nm and a depth resolution dictated by the slice thickness, typically 50nm. Morphological and volumetric analysis of the 3D reconstruction of RR1000 superalloy reveals microstructural details hitherto unseen.
NASA Astrophysics Data System (ADS)
Baumgart, M.; Druml, N.; Consani, M.
2018-05-01
This paper presents a simulation approach for Time-of-Flight cameras to estimate sensor performance and accuracy, as well as to help understanding experimentally discovered effects. The main scope is the detailed simulation of the optical signals. We use a raytracing-based approach and use the optical path length as the master parameter for depth calculations. The procedure is described in detail with references to our implementation in Zemax OpticStudio and Python. Our simulation approach supports multiple and extended light sources and allows accounting for all effects within the geometrical optics model. Especially multi-object reflection/scattering ray-paths, translucent objects, and aberration effects (e.g. distortion caused by the ToF lens) are supported. The optical path length approach also enables the implementation of different ToF senor types and transient imaging evaluations. The main features are demonstrated on a simple 3D test scene.
NASA Astrophysics Data System (ADS)
Liu, Yixiong; Yang, Ce; Yang, Dengfeng; Zhang, Rui
2016-04-01
The aerodynamic performance, detailed unsteady flow and time-based excitations acting on blade surfaces of a radial flow turbine have been investigated with pulsation flow condition. The results show that the turbine instantaneous performance under pulsation flow condition deviates from the quasi-steady value significantly and forms obvious hysteretic loops around the quasi-steady conditions. The detailed analysis of unsteady flow shows that the characteristic of pulsation flow field in radial turbine is highly influenced by the pulsation inlet condition. The blade torque, power and loading fluctuate with the inlet pulsation wave in a pulse period. For the blade excitations, the maximum and the minimum blade excitations conform to the wave crest and wave trough of the inlet pulsation, respectively, in time-based scale. And toward blade chord direction, the maximum loading distributes along the blade leading edge until 20% chord position and decreases from the leading to trailing edge.
Fostering integrity in postgraduate research: an evidence-based policy and support framework.
Mahmud, Saadia; Bretag, Tracey
2014-01-01
Postgraduate research students have a unique position in the debate on integrity in research as students and novice researchers. To assess how far policies for integrity in postgraduate research meet the needs of students as "research trainees," we reviewed online policies for integrity in postgraduate research at nine particular Australian universities against the Australian Code for Responsible Conduct of Research (the Code) and the five core elements of exemplary academic integrity policy identified by Bretag et al. (2011 ), i.e., access, approach, responsibility, detail, and support. We found inconsistency with the Code in the definition of research misconduct and a lack of adequate detail and support. Based on our analysis, previous research, and the literature, we propose a framework for policy and support for postgraduate research that encompasses a consistent and educative approach to integrity maintained across the university at all levels of scholarship and for all stakeholders.
Morton, Katherine; Band, Rebecca; van Woezik, Anne; Grist, Rebecca; McManus, Richard J.; Little, Paul; Yardley, Lucy
2018-01-01
Background For behaviour-change interventions to be successful they must be acceptable to users and overcome barriers to behaviour change. The Person-Based Approach can help to optimise interventions to maximise acceptability and engagement. This article presents a novel, efficient and systematic method that can be used as part of the Person-Based Approach to rapidly analyse data from development studies to inform intervention modifications. We describe how we used this approach to optimise a digital intervention for patients with hypertension (HOME BP), which aims to implement medication and lifestyle changes to optimise blood pressure control. Methods In study 1, hypertensive patients (N = 12) each participated in three think-aloud interviews, providing feedback on a prototype of HOME BP. In study 2 patients (N = 11) used HOME BP for three weeks and were then interviewed about their experiences. Studies 1 and 2 were used to identify detailed changes to the intervention content and potential barriers to engagement with HOME BP. In study 3 (N = 7) we interviewed hypertensive patients who were not interested in using an intervention like HOME BP to identify potential barriers to uptake, which informed modifications to our recruitment materials. Analysis in all three studies involved detailed tabulation of patient data and comparison to our modification criteria. Results Studies 1 and 2 indicated that the HOME BP procedures were generally viewed as acceptable and feasible, but also highlighted concerns about monitoring blood pressure correctly at home and making medication changes remotely. Patients in study 3 had additional concerns about the safety and security of the intervention. Modifications improved the acceptability of the intervention and recruitment materials. Conclusions This paper provides a detailed illustration of how to use the Person-Based Approach to refine a digital intervention for hypertension. The novel, efficient approach to analysis and criteria for deciding when to implement intervention modifications described here may be useful to others developing interventions. PMID:29723262
Research and Analysis of Image Processing Technologies Based on DotNet Framework
NASA Astrophysics Data System (ADS)
Ya-Lin, Song; Chen-Xi, Bai
Microsoft.Net is a kind of most popular program development tool. This paper gave a detailed analysis concluded about some image processing technologies of the advantages and disadvantages by .Net processed image while the same algorithm is used in Programming experiments. The result shows that the two best efficient methods are unsafe pointer and Direct 3D, and Direct 3D used to 3D simulation development, and the others are useful in some fields while these technologies are poor efficiency and not suited to real-time processing. The experiment results in paper will help some projects about image processing and simulation based DotNet and it has strong practicability.
Multiphysics Analysis of a Solid-Core Nuclear Thermal Engine Thrust Chamber
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Canabal, Francisco; Cheng, Gary; Chen, Yen-Sen
2006-01-01
The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics methodology. Formulations for heat transfer in solids and porous media were implemented and anchored. A two-pronged approach was employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of hydrogen dissociation and recombination on heat transfer and thrust performance. The formulations and preliminary results on both aspects are presented.
NASA Astrophysics Data System (ADS)
Gnyp, Andriy
2009-06-01
Based on the results of application of correlation analysis to records of the 2005 Mukacheve group of recurrent events and their subsequent relocation relative to the reference event of 7 July 2005, a conclusion has been drawn that all the events had most likely occurred on the same rup-ture plane. Station terms have been estimated for seismic stations of the Transcarpathians, accounting for variation of seismic velocities beneath their locations as compared to the travel time tables used in the study. In methodical aspect, potentials and usefulness of correlation analysis of seismic records for a more detailed study of seismic processes, tectonics and geodynamics of the Carpathian region have been demonstrated.
NASA Astrophysics Data System (ADS)
Choi, Jongwan; Kim, Felix Sunjoo
2018-03-01
We studied the influence of photoanode thickness on the photovoltaic characteristics and impedance responses of the dye-sensitized solar cells based on a ruthenium dye containing a hexyloxyl-substituted carbazole unit (Ru-HCz). As the thickness of photoanode increases from 4.2 μm to 14.8 μm, the dye-loading amount and the efficiency increase. The device with thicker photoanode shows a decrease in the efficiency due to the higher probability of recombination of electron-hole pairs before charge extraction. We also analyzed the electron-transfer and recombination characteristics as a function of photoanode thickness through detailed electrochemical impedance spectroscopy analysis.
Llinás, Rodolfo R.; Ustinin, Mikhail N.; Rykunov, Stanislav D.; Boyko, Anna I.; Sychev, Vyacheslav V.; Walton, Kerry D.; Rabello, Guilherme M.; Garcia, John
2015-01-01
A new method for the analysis and localization of brain activity has been developed, based on multichannel magnetic field recordings, over minutes, superimposed on the MRI of the individual. Here, a high resolution Fourier Transform is obtained over the entire recording period, leading to a detailed multi-frequency spectrum. Further analysis implements a total decomposition of the frequency components into functionally invariant entities, each having an invariant field pattern localizable in recording space. The method, addressed as functional tomography, makes it possible to find the distribution of magnetic field sources in space. Here, the method is applied to the analysis of simulated data, to oscillating signals activating a physical current dipoles phantom, and to recordings of spontaneous brain activity in 10 healthy adults. In the analysis of simulated data, 61 dipoles are localized with 0.7 mm precision. Concerning the physical phantom the method is able to localize three simultaneously activated current dipoles with 1 mm precision. Spatial resolution 3 mm was attained when localizing spontaneous alpha rhythm activity in 10 healthy adults, where the alpha peak was specified for each subject individually. Co-registration of the functional tomograms with each subject's head MRI localized alpha range activity to the occipital and/or posterior parietal brain region. This is the first application of this new functional tomography to human brain activity. The method successfully provides an overall view of brain electrical activity, a detailed spectral description and, combined with MRI, the localization of sources in anatomical brain space. PMID:26528119
Method for rapid estimation of scour at highway bridges based on limited site data
Holnbeck, S.R.; Parrett, Charles
1997-01-01
Limited site data were used to develop a method for rapid estimation of scour at highway bridges. The estimates can be obtained in a matter of hours rather than several days as required by more-detailed methods. Such a method is important because scour assessments are needed to identify scour-critical bridges throughout the United States. Using detailed scour-analysis methods and scour-prediction equations recommended by the Federal Highway Administration, the U.S. Geological Survey, in cooperation with the Montana Department of Transportation, obtained contraction, pier, and abutment scour-depth data for sites from 10 States.The data were used to develop relations between scour depth and hydraulic variables that can be rapidly measured in the field. Relations between scour depth and hydraulic variables, in the form of envelope curves, were based on simpler forms of detailed scour-prediction equations. To apply the rapid-estimation method, a 100-year recurrence interval peak discharge is determined, and bridge- length data are used in the field with graphs relating unit discharge to velocity and velocity to bridge backwater as a basis for estimating flow depths and other hydraulic variables that can then be applied using the envelope curves. The method was tested in the field. Results showed good agreement among individuals involved and with results from more-detailed methods. Although useful for identifying potentially scour-critical bridges, themethod does not replace more-detailed methods used for design purposes. Use of the rapid- estimation method should be limited to individuals having experience in bridge scour, hydraulics, and flood hydrology, and some training in use of the method.
NASA Technical Reports Server (NTRS)
Beard, Daniel A.; Liang, Shou-Dan; Qian, Hong; Biegel, Bryan (Technical Monitor)
2001-01-01
Predicting behavior of large-scale biochemical metabolic networks represents one of the greatest challenges of bioinformatics and computational biology. Approaches, such as flux balance analysis (FBA), that account for the known stoichiometry of the reaction network while avoiding implementation of detailed reaction kinetics are perhaps the most promising tools for the analysis of large complex networks. As a step towards building a complete theory of biochemical circuit analysis, we introduce energy balance analysis (EBA), which compliments the FBA approach by introducing fundamental constraints based on the first and second laws of thermodynamics. Fluxes obtained with EBA are thermodynamically feasible and provide valuable insight into the activation and suppression of biochemical pathways.
Large-Scale Femtoliter Droplet Array for Single Cell Efflux Assay of Bacteria.
Iino, Ryota; Sakakihara, Shouichi; Matsumoto, Yoshimi; Nishino, Kunihiko
2018-01-01
Large-scale femtoliter droplet array as a platform for single cell efflux assay of bacteria is described. Device microfabrication, femtoliter droplet array formation and concomitant enclosure of single bacterial cells, fluorescence-based detection of efflux activity at the single cell level, and collection of single cells from droplet and subsequent gene analysis are described in detail.
ERIC Educational Resources Information Center
Shinebourne, Pnina
2012-01-01
This paper evolved from previous research on women's experience of addiction and recovery. The original study was based on detailed semi-structured interviews analysed using interpretative phenomenological analysis (IPA). In this study a poetic representation of material from participants' accounts was created to explore how a focus on the poetic…
User Experience Design of History Game: An Analysis Review and Evaluation Study for Malaysia Context
ERIC Educational Resources Information Center
Wong, Seng Yue; Ghavifekr, Simin
2018-01-01
User experience (UX) and user interface design of an educational game are important in enhancing and sustaining the utilisation of Game Based Learning (GBL) in learning history. Thus, this article provides a detailed literature review on history learning problems, as well as previous studies on user experience in game design. Future studies on…
Remote sensing information sciences research group
NASA Technical Reports Server (NTRS)
Estes, John E.; Smith, Terence; Star, Jeffrey L.
1988-01-01
Research conducted under this grant was used to extend and expand existing remote sensing activities at the University of California, Santa Barbara in the areas of georeferenced information systems, matching assisted information extraction from image data and large spatial data bases, artificial intelligence, and vegetation analysis and modeling. The research thrusts during the past year are summarized. The projects are discussed in some detail.
ERIC Educational Resources Information Center
Estache, Antonio; Foster, Vivien; Wodon, Quentin
This book explores the connections between infrastructure reform and poverty alleviation in Latin America based on a detailed analysis of the effects of a decade of reforms. The book demonstrates that because the access to, and affordability of, basic services is still a major problem, infrastructure investment will be a core component of poverty…
Communications network design and costing model users manual
NASA Technical Reports Server (NTRS)
Logan, K. P.; Somes, S. S.; Clark, C. A.
1983-01-01
The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.
Designing Worked Examples in Statics to Promote an Expert Stance: Working THRU vs. Working OUT
ERIC Educational Resources Information Center
Calfee, Robert; Stahovich, Thomas
2011-01-01
The purpose of this study was to examine the performance patterns of freshman engineering students as they completed a tutorial on freebody problems that employed a computer-based pen (CBP) to provide feedback and direct learning. A secondary analysis was conducted on detailed performance data for 16 participants from a freshman Engineering course…
ERIC Educational Resources Information Center
Windham, Patricia W.; Hackett, E. Raymond
In response to the increasing use of state-based performance indicators for postsecondary education, a study was undertaken to review the reliability and validity of state-level indicators in the Florida Community College System (FCCS). Data were collected from literature reviews and the 1996 FCCS Accountability Report, detailing outcomes for 17…
Ground-based digital imagery for tree stem analysis
Neil Clark; Daniel L. Schmoldt; Randolph H. Wynne; Matthew F. Winn; Philip A. Araman
2000-01-01
In the USA, a subset of permanent forest sample plots within each geographic region are intensively measured to obtain estimates of tree volume and products. The detailed field measurements required for this type of sampling are both time consuming and error prone. We are attempting to reduce both of these factors with the aid of a commercially-available solid-state...
Commerce Lab: Mission analysis payload integration study. Appendix A: Data bases
NASA Technical Reports Server (NTRS)
1985-01-01
The development of Commerce Lab is detailed. Its objectives are to support the space program in these areas: (1) the expedition of space commercialization; (2) the advancement of microgravity science and applications; and (3) as a precursor to future missions in the space program. Ways and means of involving private industry and academia in this commercialization is outlined.
Cost and Usage Study of the Educational Resources Information Center (ERIC) System. Final Report.
ERIC Educational Resources Information Center
McDonald, Dennis D; And Others
A detailed descriptive analysis of both the direct and indirect costs incurred by the Federal government in operating the ERIC system, and the user population and user demand for ERIC products and services, this study is based on data gathered from a number of complementary sources. These sources included a survey of ERIC's U.S. intermediate…
ERIC Educational Resources Information Center
Heinmiller, Joseph L.
Based on data gathered from a number of complementary sources, this study provides a detailed descriptive analysis of both the direct and indirect costs incurred by the Federal government in operating the ERIC system, and the user population and user demand for ERIC products and services. Data sources included a survey of ERIC's U.S. intermediate…
Parents' Experiences of Home-Based Applied Behavior Analysis Programs for Young Children with Autism
ERIC Educational Resources Information Center
Grindle, Corinna F.; Kovshoff, Hanna; Hastings, Richard P.; Remington, Bob
2009-01-01
Although much research has documented the benefits to children with autism of early intensive behavioral intervention (EIBI), little has focused on the impact of EIBI on families. Using a semi-structured format, we interviewed 53 parents whose children had received 2 years of EIBI to obtain detailed first person accounts of the perceived benefits…
ERIC Educational Resources Information Center
Dominguez-Whitehead, Yasmine
2015-01-01
This article situates food at the heart of the fundamentals of student development, based on qualitative case study research. Food acquisition and food-related struggles in the context of the South African university are examined. Three overarching themes emerged from the analysis of the data, and are discussed in detail: depletion of food funds,…
Canadian Families' Strategies for Employment and Care for Preschool Children
ERIC Educational Resources Information Center
Ornstein, Michael; Stalker, Glenn J.
2013-01-01
Based on the 2006 Canadian Census "long form" sample of one in every five households, the authors develop a detailed typology of family strategies for employment and the care of preschool children. The analysis is restricted to opposite-sex couples with at least one child under age 6 and no older child or other adult in the household.…
Calcium inputs and transport in a base-poor forest ecosystem as interpreted by Sr isotopes
Scott W. Bailey; James W. Hornbeck; Charles T. Driscoll; Henri E. Gaudette
1996-01-01
Depletion of Ca in forests and its effects on forest health are poorly quantified. Depletion has been difficult to document due to limitations in determining rates at which Ca becomes available for ecosystem processes through weathering, and difficulty in determining changes in ecosystem storage. We coupled a detailed analysis of Sr isotopic composition with a mass...
A new solar carbon abundance based on non-LTE CN molecular spectra
NASA Technical Reports Server (NTRS)
Mount, G. H.; Linsky, J. L.
1975-01-01
A detailed non-LTE analysis of solar CN spectra strongly suggest a revised carbon abundance for the sun. We recommend a value of log carbon abundance = 8.35 plus or minus 0.15 which is significantly lower than the presently accepted value of log carbon abundance = 8.55. This revision may have important consequences in astrophysics.
Timber products output and timber harvests in Alaska: projections for 1989-2010.
David J. Brooks; Richard W. Haynes
1990-01-01
Projections of Alaska timber products output and timber harvest by owner were developed by using a detailed, trend-based analysis. Historical data for 1965-88 were the basis for projections for 1989-2010. Projections of timber products output for each major product (export logs, sawn wood, and market pulp) were used to compute the derived demand for timber. The...
Improved separability criteria via some classes of measurements
NASA Astrophysics Data System (ADS)
Shen, Shu-Qian; Li, Ming; Li-Jost, Xianqing; Fei, Shao-Ming
2018-05-01
The entanglement detection via local measurements can be experimentally implemented. Based on mutually unbiased measurements and general symmetric informationally complete positive-operator-valued measures, we present separability criteria for bipartite quantum states, which, by theoretical analysis, are stronger than the related existing criteria via these measurements. Two detailed examples are supplemented to show the efficiency of the presented separability criteria.
Situation Comedy, Feminism and Freud: Discourses of Gracie and Lucy.
ERIC Educational Resources Information Center
Mellencamp, Patricia
This paper is based on a general analysis of 40 episodes of The George Burns and Gracie Allen Show and 170 (of 179) episodes of I Love Lucy, both of which were aired on television during the 1950s. Character portrayals of the stars and supporting actors/actresses are described in detail and analyzed from the perspectives of gender and sex…
ERIC Educational Resources Information Center
Taylor, Liz
2014-01-01
This study outlines some challenges of teaching about distant place and demonstrates how different strategies can influence school students' framings of diversity. The analysis is based on an interpretive case study of 13-14?year-old students learning about Japan in a UK school. Their changing representations of Japan were tracked in detail over a…
ERIC Educational Resources Information Center
d'Alessio, Matthew; Lundquist, Loraine
2013-01-01
Each year our physical science class for pre-service elementary teachers launches water-powered rockets based on the activity from NASA. We analyze the rocket flight using data from frame-by-frame video analysis of the launches. Before developing the methods presented in this paper, we noticed our students were mired in calculation details while…
Simulation of Electric Propulsion Thrusters (Preprint)
2011-02-07
activity concerns the plumes produced by electric thrusters. Detailed information on the plumes is required for safe integration of the thruster...ground-based laboratory facilities. Device modelling also plays an important role in plume simulations by providing accurate boundary conditions at...methods used to model the flow of gas and plasma through electric propulsion devices. Discussion of the numerical analysis of other aspects of
Simulation of Electric Propulsion Thrusters
2011-01-01
and operational lifetime. The second area of modelling activity concerns the plumes produced by electric thrusters. Detailed information on the plumes ...to reproduce the in-orbit space environment using ground-based laboratory facilities. Device modelling also plays an important role in plume ...of the numerical analysis of other aspects of thruster design, such as thermal and structural processes, is omitted here. There are two fundamental
Contrast-detail phantom scoring methodology.
Thomas, Jerry A; Chakrabarti, Kish; Kaczmarek, Richard; Romanyukha, Alexander
2005-03-01
Published results of medical imaging studies which make use of contrast detail mammography (CDMAM) phantom images for analysis are difficult to compare since data are often not analyzed in the same way. In order to address this situation, the concept of ideal contrast detail curves is suggested. The ideal contrast detail curves are constructed based on the requirement of having the same product of the diameter and contrast (disk thickness) of the minimal correctly determined object for every row of the CDMAM phantom image. A correlation and comparison of five different quality parameters of the CDMAM phantom image determined for obtained ideal contrast detail curves is performed. The image quality parameters compared include: (1) contrast detail curve--a graph correlation between "minimal correct reading" diameter and disk thickness; (2) correct observation ratio--the ratio of the number of correctly identified objects to the actual total number of objects multiplied by 100; (3) image quality figure--the sum of the product of the diameter of the smallest scored object and its relative contrast; (4) figure-of-merit--the zero disk diameter value obtained from extrapolation of the contrast detail curve to the origin (e.g., zero disk diameter); and (5) k-factor--the product of the thickness and the diameter of the smallest correctly identified disks. The analysis carried out showed the existence of a nonlinear relationship between the above parameters, which means that use of different parameters of CDMAM image quality potentially can cause different conclusions about changes in image quality. Construction of the ideal contrast detail curves for CDMAM phantom is an attempt to determine the quantitative limits of the CDMAM phantom as employed for image quality evaluation. These limits are determined by the relationship between certain parameters of a digital mammography system and the set of the gold disks sizes in the CDMAM phantom. Recommendations are made on selections of CDMAM phantom regions which should be used for scoring at different image quality and which scoring methodology may be most appropriate. Special attention is also paid to the use of the CDMAM phantom for image quality assessment of digital mammography systems particularly in the vicinity of the Nyquist frequency.
NASA Technical Reports Server (NTRS)
Ruf, J. H.; Hagemann, G.; Immich, H.
2003-01-01
A three dimensional linear plug nozzle of area ratio 12.79 was designed by EADS Space Transportation (former Astrium Space Infrastructure). The nozzle was tested within the German National Technology Program 'LION' in a cold air wind tunnel by TU Dresden. The experimental hardware and test conditions are described. Experimental data was obtained for the nozzle without plug side wall fences at a nozzle pressure ratio of 116 and then with plug side wall fences at NPR 110. Schlieren images were recorded and axial profiles of plug wall static pressures were measured at several spanwise locations and on the plug base. Detailed CFD analysis was performed for these nozzle configurations at NPR 116 by NASA MSFC. The CFD exhibits good agreement with the experimental data. A detailed comparison of the CFD results and the experimental plug wall pressure data are given. Comparisons are made for both the without and with plug side wall fence configurations. Numerical results for density gradient are compared to experimental Schlieren images. Experimental nozzle thrust efficiencies are calculated based on the CFD results. The CFD results are used to illustrate the plug nozzle fluid dynamics. The effect of the plug side wall is emphasized.
A thermoacoustic-Stirling heat engine: detailed study
Backhaus; Swift
2000-06-01
A new type of thermoacoustic engine based on traveling waves and ideally reversible heat transfer is described. Measurements and analysis of its performance are presented. This new engine outperforms previous thermoacoustic engines, which are based on standing waves and intrinsically irreversible heat transfer, by more than 50%. At its most efficient operating point, it delivers 710 W of acoustic power to its resonator with a thermal efficiency of 0.30, corresponding to 41% of the Carnot efficiency. At its most powerful operating point, it delivers 890 W to its resonator with a thermal efficiency of 0.22. The efficiency of this engine can be degraded by two types of acoustic streaming. These are suppressed by appropriate tapering of crucial surfaces in the engine and by using additional nonlinearity to induce an opposing time-averaged pressure difference. Data are presented which show the nearly complete elimination of the streaming convective heat loads. Analysis of these and other irreversibilities show which components of the engine require further research to achieve higher efficiency. Additionally, these data show that the dynamics and acoustic power flows are well understood, but the details of the streaming suppression and associated heat convection are only qualitatively understood.
Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator
NASA Astrophysics Data System (ADS)
Rehman, Naveed Ur; Siddiqui, Mubashir Ali
2017-03-01
In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.
De Crop, An; Bacher, Klaus; Van Hoof, Tom; Smeets, Peter V; Smet, Barbara S; Vergauwen, Merel; Kiendys, Urszula; Duyck, Philippe; Verstraete, Koenraad; D'Herde, Katharina; Thierens, Hubert
2012-01-01
To determine the correlation between the clinical and physical image quality of chest images by using cadavers embalmed with the Thiel technique and a contrast-detail phantom. The use of human cadavers fulfilled the requirements of the institutional ethics committee. Clinical image quality was assessed by using three human cadavers embalmed with the Thiel technique, which results in excellent preservation of the flexibility and plasticity of organs and tissues. As a result, lungs can be inflated during image acquisition to simulate the pulmonary anatomy seen on a chest radiograph. Both contrast-detail phantom images and chest images of the Thiel-embalmed bodies were acquired with an amorphous silicon flat-panel detector. Tube voltage (70, 81, 90, 100, 113, 125 kVp), copper filtration (0.1, 0.2, 0.3 mm Cu), and exposure settings (200, 280, 400, 560, 800 speed class) were altered to simulate different quality levels. Four experienced radiologists assessed the image quality by using a visual grading analysis (VGA) technique based on European Quality Criteria for Chest Radiology. The phantom images were scored manually and automatically with use of dedicated software, both resulting in an inverse image quality figure (IQF). Spearman rank correlations between inverse IQFs and VGA scores were calculated. A statistically significant correlation (r = 0.80, P < .01) was observed between the VGA scores and the manually obtained inverse IQFs. Comparison of the VGA scores and the automated evaluated phantom images showed an even better correlation (r = 0.92, P < .001). The results support the value of contrast-detail phantom analysis for evaluating clinical image quality in chest radiography. © RSNA, 2011.
Artificial intelligence techniques used in respiratory sound analysis--a systematic review.
Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian
2014-02-01
Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.
Fatigue assessment of an existing steel bridge by finite element modelling and field measurements
NASA Astrophysics Data System (ADS)
Kwad, J.; Alencar, G.; Correia, J.; Jesus, A.; Calçada, R.; Kripakaran, P.
2017-05-01
The evaluation of fatigue life of structural details in metallic bridges is a major challenge for bridge engineers. A reliable and cost-effective approach is essential to ensure appropriate maintenance and management of these structures. Typically, local stresses predicted by a finite element model of the bridge are employed to assess the fatigue life of fatigue-prone details. This paper illustrates an approach for fatigue assessment based on measured data for a connection in an old bascule steel bridge located in Exeter (UK). A finite element model is first developed from the design information. The finite element model of the bridge is calibrated using measured responses from an ambient vibration test. The stress time histories are calculated through dynamic analysis of the updated finite element model. Stress cycles are computed through the rainflow counting algorithm, and the fatigue prone details are evaluated using the standard SN curves approach and the Miner’s rule. Results show that the proposed approach can estimate the fatigue damage of a fatigue prone detail in a structure using measured strain data.
NASA Astrophysics Data System (ADS)
Vacanti, Giuseppe; Barrière, Nicolas; Bavdaz, Marcos; Chatbi, Abdelhakim; Collon, Maximilien; Dekker, Daniëlle; Girou, David; Günther, Ramses; van der Hoeven, Roy; Krumrey, Michael; Landgraf, Boris; Müller, Peter; Schreiber, Swenja; Vervest, Mark; Wille, Eric
2017-09-01
While predictions based on the metrology (local slope errors and detailed geometrical details) play an essential role in controlling the development of the manufacturing processes, X-ray characterization remains the ultimate indication of the actual performance of Silicon Pore Optics (SPO). For this reason SPO stacks and mirror modules are routinely characterized at PTB's X-ray Pencil Beam Facility at BESSY II. Obtaining standard X-ray results quickly, right after the production of X-ray optics is essential to making sure that X-ray results can inform decisions taken in the lab. We describe the data analysis pipeline in operations at cosine, and how it allows us to go from stack production to full X-ray characterization in 24 hours.
NASA Technical Reports Server (NTRS)
Shelton, Duane; Gamota, George
1989-01-01
The Japanese regard success in R and D in high temperature superconductivity as an important national objective. The results of a detailed evaluation of the current state of Japanese high temperature superconductivity development are provided. The analysis was performed by a panel of technical experts drawn from U.S. industry and academia, and is based on reviews of the relevant literature and visits to Japanese government, academic and industrial laboratories. Detailed appraisals are presented on the following: Basic research; superconducting materials; large scale applications; processing of superconducting materials; superconducting electronics and thin films. In all cases, comparisons are made with the corresponding state-of-the-art in the United States.
Grazing-incidence small angle x-ray scattering studies of nanoscale polymer gratings
NASA Astrophysics Data System (ADS)
Doxastakis, Manolis; Suh, Hyo Seon; Chen, Xuanxuan; Rincon Delgadillo, Paulina A.; Wan, Lingshu; Williamson, Lance; Jiang, Zhang; Strzalka, Joseph; Wang, Jin; Chen, Wei; Ferrier, Nicola; Ramirez-Hernandez, Abelardo; de Pablo, Juan J.; Gronheid, Roel; Nealey, Paul
2015-03-01
Grazing-Incidence Small Angle X-ray Scattering (GISAXS) offers the ability to probe large sample areas, providing three-dimensional structural information at high detail in a thin film geometry. In this study we exploit the application of GISAXS to structures formed at one step of the LiNe (Liu-Nealey) flow using chemical patterns for directed self-assembly of block copolymer films. Experiments conducted at the Argonne National Laboratory provided scattering patterns probing film characteristics at both parallel and normal directions to the surface. We demonstrate the application of new computational methods to construct models based on scattering measured. Such analysis allows for extraction of structural characteristics at unprecedented detail.
NASA Technical Reports Server (NTRS)
Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga
2005-01-01
Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.
Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin
2017-08-16
The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.
NASA Astrophysics Data System (ADS)
Palchykov, Vitalii A.; Zarovnaya, Iryna S.; Tretiakov, Serhii V.; Reshetnyak, Alyona V.; Omelchenko, Iryna V.; Shishkin, Oleg V.; Okovytyy, Sergiy I.
2018-04-01
Aminolysis of 3,4-epoxysulfolane in aqueous media leads to a very complex mixture of products with unresolved stereochemistry. Herein, we report a detailed theoretical and experimental mechanistic investigation of this reaction along with extensive spectroscopic characterization of the resulting amino alcohols, using 1D and 2D NMR techniques (1H, 13C, NOE, NOESY, COSY, HSQC, HMBC) as well as XRD analysis. In addition to simple amines such as ammonia and benzylamine, our study also employed the more sterically hindered endo-bicyclo[2.2.1]hept-5-en-2-ylmethanamine. The mechanism of the aminolysis of 3,4-epoxysulfolane by aqueous ammonia was studied in more detail using quantum chemical calculations at the M06-2X/6-31++G** level of theory. The computational results led us to conclude that the most probable way of initial epoxide transformation is base-catalyzed rearrangement to a corresponding allylic alcohol. Subsequent formation of vicinal amino alcohols and diols proceeds via addition of ammonia or hydroxy-anions to activated double Cdbnd C bond with some preference of a cis-attack. Detailed analytical data obtained in the course of our work will be useful for the stereochemical identification of new sulfolane derivatives.
Edge Preserved Speckle Noise Reduction Using Integrated Fuzzy Filters
Dewal, M. L.; Rohit, Manoj Kumar
2014-01-01
Echocardiographic images are inherent with speckle noise which makes visual reading and analysis quite difficult. The multiplicative speckle noise masks finer details, necessary for diagnosis of abnormalities. A novel speckle reduction technique based on integration of geometric, wiener, and fuzzy filters is proposed and analyzed in this paper. The denoising applications of fuzzy filters are studied and analyzed along with 26 denoising techniques. It is observed that geometric filter retains noise and, to address this issue, wiener filter is embedded into the geometric filter during iteration process. The performance of geometric-wiener filter is further enhanced using fuzzy filters and the proposed despeckling techniques are called integrated fuzzy filters. Fuzzy filters based on moving average and median value are employed in the integrated fuzzy filters. The performances of integrated fuzzy filters are tested on echocardiographic images and synthetic images in terms of image quality metrics. It is observed that the performance parameters are highest in case of integrated fuzzy filters in comparison to fuzzy and geometric-fuzzy filters. The clinical validation reveals that the output images obtained using geometric-wiener, integrated fuzzy, nonlocal means, and details preserving anisotropic diffusion filters are acceptable. The necessary finer details are retained in the denoised echocardiographic images. PMID:27437499
NASA Technical Reports Server (NTRS)
Haber, Benjamin M.; Green, Joseph J.
2010-01-01
The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the early stage of mission concept development.
Thalanany, Mariamma M; Mugford, Miranda; Hibbert, Clare; Cooper, Nicola J; Truesdale, Ann; Robinson, Steven; Tiruvoipati, Ravindranath; Elbourne, Diana R; Peek, Giles J; Clemens, Felicity; Hardy, Polly; Wilson, Andrew
2008-01-01
Background Extracorporeal Membrane Oxygenation (ECMO) is a technology used in treatment of patients with severe but potentially reversible respiratory failure. A multi-centre randomised controlled trial (CESAR) was funded in the UK to compare care including ECMO with conventional intensive care management. The protocol and funding for the CESAR trial included plans for economic data collection and analysis. Given the high cost of treatment, ECMO is considered an expensive technology for many funding systems. However, conventional treatment for severe respiratory failure is also one of the more costly forms of care in any health system. Methods/Design The objectives of the economic evaluation are to compare the costs of a policy of referral for ECMO with those of conventional treatment; to assess cost-effectiveness and the cost-utility at 6 months follow-up; and to assess the cost-utility over a predicted lifetime. Resources used by patients in the trial are identified. Resource use data are collected from clinical report forms and through follow up interviews with patients. Unit costs of hospital intensive care resources are based on parallel research on cost functions in UK NHS intensive care units. Other unit costs are based on published NHS tariffs. Cost effectiveness analysis uses the outcome: survival without severe disability. Cost utility analysis is based on quality adjusted life years gained based on the Euroqol EQ-5D at 6 months. Sensitivity analysis is planned to vary assumptions about transport costs and method of costing intensive care. Uncertainty will also be expressed in analysis of individual patient data. Probabilities of cost effectiveness given different funding thresholds will be estimated. Discussion In our view it is important to record our methods in detail and present them before publication of the results of the trial so that a record of detail not normally found in the final trial reports can be made available in the public domain. Trial Registrations The CESAR trial registration number is ISRCTN47279827. PMID:18447931
Ramírez Hernández, Javier; Bonete Pérez, María José; Martínez Espinosa, Rosa María
2014-12-17
1) to propose a new classification of the trace elements based on a study of the recently reported research; 2) to offer detailed and actualized information about trace elements. the analysis of the research results recently reported reveals that the advances of the molecular analysis techniques point out the importance of certain trace elements in human health. A detailed analysis of the catalytic function related to several elements not considered essential o probably essentials up to now is also offered. To perform the integral analysis of the enzymes containing trace elements informatics tools have been used. Actualized information about physiological role, kinetics, metabolism, dietetic sources and factors promoting trace elements scarcity or toxicity is also presented. Oligotherapy uses catalytic active trace elements with therapeutic proposals. The new trace element classification here presented will be of high interest for different professional sectors: doctors and other professions related to medicine; nutritionist, pharmaceutics, etc. Using this new classification and approaches, new therapeutic strategies could be designed to mitigate symptomatology related to several pathologies, particularly carential and metabolic diseases. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Instantiating the art of war for effects-based operations
NASA Astrophysics Data System (ADS)
Burns, Carla L.
2002-07-01
Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.
Guttman, Nurit
2015-11-01
Communication campaigns are employed as an important tool to promote road safety practices. Researchers maintain road safety communication campaigns are more effective when their persuasive appeals, which are central to their communicative strategy, are based on explicit theoretical frameworks. This study's main objectives were to develop a detailed categorization of persuasive appeals used in road safety communication campaigns that differentiate between appeals that appear to be similar but differ conceptually, and to indicate the advantages, limitations and ethical issues associated with each type, drawing on behavior change theories. Materials from over 300 campaigns were obtained from 41 countries, mainly using road safety organizations' websites. Drawing on the literature, five types of main approaches were identified, and the analysis yielded a more detailed categorizations of appeals within these general categories. The analysis points to advantages, limitations, ethical issues and challenges in using different types of appeals. The discussion summarizes challenges in designing persuasive-appeals for road safety communication campaigns. Copyright © 2015 Elsevier Ltd. All rights reserved.
Guttman, Nurit
2016-12-01
Communication campaigns are employed as an important tool to promote road safety practices. Researchers maintain road safety communication campaigns are more effective when their persuasive appeals, which are central to their communicative strategy, are based on explicit theoretical frameworks. This study's main objectives were to develop a detailed categorization of persuasive appeals used in road safety communication campaigns that differentiate between appeals that appear to be similar but differ conceptually, and to indicate the advantages, limitations and ethical issues associated with each type, drawing on behavior change theories. Materials from over 300 campaigns were obtained from 41 countries, mainly using road safety organizations' websites. Drawing on the literature, five types of main approaches were identified, and the analysis yielded a more detailed categorizations of appeals within these general categories. The analysis points to advantages, limitations, ethical issues and challenges in using different types of appeals. The discussion summarizes challenges in designing persuasive-appeals for road safety communication campaigns. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sensor image prediction techniques
NASA Astrophysics Data System (ADS)
Stenger, A. J.; Stone, W. R.; Berry, L.; Murray, T. J.
1981-02-01
The preparation of prediction imagery is a complex, costly, and time consuming process. Image prediction systems which produce a detailed replica of the image area require the extensive Defense Mapping Agency data base. The purpose of this study was to analyze the use of image predictions in order to determine whether a reduced set of more compact image features contains enough information to produce acceptable navigator performance. A job analysis of the navigator's mission tasks was performed. It showed that the cognitive and perceptual tasks he performs during navigation are identical to those performed for the targeting mission function. In addition, the results of the analysis of his performance when using a particular sensor can be extended to the analysis of this mission tasks using any sensor. An experimental approach was used to determine the relationship between navigator performance and the type of amount of information in the prediction image. A number of subjects were given image predictions containing varying levels of scene detail and different image features, and then asked to identify the predicted targets in corresponding dynamic flight sequences over scenes of cultural, terrain, and mixed (both cultural and terrain) content.