Science.gov

Sample records for analysis tool incorporating

  1. Immediate tool incorporation processes determine human motor planning with tools.

    PubMed

    Ganesh, G; Yoshioka, T; Osu, R; Ikegami, T

    2014-01-01

    Human dexterity with tools is believed to stem from our ability to incorporate and use tools as parts of our body. However tool incorporation, evident as extensions in our body representation and peri-personal space, has been observed predominantly after extended tool exposures and does not explain our immediate motor behaviours when we change tools. Here we utilize two novel experiments to elucidate the presence of additional immediate tool incorporation effects that determine motor planning with tools. Interestingly, tools were observed to immediately induce a trial-by-trial, tool length dependent shortening of the perceived limb lengths, opposite to observations of elongations after extended tool use. Our results thus exhibit that tools induce a dual effect on our body representation; an immediate shortening that critically affects motor planning with a new tool, and the slow elongation, probably a consequence of skill related changes in sensory-motor mappings with the repeated use of the tool. PMID:25077612

  2. Incorporating LCA tools in integrated simulation environments

    SciTech Connect

    Pal, Vineeta; Papamichael, Konstantinos; Bourassa, Norman; Loffeld, John J.

    2001-02-01

    In this paper we address the issue of building data schema evolution in integrated simulation environments, as seen from the perspective of incorporating LCA tools within these environments. First we describe the key features of an integrated simulation environment designed for expandability, focusing on (a) the mechanism for the expansion of the integrated environment, and (b) its overall system architecture that allows processes and data to be added to the system without modifications or restructuring of existing code. We then focus on how the data schema allows the inclusion and maintenance of specialized construction objects bearing LCA data. Finally, we discuss various integration issues that arise from modeling capabilities and idiosyncrasies of individual simulation and analysis tools.

  3. Subgrouping For Patients With Low Back Pain: A Multidimensional Approach Incorporating Cluster Analysis & The STarT Back Screening Tool

    PubMed Central

    Beneciuk, Jason M.; Robinson, Michael E.; George, Steven Z.

    2014-01-01

    Early screening for psychological distress has been suggested to improve patient management for individuals experiencing low back pain. This study compared two approaches to psychological screening (i.e., multidimensional and unidimensional) so that preliminary recommendations on which approach may be appropriate for use in clinical settings other than primary care could be provided. Specifically, this study investigated STarT Back Screening Tool (SBT): 1) discriminant validity by evaluating its relationship with unidimensional psychological measures and 2) construct validity by evaluating how SBT risk categories compared to empirically derived subgroups using unidimensional psychological and disability measures. Patients (n = 146) receiving physical therapy for LBP were administered the SBT and a battery of unidimensional psychological measures at initial evaluation. Clinical measures consisted of pain intensity and self-reported disability. Several SBT risk dependent relationships (i.e., SBT low < medium < high risk) were identified for unidimensional psychological measure scores with depressive symptom scores associated with the strongest influence on SBT risk categorization. Empirically derived subgroups indicated that there was no evidence of distinctive patterns amongst psychological or disability measures other than high or low profiles, therefore two groups may provide a more clear representation of the level of pain associated psychological distress, maladaptive coping and disability in this setting, as compared to three groups which have been suggested when using the SBT in primary care settings. PMID:25451622

  4. Incorporating Online Tools in Tertiary Education

    ERIC Educational Resources Information Center

    Steenkamp, Leon P.; Rudman, Riaan J.

    2013-01-01

    Students currently studying at tertiary institutions have developed a set of attitudes and aptitudes as a result of growing up in an IT and media-rich environment. These attitudes and aptitudes influence how they learn and in order to be effective, lecturers must adapt to address their learning preferences and use the online teaching tools that…

  5. COASTAL INVERTEBRATES AND FISHES: HOW WILL THEY BE AFFECTED BY CHANGING ENVIRONMENTAL CONDITIONS- INCORPORATING CLIMATE SCENARIOS INTO THE COASTAL BIODIVERSITY RISK ANALYSIS TOOL (CBRAT)

    EPA Science Inventory

    The Coastal Biodiversity Risk Analysis Tool (CBRAT) is a public website that functions as an ecoinformatics platform to synthesize biogeographical distributions, abundances, life history attributes, and environmental tolerances for near-coastal invertebrates and fishes on a broad...

  6. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  7. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  8. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  9. Demand Response Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be usedmore » by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.« less

  10. Demand Response Analysis Tool

    SciTech Connect

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be used by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.

  11. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  12. Analysis/Design Tool

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Excelerator II, developed by INTERSOLV, Inc., provides a complete environment for rules-based expert systems. The software incorporates NASA's C Language Integrated Production System (CLIPS), a shell for constructing expert systems. Excelerator II provides complex verification and transformation routines based on matching that is simple and inexpensive. *Excelerator II was sold to SELECT Software Tools in June 1997 and is now called SELECT Excelerator. SELECT has assumed full support and maintenance for the product line.

  13. ATAMM analysis tool

    NASA Astrophysics Data System (ADS)

    Jones, Robert; Stoughton, John; Mielke, Roland

    1991-10-01

    Diagnostics software for analyzing Algorithm to Architecture Mapping Model (ATAMM) based concurrent processing systems is presented. ATAMM is capable of modeling the execution of large grain algorithms on distributed data flow architectures. The tool graphically displays algorithm activities and processor activities for evaluation of the behavior and performance of an ATAMM based system. The tool's measurement capabilities indicate computing speed, throughput, concurrency, resource utilization, and overhead. Evaluations are performed on a simulated system using the software tool. The tool is used to estimate theoretical lower bound performance. Analysis results are shown to be comparable to the predictions.

  14. ATAMM analysis tool

    NASA Technical Reports Server (NTRS)

    Jones, Robert; Stoughton, John; Mielke, Roland

    1991-01-01

    Diagnostics software for analyzing Algorithm to Architecture Mapping Model (ATAMM) based concurrent processing systems is presented. ATAMM is capable of modeling the execution of large grain algorithms on distributed data flow architectures. The tool graphically displays algorithm activities and processor activities for evaluation of the behavior and performance of an ATAMM based system. The tool's measurement capabilities indicate computing speed, throughput, concurrency, resource utilization, and overhead. Evaluations are performed on a simulated system using the software tool. The tool is used to estimate theoretical lower bound performance. Analysis results are shown to be comparable to the predictions.

  15. Incorporating psychological influences in probabilistic cost analysis

    SciTech Connect

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the

  16. MIRO: A debugging tool for CLIPS incorporating historical Rete networks

    NASA Technical Reports Server (NTRS)

    Tuttle, Sharon M.; Eick, Christoph F.

    1994-01-01

    At the last CLIPS conference, we discussed our ideas for adding a temporal dimension to the Rete network used to implement CLIPS. The resulting historical Rete network could then be used to store 'historical' information about a run of a CLIPS program, to aid in debugging. MIRO, a debugging tool for CLIPS built on top of CLIPS, incorporates such a historical Rete network and uses it to support its prototype question-answering capability. By enabling CLIPS users to directly ask debugging-related questions about the history of a program run, we hope to reduce the amount of single-stepping and program tracing required to debug a CLIPS program. In this paper, we briefly describe MIRO's architecture and implementation, and the current question-types that MIRO supports. These question-types are further illustrated using an example, and the benefits of the debugging tool are discussed. We also present empirical results that measure the run-time and partial storage overhead of MIRO, and discuss how MIRO may also be used to study various efficiency aspects of CLIPS programs.

  17. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  18. Physics analysis tools

    SciTech Connect

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed.

  19. Graphical Contingency Analysis Tool

    SciTech Connect

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identify the best action by interactively evaluate candidate actions.

  20. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  1. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  2. PCard Data Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes,more » Email status (date sent, response), audit resolution.« less

  3. PCard Data Analysis Tool

    SciTech Connect

    Hilts, Jim

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes, Email status (date sent, response), audit resolution.

  4. Graphical Contingency Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identifymore » the best action by interactively evaluate candidate actions.« less

  5. Transmission Planning Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2015-06-23

    Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identifymore » weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysis and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.« less

  6. Transmission Planning Analysis Tool

    SciTech Connect

    2015-06-23

    Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identify weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysis and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.

  7. Swift Science Analysis Tools

    NASA Astrophysics Data System (ADS)

    Marshall, F. E.; Swift Team Team

    2003-05-01

    Swift is an autonomous, multiwavelength observatory selected by NASA to study gamma-ray bursts (GRBs) and their afterglows. Its Burst Alert Telescope (BAT) is a large coded mask instrument that will image GRBs in the 15 to 150 keV band. The X-ray Telescope (XRT) focuses X-rays in the 0.2 to 10 keV band onto CCDs, and the co-aligned Ultra-Violet/Optical Telescope (UVOT) has filters and grisms for low-resolution spectroscopy. The Swift team is developing mission-specific tools for processing the telemetry into FITS files and for calibrating and selecting the data for further analysis with such mission-independent tools as XIMAGE and XSPEC. The FTOOLS-based suite of tools will be released to the community before launch with additional updates after launch. Documentation for the tools and standard receipes for their use will be available on the Swift Science Center (SSC) Web site (http://swiftsc.gsfc.nasa.gov), and the SSC will provide user assistance with an e-mail help desk. After the verification phase of the mission, all data will be available to the community as soon as it is processed in the Swift Data Center (SDC). Once all the data for an observation is available, the data will be transferred to the HEASARC and data centers in England and Italy. The data can then be searched and accessed using standard tools such as Browse. Before this transfer the quick-look data will be available on an ftp site at the SDC. The SSC will also provide documentation and simulation tools in support of the Swift Guest Investigator program.

  8. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  9. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  10. Stack Trace Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2008-01-16

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallel application. STAT uses the MRNet free based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to form a single call prefix tree.more » The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence classes. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.« less

  11. Stack Trace Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree.more » The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.« less

  12. Stack Trace Analysis Tool

    SciTech Connect

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree. The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.

  13. Stack Trace Analysis Tool

    SciTech Connect

    2008-01-16

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallel application. STAT uses the MRNet free based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to form a single call prefix tree. The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence classes. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.

  14. Frequency Response Analysis Tool

    SciTech Connect

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  15. Neutron multiplicity analysis tool

    SciTech Connect

    Stewart, Scott L

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  16. Geodetic Strain Analysis Tool

    NASA Technical Reports Server (NTRS)

    Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan

    2011-01-01

    A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.

  17. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  18. Climate Data Analysis Tools

    Energy Science and Technology Software Center (ESTSC)

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). Themore » CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).« less

  19. Climate Data Analysis Tools

    SciTech Connect

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).

  20. Incorporating Stream Features into Groundwater Contouring Tools Within GIS.

    PubMed

    Bannister, Roger; Kennelly, Patrick

    2016-03-01

    Hydrogeologists often are called upon to estimate surfaces from discrete, sparse data points. This estimation is often accomplished by manually drawing contours on maps using interpolation methods between points of known value while accounting for features known to influence the water table's surface. By contrast, geographic information systems (GIS) are good at creating smooth continuous surfaces from limited data points and allowing the user to represent the resulting surface resulting with contours, but these automated methods often fail to meet the expectations of many hydrogeologists because they do not include knowledge of other influences on the water table. In this study, we seek to fill this gap in the GIS-based methodology for hydrogeologists through an interactive tool that shapes an interpolated surface based on additional knowledge of the water table inferred from gaining or losing streams. The modified surface is reflected in water table contours that, for example, "V" upstream for gaining streams, and can be interactively adjusted to fit the user's expectations. By modifying not only the contours but also the associated interpolated surface, additional contours will follow the same trend, and the modified surface can be used for other analyses like calculating average gradients and flow paths. The tool leverages Esri's ArcGIS Desktop software, building upon a robust suite of mapping tools. We see this as a prototype for other tools that could be developed for hydrogeologists to account for variations in the water table inferred from local topographic trends, pumping or injection wells, and other hydrogeologic features. PMID:25810357

  1. Incorporating Experience Curves in Appliance Standards Analysis

    SciTech Connect

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  2. Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  3. Java Radar Analysis Tool

    NASA Technical Reports Server (NTRS)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  4. A new weather warning tool that incorporates census data

    NASA Astrophysics Data System (ADS)

    Collins, J.; Paxton, C. H.; Hansen, T. L.; Simms, J. L.; Hirvela, K.; Saurabh Kotiyal, S.

    2011-12-01

    One goal of the NWS Integrated Hazard Information Services (IHIS) is to transition the current hazard program from a paradigm of issuing products to one of providing information for decision support. This IHIS project is intended to significantly improve the decision making process of emergency managers and other public officials who manage resources during periods of deadly weather, and also better inform the media and public. The IHIS system has two components; internal and external. Forecasters use the internal graphical component to issue watches, warnings, advisories and statements for hazardous weather by zone, county, or polygon. The IHIS system converts all specified areas into polygons and parses the appropriate information from census block data, displaying the results in tabular and graphical formats for use in the warning decision process. As one example, infrastructure such as hospitals and fire stations are available as overlays. The external component of the IHIS system is a Web site accessible to the emergency managers, the media and the public that shows the polygons with accompanying tables and graphs of census and infrastructure information for the specified areas. Interviews of emergency managers and the media provided positive feedback, illustrating the functionality of the data obtained through the graphical tool on the NWS Web site. This presentation will provide an overview of the interface, the method for census data extraction, user feedback and direction for the future.

  5. FSSC Science Tools: Pulsar Analysis

    NASA Technical Reports Server (NTRS)

    Thompson, Dave

    2010-01-01

    This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

  6. Failure Environment Analysis Tool (FEAT)

    NASA Technical Reports Server (NTRS)

    Lawler, D. G.

    1991-01-01

    Information is given in viewgraph form on the Failure Environment Analysis Tool (FEAT), a tool designed to demonstrate advanced modeling and analysis techniques to better understand and capture the flow of failures within and between elements of the Space Station Freedom (SSF) and other large complex systems. Topics covered include objectives, development background, the technical approach, SSF baseline integration, and FEAT growth and evolution.

  7. Incorporating scale into digital terrain analysis

    NASA Astrophysics Data System (ADS)

    Dragut, L. D.; Eisank, C.; Strasser, T.

    2009-04-01

    Digital Elevation Models (DEMs) and their derived terrain attributes are commonly used in soil-landscape modeling. Process-based terrain attributes meaningful to the soil properties of interest are sought to be produced through digital terrain analysis. Typically, the standard 3 X 3 window-based algorithms are used for this purpose, thus tying the scale of resulting layers to the spatial resolution of the available DEM. But this is likely to induce mismatches between scale domains of terrain information and soil properties of interest, which further propagate biases in soil-landscape modeling. We have started developing a procedure to incorporate scale into digital terrain analysis for terrain-based environmental modeling (Drăguţ et al., in press). The workflow was exemplified on crop yield data. Terrain information was generalized into successive scale levels with focal statistics on increasing neighborhood size. The degree of association between each terrain derivative and crop yield values was established iteratively for all scale levels through correlation analysis. The first peak of correlation indicated the scale level to be further retained. While in a standard 3 X 3 window-based analysis mean curvature was one of the poorest correlated terrain attribute, after generalization it turned into the best correlated variable. To illustrate the importance of scale, we compared the regression results of unfiltered and filtered mean curvature vs. crop yield. The comparison shows an improvement of R squared from a value of 0.01 when the curvature was not filtered, to 0.16 when the curvature was filtered within 55 X 55 m neighborhood size. This indicates the optimum size of curvature information (scale) that influences soil fertility. We further used these results in an object-based image analysis environment to create terrain objects containing aggregated values of both terrain derivatives and crop yield. Hence, we introduce terrain segmentation as an alternative

  8. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    SciTech Connect

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  9. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  10. Logistics Process Analysis ToolProcess Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component wasmore » added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  11. The rules of tool incorporation: Tool morpho-functional & sensori-motor constraints.

    PubMed

    Cardinali, L; Brozzoli, C; Finos, L; Roy, A C; Farnè, A

    2016-04-01

    Previous studies showed that using tools modifies the agent's body and space representation. However, it is still not clear which rules govern those remapping processes. Here, we studied the differential role played by the morpho-functional characteristics of a tool and the sensori-motor constraints that a tool imposes on the hand. To do so, we asked a group of participants to reach and grasp an object using, in different conditions, two different tools: Pliers, to be acted upon by the index and thumb fingertips, and Sticks, taped to the same two digits. The two tools were equivalent in terms of morpho-functional characteristics, providing index finger and thumb with the same amount of elongation. Crucially, however, they imposed different sensori-motor constraints on the acting fingers. We measured and compared the kinematic profile of free-hand movements performed before and after the use of both devices. As predicted on the basis of their equivalent morpho-functional characteristics, both tools induced similar changes in the fingers (but not the arm) kinematics compatible with the hand being represented as bigger. Furthermore, the different sensori-motor constraints imposed by Pliers and Sticks over the hand, induced differential updates of the hand representation. In particular, the Sticks selectively affected the kinematics of the two fingers they were taped on, whereas Pliers had a more global effect, affecting the kinematics of hand movements not performed during the use of the tool. These results suggest that tool-use induces a rapid update of the hand representation in the brain, not only on the basis of the morpho-functional characteristics of the tool, but also depending on the specific sensori-motor constraints imposed by the tool. PMID:26774102

  12. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published. PMID:18495751

  13. Atlas Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  14. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  15. Tiling Microarray Analysis Tools

    SciTech Connect

    Nix, Davis Austin

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons), 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)

  16. Tiling Microarray Analysis Tools

    Energy Science and Technology Software Center (ESTSC)

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons),more » 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)« less

  17. Sandia PUF Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics suchmore » as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.« less

  18. Sandia PUF Analysis Tool

    SciTech Connect

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  19. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  20. Heliostat cost-analysis tool

    NASA Astrophysics Data System (ADS)

    Brandt, L. D.; Chang, R. E.

    1981-10-01

    A heliostat cost analysis tool (HELCAT) that processes manufacturing transportation, and installation cost data was developed which provides a consistent structure for cost analyses. The HELCAT calculates a representation product price based on direct input data and various economic, financial, and accounting assumptions. The characteristics of this tool and its initial application in the evaluation of second generation heliostat cost estimates are discussed. A set of nominal economic and financial parameters is also suggested.

  1. The Challenge of Multiple Perspectives: Multiple Solution Tasks for Students Incorporating Diverse Tools and Representation Systems

    ERIC Educational Resources Information Center

    Kordaki, Maria

    2015-01-01

    This study focuses on the role of multiple solution tasks (MST) incorporating multiple learning tools and representation systems (MTRS) in encouraging each student to develop multiple perspectives on the learning concepts under study and creativity of thought. Specifically, two types of MST were used, namely tasks that allowed and demanded…

  2. Getting Students Excited about Learning: Incorporating Digital Tools to Support the Writing Process

    ERIC Educational Resources Information Center

    Saulsburry, Rachel; Kilpatrick, Jennifer Renée; Wolbers, Kimberly A.; Dostal, Hannah

    2015-01-01

    Technology--in the form of digital tools incorporated into writing instruction--can help teachers motivate and engage young children, and it may be especially critical for students who do everything they can to avoid writing. Technology may bolster student involvement, foster the engagement of reluctant or struggling writers, and support writing…

  3. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  4. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  5. Incorporating Non-Coding Annotations into Rare Variant Analysis

    PubMed Central

    Richardson, Tom G.; Campbell, Colin; Timpson, Nicholas J; Gaunt, Tom R.

    2016-01-01

    Background The success of collapsing methods which investigate the combined effect of rare variants on complex traits has so far been limited. The manner in which variants within a gene are selected prior to analysis has a crucial impact on this success, which has resulted in analyses conventionally filtering variants according to their consequence. This study investigates whether an alternative approach to filtering, using annotations from recently developed bioinformatics tools, can aid these types of analyses in comparison to conventional approaches. Methods & Results We conducted a candidate gene analysis using the UK10K sequence and lipids data, filtering according to functional annotations using the resource CADD (Combined Annotation-Dependent Depletion) and contrasting results with ‘nonsynonymous’ and ‘loss of function’ consequence analyses. Using CADD allowed the inclusion of potentially deleterious intronic variants, which was not possible when filtering by consequence. Overall, different filtering approaches provided similar evidence of association, although filtering according to CADD identified evidence of association between ANGPTL4 and High Density Lipoproteins (P = 0.02, N = 3,210) which was not observed in the other analyses. We also undertook genome-wide analyses to determine how filtering in this manner compared to conventional approaches for gene regions. Results suggested that filtering by annotations according to CADD, as well as other tools known as FATHMM-MKL and DANN, identified association signals not detected when filtering by variant consequence and vice versa. Conclusion Incorporating variant annotations from non-coding bioinformatics tools should prove to be a valuable asset for rare variant analyses in the future. Filtering by variant consequence is only possible in coding regions of the genome, whereas utilising non-coding bioinformatics annotations provides an opportunity to discover unknown causal variants in non

  6. Deuterium incorporation in biomass cell wall components by NMR analysis

    SciTech Connect

    Foston, Marcus B; McGaughey, Joseph; O'Neill, Hugh Michael; Evans, Barbara R; Ragauskas, Arthur J

    2012-01-01

    A commercially available deuterated kale sample was analyzed for deuterium incorporation by ionic liquid solution 2H and 1H nuclear magnetic resonance (NMR). This protocol was found to effectively measure the percent deuterium incorporation at 33%, comparable to the 31% value determined by combustion. The solution NMR technique also suggested by a qualitative analysis that deuterium is preferentially incorporated into the carbohydrate components of the kale sample.

  7. Incorporating Quality Scores in Meta-Analysis

    ERIC Educational Resources Information Center

    Ahn, Soyeon; Becker, Betsy Jane

    2011-01-01

    This paper examines the impact of quality-score weights in meta-analysis. A simulation examines the roles of study characteristics such as population effect size (ES) and its variance on the bias and mean square errors (MSEs) of the estimators for several patterns of relationship between quality and ES, and for specific patterns of systematic…

  8. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  9. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  10. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1994-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within it, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  11. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  12. Incorporating spatial dependence in regional frequency analysis

    PubMed Central

    Wang, Zhuo; Yan, Jun; Zhang, Xuebin

    2014-01-01

    The efficiency of regional frequency analysis (RFA) is undermined by intersite dependence, which is usually ignored in parameter estimation. We propose a spatial index flood model where marginal generalized extreme value distributions are joined by an extreme-value copula characterized by a max-stable process for the spatial dependence. The parameters are estimated with a pairwise likelihood constructed from bivariate marginal generalized extreme value distributions. The estimators of model parameters and return levels can be more efficient than those from the traditional index flood model when the max-stable process fits the intersite dependence well. Through simulation, we compared the pairwise likelihood method with an L-moment method and an independence likelihood method under various spatial dependence models and dependence levels. The pairwise likelihood method was found to be the most efficient in mean squared error if the dependence model was correctly specified. When the dependence model was misspecified within the max-stable models, the pairwise likelihood method was still competitive relative to the other two methods. When the dependence model was not a max-stable model, the pairwise likelihood method led to serious bias in estimating the shape parameter and return levels, especially when the dependence was strong. In an illustration with annual maximum precipitation data from Switzerland, the pairwise likelihood method yielded remarkable reduction in the standard errors of return level estimates in comparison to the L-moment method. PMID:25745273

  13. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  14. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  15. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  16. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  17. Incorporating parametric uncertainty into population viability analysis models

    USGS Publications Warehouse

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  18. Incorporating Basic Optical Microscopy in the Instrumental Analysis Laboratory

    ERIC Educational Resources Information Center

    Flowers, Paul A.

    2011-01-01

    A simple and versatile approach to incorporating basic optical microscopy in the undergraduate instrumental analysis laboratory is described. Attaching a miniature CCD spectrometer to the video port of a standard compound microscope yields a visible microspectrophotometer suitable for student investigations of fundamental spectrometry concepts,…

  19. Climate Data Analysis Tools - (CDAT)

    NASA Astrophysics Data System (ADS)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.

    2003-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware

  20. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  1. The watershed depositon tool : a tool for incorporating atmospheric deposition in water-quality analyses {sup 1}.

    SciTech Connect

    Schwede, D. B.; Dennis, R. L.; Bitz, M. A.; Decision and Information Sciences; NOAA; EPA

    2009-08-01

    A tool for providing the linkage between air and water-quality modeling needed for determining the Total Maximum Daily Load (TMDL) and for analyzing related nonpoint-source impacts on watersheds has been developed. Using gridded output of atmospheric deposition from the Community Multiscale Air Quality (CMAQ) model, the Watershed Deposition Tool (WDT) calculates average per unit area and total deposition to selected watersheds and subwatersheds. CMAQ estimates the wet and dry deposition for all of its gaseous and particulate chemical species, including ozone, sulfur species, nitrogen species, secondary organic aerosols, and hazardous air pollutants at grid scale sizes ranging from 4 to 36 km. An overview of the CMAQ model is provided. The somewhat specialized format of the CMAQ files is not easily imported into standard spatial analysis tools. The WDT provides a graphical user interface that allows users to visualize CMAQ gridded data and perform further analyses on selected watersheds or simply convert CMAQ gridded data to a shapefile for use in other programs. Shapefiles for the 8-digit (cataloging unit) hydrologic unit code polygons for the United States are provided with the WDT; however, other user-supplied closed polygons may be used. An example application of the WDT for assessing the contributions of different source categories to deposition estimates, the contributions of wet and dry deposition to total deposition, and the potential reductions in total nitrogen deposition to the Albemarle-Pamlico basin stemming from future air emissions reductions is used to illustrate the WDT capabilities.

  2. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  3. ISHM Decision Analysis Tool: Operations Concept

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  4. General Mission Analysis Tool (GMAT) Mathematical Specifications

    NASA Technical Reports Server (NTRS)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  5. Multi-mission telecom analysis tool

    NASA Technical Reports Server (NTRS)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  6. Incorporating spatial context into the analysis of salmonid habitat relations

    USGS Publications Warehouse

    Torgersen, Christian E.; Baxter, Colden V.; Ebersole, J.L.; Gresswell, Bob

    2012-01-01

    In this response to the chapter by Lapointe (this volume), we discuss the question of why it is so difficult to predict salmonid-habitat relations in gravel-bed rivers and streams. We acknowledge that this cannot be an exhaustive treatment of the subject and, thus, identify what we believe are several key issues that demonstrate the necessity of incorporating spatial context into the analysis of fish-habitat data. Our emphasis is on spatial context (i.e., scale and location), but it is important to note that the same principles may be applied with some modification to temporal context, which is beyond the scope of this chapter.

  7. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  8. Incorporating ICT Tools in an Active Engagement Strategy-Based Classroom to Promote Learning Awareness and Self-Monitoring

    ERIC Educational Resources Information Center

    Kean, Ang Chooi; Embi, Mohamed Amin; Yunus, Melor Md

    2012-01-01

    The paper examines the influence of incorporating information and communication technology (ICT) tools to help learners to promote learning awareness and self-monitoring skills. An open-ended online questionnaire survey was administered to 15 course participants at the conclusion of the course. The data were analysed on the basis of the percentage…

  9. Incorporating finite element analysis into component life and reliability

    NASA Technical Reports Server (NTRS)

    August, Richard; Zaretsky, Erwin V.

    1991-01-01

    A method for calculating a component's design survivability by incorporating finite element analysis and probabilistic material properties was developed. The method evaluates design parameters through direct comparisons of component survivability expressed in terms of Weibull parameters. The analysis was applied to a rotating disk with mounting bolt holes. The highest probability of failure occurred at, or near, the maximum shear stress region of the bolt holes. Distribution of failure as a function of Weibull slope affects the probability of survival. Where Weibull parameters are unknown for a rotating disk, it may be permissible to assume Weibull parameters, as well as the stress-life exponent, in order to determine the disk speed where the probability of survival is highest.

  10. Model Analysis ToolKit

    Energy Science and Technology Software Center (ESTSC)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - modelmore » calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R« less

  11. Model Analysis ToolKit

    SciTech Connect

    Harp, Dylan R.

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  12. 2010 Solar Market Transformation Analysis and Tools

    SciTech Connect

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  13. Linkage analysis of anorexia nervosa incorporating behavioral covariates.

    PubMed

    Devlin, Bernie; Bacanu, Silviu-Alin; Klump, Kelly L; Bulik, Cynthia M; Fichter, Manfred M; Halmi, Katherine A; Kaplan, Allan S; Strober, Michael; Treasure, Janet; Woodside, D Blake; Berrettini, Wade H; Kaye, Walter H

    2002-03-15

    Eating disorders, such as anorexia nervosa (AN) and bulimia nervosa (BN), have genetic and environmental underpinnings. To explore genetic contributions to AN, we measured psychiatric, personality and temperament phenotypes of individuals diagnosed with eating disorders from 196 multiplex families, all accessed through an AN proband, as well as genotyping a battery of 387 short tandem repeat (STR) markers distributed across the genome. On these data we performed a multipoint affected sibling pair (ASP) linkage analysis using a novel method that incorporates covariates. By exploring seven attributes thought to typify individuals with eating disorders, we identified two variables, drive-for-thinness and obsessionality, which delimit populations among the ASPs. For both of these traits, or covariates, there were a cluster of ASPs who have high and concordant values for these traits, in keeping with our expectations for individuals with AN, and other clusters of ASPs who did not meet those expectations. When we incorporated these covariates into the ASP linkage analysis, both jointly and separately, we found several regions of suggestive linkage: one close to genome-wide significance on chromosome 1 (at 210 cM, D1S1660; LOD = 3.46, P = 0.00003), another on chromosome 2 (at 114 cM, D2S1790; LOD = 2.22, P = 0.00070) and a third region on chromosome 13 (at 26 cM, D13S894; LOD = 2.50, P = 0.00035). By comparing our results to those implemented using more standard linkage methods, we find the covariates convey substantial information for the linkage analysis. PMID:11912184

  14. Budget Risk & Prioritization Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  15. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  16. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  17. Analysis of an advanced technology subsonic turbofan incorporating revolutionary materials

    NASA Technical Reports Server (NTRS)

    Knip, Gerald, Jr.

    1987-01-01

    Successful implementation of revolutionary composite materials in an advanced turbofan offers the possibility of further improvements in engine performance and thrust-to-weight ratio relative to current metallic materials. The present analysis determines the approximate engine cycle and configuration for an early 21st century subsonic turbofan incorporating all composite materials. The advanced engine is evaluated relative to a current technology baseline engine in terms of its potential fuel savings for an intercontinental quadjet having a design range of 5500 nmi and a payload of 500 passengers. The resultant near optimum, uncooled, two-spool, advanced engine has an overall pressure ratio of 87, a bypass ratio of 18, a geared fan, and a turbine rotor inlet temperature of 3085 R. Improvements result in a 33-percent fuel saving for the specified misssion. Various advanced composite materials are used throughout the engine. For example, advanced polymer composite materials are used for the fan and the low pressure compressor (LPC).

  18. Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben

    2014-01-01

    This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.

  19. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  20. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  1. PepTool and GeneTool: platform-independent tools for biological sequence analysis.

    PubMed

    Wishart, D S; Stothard, P; Van Domselaar, G H

    2000-01-01

    Although we are unable to discuss all of the functionality available in PepTool and GeneTool, it should be evident from this brief review that both packages offer a great deal in terms of functionality and ease-of-use. Furthermore, a number of useful innovations including platform-independent GUI design, networked parallelism, direct internet connectivity, database compression, and a variety of enhanced or improved algorithms should make these two programs particularly useful in the rapidly changing world of biological sequence analysis. More complete descriptions of the programs, algorithms and operation of PepTool and GeneTool are available on the BioTools web site (www.biotools.com), in the associated program user manuals and in the on-line Help pages. PMID:10547833

  2. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya Monton, Carlos; Schuller, Ivan K.

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  3. Photogrammetry Tool for Forensic Analysis

    NASA Technical Reports Server (NTRS)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  4. Genomic sequence analysis tools: a user's guide.

    PubMed

    Fortna, A; Gardiner, K

    2001-03-01

    The wealth of information from various genome sequencing projects provides the biologist with a new perspective from which to analyze, and design experiments with, mammalian systems. The complexity of the information, however, requires new software tools, and numerous such tools are now available. Which type and which specific system is most effective depends, in part, upon how much sequence is to be analyzed and with what level of experimental support. Here we survey a number of mammalian genomic sequence analysis systems with respect to the data they provide and the ease of their use. The hope is to aid the experimental biologist in choosing the most appropriate tool for their analyses. PMID:11226611

  5. Built Environment Energy Analysis Tool Overview (Presentation)

    SciTech Connect

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  6. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees

    PubMed Central

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-01-01

    A novel Protocol Ethics Tool Kit (‘Ethics Tool Kit’) has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  7. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    PubMed

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  8. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  9. Incorporating Concept Maps in a Slide Presentation Tool for the Classroom Environment.

    ERIC Educational Resources Information Center

    Gopal, Kreshna; Morapakkam, Karthik

    This paper presents a slide presentation software that incorporates a concept map, which explicitly shows how the various slides (and other multimedia components) presented are related to each other. Furthermore, presentations are conceived as hypermedia systems, where the presenter can navigate among slides (and the concept map) instead of the…

  10. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  11. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  12. Quantifying element incorporation in multispecies biofilms using nanoscale secondary ion mass spectrometry image analysis.

    PubMed

    Renslow, Ryan S; Lindemann, Stephen R; Cole, Jessica K; Zhu, Zihua; Anderton, Christopher R

    2016-06-01

    Elucidating nutrient exchange in microbial communities is an important step in understanding the relationships between microbial systems and global biogeochemical cycles, but these communities are complex and the interspecies interactions that occur within them are not well understood. Phototrophic consortia are useful and relevant experimental systems to investigate such interactions as they are not only prevalent in the environment, but some are cultivable in vitro and amenable to controlled scientific experimentation. Nanoscale secondary ion mass spectrometry (NanoSIMS) is a powerful, high spatial resolution tool capable of visualizing the metabolic activities of single cells within a biofilm, but quantitative analysis of the resulting data has typically been a manual process, resulting in a task that is both laborious and susceptible to human error. Here, the authors describe the creation and application of a semiautomated image-processing pipeline that can analyze NanoSIMS-generated data, applied to phototrophic biofilms as an example. The tool employs an image analysis process, which includes both elemental and morphological segmentation, producing a final segmented image that allows for discrimination between autotrophic and heterotrophic biomass, the detection of individual cyanobacterial filaments and heterotrophic cells, the quantification of isotopic incorporation of individual heterotrophic cells, and calculation of relevant population statistics. The authors demonstrate the functionality of the tool by using it to analyze the uptake of (15)N provided as either nitrate or ammonium through the unicyanobacterial consortium UCC-O and imaged via NanoSIMS. The authors found that the degree of (15)N incorporation by individual cells was highly variable when labeled with (15)NH4 (+), but much more even when biofilms were labeled with (15)NO3 (-). In the (15)NH4 (+)-amended biofilms, the heterotrophic distribution of (15)N incorporation was highly skewed, with

  13. Incorporating Endmember Variability into Spectral Mixture Analysis Through Endmember Bundles

    NASA Technical Reports Server (NTRS)

    Bateson, C. Ann; Asner, Gregory P.; Wessman, Carol A.

    1998-01-01

    Variation in canopy structure and biochemistry induces a concomitant variation in the top-of-canopy spectral reflectance of a vegetation type. Hence, the use of a single endmember spectrum to track the fractional abundance of a given vegetation cover in a hyperspectral image may result in fractions with considerable error. One solution to the problem of endmember variability is to increase the number of endmembers used in a spectral mixture analysis of the image. For example, there could be several tree endmembers in the analysis because of differences in leaf area index (LAI) and multiple scatterings between leaves and stems. However, it is often difficult in terms of computer or human interaction time to select more than six or seven endmembers and any non-removable noise, as well as the number of uncorrelated bands in the image, limits the number of endmembers that can be discriminated. Moreover, as endmembers proliferate, their interpretation becomes increasingly difficult and often applications simply need the aerial fractions of a few land cover components which comprise most of the scene. In order to incorporate endmember variability into spectral mixture analysis, we propose representing a landscape component type not with one endmember spectrum but with a set or bundle of spectra, each of which is feasible as the spectrum of an instance of the component (e.g., in the case of a tree component, each spectrum could reasonably be the spectral reflectance of a tree canopy). These endmember bundles can be used with nonlinear optimization algorithms to find upper and lower bounds on endmember fractions. This approach to endmember variability naturally evolved from previous work in deriving endmembers from the data itself by fitting a triangle, tetrahedron or, more generally, a simplex to the data cloud reduced in dimension by a principal component analysis. Conceptually, endmember variability could make it difficult to find a simplex that both surrounds the data

  14. Comparing Work Skills Analysis Tools. Project Report.

    ERIC Educational Resources Information Center

    Barker, Kathryn

    This document outlines the processes and outcomes of a research project conducted to review work skills analysis tools (products and/or services) that profile required job skills and/or assess individuals' acquired skills. The document begins with a brief literature review and discussion of pertinent terminology. Presented next is a list of…

  15. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  16. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  17. From sensor networks to connected analysis tools

    NASA Astrophysics Data System (ADS)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the

  18. Incorporating a Soil Science Artifact into a University ePortfolio Assessment Tool

    ERIC Educational Resources Information Center

    Mikhailova, Elena; Werts, Joshua; Post, Christopher; Ring, Gail

    2014-01-01

    The ePortfolio is a useful educational tool that is utilized in many educational institutions to showcase student accomplishments and provide students with an opportunity to reflect on their educational progress. The objective of this study was to develop and test an artifact from an introductory soil science course to be included in the…

  19. Decision Analysis Tools for Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  20. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    NASA Technical Reports Server (NTRS)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  1. Fairing Separation Analysis Using SepTOOL

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Kosareo, Daniel N.

    2015-01-01

    This document describes the relevant equations programmed in spreadsheet software, SepTOOL, developed by ZIN Technologies, Inc. (ZIN) to determine the separation clearance between a launch vehicle payload fairing and remaining stages. The software uses closed form rigid body dynamic solutions of the vehicle in combination with flexible body dynamics of the fairing, which is obtained from flexible body dynamic analysis or from test data, and superimposes the two results to obtain minimum separation clearance for any given set of flight trajectory conditions. Using closed form solutions allows SepTOOL to perform separation calculations several orders of magnitude faster compared to numerical methods which allows users to perform real time parameter studies. Moreover, SepTOOL can optimize vehicle performance to minimize separation clearance. This tool can evaluate various shapes and sizes of fairings along with different vehicle configurations and trajectories. These geometries and parameters are inputted in a user friendly interface. Although the software was specifically developed for evaluating the separation clearance of launch vehicle payload fairings, separation dynamics of other launch vehicle components can be evaluated provided that aerodynamic loads acting on the vehicle during the separation event are negligible. This document describes the development of SepTOOL providing analytical procedure and theoretical equations whose implementation of these equations is not disclosed. Realistic examples are presented, and the results are verified with ADAMS (MSC Software Corporation) simulations. It should be noted that SepTOOL is a preliminary separation clearance assessment software for payload fairings and should not be used for final clearance analysis.

  2. Systematic Omics Analysis Review (SOAR) tool to support risk assessment.

    PubMed

    McConnell, Emma R; Bell, Shannon M; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  3. Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment

    PubMed Central

    McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D.

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  4. Vibration analysis as a predictive maintenance tool

    SciTech Connect

    Dischner, J.M.

    1995-09-01

    Vibration analysis is a powerful and effective tool in both predicting and isolating incipient fault conditions. Vibration can assist in the identification of root cause failure analysis and can be used to establish maintenance procedures on a condition assessment basis rather than a scheduled or calendar basis. Recent advances in technology allow for not only new types of testing to be performed, but when integrated with other types of machine information, can lead to even greater insight and accuracy of the entire predictive maintenance program. Case studies and recent findings will be presented along with a discussion of how vibration is used as an invaluable tool in the detection of defects in gearboxes, mill stands, and roll chatter detection and correction. Acceptable vibration criteria and cost benefit summaries will be included.

  5. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools

  6. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  7. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  8. A flexible and national scale approach to coastal decision tools incorporating sea level rise

    NASA Astrophysics Data System (ADS)

    Strauss, B.; Kulp, S. A.; Tebaldi, C.

    2014-12-01

    Climate science and sea level models constantly evolve. In this context, maps and analyses of exposure to sea level rise - or coastal flooding aggravated by rise - quickly fall out of date when based upon a specific model projection or projection set. At the same time, policy makers and planners prefer simple and stable risk assessments for their future planning. Here, using Climate Central's Surging Seas Risk Finder, we describe and illustrate a decision tool framework that separates the spatial and temporal dimensions of coastal exposure in order to help alleviate this tension. The Risk Finder presents local maps and exposure analyses simply as functions of a discrete set of local water levels. In turn, each water level may be achieved at different times, with different probabilities, according to different combinations of sea level change, storm surge and tide. This temporal dimension is expressed in a separate module of the Risk Finder, so that users may explore the probabilities and time frames of different water levels, as a function of different sea level models and emissions scenarios. With such an approach, decision-makers can quickly get a sense of the range of risks for each water level given current understanding. At the same time, the models and scenarios can easily be updated over time as the science evolves, while avoiding the labor of regenerating maps and exposure analyses. In this talk, we will also use the tool to highlight key findings from a new U.S. national assessment of sea level and coastal flood risk. For example, more than 2.5 million people and $500 billion dollars of property value sit on land less than 2 meters above the high tide line in Florida alone.

  9. A new tool for accelerator system modeling and analysis

    SciTech Connect

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-09-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated.

  10. Stochastic modelling of landfill leachate and biogas production incorporating waste heterogeneity. Model formulation and uncertainty analysis.

    PubMed

    Zacharof, A I; Butler, A P

    2004-01-01

    A mathematical model simulating the hydrological and biochemical processes occurring in landfilled waste is presented and demonstrated. The model combines biochemical and hydrological models into an integrated representation of the landfill environment. Waste decomposition is modelled using traditional biochemical waste decomposition pathways combined with a simplified methodology for representing the rate of decomposition. Water flow through the waste is represented using a statistical velocity model capable of representing the effects of waste heterogeneity on leachate flow through the waste. Given the limitations in data capture from landfill sites, significant emphasis is placed on improving parameter identification and reducing parameter requirements. A sensitivity analysis is performed, highlighting the model's response to changes in input variables. A model test run is also presented, demonstrating the model capabilities. A parameter perturbation model sensitivity analysis was also performed. This has been able to show that although the model is sensitive to certain key parameters, its overall intuitive response provides a good basis for making reasonable predictions of the future state of the landfill system. Finally, due to the high uncertainty associated with landfill data, a tool for handling input data uncertainty is incorporated in the model's structure. It is concluded that the model can be used as a reasonable tool for modelling landfill processes and that further work should be undertaken to assess the model's performance. PMID:15120429

  11. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    PubMed

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. PMID:26679001

  12. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  13. Integrated FDIR Analysis Tool for Space Applications

    NASA Astrophysics Data System (ADS)

    Piras, Annamaria; Malucchi, Giovanni; Di Tommaso, Umberto

    2013-08-01

    The crucial role of health management in space applications has been the subject of many studies carried out by NASA and ESA and is held in high regard by Thales Alenia Space. The common objective is to improve reliability and availability of space systems. This paper will briefly illustrate the evolution of IDEHAS (IntegrateD Engineering Harness Avionics and Software), an advanced tool currently used in Thales Alenia Space - Italy in several space programs and recently enhanced to fully support FDIR (Fault Detection Isolation and Recovery) analysis. The FDIR analysis logic flow will be presented, emphasizing the improvements offered to Mission Support & Operations activities. Finally the benefits provided to the Company and a list of possible future enhancements will be given.

  14. A mobile data collection tool for workflow analysis.

    PubMed

    Moss, Jacqueline; Berner, Eta S; Savell, Kathy

    2007-01-01

    Faulty exchange and impaired access to clinical information is a major contributing factor to the incidence of medical error and occurrence of adverse events. Traditional methods utilized for systems analysis and information technology design fail to capture the nature of information use in highly dynamic healthcare environments. This paper describes a study designed to identify information task components in a cardiovascular intensive care unit and the development of an observational data collection tool to characterize the use of information in this environment. Direct observation can be a time-consuming process and without easy to use, reliable and valid methods of documentation, may not be reproducible across observers or settings. The following attributes were found to be necessary components for the characterization of information tasks in this setting: purpose, action, role, target, mode, and duration. The identified information task components were incorporated into the design of an electronic data collection tool to allow coding of information tasks. The reliability and validity of this tool in practice is discussed and an illustration of observational data output is provided. PMID:17911676

  15. Incorporating Load Balancing Spatial Analysis Into Xml-Based Webgis

    NASA Astrophysics Data System (ADS)

    Huang, H.

    2012-07-01

    This article aims to introduce load balancing spatial analysis into XML-based WebGIS. In contrast to other approaches that implement spatial queries and analyses solely on server or browser sides, load balancing spatial analysis carries out spatial analysis on either the server or the browser sides depending on the execution costs (i.e., network transmission costs and computational costs). In this article, key elements of load balancing middlewares are investigated, and relevant solution is proposed. The comparison with server-side solution, browse-side solution, and our former solution shows that the proposed solution can optimize the execution of spatial analysis, greatly ease the network transmission load between the server and the browser sides, and therefore lead to a better performance. The proposed solution enables users to access high-performance spatial analysis simply via a web browser.

  16. Zinc is incorporated into cuticular "tools" after ecdysis: the time course of the zinc distribution in "tools" and whole bodies of an ant and a scorpion.

    PubMed

    Schofield, R M S; Nesson, M H; Richardson, K A; Wyeth, P

    2003-01-01

    An understanding of the developmental course of specialized accumulations in the cuticular "tools" of arthropods will give clues to the chemical form, function and biology of these accumulations as well as to their evolutionary history. Specimens from individuals representing a range of developmental stages were examined using MeV - Ion microscopy. We found that zinc, manganese, calcium and chlorine began to accumulate in the mandibular teeth of the ant Tapinoma sessile after pre-ecdysial tanning, and the zinc mostly after eclosion; peak measured zinc concentrations reached 16% of dry mass. Accumulations in the pedipalp teeth, tarsal claws, cheliceral teeth and sting (aculeus) of the scorpion Vaejovis spinigeris also began after pre-ecdysial tanning and more than 48 h after ecdysis of the second instars. Zinc may be deposited in the fully formed cuticle through a network of nanometer scale canals that we observed only in the metal bearing cuticle of both the ants and scorpions. In addition to the elemental analyses of cuticular "tools", quantitative distribution maps for whole ants were obtained. The zinc content of the mandibular teeth was a small fraction of, and independent of, the total body content of zinc. We did not find specialized storage sites that were depleted when zinc was incorporated into the mandibular teeth. The similarities in the time course of zinc, manganese and calcium deposition in the cuticular "tools" of the ant (a hexapod arthropod) and those of the scorpion (a chelicerate arthropod) contribute to the evidence suggesting that heavy metal-halogen fortification evolved before these groups diverged. PMID:12770014

  17. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  18. Automated Steel Cleanliness Analysis Tool (ASCAT)

    SciTech Connect

    Gary Casuccio; Michael Potter; Fred Schwerer; Dr. Richard J. Fruehan; Dr. Scott Story

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  19. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  20. CRITICA: coding region identification tool invoking comparative analysis

    NASA Technical Reports Server (NTRS)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  1. Tools4miRs – one place to gather all the tools for miRNA analysis

    PubMed Central

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-01-01

    Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626

  2. Simplified building energy analysis tool for architects

    NASA Astrophysics Data System (ADS)

    Chaisuparasmikul, Pongsak

    Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools

  3. A Distributed, Parallel Visualization and Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2007-12-01

    VisIt is an interactive parallel visualization and graphical analysis tool for viewing scientific date on UNIX and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-more » dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range.« less

  4. Multi-Mission Power Analysis Tool

    NASA Technical Reports Server (NTRS)

    Broderick, Daniel

    2011-01-01

    Multi-Mission Power Analysis Tool (MMPAT) Version 2 simulates spacecraft power generation, use, and storage in order to support spacecraft design, mission planning, and spacecraft operations. It can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. A user-friendly GUI (graphical user interface) makes it easy to use. Multiple deployments allow use on the desktop, in batch mode, or as a callable library. It includes detailed models of solar arrays, radioisotope thermoelectric generators, nickel-hydrogen and lithium-ion batteries, and various load types. There is built-in flexibility through user-designed state models and table-driven parameters.

  5. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  6. Canonical Correlation Analysis that Incorporates Measurement and Sampling Error Considerations.

    ERIC Educational Resources Information Center

    Thompson, Bruce; Daniel, Larry

    Multivariate methods are being used with increasing frequency in educational research because these methods control "experimentwise" error rate inflation, and because the methods best honor the nature of the reality to which the researcher wishes to generalize. This paper: explains the basic logic of canonical analysis; illustrates that canonical…

  7. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  8. ANALYSIS OF SOLVENT RECOVERED FROM WRIGHT INDUSTRIES, INCORPORATED TESTING

    SciTech Connect

    Poirier, M; Thomas Peters, T; Fernando Fondeur, F; Samuel Fink, S

    2007-01-11

    Washington Savannah River Company (WSRC) began designing and building a Modular Caustic Side Solvent Extraction (CSSX) Unit (MCU) at the Savannah River Site (SRS) to process liquid waste for an interim period. The MCU Project Team conducted testing of the contactors, coalescers, and decanters at Wright Industries, Incorporated (WII) in Nashville, Tennessee. That testing used MCU solvent and simulated SRS dissolved salt. Because of the value of the solvent, the MCU Project wishes to recover it for use in the MCU process in the H-Tank Farm. Following testing, WII recovered approximately 62 gallons of solvent (with entrained aqueous) and shipped it to SRS. The solvent arrived in two stainless steel drums. The MCU Project requested SRNL to analyze the solvent to determine whether it is suitable for use in the MCU Process. SRNL analyzed the solvent for Isopar{reg_sign} L by Gas Chromatography--Mass Spectroscopy (GC-MS), for Modifier and BOBCalixC6 by High Pressure Liquid Chromatography (HPLC), and for Isopar{reg_sign} L-to-Modifier ratio by Fourier-Transform Infrared (FTIR) spectroscopy. They also measured the solvent density gravimetrically and used that measurement to calculate the Isopar{reg_sign} L and Modifier concentration. The conclusions from this work are: (1) The constituents of the used WII solvent are collectively low in Isopar{reg_sign} L, most likely due to evaporation. This can be easily corrected through the addition of Isopar{reg_sign} L. (2) Compared to a sample of the WII Partial Solvent (without BOBCalixC6) archived before transfer to WII, the Reworked WII Solvent showed a significant improvement (i.e., nearly doubling) in the dispersion numbers for tests with simulated salt solution and with strip acid. Hence, the presence of the plasticizer impurity has no detrimental impact on phase separation. While there are no previous dispersion tests using the exact same materials, the results seem to indicate that the washing of the solvent gives a

  9. Robust Flutter Margin Analysis that Incorporates Flight Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Martin J.

    1998-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  10. Built Environment Analysis Tool: April 2013

    SciTech Connect

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  11. A Topography Analysis Incorporated Optimization Method for the Selection and Placement of Best Management Practices

    PubMed Central

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  12. ISSARS Aerosol Database : an Incorporation of Atmospheric Particles into a Universal Tool to Simulate Remote Sensing Instruments

    NASA Technical Reports Server (NTRS)

    Goetz, Michael B.

    2011-01-01

    The Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) entered its third and final year of development with an overall goal of providing a unified tool to simulate active and passive space borne atmospheric remote sensing instruments. These simulations focus on the atmosphere ranging from UV to microwaves. ISSARS handles all assumptions and uses various models on scattering and microphysics to fill the gaps left unspecified by the atmospheric models to create each instrument's measurements. This will help benefit mission design and reduce mission cost, create efficient implementation of multi-instrument/platform Observing System Simulation Experiments (OSSE), and improve existing models as well as new advanced models in development. In this effort, various aerosol particles are incorporated into the system, and a simulation of input wavelength and spectral refractive indices related to each spherical test particle(s) generate its scattering properties and phase functions. These atmospheric particles being integrated into the system comprise the ones observed by the Multi-angle Imaging SpectroRadiometer(MISR) and by the Multiangle SpectroPolarimetric Imager(MSPI). In addition, a complex scattering database generated by Prof. Ping Yang (Texas A&M) is also incorporated into this aerosol database. Future development with a radiative transfer code will generate a series of results that can be validated with results obtained by the MISR and MSPI instruments; nevertheless, test cases are simulated to determine the validity of various plugin libraries used to determine or gather the scattering properties of particles studied by MISR and MSPI, or within the Single-scattering properties of tri-axial ellipsoidal mineral dust particles database created by Prof. Ping Yang.

  13. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  14. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  15. Solar Array Verification Analysis Tool (SAVANT) Developed

    NASA Technical Reports Server (NTRS)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  16. PyRAT - python radiography analysis tool (u)

    SciTech Connect

    Temple, Brian A; Buescher, Kevin L; Armstrong, Jerawan C

    2011-01-14

    PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

  17. Pathway Analysis Incorporating Protein-Protein Interaction Networks Identified Candidate Pathways for the Seven Common Diseases.

    PubMed

    Lin, Peng-Lin; Yu, Ya-Wen; Chung, Ren-Hua

    2016-01-01

    Pathway analysis has become popular as a secondary analysis strategy for genome-wide association studies (GWAS). Most of the current pathway analysis methods aggregate signals from the main effects of single nucleotide polymorphisms (SNPs) in genes within a pathway without considering the effects of gene-gene interactions. However, gene-gene interactions can also have critical effects on complex diseases. Protein-protein interaction (PPI) networks have been used to define gene pairs for the gene-gene interaction tests. Incorporating the PPI information to define gene pairs for interaction tests within pathways can increase the power for pathway-based association tests. We propose a pathway association test, which aggregates the interaction signals in PPI networks within a pathway, for GWAS with case-control samples. Gene size is properly considered in the test so that genes do not contribute more to the test statistic simply due to their size. Simulation studies were performed to verify that the method is a valid test and can have more power than other pathway association tests in the presence of gene-gene interactions within a pathway under different scenarios. We applied the test to the Wellcome Trust Case Control Consortium GWAS datasets for seven common diseases. The most significant pathway is the chaperones modulate interferon signaling pathway for Crohn's disease (p-value = 0.0003). The pathway modulates interferon gamma, which induces the JAK/STAT pathway that is involved in Crohn's disease. Several other pathways that have functional implications for the seven diseases were also identified. The proposed test based on gene-gene interaction signals in PPI networks can be used as a complementary tool to the current existing pathway analysis methods focusing on main effects of genes. An efficient software implementing the method is freely available at http://puppi.sourceforge.net. PMID:27622767

  18. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    SciTech Connect

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  19. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings. PMID:18406925

  20. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    SciTech Connect

    Rath, Frank

    2008-05-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  1. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  2. MathWeb: a concurrent image analysis tool suite for multispectral data fusion

    NASA Astrophysics Data System (ADS)

    Achalakul, Tiranee; Haaland, Peter D.; Taylor, Stephen

    1999-03-01

    This paper describes a preliminary approach to the fusion of multi-spectral image data for the analysis of cervical cancer. The long-term goal of this research is to define spectral signatures and automatically detect cancer cell structures. The approach combines a multi-spectral microscope with an image analysis tool suite, MathWeb. The tool suite incorporates a concurrent Principal Component Transform (PCT) that is used to fuse the multi-spectral data. This paper describes the general approach and the concurrent PCT algorithm. The algorithm is evaluated from both the perspective of image quality and performance scalability.

  3. Knowledge base navigator facilitating regional analysis inter-tool communication.

    SciTech Connect

    Hampton, Jeffery Wade; Chael, Eric Paul; Hart, Darren M.; Merchant, Bion John; Chown, Matthew N.

    2004-08-01

    To make use of some portions of the National Nuclear Security Administration (NNSA) Knowledge Base (KB) for which no current operational monitoring applications were available, Sandia National Laboratories have developed a set of prototype regional analysis tools (MatSeis, EventID Tool, CodaMag Tool, PhaseMatch Tool, Dendro Tool, Infra Tool, etc.), and we continue to maintain and improve these. Individually, these tools have proven effective in addressing specific monitoring tasks, but collectively their number and variety tend to overwhelm KB users, so we developed another application - the KB Navigator - to launch the tools and facilitate their use for real monitoring tasks. The KB Navigator is a flexible, extensible java application that includes a browser for KB data content, as well as support to launch any of the regional analysis tools. In this paper, we will discuss the latest versions of KB Navigator and the regional analysis tools, with special emphasis on the new overarching inter-tool communication methodology that we have developed to make the KB Navigator and the tools function together seamlessly. We use a peer-to-peer communication model, which allows any tool to communicate with any other. The messages themselves are passed as serialized XML, and the conversion from Java to XML (and vice versa) is done using Java Architecture for XML Binding (JAXB).

  4. Spectral Analysis Tool 6.2 for Windows

    NASA Technical Reports Server (NTRS)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  5. ProMAT: protein microarray analysis tool

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

    2006-04-04

    Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

  6. Ball Bearing Analysis with the ORBIS Tool

    NASA Technical Reports Server (NTRS)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  7. Usage-Based Evolution of Visual Analysis Tools

    SciTech Connect

    Hetzler, Elizabeth G.; Rose, Stuart J.; McQuerry, Dennis L.; Medvick, Patricia A.

    2005-06-12

    Visual analysis tools have been developed to help people in many different domains more effectively explore, understand, and make decisions from their information. Challenges in making a successful tool include suitability within a user's work processes, and tradeoffs between analytic power and tool complexity, both of which impact ease of learning. This paper describes experience working with users to help them apply visual analysis tools in several different domains, and examples of how the tools evolved significantly to better match users' goals and processes.

  8. From Numerical Problem Solving to Model-Based Experimentation Incorporating Computer-Based Tools of Various Scales into the ChE Curriculum

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Cutlip, Michael B.; Brauner, Neima

    2009-01-01

    A continuing challenge to the undergraduate chemical engineering curriculum is the time-effective incorporation and use of computer-based tools throughout the educational program. Computing skills in academia and industry require some proficiency in programming and effective use of software packages for solving 1) single-model, single-algorithm…

  9. Tools for Knowledge Analysis, Synthesis, and Sharing

    NASA Astrophysics Data System (ADS)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  10. Friction analysis between tool and chip

    NASA Astrophysics Data System (ADS)

    Wang, Min; Xu, Binshi; Zhang, Jiaying; Dong, Shiyun

    2010-12-01

    The elastic-plasticity mechanics are applied to analyze the friction between tool and chip. According to the slip-line field theory, a series of theoretical formula and the friction coefficient is derived between the tool and chip. So the cutting process can be investigated. Based on the Orthogonal Cutting Model and the Mohr's circle stress, the cutting mechanism of the cladding and the surface integrity of machining can be studied.

  11. Friction analysis between tool and chip

    NASA Astrophysics Data System (ADS)

    Wang, Min; Xu, Binshi; Zhang, Jiaying; Dong, Shiyun

    2011-05-01

    The elastic-plasticity mechanics are applied to analyze the friction between tool and chip. According to the slip-line field theory, a series of theoretical formula and the friction coefficient is derived between the tool and chip. So the cutting process can be investigated. Based on the Orthogonal Cutting Model and the Mohr's circle stress, the cutting mechanism of the cladding and the surface integrity of machining can be studied.

  12. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    NASA Technical Reports Server (NTRS)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  13. Integrated Turbopump Thermo-Mechanical Design and Analysis Tools

    NASA Astrophysics Data System (ADS)

    Platt, Mike

    2002-07-01

    This viewgraph presentation provides information on the thermo-mechanical design and analysis tools used to control the steady and transient thermo-mechanical effects which drive life, reliability, and cost. The thermo-mechanical analysis tools provide upfront design capability by effectively leveraging existing component design tools to analyze and control: fits, clearance, preload; cooling requirements; stress levels, LCF (low cycle fatigue) limits, and HCF (high cycle fatigue) margin.

  14. Interactive Graphics Tools for Analysis of MOLA and Other Data

    NASA Technical Reports Server (NTRS)

    Frey, H.; Roark, J.; Sakimoto, S.

    2000-01-01

    We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

  15. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  16. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  17. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  18. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  19. Using Kepler for Tool Integration in Microarray Analysis Workflows

    PubMed Central

    Gan, Zhuohui; Stowe, Jennifer C.; Altintas, Ilkay; McCulloch, Andrew D.; Zambon, Alexander C.

    2015-01-01

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines. PMID:26605000

  20. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  1. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    NASA Technical Reports Server (NTRS)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  2. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

  3. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  4. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  5. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  6. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  7. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  8. Graphical Acoustic Liner Design and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  9. Incorporation of Data Analysis throughout the ChE Curriculum Made Easy with DataFit

    ERIC Educational Resources Information Center

    Brenner, James R.

    2007-01-01

    At Florida Tech, we have incorporated DataFit from Oakdale Engineering throughout the entire curriculum, beginning with ChE 1102, an eight-week, one-day-per-week, two-hour, one-credit-hour, second-semester Introduction to Chemical Engineering course in a hands-on computer classroom. Our experience is that students retain data analysis concepts…

  10. Incorporation of wind generation to the Mexican power grid: Steady state analysis

    SciTech Connect

    Tovar, J.H.; Guardado, J.L.; Cisneros, F.; Cadenas, R.; Lopez, S.

    1997-09-01

    This paper describes a steady state analysis related with the incorporation of large amounts of eolic generation into the Mexican power system. An equivalent node is used to represent individual eolic generators in the wind farm. Possible overloads, losses, voltage and reactive profiles and estimated severe contingencies are analyzed. Finally, the conclusions of this study are presented.

  11. Design tools for daylighting illumination and energy analysis

    SciTech Connect

    Selkowitz, S.

    1982-07-01

    The problems and potentials for using daylighting to provide illumination in building interiors are reviewed. It describes some of the design tools now or soon to be available for incorporating daylighting into the building design process. It also describes state-of-the-art methods for analyzing the impacts daylighting can have on selection of lighting controls, lighting energy consumption, heating and cooling loads, and peak power demand.

  12. New Access and Analysis Tools for Voyager LECP Data

    NASA Astrophysics Data System (ADS)

    Brown, L. E.; Hill, M. E.; Decker, R. B.; Cooper, J. F.; Krimigis, S. M.; Vandegriff, J. D.

    2008-12-01

    The Low Energy Charged Particle (LECP) instruments on the Voyager 1 and 2 spacecraft have been returning unique scientific measurements since launching in 1977, most notably observations from the historic tour of the giant planets. As these spacecraft continue on their exit trajectories from the Solar system they have become an interstellar mission and have begun to probe the boundary between the heliosphere and the interstellar cloud and continue to make exciting discoveries. As the mission changed from one focused on discrete encounters to an open ended search for heliospheric boundaries and transitory disturbances, the positions and timing of which are not known, the data processing needs have changed. Open data policies and the push to draw data under the umbrella of emerging Virtual Observatories have added a data sharing component that was not a part of the original mission plans. We present our work in utilizing new, reusable software analysis tools to access legacy data in a way that leverages pre-existing data analysis techniques. We took an existing Applied Physics Laboratory application, Mission Independent Data Layer (MIDL) -- developed originally under a NASA Applied Information Research Program (AISRP) and subsequently used with data from Geotail, Cassini, IMP-8, ACE, Messenger, and New Horizons -- and applied it to Voyager data. We use the MIDL codebase to automatically generate standard data products such as daily summary plots and associated tabulated data that increase our ability to monitor the heliospheric environment on a regular basis. These data products will be publicly available and updated automatically and can be analyzed by the community using the ultra portable MIDL software launched from the data distribution website. The currently available LECP data will also be described with SPASE metadata and incorporated into the emerging Virtual Energetic Particle Observatory (VEPO).

  13. SPLAT-VO: Spectral Analysis Tool for the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Castro-Neves, Margarida; Draper, Peter W.

    2014-02-01

    SPLAT-VO is an extension of the SPLAT (Spectral Analysis Tool, ascl:1402.007) graphical tool for displaying, comparing, modifying and analyzing astronomical spectra; it includes facilities that allow it to work as part of the Virtual Observatory (VO). SPLAT-VO comes in two different forms, one for querying and downloading spectra from SSAP servers and one for interoperating with VO tools, such as TOPCAT (ascl:1101.010).

  14. Klonos: A Similarity Analysis Based Tool for Software Porting

    Energy Science and Technology Software Center (ESTSC)

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  15. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2014-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Initial results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  16. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2013-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Preliminary results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  17. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  18. Incorporation of Uncertainty and Variability of Drip Shield and Waste Package Degradation in WAPDEG Analysis

    SciTech Connect

    J.C. Helton

    2000-04-19

    This presentation investigates the incorporation of uncertainty and variability of drip shield and waste package degradation in analyses with the Waste Package Degradation (WAPDEG) program (CRWMS M&O 1998). This plan was developed in accordance with Development Plan TDP-EBS-MD-000020 (CRWMS M&O 1999a). Topics considered include (1) the nature of uncertainty and variability (Section 6.1), (2) incorporation of variability and uncertainty into analyses involving individual patches, waste packages, groups of waste packages, and the entire repository (Section 6.2), (3) computational strategies (Section 6.3), (4) incorporation of multiple waste package layers (i.e., drip shield, Alloy 22, and stainless steel) into an analysis (Section 6.4), (5) uncertainty in the characterization of variability (Section 6.5), and (6) Gaussian variance partitioning (Section 6.6). The presentation ends with a brief concluding discussion (Section 7).

  19. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  20. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    ERIC Educational Resources Information Center

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  1. JAVA based LCD Reconstruction and Analysis Tools

    SciTech Connect

    Bower, G.

    2004-10-11

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

  2. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  3. HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL

    EPA Science Inventory

    An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

  4. Analysis tools for turbulence studies at Alcator C-Mod

    NASA Astrophysics Data System (ADS)

    Burns, C.; Shehata, S.; White, A. E.; Cziegler, I.; Dominguez, A.; Terry, J. L.; Pace, D. C.

    2010-11-01

    A new suite of analysis tools written in IDL is being developed to support experimental investigation of turbulence at Alcator C-Mod. The tools include GUIs for spectral analysis (coherence, cross-phase and bicoherence) and characteristic frequency calculations. A user-friendly interface for the GENRAY code, to facilitate in-between shot ray-tracing analysis, is also being developed. The spectral analysis tool is being used to analyze data from existing edge turbulence diagnostics, such as the O-mode correlation reflectometer and Gas Puff Imaging, during I-mode, ITB and EDA H-mode plasmas. GENRAY and the characteristic frequency tool are being used to study diagnostic accessibility limits set by wave propagation and refraction for X-mode Doppler Backscattering and Correlation Electron Cyclotron Emission (CECE) systems that are being planned for core turbulence studies at Alcator C-Mod.

  5. Breads enriched with guava flour as a tool for studying the incorporation of phenolic compounds in bread melanoidins.

    PubMed

    Alves, Genilton; Perrone, Daniel

    2015-10-15

    In the present study we aimed at studying, for the first time, the incorporation of phenolic compounds into bread melanoidins. Fermentation significantly affected the phenolics profile of bread doughs. Melanoidins contents continuously increased from 24.1 mg/g to 71.9 mg/g during baking, but their molecular weight decreased at the beginning of the process and increased thereafter. Enrichment of white wheat bread with guava flour increased the incorporation of phenolic compounds up to 2.4-fold. Most phenolic compounds showed higher incorporation than release rates during baking, leading to increases from 3.3- to 13.3-fold in total melanoidin-bound phenolics. Incorporation patterns suggested that phenolic hydroxyls, but not glycosidic bonds of melanoidin-bound phenolics are cleaved during thermal processing. Antioxidant capacity of bread melanoidins increased due to enrichment with guava flour and increasing baking periods and was partially attributed to bound phenolics. Moreover, FRAP assay was more sensitive to measure this parameter than TEAC assay. PMID:25952842

  6. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  7. Incorporation of Multi-Member Substructure Capabilities in FAST for Analysis of Offshore Wind Turbines: Preprint

    SciTech Connect

    Song, H.; Robertson, A.; Jonkman, J.; Sewell, D.

    2012-05-01

    FAST, developed by the National Renewable Energy Laboratory (NREL), is an aero-hydro-servo-elastic tool widely used for analyzing onshore and offshore wind turbines. This paper discusses recent modifications made to FAST to enable the examination of offshore wind turbines with fixed-bottom, multi-member support structures (which are commonly used in transitional-depth waters).; This paper addresses the methods used for incorporating the hydrostatic and hydrodynamic loading on multi-member structures in FAST through its hydronamic loading module, HydroDyn. Modeling of the hydrodynamic loads was accomplished through the incorporation of Morison and buoyancy loads on the support structures. Issues addressed include how to model loads at the joints of intersecting members and on tapered and tilted members of the support structure. Three example structures are modeled to test and verify the solutions generated by the modifications to HydroDyn, including a monopile, tripod, and jacket structure. Verification is achieved through comparison of the results to a computational fluid dynamics (CFD)-derived solution using the commercial software tool STAR-CCM+.

  8. Capabilities of the analysis tools of the IMPEx infrastructure

    NASA Astrophysics Data System (ADS)

    Génot, V.; Khodachenko, M. L.; Kallio, E. J.; Topf, F.; Al-Ubaidi, T.; Gangloff, M.; Budnik, E.; Bouchemit, M.; Renard, B.; Bourel, N.; Penou, E.; André, N.; Modolo, R.; Hess, S.; Schmidt, W.; Alexeev, I. I.; Belenkaya, E. S.

    2012-09-01

    The EU-FP7 Project "Integrated Medium for Planetary Exploration" was established as a result of scientific collaboration between institutions across Europe and is working on the integration of a set of interactive data analysis and modeling tools in the field of space plasma and planetary physics. According to [1] these tools are comprised of AMDA, Clweb and 3DView from the data analysis and visualisation sector as well as Hybrid/MHD and Paraboloid magnetospheric models from the simulation sector. This presentation focuses on how these various tools will access observational and modeled data and display them in innovative and interactive ways.

  9. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  10. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    SciTech Connect

    Bush, B.; Penev, M.; Melaina, M.; Zuboy, J.

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  11. Opportunities and strategies to incorporate ecosystem services knowledge and decision support tools into planning and decision making in Hawai'i.

    PubMed

    Bremer, Leah L; Delevaux, Jade M S; Leary, James J K; J Cox, Linda; Oleson, Kirsten L L

    2015-04-01

    Incorporating ecosystem services into management decisions is a promising means to link conservation and human well-being. Nonetheless, planning and management in Hawai'i, a state with highly valued natural capital, has yet to broadly utilize an ecosystem service approach. We conducted a stakeholder assessment, based on semi-structured interviews, with terrestrial (n = 26) and marine (n = 27) natural resource managers across the State of Hawai'i to understand the current use of ecosystem services (ES) knowledge and decision support tools and whether, how, and under what contexts, further development would potentially be useful. We found that ES knowledge and tools customized to Hawai'i could be useful for communication and outreach, justifying management decisions, and spatial planning. Greater incorporation of this approach is clearly desired and has a strong potential to contribute to more sustainable decision making and planning in Hawai'i and other oceanic island systems. However, the unique biophysical, socio-economic, and cultural context of Hawai'i, and other island systems, will require substantial adaptation of existing ES tools. Based on our findings, we identified four key opportunities for the use of ES knowledge and tools in Hawai'i: (1) linking native forest protection to watershed health; (2) supporting sustainable agriculture; (3) facilitating ridge-to-reef management; and (4) supporting statewide terrestrial and marine spatial planning. Given the interest expressed by natural resource managers, we envision broad adoption of ES knowledge and decision support tools if knowledge and tools are tailored to the Hawaiian context and coupled with adequate outreach and training. PMID:25651801

  12. Opportunities and Strategies to Incorporate Ecosystem Services Knowledge and Decision Support Tools into Planning and Decision Making in Hawai`i

    NASA Astrophysics Data System (ADS)

    Bremer, Leah L.; Delevaux, Jade M. S.; Leary, James J. K.; J. Cox, Linda; Oleson, Kirsten L. L.

    2015-04-01

    Incorporating ecosystem services into management decisions is a promising means to link conservation and human well-being. Nonetheless, planning and management in Hawai`i, a state with highly valued natural capital, has yet to broadly utilize an ecosystem service approach. We conducted a stakeholder assessment, based on semi-structured interviews, with terrestrial ( n = 26) and marine ( n = 27) natural resource managers across the State of Hawai`i to understand the current use of ecosystem services (ES) knowledge and decision support tools and whether, how, and under what contexts, further development would potentially be useful. We found that ES knowledge and tools customized to Hawai`i could be useful for communication and outreach, justifying management decisions, and spatial planning. Greater incorporation of this approach is clearly desired and has a strong potential to contribute to more sustainable decision making and planning in Hawai`i and other oceanic island systems. However, the unique biophysical, socio-economic, and cultural context of Hawai`i, and other island systems, will require substantial adaptation of existing ES tools. Based on our findings, we identified four key opportunities for the use of ES knowledge and tools in Hawai`i: (1) linking native forest protection to watershed health; (2) supporting sustainable agriculture; (3) facilitating ridge-to-reef management; and (4) supporting statewide terrestrial and marine spatial planning. Given the interest expressed by natural resource managers, we envision broad adoption of ES knowledge and decision support tools if knowledge and tools are tailored to the Hawaiian context and coupled with adequate outreach and training.

  13. A 3D image analysis tool for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  14. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  15. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  16. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    Energy Science and Technology Software Center (ESTSC)

    2015-10-20

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projections screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals.more » SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position. This is key to evaluating a facet as suitable for a specific solar application. SOFAST reports the measurements of the facet as detailed surface normal location in a format suitable for ray tracing optical analysis codes. SOFAST also reports summary information as to the facet fitted shape (monomial) and error parameters. Useful plots of the error distribution are also presented.« less

  17. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    SciTech Connect

    Andraka, Charles E.

    2015-10-20

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projections screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals. SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position. This is key to evaluating a facet as suitable for a specific solar application. SOFAST reports the measurements of the facet as detailed surface normal location in a format suitable for ray tracing optical analysis codes. SOFAST also reports summary information as to the facet fitted shape (monomial) and error parameters. Useful plots of the error distribution are also presented.

  18. RCytoscape: tools for exploratory network analysis

    PubMed Central

    2013-01-01

    Background Biomolecular pathways and networks are dynamic and complex, and the perturbations to them which cause disease are often multiple, heterogeneous and contingent. Pathway and network visualizations, rendered on a computer or published on paper, however, tend to be static, lacking in detail, and ill-equipped to explore the variety and quantities of data available today, and the complex causes we seek to understand. Results RCytoscape integrates R (an open-ended programming environment rich in statistical power and data-handling facilities) and Cytoscape (powerful network visualization and analysis software). RCytoscape extends Cytoscape's functionality beyond what is possible with the Cytoscape graphical user interface. To illustrate the power of RCytoscape, a portion of the Glioblastoma multiforme (GBM) data set from the Cancer Genome Atlas (TCGA) is examined. Network visualization reveals previously unreported patterns in the data suggesting heterogeneous signaling mechanisms active in GBM Proneural tumors, with possible clinical relevance. Conclusions Progress in bioinformatics and computational biology depends upon exploratory and confirmatory data analysis, upon inference, and upon modeling. These activities will eventually permit the prediction and control of complex biological systems. Network visualizations -- molecular maps -- created from an open-ended programming environment rich in statistical power and data-handling facilities, such as RCytoscape, will play an essential role in this progression. PMID:23837656

  19. GATB: Genome Assembly & Analysis Tool Box

    PubMed Central

    Drezen, Erwan; Rizk, Guillaume; Chikhi, Rayan; Deltel, Charles; Lemaitre, Claire; Peterlongo, Pierre; Lavenier, Dominique

    2014-01-01

    Motivation: Efficient and fast next-generation sequencing (NGS) algorithms are essential to analyze the terabytes of data generated by the NGS machines. A serious bottleneck can be the design of such algorithms, as they require sophisticated data structures and advanced hardware implementation. Results: We propose an open-source library dedicated to genome assembly and analysis to fasten the process of developing efficient software. The library is based on a recent optimized de-Bruijn graph implementation allowing complex genomes to be processed on desktop computers using fast algorithms with low memory footprints. Availability and implementation: The GATB library is written in C++ and is available at the following Web site http://gatb.inria.fr under the A-GPL license. Contact: lavenier@irisa.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24990603

  20. Serial concept maps: tools for concept analysis.

    PubMed

    All, Anita C; Huycke, LaRae I

    2007-05-01

    Nursing theory challenges students to think abstractly and is often a difficult introduction to graduate study. Traditionally, concept analysis is useful in facilitating this abstract thinking. Concept maps are a way to visualize an individual's knowledge about a specific topic. Serial concept maps express the sequential evolution of a student's perceptions of a selected concept. Maps reveal individual differences in learning and perceptions, as well as progress in understanding the concept. Relationships are assessed and suggestions are made during serial mapping, which actively engages the students and faculty in dialogue that leads to increased understanding of the link between nursing theory and practice. Serial concept mapping lends itself well to both online and traditional classroom environments. PMID:17547345

  1. Nonlinear Robustness Analysis Tools for Flight Control Law Validation & Verification

    NASA Astrophysics Data System (ADS)

    Chakraborty, Abhijit

    Loss of control in flight is among the highest aviation accident categories for both the number of accidents and the number of fatalities. The flight controls community is seeking an improved validation tools for safety critical flight control systems. Current validation tools rely heavily on linear analysis, which ignore the inherent nonlinear nature of the aircraft dynamics and flight control system. Specifically, current practices in validating the flight control system involve gridding the flight envelope and checking various criteria based on linear analysis to ensure safety of the flight control system. The analysis and certification methods currently applied assume the aircrafts' dynamics is linear. In reality, the behavior of the aircraft is always nonlinear due to its aerodynamic characteristics and physical limitations imposed by the actuators. This thesis develops nonlinear analysis tools capable of certifying flight control laws for nonlinear aircraft dynamics. The proposed analysis tools can handle both the aerodynamic nonlinearities and the physical limitations imposed by the actuators in the aircrafts' dynamics. This proposed validation technique will extend and enrich the predictive capability of existing flight control law validation methods to analyze nonlinearities. The objective of this thesis is to provide the flight control community with an advanced set of analysis tools to reduce aviation fatalities and accidents rate.

  2. Pre-Steady-State Kinetic Analysis of Single-Nucleotide Incorporation by DNA Polymerases.

    PubMed

    Su, Yan; Peter Guengerich, F

    2016-01-01

    Pre-steady-state kinetic analysis is a powerful and widely used method to obtain multiple kinetic parameters. This protocol provides a step-by-step procedure for pre-steady-state kinetic analysis of single-nucleotide incorporation by a DNA polymerase. It describes the experimental details of DNA substrate annealing, reaction mixture preparation, handling of the RQF-3 rapid quench-flow instrument, denaturing polyacrylamide DNA gel preparation, electrophoresis, quantitation, and data analysis. The core and unique part of this protocol is the rationale for preparation of the reaction mixture (the ratio of the polymerase to the DNA substrate) and methods for conducting pre-steady-state assays on an RQF-3 rapid quench-flow instrument, as well as data interpretation after analysis. In addition, the methods for the DNA substrate annealing and DNA polyacrylamide gel preparation, electrophoresis, quantitation and analysis are suitable for use in other studies. © 2016 by John Wiley & Sons, Inc. PMID:27248785

  3. Parachute system design, analysis, and simulation tool. Status report

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-12-31

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  4. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. PMID:26572119

  5. Development of data analysis tool for combat system integration

    NASA Astrophysics Data System (ADS)

    Shin, Seung-Chun; Shin, Jong-Gye; Oh, Dae-Kyun

    2013-03-01

    System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT) for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  6. FORENSIC ANALYSIS OF WINDOW’S® VIRTUAL MEMORY INCORPORATING THE SYSTEM’S PAGEFILE COUNTERINTELLIGENCE THROUGH MALICIOUS CODE ANALYSIS

    SciTech Connect

    Jared Stimson Edward Murphy

    2007-06-01

    FORENSIC ANALYSIS OF WINDOW’S® VIRTUAL MEMORY INCORPORATING THE SYSTEM’S PAGEFILE Computer Forensics is concerned with the use of computer investigation and analysis techniques in order to collect evidence suitable for presentation in court. The examination of volatile memory is a relatively new but important area in computer forensics. More recently criminals are becoming more forensically aware and are now able to compromise computers without accessing the hard disk of the target computer. This means that traditional incident response practice of pulling the plug will destroy the only evidence of the crime. While some techniques are available for acquiring the contents of main memory, few exist which can analyze these data in a meaningful way. One reason for this is how memory is managed by the operating system. Data belonging to one process can be distributed arbitrarily across physical memory or the hard disk, making it very difficult to recover useful information. This report will focus on how these disparate sources of information can be combined to give a single, contiguous address space for each process. Using address translation a tool is developed to reconstruct the virtual address space of a process by combining a physical memory dump with the page-file on the hard disk. COUNTERINTELLIGENCE THROUGH MALICIOUS CODE ANALYSIS As computer network technology continues to grow so does the reliance on this technology for everyday business functionality. To appeal to customers and employees alike, businesses are seeking an increased online prescience, and to increase productivity the same businesses are computerizing their day-to-day operations. The combination of a publicly accessible interface to the businesses network, and the increase in the amount of intellectual property present on these networks presents serious risks. All of this intellectual property now faces constant attacks from a wide variety of malicious software that is intended to uncover

  7. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  8. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has began using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. ASSESS analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. Its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weaknesses of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weaknesses, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  9. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    PubMed

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis. PMID:27391182

  10. Development of a climate data analysis tool (CDAT)

    SciTech Connect

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  11. Free software tools for atlas-based volumetric neuroimage analysis

    NASA Astrophysics Data System (ADS)

    Bazin, Pierre-Louis; Pham, Dzung L.; Gandler, William; McAuliffe, Matthew

    2005-04-01

    We describe new and freely available software tools for measuring volumes in subregions of the brain. The method is fast, flexible, and employs well-studied techniques based on the Talairach-Tournoux atlas. The software tools are released as plug-ins for MIPAV, a freely available and user-friendly image analysis software package developed by the National Institutes of Health. Our software tools include a digital Talairach atlas that consists of labels for 148 different substructures of the brain at various scales.

  12. A Semi-Automated Functional Test Data Analysis Tool

    SciTech Connect

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  13. The antibody mining toolbox: an open source tool for the rapid analysis of antibody repertoires.

    PubMed

    D'Angelo, Sara; Glanville, Jacob; Ferrara, Fortunato; Naranjo, Leslie; Gleasner, Cheryl D; Shen, Xiaohong; Bradbury, Andrew R M; Kiss, Csaba

    2014-01-01

    In vitro selection has been an essential tool in the development of recombinant antibodies against various antigen targets. Deep sequencing has recently been gaining ground as an alternative and valuable method to analyze such antibody selections. The analysis provides a novel and extremely detailed view of selected antibody populations, and allows the identification of specific antibodies using only sequencing data, potentially eliminating the need for expensive and laborious low-throughput screening methods such as enzyme-linked immunosorbant assay. The high cost and the need for bioinformatics experts and powerful computer clusters, however, have limited the general use of deep sequencing in antibody selections. Here, we describe the AbMining ToolBox, an open source software package for the straightforward analysis of antibody libraries sequenced by the three main next generation sequencing platforms (454, Ion Torrent, MiSeq). The ToolBox is able to identify heavy chain CDR3s as effectively as more computationally intense software, and can be easily adapted to analyze other portions of antibody variable genes, as well as the selection outputs of libraries based on different scaffolds. The software runs on all common operating systems (Microsoft Windows, Mac OS X, Linux), on standard personal computers, and sequence analysis of 1-2 million reads can be accomplished in 10-15 min, a fraction of the time of competing software. Use of the ToolBox will allow the average researcher to incorporate deep sequence analysis into routine selections from antibody display libraries. PMID:24423623

  14. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  15. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  16. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  17. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    PubMed

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  18. Incorporation of Photosynthetic Reaction Centers in the Membrane of Human Cells: Toward a New Tool for Optical Control of Cell Activity

    SciTech Connect

    Pennisi, Cristian P.; Jensen, Poul Erik; Zachar, Vladimir; Greenbaum, Elias; Yoshida, Ken

    2009-01-01

    The Photosystem I (PSI) reaction center is a photosynthetic membrane complex in which light-induced charge separation is accompanied by the generation of an electric potential. It has been recently proposed as a means to confer light sensitivity to cells possessing voltage-activated ion channels, but the feasibility of heterologous incorporation has not been demonstrated. In this work, methods of delivery and detection of PSI in the membrane of human cells are presented. Purified fractions of PSI were reconstituted in proteoliposomes that were used as vehicles for the membrane incorporation. A fluorescent impermeable dye was entrapped in the vesicles to qualitatively analyze the nature of the vesicle cell interaction. After incorporation, the localization and orientation of the complexes in the membrane was studied using immuno-fluorescence microscopy. The results showed complexes oriented as in native membranes, which were randomly distributed in clusters over the entire surface of the cell. Additionally, analysis of cell viability showed that the incorporation process does not damage the cell membrane. Taken together, the results of this work suggest that the mammalian cellular membrane is a reasonable environment for the incorporation of PSI complexes, which opens the possibility of using these molecular photovoltaic structures for optical control of cell activity.

  19. Design and analysis tools for concurrent blackboard systems

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.

    1991-01-01

    A set of blackboard system design and analysis tools that consists of a knowledge source organizer, a knowledge source input/output connectivity analyzer, and a validated blackboard system simulation model is discussed. The author presents the structure and functionality of the knowledge source input/output connectivity analyzer. An example outlining the use of the analyzer to aid in the design of a concurrent tactical decision generator for air-to-air combat is presented. The blackboard system design and analysis tools were designed for generic blackboard systems and are application independent.

  20. GASP: A Performance Analysis Tool Interface for Global AddressSpace Programming Models, Version 1.5

    SciTech Connect

    Leko, Adam; Bonachea, Dan; Su, Hung-Hsun; George, Alan D.; Sherburne, Hans; George, Alan D.

    2006-09-14

    Due to the wide range of compilers and the lack of astandardized performance tool interface, writers of performance toolsface many challenges when incorporating support for global address space(GAS) programming models such as Unified Parallel C (UPC), Titanium, andCo-Array Fortran (CAF). This document presents a Global Address SpacePerformance tool interface (GASP) that is flexible enough to be adaptedinto current global address space compiler and runtime infrastructureswith little effort, while allowing performance analysis tools to gathermuch information about the performance of global address spaceprograms.

  1. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  2. Interoperability of the analysis tools within the IMPEx project

    NASA Astrophysics Data System (ADS)

    Génot, Vincent; Khodachenko, Maxim; Kallio, Esa; Al-Ubaidi, Tarek; Gangloff, Michel; Budnik, Elena; Bouchemit, Myriam; Renard, Benjamin; Bourel, Natacha; Modolo, Ronan; Hess, Sébastien; André, Nicolas; Penou, Emmanuel; Topf, Florian; Alexeev, Igor; Belenkaya, Elena; Kalegaev, Vladimir; Schmidt, Walter

    2013-04-01

    The growing amount of data in planetary sciences requires adequate tools for visualisation enabling in depth analysis. Within the FP7 IMPEx infrastructure data will originate from heterogeneous sources : large observational databases (CDAWeb, AMDA at CDPP, ...), simulation databases for hybrid and MHD codes (FMI, LATMOS), planetary magnetic field models database and online services (SINP). Together with the common "time series" visualisation functionality for both in-situ and modeled data (provided by AMDA and CLWeb tools), IMPEx will also provide immersion capabilities into the complex 3D data originating from models (provided by 3DView). The functionalities of these tools will be described. The emphasis will be put on how these tools 1/ can share information (for instance Time Tables or user composed parameters) and 2/ be operated synchronously via dynamic connections based on Virtual Observatory standards.

  3. Parallel Analysis Tools for Ultra-Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

    2013-04-01

    While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

  4. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  5. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    PubMed

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  6. Dipole estimation errors due to not incorporating anisotropic conductivities in realistic head models for EEG source analysis

    NASA Astrophysics Data System (ADS)

    Hallez, Hans; Staelens, Steven; Lemahieu, Ignace

    2009-10-01

    EEG source analysis is a valuable tool for brain functionality research and for diagnosing neurological disorders, such as epilepsy. It requires a geometrical representation of the human head or a head model, which is often modeled as an isotropic conductor. However, it is known that some brain tissues, such as the skull or white matter, have an anisotropic conductivity. Many studies reported that the anisotropic conductivities have an influence on the calculated electrode potentials. However, few studies have assessed the influence of anisotropic conductivities on the dipole estimations. In this study, we want to determine the dipole estimation errors due to not taking into account the anisotropic conductivities of the skull and/or brain tissues. Therefore, head models are constructed with the same geometry, but with an anisotropically conducting skull and/or brain tissue compartment. These head models are used in simulation studies where the dipole location and orientation error is calculated due to neglecting anisotropic conductivities of the skull and brain tissue. Results show that not taking into account the anisotropic conductivities of the skull yields a dipole location error between 2 and 25 mm, with an average of 10 mm. When the anisotropic conductivities of the brain tissues are neglected, the dipole location error ranges between 0 and 5 mm. In this case, the average dipole location error was 2.3 mm. In all simulations, the dipole orientation error was smaller than 10°. We can conclude that the anisotropic conductivities of the skull have to be incorporated to improve the accuracy of EEG source analysis. The results of the simulation, as presented here, also suggest that incorporation of the anisotropic conductivities of brain tissues is not necessary. However, more studies are needed to confirm these suggestions.

  7. An Automated Data Analysis Tool for Livestock Market Data

    ERIC Educational Resources Information Center

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  8. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    SciTech Connect

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  9. The Adversarial Route Analysis Tool: A Web Application

    SciTech Connect

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  10. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  11. Recursive Frame Analysis: A Practitioner's Tool for Mapping Therapeutic Conversation

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford; Chenail, Ronald J.

    2012-01-01

    Recursive frame analysis (RFA), both a practical therapeutic tool and an advanced qualitative research method that maps the structure of therapeutic conversation, is introduced with a clinical case vignette. We present and illustrate a means of mapping metaphorical themes that contextualize the performance taking place in the room, recursively…

  12. Discovery and New Frontiers Project Budget Analysis Tool

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  13. V-Lab{trademark}: Virtual laboratories -- The analysis tool for structural analysis of composite components

    SciTech Connect

    1999-07-01

    V-Lab{trademark}, an acronym for Virtual Laboratories, is a design and analysis tool for fiber-reinforced composite components. This program allows the user to perform analysis, numerical experimentation, and design prototyping using advanced composite stress and failure analysis tools. The software was designed to be intuitive and easy to use, even by designers who are not experts in composite materials or structural analysis. V-Lab{trademark} is the software tool every specialist in design engineering, structural analysis, research and development and repair needs to perform accurate, fast and economical analysis of composite components.

  14. LEGO: a novel method for gene set over-representation analysis by incorporating network-based gene weights.

    PubMed

    Dong, Xinran; Hao, Yun; Wang, Xiao; Tian, Weidong

    2016-01-01

    Pathway or gene set over-representation analysis (ORA) has become a routine task in functional genomics studies. However, currently widely used ORA tools employ statistical methods such as Fisher's exact test that reduce a pathway into a list of genes, ignoring the constitutive functional non-equivalent roles of genes and the complex gene-gene interactions. Here, we develop a novel method named LEGO (functional Link Enrichment of Gene Ontology or gene sets) that takes into consideration these two types of information by incorporating network-based gene weights in ORA analysis. In three benchmarks, LEGO achieves better performance than Fisher and three other network-based methods. To further evaluate LEGO's usefulness, we compare LEGO with five gene expression-based and three pathway topology-based methods using a benchmark of 34 disease gene expression datasets compiled by a recent publication, and show that LEGO is among the top-ranked methods in terms of both sensitivity and prioritization for detecting target KEGG pathways. In addition, we develop a cluster-and-filter approach to reduce the redundancy among the enriched gene sets, making the results more interpretable to biologists. Finally, we apply LEGO to two lists of autism genes, and identify relevant gene sets to autism that could not be found by Fisher. PMID:26750448

  15. LEGO: a novel method for gene set over-representation analysis by incorporating network-based gene weights

    PubMed Central

    Dong, Xinran; Hao, Yun; Wang, Xiao; Tian, Weidong

    2016-01-01

    Pathway or gene set over-representation analysis (ORA) has become a routine task in functional genomics studies. However, currently widely used ORA tools employ statistical methods such as Fisher’s exact test that reduce a pathway into a list of genes, ignoring the constitutive functional non-equivalent roles of genes and the complex gene-gene interactions. Here, we develop a novel method named LEGO (functional Link Enrichment of Gene Ontology or gene sets) that takes into consideration these two types of information by incorporating network-based gene weights in ORA analysis. In three benchmarks, LEGO achieves better performance than Fisher and three other network-based methods. To further evaluate LEGO’s usefulness, we compare LEGO with five gene expression-based and three pathway topology-based methods using a benchmark of 34 disease gene expression datasets compiled by a recent publication, and show that LEGO is among the top-ranked methods in terms of both sensitivity and prioritization for detecting target KEGG pathways. In addition, we develop a cluster-and-filter approach to reduce the redundancy among the enriched gene sets, making the results more interpretable to biologists. Finally, we apply LEGO to two lists of autism genes, and identify relevant gene sets to autism that could not be found by Fisher. PMID:26750448

  16. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  17. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  18. A dataflow analysis tool for parallel processing of algorithms

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  19. The Cube Analysis and Rendering Tool for Astronomy

    NASA Astrophysics Data System (ADS)

    Rosolowsky, E.; Kern, J.; Federl, P.; Jacobs, J.; Loveland, S.; Taylor, J.; Sivakoff, G.; Taylor, R.

    2015-09-01

    We present the design principles and current status of the Cube Analysis and Rendering Tool for Astronomy (CARTA). The CARTA project is designing a cube visualization tool for the Atacama Large Millimetre/submillimeter array. CARTA will join the domain-specific software already developed for millimetre-wave interferometry with sever-side visualization solution. This connection will enable archive-hosted exploration of three-dimensional data cubes. CARTA will also provide an indistinguishable desktop client. While such a goal is ambitious for a short project, the team is focusing on a well-developed framework which can readily accommodate community code development through plugins.

  20. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  1. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  2. WholePathwayScope: a comprehensive pathway-based analysis tool for high-throughput data

    PubMed Central

    Yi, Ming; Horton, Jay D; Cohen, Jonathan C; Hobbs, Helen H; Stephens, Robert M

    2006-01-01

    Background Analysis of High Throughput (HTP) Data such as microarray and proteomics data has provided a powerful methodology to study patterns of gene regulation at genome scale. A major unresolved problem in the post-genomic era is to assemble the large amounts of data generated into a meaningful biological context. We have developed a comprehensive software tool, WholePathwayScope (WPS), for deriving biological insights from analysis of HTP data. Result WPS extracts gene lists with shared biological themes through color cue templates. WPS statistically evaluates global functional category enrichment of gene lists and pathway-level pattern enrichment of data. WPS incorporates well-known biological pathways from KEGG (Kyoto Encyclopedia of Genes and Genomes) and Biocarta, GO (Gene Ontology) terms as well as user-defined pathways or relevant gene clusters or groups, and explores gene-term relationships within the derived gene-term association networks (GTANs). WPS simultaneously compares multiple datasets within biological contexts either as pathways or as association networks. WPS also integrates Genetic Association Database and Partial MedGene Database for disease-association information. We have used this program to analyze and compare microarray and proteomics datasets derived from a variety of biological systems. Application examples demonstrated the capacity of WPS to significantly facilitate the analysis of HTP data for integrative discovery. Conclusion This tool represents a pathway-based platform for discovery integration to maximize analysis power. The tool is freely available at . PMID:16423281

  3. Suspension array technology: new tools for gene and protein analysis.

    PubMed

    Nolan, J P; Mandy, F F

    2001-11-01

    Flow cytometry has long been a key tool in the analysis of lymphocytes and other cells, owing to its ability to make quantitative, homogeneous, multiparameter measurements of particles. New developments in illumination sources, digital signal processing and microsphere chemistry are driving the development of flow cytometry in new areas of biomedical research. In particular. the maturation of approaches to perform highly parallel analyses using suspension arrays of microspheres with different morphospectral features is making flow cytometry an important tool in protein and genetic analysis. In this paper, we review the development of suspension array technology (SAT), current applications in protein and genomic analysis, and the prospects for this platform in a variety of large scale screening applications. PMID:11838973

  4. Use of New Communication Technologies to Change NASA Safety Culture: Incorporating the Use of Blogs as a Fundamental Communications Tool

    NASA Technical Reports Server (NTRS)

    Huls, Dale thomas

    2005-01-01

    can be restored. For NASA to harness the capabilities of blogs, NASA must develop an Agency-wide policy on blogging to encourage use and provide guidance. This policy should describe basic rules of conduct and content as well as a policy of non-retribution and/or anonymity. The Agency must provide sever space within their firewalls, provide appropriate software tools, and promote blogs in newsletters and official websites. By embracing the use of blogs, a potential pool of 19,000 experts could be available to address each posted safety issue, concern, problem, or question. Blogs could result in real NASA culture change.

  5. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.

    1989-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  6. Tools for Large-Scale Mobile Malware Analysis

    SciTech Connect

    Bierma, Michael

    2014-01-01

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000 Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.

  7. SMART: Statistical Metabolomics Analysis-An R Tool.

    PubMed

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm . PMID:27248514

  8. Virtual Tool Mark Generation for Efficient Striation Analysis

    SciTech Connect

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  9. Field Quality Analysis as a Tool to Monitor Magnet Production

    SciTech Connect

    Gupta, R.; Anerella, M.; Cozzolino, J.; Fisher, D.; Ghosh, A.; Jain, A.; Sampson, W.; Schmalzle, J.; Thompson, P.; Wanderer, P.; Willen, E.

    1997-10-18

    Field harmonics offer a powerful tool to examine the mechanical structure of accelerator magnets. A large deviation from the nominal values suggests a mechanical defect. Magnets with such defects are likely to have a poor quench performance. Similarly, a trend suggests a wear in tooling or a gradual change in the magnet assem-bly or in the size of a component. This paper presents the use of the field quality as a tool to monitor the magnet production of the Relativistic Heavy Ion Collider (RHIC). Several examples are briefly described. Field quality analysis can also rule out a suspected geometric error if it can not be supported by the symmetry and the magnitude of the measured harmonics.

  10. MISR Data Visualization and Analysis Using the hdfscan Tool

    NASA Astrophysics Data System (ADS)

    Crean, K. A.; Diner, D. J.; Banerjee, P. K.

    2001-05-01

    A new software tool called hdfscan is available to display and analyze data formatted using the HDF-EOS grid and swath interfaces, as well as the native HDF SDS, vdata, vgroup, and raster interfaces. The hdfscan tool can display data in both image and textual form, and can also display attributes, metadata, annotations, file structure, projection information, and simple data statistics. hdfscan also includes a data editing capability. In addition, the tool contains unique features to aid in the interpretation of data from the Multi-angle Imaging SpectroRadiometer (MISR) instrument, which currently flies aboard NASA's Terra spacecraft. These features include the ability to unscale and unpack MISR data fields; the ability to display MISR data flag values according to their interpreted values as well as their raw values; and knowledge of special MISR fill values. MISR measures upwelling radiance from Earth in 4 spectral bands corresponding to blue, green, red, and near-infrared wavelengths, at each of 9 view angles including the nadir (vertical) direction plus 26.1, 45.6, 60.0, and 70.5 degrees forward and aftward of nadir. Data products derived from MISR measurements aim at improving our understanding of the Earth's environment and climate. The hdfscan tool runs in one of two modes, as selected by the user: command-line mode, or via a graphical user interface (GUI). This provides a user with flexibility in using the tool for either batch mode processing or interactive analysis. This presentation will describe features and functionalities of the hdfscan tool. The user interface will be shown, and menu options will be explained. Information on how to obtain the tool will be provided.

  11. Industrial Geospatial Analysis Tool for Energy Evaluation (IGATE-E)

    SciTech Connect

    Alkadi, Nasr E; Starke, Michael R; Ma, Ookie; Nimbalkar, Sachin U; Cox, Daryl

    2013-01-01

    IGATE-E is an energy analysis tool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.

  12. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Astrophysics Data System (ADS)

    Doyle, Monica M.; O'Neil, Daniel A.; Christensen, Carissa B.

    2005-02-01

    Forecasting technology capabilities requires a tool and a process for capturing state-of-the-art technology metrics and estimates for future metrics. A decision support tool, known as the Advanced Technology Lifecycle Analysis System (ATLAS), contains a Technology Tool Box (TTB) database designed to accomplish this goal. Sections of this database correspond to a Work Breakdown Structure (WBS) developed by NASA's Exploration Systems Research and Technology (ESRT) Program. These sections cover the waterfront of technologies required for human and robotic space exploration. Records in each section include technology performance, operations, and programmatic metrics. Timeframes in the database provide metric values for the state of the art (Timeframe 0) and forecasts for timeframes that correspond to spiral development milestones in NASA's Exploration Systems Mission Directorate (ESMD) development strategy. Collecting and vetting data for the TTB will involve technologists from across the agency, the aerospace industry and academia. Technologists will have opportunities to submit technology metrics and forecasts to the TTB development team. Semi-annual forums will facilitate discussions about the basis of forecast estimates. As the tool and process mature, the TTB will serve as a powerful communication and decision support tool for the ESRT program.

  13. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    NASA Astrophysics Data System (ADS)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  14. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  15. A conceptual design tool for RBCC engine performance analysis

    NASA Astrophysics Data System (ADS)

    Olds, John R.; Saks, Greg

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960's. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework.

  16. Multi-Spacecraft Analysis with Generic Visualization Tools

    NASA Astrophysics Data System (ADS)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  17. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  18. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  19. A new algorithm for grid-based hydrologic analysis by incorporating stormwater infrastructure

    NASA Astrophysics Data System (ADS)

    Choi, Yosoon; Yi, Huiuk; Park, Hyeong-Dong

    2011-08-01

    We developed a new algorithm, the Adaptive Stormwater Infrastructure (ASI) algorithm, to incorporate ancillary data sets related to stormwater infrastructure into the grid-based hydrologic analysis. The algorithm simultaneously considers the effects of the surface stormwater collector network (e.g., diversions, roadside ditches, and canals) and underground stormwater conveyance systems (e.g., waterway tunnels, collector pipes, and culverts). The surface drainage flows controlled by the surface runoff collector network are superimposed onto the flow directions derived from a DEM. After examining the connections between inlets and outfalls in the underground stormwater conveyance system, the flow accumulation and delineation of watersheds are calculated based on recursive computations. Application of the algorithm to the Sangdong tailings dam in Korea revealed superior performance to that of a conventional D8 single-flow algorithm in terms of providing reasonable hydrologic information on watersheds with stormwater infrastructure.

  20. Incorporating Quality into Data Envelopment Analysis of Nursing Home Performance: A Case Study

    PubMed Central

    Lenard, Melanie L.; Klimberg, Ronald K.

    2009-01-01

    When using data envelopment analysis (DEA) as a benchmarking technique for nursing homes, it is essential to include measures of the quality of care. We survey applications where quality has been incorporated into DEA models and consider the concerns that arise when the results show that quality measures have been effectively ignored. Three modeling techniques are identified that address these concerns. Each of these techniques requires some input from management as to the proper emphasis to be placed on the quality aspect of performance. We report the results of a case study in which we apply these techniques to a DEA model of nursing home performance. We examine in depth not only the resulting efficiency scores, but also the benchmark sets and the weights given to the input and output measures. We find that two of the techniques are effective in insuring that DEA results discriminate between high and low quality performance. PMID:20161166

  1. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  2. Space mission scenario development and performance analysis tool

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  3. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    NASA Astrophysics Data System (ADS)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  4. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGESBeta

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; et al

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  5. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  6. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  7. A SOLIDS ANALYSIS APPROACH INCORPORATING ARGON-ION MILLING TO COPPER AND LEAD PIPE SCALE ANALYSIS

    EPA Science Inventory

    Corrosion of copper and lead plumbing materials in water is complex and has been the topic of a number of studies on the topic (Lucey 1967; Edwards et al. 1994a; Edwards et al. 1994b; Duthil et al.1996; Harrison et al. 2004). Solids analysis is one of the most convenient and nfo...

  8. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  9. Aeroelastic Analysis of Helicopter Rotor Blades Incorporating Anisotropic Piezoelectric Twist Actuation

    NASA Technical Reports Server (NTRS)

    Wilkie, W. Keats; Belvin, W. Keith; Park, K. C.

    1996-01-01

    A simple aeroelastic analysis of a helicopter rotor blade incorporating embedded piezoelectric fiber composite, interdigitated electrode blade twist actuators is described. The analysis consists of a linear torsion and flapwise bending model coupled with a nonlinear ONERA based unsteady aerodynamics model. A modified Galerkin procedure is performed upon the rotor blade partial differential equations of motion to develop a system of ordinary differential equations suitable for dynamics simulation using numerical integration. The twist actuation responses for three conceptual fullscale blade designs with realistic constraints on blade mass are numerically evaluated using the analysis. Numerical results indicate that useful amplitudes of nonresonant elastic twist, on the order of one to two degrees, are achievable under one-g hovering flight conditions for interdigitated electrode poling configurations. Twist actuation for the interdigitated electrode blades is also compared with the twist actuation of a conventionally poled piezoelectric fiber composite blade. Elastic twist produced using the interdigitated electrode actuators was found to be four to five times larger than that obtained with the conventionally poled actuators.

  10. An aeroelastic analysis of helicopter rotor blades incorporating piezoelectric fiber composite twist actuation

    NASA Technical Reports Server (NTRS)

    Wilkie, W. Keats; Park, K. C.

    1996-01-01

    A simple aeroelastic analysis of a helicopter rotor blade incorporating embedded piezoelectric fiber composite, interdigitated electrode blade twist actuators is described. The analysis consist of a linear torsion and flapwise bending model coupled with a nonlinear ONERA based unsteady aerodynamics model. A modified Galerkin procedure is performed upon the rotor blade partial differential equations of motion to develop a system of ordinary differential equations suitable for numerical integration. The twist actuation responses for three conceptual full-scale blade designs with realistic constraints on blade mass are numerically evaluated using the analysis. Numerical results indicate that useful amplitudes of nonresonant elastic twist, on the order of one to two degrees, are achievable under one-g hovering flight conditions for interdigitated electrode poling configurations. Twist actuation for the interdigitated electrode blades is also compared with the twist actuation of a conventionally poled piezoelectric fiber composite blade. Elastic twist produced using the interdigitated electrode actuators was found to be four to five times larger than that obtained with the conventionally poled actuators.

  11. A methodology to incorporate life cycle analysis and the triple bottom line mechanism for sustainable management of industrial enterprises

    NASA Astrophysics Data System (ADS)

    Wang, Ling; Lin, Li

    2004-02-01

    Since 1970"s, the environmental protection movement has challenged industries to increase their investment in Environmentally Conscious Manufacturing (ECM) techniques and management tools. Social considerations for global citizens and their descendants also motivated the examination on the complex issues of sustainable development beyond the immediate economic impact. Consequently, industrial enterprises have started to understand sustainable development in considering the Triple Bottom Line (TBL): economic prosperity, environmental quality and social justice. For the management, however, a lack of systematic ECM methodologies hinders their effort in planning, evaluating, reporting and auditing of sustainability. To address this critical need, this research develops a framework of a sustainable management system by incorporating a Life Cycle Analysis (LCA) of industrial operations with the TBL mechanism. A TBL metric system with seven sets of indices for the TBL elements and their complex relations is identified for the comprehensive evaluation of a company"s sustainability performance. Utilities of the TBL indices are estimated to represent the views of various stakeholders, including the company, investors, employees and the society at large. Costs of these indices are also captured to reflect the company"s effort in meeting the utilities. An optimization model is formulated to maximize the economic, environmental and social benefits by the company"s effort in developing sustainable strategies. To promote environmental and social consciousness, the methodology can significantly facilitate management decisions by its capabilities of including "non-business" values and external costs that the company has not contemplated before.

  12. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  13. Aerospace Power Systems Design and Analysis (APSDA) Tool

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  14. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  15. Federal metering data analysis needs and existing tools

    SciTech Connect

    Henderson, Jordan W.; Fowler, Kimberly M.

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  16. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  17. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew

  18. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  19. VMPLOT: A versatile analysis tool for mission operations

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.

    1993-01-01

    VMPLOT is a versatile analysis tool designed by the Magellan Spacecraft Team to graphically display engineering data used to support mission operations. While there is nothing revolutionary or innovative about graphical data analysis tools, VMPLOT has some distinguishing features that set it apart from other custom or commercially available software packages. These features include the ability to utilize time in a Universal Time Coordinated (UTC) or Spacecraft Clock (SCLK) format as an enumerated data type, the ability to automatically scale both axes based on the data to be displayed (including time), the ability to combine data from different files, and the ability to utilize the program either interactively or in batch mode, thereby enhancing automation. Another important feature of VMPLOT not visible to the user is the software engineering philosophies utilized. A layered approach was used to isolate program functionality to different layers. This was done to increase program portability to different platforms and to ease maintenance and enhancements due to changing requirements. The functionality of the unique features of VMPLOT as well as highlighting the algorithms that make these features possible are described. The software engineering philosophies used in the creation of the software tool are also summarized.

  20. SATRAT: Staphylococcus aureus transcript regulatory network analysis tool

    PubMed Central

    Nagarajan, Vijayaraj; Elasri, Mohamed O.

    2015-01-01

    Staphylococcus aureus is a commensal organism that primarily colonizes the nose of healthy individuals. S. aureus causes a spectrum of infections that range from skin and soft-tissue infections to fatal invasive diseases. S. aureus uses a large number of virulence factors that are regulated in a coordinated fashion. The complex regulatory mechanisms have been investigated in numerous high-throughput experiments. Access to this data is critical to studying this pathogen. Previously, we developed a compilation of microarray experimental data to enable researchers to search, browse, compare, and contrast transcript profiles. We have substantially updated this database and have built a novel exploratory tool—SATRAT—the S. aureus transcript regulatory network analysis tool, based on the updated database. This tool is capable of performing deep searches using a query and generating an interactive regulatory network based on associations among the regulators of any query gene. We believe this integrated regulatory network analysis tool would help researchers explore the missing links and identify novel pathways that regulate virulence in S. aureus. Also, the data model and the network generation code used to build this resource is open sourced, enabling researchers to build similar resources for other bacterial systems. PMID:25653902

  1. ICAT: a General Purpose Image Reduction and Analysis Tool for Robotic Observatories

    NASA Astrophysics Data System (ADS)

    Colome, J.; Ribas, I.

    2006-08-01

    New technologies applied to astronomical facilities have allowed the development of fully automatically controlled observatories. For an efficient operation, reliable and fast data calibration and analysis software is crucial for an automatic treatment of the vast amount of images to be used in scientific studies. The IEEC Calibration and Analysis Tool (ICAT) software presented here, has been developed as a tool for robotic observatories with the objective of managing astronomical images to extract relevant scientific information in real time. It has been implemented in a modular way to allow an easy use in combination with most of the CCD camera control software. CAT is based on Perl and Unix shell scripts, for file managing, and it uses the NOAO PC-IRAF and Source Extractor packages as image handling tools. Because of its modularity and design the software can be easily incorporated in the frame of a general observatory control system, under either batch or direct user control. All routines and products will conform to Virtual Observatory standards.

  2. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  3. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  4. Protocol analysis as a tool for behavior analysis

    PubMed Central

    Austin, John; Delaney, Peter F.

    1998-01-01

    The study of thinking is made difficult by the fact that many of the relevant stimuli and responses are not apparent. Although the use of verbal reports has a long history in psychology, it is only recently that Ericsson and Simon's (1993) book on verbal reports explicated the conditions under which such reports may be reliable and valid. We review some studies in behavior analysis and cognitive psychology that have used talk-aloud reporting. We review particular methods for collecting reliable and valid verbal reports using the “talk-aloud” method as well as discuss alternatives to the talk-aloud procedure that are effective under different task conditions, such as the use of reports after completion of very rapid task performances. We specifically caution against the practice of asking subjects to reflect on the causes of their own behavior and the less frequently discussed problems associated with providing inappropriate social stimulation to participants during experimental sessions. PMID:22477126

  5. Incorporating natural capital into economy-wide impact analysis: a case study from Alberta.

    PubMed

    Patriquin, Mike N; Alavalapati, Janaki R R; Adamowicz, Wiktor L; White, William A

    2003-01-01

    Traditionally, decision-makers have relied on economic impact estimates derived from conventional economy-wide models. Conventional models lack the environmental linkages necessary for examining environmental stewardship and economic sustainability, and in particular the ability to assess the impact of policies on natural capital. This study investigates environmentally extended economic impact estimation on a regional scale using a case study region in the province of Alberta known as the Foothills Model Forest (FMF). Conventional economic impact models are environmentally extended in pursuit of enhancing policy analysis and local decision-making. It is found that the flexibility of the computable general equilibrium (CGE) modeling approach offers potential for environmental extension, with a solid grounding in economic theory. The CGE approach may be the tool of the future for more complete integrated environment and economic impact assessment. PMID:12859004

  6. TA-DA: A TOOL FOR ASTROPHYSICAL DATA ANALYSIS

    SciTech Connect

    Da Rio, Nicola; Robberto, Massimo

    2012-12-01

    We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

  7. Tool for Statistical Analysis and Display of Landing Sites

    NASA Technical Reports Server (NTRS)

    Wawrzyniak, Geoffrey; Kennedy, Brian; Knocke, Philip; Michel, John

    2006-01-01

    MarsLS is a software tool for analyzing statistical dispersion of spacecraft-landing sites and displaying the results of its analyses. Originally intended for the Mars Explorer Rover (MER) mission, MarsLS is also applicable to landing sites on Earth and non-MER sites on Mars. MarsLS is a collection of interdependent MATLAB scripts that utilize the MATLAB graphical-user-interface software environment to display landing-site data (see figure) on calibrated image-maps of the Martian or other terrain. The landing-site data comprise latitude/longitude pairs generated by Monte Carlo runs of other computer programs that simulate entry, descent, and landing. Using these data, MarsLS can compute a landing-site ellipse a standard means of depicting the area within which the spacecraft can be expected to land with a given probability. MarsLS incorporates several features for the user s convenience, including capabilities for drawing lines and ellipses, overlaying kilometer or latitude/longitude grids, drawing and/or specifying lines and/or points, entering notes, defining and/or displaying polygons to indicate hazards or areas of interest, and evaluating hazardous and/or scientifically interesting areas. As part of such an evaluation, MarsLS can compute the probability of landing in a specified polygonal area.

  8. Principles and tools for collaborative entity-based intelligence analysis.

    PubMed

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis. PMID:20075480

  9. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  10. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  11. The Smooth Decomposition as a nonlinear modal analysis tool

    NASA Astrophysics Data System (ADS)

    Bellizzi, Sergio; Sampaio, Rubens

    2015-12-01

    The Smooth Decomposition (SD) is a statistical analysis technique for finding structures in an ensemble of spatially distributed data such that the vector directions not only keep the maximum possible variance but also the motions, along the vector directions, are as smooth in time as possible. In this paper, the notion of the dual smooth modes is introduced and used in the framework of oblique projection to expand a random response of a system. The dual modes define a tool that transforms the SD in an efficient modal analysis tool. The main properties of the SD are discussed and some new optimality properties of the expansion are deduced. The parameters of the SD give access to modal parameters of a linear system (mode shapes, resonance frequencies and modal energy participations). In case of nonlinear systems, a richer picture of the evolution of the modes versus energy can be obtained analyzing the responses under several excitation levels. This novel analysis of a nonlinear system is illustrated by an example.

  12. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  13. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  14. Image and Data-analysis Tools For Paleoclimatic Reconstructions

    NASA Astrophysics Data System (ADS)

    Pozzi, M.

    It comes here proposed a directory of instruments and computer science resources chosen in order to resolve the problematic ones that regard the paleoclimatic recon- structions. They will come discussed in particular the following points: 1) Numerical analysis of paleo-data (fossils abundances, species analyses, isotopic signals, chemical-physical parameters, biological data): a) statistical analyses (uni- variate, diversity, rarefaction, correlation, ANOVA, F and T tests, Chi^2) b) multidi- mensional analyses (principal components, corrispondence, cluster analysis, seriation, discriminant, autocorrelation, spectral analysis) neural analyses (backpropagation net, kohonen feature map, hopfield net genetic algorithms) 2) Graphical analysis (visu- alization tools) of paleo-data (quantitative and qualitative fossils abundances, species analyses, isotopic signals, chemical-physical parameters): a) 2-D data analyses (graph, histogram, ternary, survivorship) b) 3-D data analyses (direct volume rendering, iso- surfaces, segmentation, surface reconstruction, surface simplification,generation of tetrahedral grids). 3) Quantitative and qualitative digital image analysis (macro and microfossils image analysis, Scanning Electron Microscope. and Optical Polarized Microscope images capture and analysis, morphometric data analysis, 3-D reconstruc- tions): a) 2D image analysis (correction of image defects, enhancement of image de- tail, converting texture and directionality to grey scale or colour differences, visual enhancement using pseudo-colour, pseudo-3D, thresholding of image features, binary image processing, measurements, stereological measurements, measuring features on a white background) b) 3D image analysis (basic stereological procedures, two dimen- sional structures; area fraction from the point count, volume fraction from the point count, three dimensional structures: surface area and the line intercept count, three dimensional microstructures; line length and the

  15. A POST ANALYSIS OF A PREVENTIVE AND CHRONIC HEALTHCARE TOOL.

    PubMed

    Borrayo, Brooke D; O'Lawrence, Henry

    2016-01-01

    This study uses the data set from Kaiser Permanente to examine the post implementation of a preventive and chronic care that utilizes clinical information system, delivery system design, and clinical decision support to maximize the office visit. The analysis suggests a significant positive relationship between frequency of utilization rates to address preventive and chronic care gaps. There is no implication of a significant positive relationship with the successfully captured rate, which satisfies closing the care gap within 45 days. The use of the preventive care tool will assist members in satisfying the preventive care gap, cervical cancer screening, within 45 days of the encounter. PMID:27483973

  16. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  17. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity. PMID:16675027

  18. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    NASA Technical Reports Server (NTRS)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  19. A Design and Performance Analysis Tool for Superconducting RF Systems

    NASA Astrophysics Data System (ADS)

    Schilcher, Th.; Simrock, S. N.; Merminga, L.; Wang, D. X.

    1997-05-01

    Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall plug power efficiency. Typical examples are CEBAF at Jefferson Lab and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper we describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyse the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise. An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse stucture and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feedforward can be added to further supress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented.

  20. A design and performance analysis tool for superconducting RF systems

    SciTech Connect

    T. Schilcher; S.N. Simrock; L. Merminga; D.X. Wang

    1997-05-01

    Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall wall plug power efficiency. Typical examples are CEBAF at the Thomas Jefferson National Accelerator Facility (Jefferson Lab) and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper the authors describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyze the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise.An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse structure and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feed forward can be added to further suppress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented.

  1. Java Analysis Tools for Element Production Calculations in Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Lingerfelt, E.; Hix, W.; Guidry, M.; Smith, M.

    2002-12-01

    We are developing a set of extendable, cross-platform tools and interfaces using Java and vector graphic technologies such as SVG and SWF to facilitate element production calculations in computational astrophysics. The Java technologies are customizable and portable, and can be utilized as stand-alone applications or distributed across a network. These tools, which have broad applications in general scientific visualization, are currently being used to explore and analyze a large library of nuclear reaction rates and visualize results of explosive nucleosynthesis calculations with compact, high quality vector graphics. The facilities for reading and plotting nuclear reaction rates and their components from a network or library permit the user to easily include new rates and compare and adjust current ones. Sophisticated visualization and graphical analysis tools offer the ability to view results in an interactive, scalable vector graphics format, which leads to a dramatic (ten-fold) reduction in visualization file sizes while maintaining high visual quality and interactive control. ORNL Physics Division is managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  2. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    SciTech Connect

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  3. Risk analysis systems for veterinary biologicals: a regulator's tool box.

    PubMed

    Osborne, C G; McElvaine, M D; Ahl, A S; Glosser, J W

    1995-12-01

    Recent advances in biology and technology have significantly improved our ability to produce veterinary biologicals of high purity, efficacy and safety, virtually anywhere in the world. At the same time, increasing trade and comprehensive trade agreements, such as the Uruguay Round of the General Agreement on Tariffs and Trade (GATT: now the World Trade Organisation [WTO]), have put pressure on governments to use scientific principles in the regulation of trade for a wide range of products, including veterinary biologicals. In many cases, however, nations have been reluctant to allow the movement of veterinary biologicals, due to the perceived threat of importing an exotic disease. This paper discusses the history of risk analysis as a decision support tool and provides examples of how this tool may be used in a science-based regulatory system for veterinary biologicals. A wide variety of tools are described, including qualitative, semi-quantitative and quantitative methods, most with a long history of use in engineering and the health and environmental sciences. PMID:8639961

  4. Sensitivity Analysis of Flutter Response of a Wing Incorporating Finite-Span Corrections

    NASA Technical Reports Server (NTRS)

    Issac, Jason Cherian; Kapania, Rakesh K.; Barthelemy, Jean-Francois M.

    1994-01-01

    Flutter analysis of a wing is performed in compressible flow using state-space representation of the unsteady aerodynamic behavior. Three different expressions are used to incorporate corrections due to the finite-span effects of the wing in estimating the lift-curve slope. The structural formulation is based on a Rayleigh-Pitz technique with Chebyshev polynomials used for the wing deflections. The aeroelastic equations are solved as an eigen-value problem to determine the flutter speed of the wing. The flutter speeds are found to be higher in these cases, when compared to that obtained without accounting for the finite-span effects. The derivatives of the flutter speed with respect to the shape parameters, namely: aspect ratio, area, taper ratio and sweep angle, are calculated analytically. The shape sensitivity derivatives give a linear approximation to the flutter speed curves over a range of values of the shape parameter which is perturbed. Flutter and sensitivity calculations are performed on a wing using a lifting-surface unsteady aerodynamic theory using modules from a system of programs called FAST.

  5. Image analysis tools and emerging algorithms for expression proteomics

    PubMed Central

    English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

    2012-01-01

    Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

  6. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis

    PubMed Central

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f . PMID:27158457

  7. Incorporating operational flexibility into electric generation planning Impacts and methods for system design and policy analysis

    NASA Astrophysics Data System (ADS)

    Palmintier, Bryan S.

    This dissertation demonstrates how flexibility in hourly electricity operations can impact long-term planning and analysis for future power systems, particularly those with substantial variable renewables (e.g., wind) or strict carbon policies. Operational flexibility describes a power system's ability to respond to predictable and unexpected changes in generation or demand. Planning and policy models have traditionally not directly captured the technical operating constraints that determine operational flexibility. However, as demonstrated in this dissertation, this capability becomes increasingly important with the greater flexibility required by significant renewables (>= 20%) and the decreased flexibility inherent in some low-carbon generation technologies. Incorporating flexibility can significantly change optimal generation and energy mixes, lower system costs, improve policy impact estimates, and enable system designs capable of meeting strict regulatory targets. Methodologically, this work presents a new clustered formulation that tractably combines a range of normally distinct power system models, from hourly unit-commitment operations to long-term generation planning. This formulation groups similar generators into clusters to reduce problem size, while still retaining the individual unit constraints required to accurately capture operating reserves and other flexibility drivers. In comparisons against traditional unit commitment formulations, errors were generally less than 1% while run times decreased by several orders of magnitude (e.g., 5000x). Extensive numerical simulations, using a realistic Texas-based power system show that ignoring flexibility can underestimate carbon emissions by 50% or result in significant load and wind shedding to meet environmental regulations. Contributions of this dissertation include: 1. Demonstrating that operational flexibility can have an important impact on power system planning, and describing when and how these

  8. DCE@urLAB: a dynamic contrast-enhanced MRI pharmacokinetic analysis tool for preclinical data

    PubMed Central

    2013-01-01

    Background DCE@urLAB is a software application for analysis of dynamic contrast-enhanced magnetic resonance imaging data (DCE-MRI). The tool incorporates a friendly graphical user interface (GUI) to interactively select and analyze a region of interest (ROI) within the image set, taking into account the tissue concentration of the contrast agent (CA) and its effect on pixel intensity. Results Pixel-wise model-based quantitative parameters are estimated by fitting DCE-MRI data to several pharmacokinetic models using the Levenberg-Marquardt algorithm (LMA). DCE@urLAB also includes the semi-quantitative parametric and heuristic analysis approaches commonly used in practice. This software application has been programmed in the Interactive Data Language (IDL) and tested both with publicly available simulated data and preclinical studies from tumor-bearing mouse brains. Conclusions A user-friendly solution for applying pharmacokinetic and non-quantitative analysis DCE-MRI in preclinical studies has been implemented and tested. The proposed tool has been specially designed for easy selection of multi-pixel ROIs. A public release of DCE@urLAB, together with the open source code and sample datasets, is available at http://www.die.upm.es/im/archives/DCEurLAB/. PMID:24180558

  9. Environmental Tools and Radiological Assessment

    EPA Science Inventory

    This presentation details two tools (SADA and FRAMES) available for use in environmental assessments of chemicals that can also be used for radiological assessments of the environment. Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporate...

  10. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  11. The Lagrangian analysis tool LAGRANTO - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-02-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities: (i) trajectory starting positions can be described easily based on different geometrical and/or meteorological conditions; e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels; (ii) a versatile selection of trajectories is offered based on single or combined criteria; these criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity (PV) greater than 2 PVU); and (iii) full versions are available for global ECMWF and regional COSMO data; core functionality is also provided for the regional WRF and UM models, and for the global 20th Century Reanalysis data set. The intuitive application of LAGRANTO is first presented for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO is used to quasi-operationally diagnose stratosphere-troposphere exchange events over Europe. Whereas these example rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution are needed to adequately resolve the rather complex flow structure associated with orographic blocking due to the Alps. Finally, an example of backward trajectories presents the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes simple tools, e.g., to visualize and merge trajectories. Furthermore, a detailed user guide exists, which describes all LAGRANTO capabilities.

  12. The LAGRANTO Lagrangian analysis tool - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-08-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities. Trajectory starting positions can be defined easily and flexibly based on different geometrical and/or meteorological conditions, e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels. After the computation of the trajectories, a versatile selection of trajectories is offered based on single or combined criteria. These criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity, PV, greater than 2 PVU; 1 PVU = 10-6 K m2 kg-1 s-1). Full versions of this new version of LAGRANTO are available for global ECMWF and regional COSMO data, and core functionality is provided for the regional WRF and MetUM models and the global 20th Century Reanalysis data set. The paper first presents the intuitive application of LAGRANTO for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO can be used to quasi-operationally diagnose stratosphere-troposphere exchange events. Whereas these examples rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution serve to resolve the rather complex flow structure associated with orographic blocking due to the Alps, as shown in a third example. A final example illustrates the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes auxiliary tools, e.g., to visualize trajectories. A detailed user guide describes all LAGRANTO capabilities.

  13. Net energy analysis - powerful tool for selecting elective power options

    SciTech Connect

    Baron, S.

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  14. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  15. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  16. Improved seismic data analysis tool in hydrogeophysical applications

    NASA Astrophysics Data System (ADS)

    Scholtz, P.

    2003-04-01

    To study the near-surface environment several geophysical measurement techniques exist. Seismic methods are widely and successfully used to aid the solution of different geological tasks. Unfortunately the financial background of environmental related efforts are limited, hence it is vital to get the most information out of our geophysical field data. Hydrogeological investigations require special accuracy and resolution from the applied seismic methods. A dispersion analysis tool will be presented, which is insensitive to inaccuracies, works under noisy conditions and can separate close arrivals. We show a wavelet transformation based method, where the choice of the appropriate basic wavelet improves the quality of the results. Applying this analysis technique to the inversion of frequency-velocity functions of wave guiding sequences or in mapping inhomogeneities of the near-surface by group traveltime tomography will yield to more reliable physical parameters needed by hydrogeologists.

  17. PyRAT (python radiography analysis tool): overview

    SciTech Connect

    Armstrong, Jerawan C; Temple, Brian A; Buescher, Kevin L

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  18. Stacks: an analysis tool set for population genomics

    PubMed Central

    CATCHEN, JULIAN; HOHENLOHE, PAUL A.; BASSHAM, SUSAN; AMORES, ANGEL; CRESKO, WILLIAM A.

    2014-01-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics. PMID:23701397

  19. Empirical calibrated radiocarbon sampler: a tool for incorporating radiocarbon-date and calibration error into Bayesian phylogenetic analyses of ancient DNA.

    PubMed

    Molak, Martyna; Suchard, Marc A; Ho, Simon Y W; Beilman, David W; Shapiro, Beth

    2015-01-01

    Studies of DNA from ancient samples provide a valuable opportunity to gain insight into past evolutionary and demographic processes. Bayesian phylogenetic methods can estimate evolutionary rates and timescales from ancient DNA sequences, with the ages of the samples acting as calibrations for the molecular clock. Sample ages are often estimated using radiocarbon dating, but the associated measurement error is rarely taken into account. In addition, the total uncertainty quantified by converting radiocarbon dates to calendar dates is typically ignored. Here, we present a tool for incorporating both of these sources of uncertainty into Bayesian phylogenetic analyses of ancient DNA. This empirical calibrated radiocarbon sampler (ECRS) integrates the age uncertainty for each ancient sequence over the calibrated probability density function estimated for its radiocarbon date and associated error. We use the ECRS to analyse three ancient DNA data sets. Accounting for radiocarbon-dating and calibration error appeared to have little impact on estimates of evolutionary rates and related parameters for these data sets. However, analyses of other data sets, particularly those with few or only very old radiocarbon dates, might be more sensitive to using artificially precise sample ages and should benefit from use of the ECRS. PMID:24964386

  20. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    NASA Astrophysics Data System (ADS)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  1. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  2. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  3. Charming the Snake: Student Experiences with Python Programming as a Data Analysis Tool

    NASA Astrophysics Data System (ADS)

    Booker, Melissa; Ivers, C. B.; Piper, M.; Powers, L.; Ali, B.

    2014-01-01

    During the past year, twelve high school students and one undergraduate student participated in the NASA/IPAC Teacher Archive Research Program (NITARP) alongside three high school educators and one informal educator, gaining experience in using Python as a tool for analyzing the vast amount of photometry data available from the Herschel and Spitzer telescopes in the NGC 281 region. Use of Python appeared to produce two main positive gains: (1) a gain in student ability to successfully write and execute Python programs for the bulk analysis of data, and (2) a change in their perceptions of the utility of computer programming and of the students’ abilities to use programming to solve problems. We outline the trials, tribulations, successes, and failures of the teachers and students through this learning exercise and provide some recommendations for incorporating programming in scientific learning.

  4. Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2015-01-01

    Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.

  5. Design and Analysis Tool for External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2012-01-01

    A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

  6. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

    2010-12-01

    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of

  7. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  8. DevMouse, the mouse developmental methylome database and analysis tools

    PubMed Central

    Liu, Hongbo; Zhu, Rangfei; Lv, Jie; He, Hongjuan; Yang, Lin; Huang, Zhijun; Su, Jianzhong; Zhang, Yan; Yu, Shihuan; Wu, Qiong

    2014-01-01

    DNA methylation undergoes dynamic changes during mouse development and plays crucial roles in embryogenesis, cell-lineage determination and genomic imprinting. Bisulfite sequencing enables profiling of mouse developmental methylomes on an unprecedented scale; however, integrating and mining these data are challenges for experimental biologists. Therefore, we developed DevMouse, which focuses on the efficient storage of DNA methylomes in temporal order and quantitative analysis of methylation dynamics during mouse development. The latest release of DevMouse incorporates 32 normalized and temporally ordered methylomes across 15 developmental stages and related genome information. A flexible query engine is developed for acquisition of methylation profiles for genes, microRNAs, long non-coding RNAs and genomic intervals of interest across selected developmental stages. To facilitate in-depth mining of these profiles, DevMouse offers online analysis tools for the quantification of methylation variation, identification of differentially methylated genes, hierarchical clustering, gene function annotation and enrichment. Moreover, a configurable MethyBrowser is provided to view the base-resolution methylomes under a genomic context. In brief, DevMouse hosts comprehensive mouse developmental methylome data and provides online tools to explore the relationships of DNA methylation and development. Database URL: http://www.devmouse.org/ PMID:24408217

  9. DevMouse, the mouse developmental methylome database and analysis tools.

    PubMed

    Liu, Hongbo; Zhu, Rangfei; Lv, Jie; He, Hongjuan; Yang, Lin; Huang, Zhijun; Su, Jianzhong; Zhang, Yan; Yu, Shihuan; Wu, Qiong

    2014-01-01

    DNA methylation undergoes dynamic changes during mouse development and plays crucial roles in embryogenesis, cell-lineage determination and genomic imprinting. Bisulfite sequencing enables profiling of mouse developmental methylomes on an unprecedented scale; however, integrating and mining these data are challenges for experimental biologists. Therefore, we developed DevMouse, which focuses on the efficient storage of DNA methylomes in temporal order and quantitative analysis of methylation dynamics during mouse development. The latest release of DevMouse incorporates 32 normalized and temporally ordered methylomes across 15 developmental stages and related genome information. A flexible query engine is developed for acquisition of methylation profiles for genes, microRNAs, long non-coding RNAs and genomic intervals of interest across selected developmental stages. To facilitate in-depth mining of these profiles, DevMouse offers online analysis tools for the quantification of methylation variation, identification of differentially methylated genes, hierarchical clustering, gene function annotation and enrichment. Moreover, a configurable MethyBrowser is provided to view the base-resolution methylomes under a genomic context. In brief, DevMouse hosts comprehensive mouse developmental methylome data and provides online tools to explore the relationships of DNA methylation and development. Database URL: http://www.devmouse.org/ PMID:24408217

  10. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    SciTech Connect

    Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  11. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  12. Analysis and specification tools in relation to the APSE

    NASA Technical Reports Server (NTRS)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  13. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  14. Validation of tool mark analysis of cut costal cartilage.

    PubMed

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2012-03-01

    This study was designed to establish the potential error rate associated with the generally accepted method of tool mark analysis of cut marks in costal cartilage. Three knives with different blade types were used to make experimental cut marks in costal cartilage of pigs. Each cut surface was cast, and each cast was examined by three analysts working independently. The presence of striations, regularity of striations, and presence of a primary and secondary striation pattern were recorded for each cast. The distance between each striation was measured. The results showed that striations were not consistently impressed on the cut surface by the blade's cutting edge. Also, blade type classification by the presence or absence of striations led to a 65% misclassification rate. Use of the classification tree and cross-validation methods and inclusion of the mean interstriation distance decreased the error rate to c. 50%. PMID:22081951

  15. Gnome--an Internet-based sequence analysis tool.

    PubMed

    Nakai, K; Tokimori, T; Ogiwara, A; Uchiyama, I; Niiyama, T

    1994-09-01

    Gnome (GenomeNet Open Mail-service Environment) is a sequence analysis tool that enables an end-user to make use of several Internet- (mainly e-mail) based services with an easy-to-use graphical user interface. Users can conduct homology and motif searches, and database-entry retrieval against the latest databases by emitting search requests to and receiving their results form a search-server by e-mail. The search results are viewed and managed efficiently with this system. The Macintosh and X (Motif) versions of the Gnome client and the UNIX version of the Gnome server are available to academic users free of charge. PMID:7828072

  16. Input Range Testing for the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  17. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  18. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  19. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    PubMed

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes. PMID:26753878

  20. Lagrangian analysis. Modern tool of the dynamics of solids

    NASA Astrophysics Data System (ADS)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  1. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    PubMed

    Caruana, Matthew V; Carvalho, Susana; Braun, David R; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W K

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  2. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    PubMed Central

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  3. The effects of music therapy incorporated with applied behavior analysis verbal behavior approach for children with autism spectrum disorders.

    PubMed

    Lim, Hayoung A; Draper, Ellary

    2011-01-01

    This study compared a common form of Applied Behavior Analysis Verbal Behavior (ABA VB) approach and music incorporated with ABA VB method as part of developmental speech-language training in the speech production of children with Autism Spectrum Disorders (ASD). This study explored how the perception of musical patterns incorporated in ABA VB operants impacted the production of speech in children with ASD. Participants were 22 children with ASD, age range 3 to 5 years, who were verbal or pre verbal with presence of immediate echolalia. They were randomly assigned a set of target words for each of the 3 training conditions: (a) music incorporated ABA VB, (b) speech (ABA VB) and (c) no-training. Results showed both music and speech trainings were effective for production of the four ABA verbal operants; however, the difference between music and speech training was not statistically different. Results also indicated that music incorporated ABA VB training was most effective in echoic production, and speech training was most effective in tact production. Music can be incorporated into the ABA VB training method, and musical stimuli can be used as successfully as ABA VB speech training to enhance the functional verbal production in children with ASD. PMID:22506303

  4. Numeral-Incorporating Roots in Numeral Systems: A Comparative Analysis of Two Sign Languages

    ERIC Educational Resources Information Center

    Fuentes, Mariana; Massone, Maria Ignacia; Fernandez-Viader, Maria del Pilar; Makotrinsky, Alejandro; Pulgarin, Francisca

    2010-01-01

    Numeral-incorporating roots in the numeral systems of Argentine Sign Language (LSA) and Catalan Sign Language (LSC), as well as the main features of the number systems of both languages, are described and compared. Informants discussed the use of numerals and roots in both languages (in most cases in natural contexts). Ten informants took part in…

  5. Generalized Analysis Tools for Multi-Spacecraft Missions

    NASA Astrophysics Data System (ADS)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  6. Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools

    SciTech Connect

    Barrand, Guy

    2002-08-20

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  7. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    NASA Astrophysics Data System (ADS)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  8. Micropollutants in urban watersheds : substance flow analysis as management tool

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment

  9. New tools for the analysis and design of building envelopes

    SciTech Connect

    Papamichael, K.; Winkelmann, F.C.; Buhl, W.F.; Chauvet, H.

    1994-08-01

    We describe the integrated development of PowerDOE, a new version of the DOE-2 building energy analysis program, and the Building Design Advisor (BDA), a multimedia-based design tool that assists building designers with the concurrent consideration of multiple design solutions with respect to multiple design criteria. PowerDOE has a windows-based Graphical User Interface (GUI) that makes it easier to use than DOE-2, while retaining DOE-2`s calculation power and accuracy. BDA, with a similar GUI, is designed to link to multiple analytical models and databases. In its first release it is linked to PowerDOE and a Daylighting Analysis Module, as well as to a Case Studies Database and a Schematic Graphic Editor. These allow building designers to set performance goals and address key building envelope parameters from the initial, schematic phases of building design to the detailed specification of building components and systems required by PowerDOE. The consideration of the thermal performance of building envelopes through PowerDOE and BDA is integrated with non-thermal envelope performance aspects, such as daylighting, as well as with the performance of non-envelope building components and systems, such as electric lighting and HVAC. Future versions of BDA will support links to CAD and electronic product catalogs, as well as provide context-dependent design advice to improve performance.

  10. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  11. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  12. Wavelet analysis as a nonstationary plasma fluctuation diagnostic tool

    SciTech Connect

    Santoso, S.; Powers, E.J.; Ouroua, A.; Heard, J.W.; Bengtson, R.D.

    1996-12-31

    Analysis of nonstationary plasma fluctuation data has been a long-time challenge for the plasma diagnostic community. For this reason, in this paper the authors present and apply wavelet transforms as a new diagnostic tool to analyze nonstationary plasma fluctuation data. Unlike the Fourier transform, which represents a given signal globally without temporal resolution, the wavelet transform provides a local representation of the given signal in the time-scale domain. The fundamental concepts and multiresolution properties of wavelet transforms, along with a brief comparison with the short-time Fourier transform, are presented in this paper. The selection of a prototype wavelet or a mother wavelet is also discussed. Digital implementation of wavelet spectral analysis, which include time-scale power spectra and scale power spectra are described. The efficacy of the wavelet approach is demonstrated by analyzing transient broadband electrostatic potential fluctuations inside the inversion radius of sawtoothing TEXT-U plasmas during electron cyclotron resonance heating. The potential signals are collected using a 2 MeV heavy ion beam probe.

  13. jSIPRO - analysis tool for magnetic resonance spectroscopic imaging.

    PubMed

    Jiru, Filip; Skoch, Antonin; Wagnerova, Dita; Dezortova, Monika; Hajek, Milan

    2013-10-01

    Magnetic resonance spectroscopic imaging (MRSI) involves a huge number of spectra to be processed and analyzed. Several tools enabling MRSI data processing have been developed and widely used. However, the processing programs primarily focus on sophisticated spectra processing and offer limited support for the analysis of the calculated spectroscopic maps. In this paper the jSIPRO (java Spectroscopic Imaging PROcessing) program is presented, which is a java-based graphical interface enabling post-processing, viewing, analysis and result reporting of MRSI data. Interactive graphical processing as well as protocol controlled batch processing are available in jSIPRO. jSIPRO does not contain a built-in fitting program. Instead, it makes use of fitting programs from third parties and manages the data flows. Currently, automatic spectra processing using LCModel, TARQUIN and jMRUI programs are supported. Concentration and error values, fitted spectra, metabolite images and various parametric maps can be viewed for each calculated dataset. Metabolite images can be exported in the DICOM format either for archiving purposes or for the use in neurosurgery navigation systems. PMID:23870172

  14. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    NASA Astrophysics Data System (ADS)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  15. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  16. Immunoglobulin analysis tool: a novel tool for the analysis of human and mouse heavy and light chain transcripts.

    PubMed

    Rogosch, Tobias; Kerzel, Sebastian; Hoi, Kam Hon; Zhang, Zhixin; Maier, Rolf F; Ippolito, Gregory C; Zemlin, Michael

    2012-01-01

    Sequence analysis of immunoglobulin (Ig) heavy and light chain transcripts can refine categorization of B cell subpopulations and can shed light on the selective forces that act during immune responses or immune dysregulation, such as autoimmunity, allergy, and B cell malignancy. High-throughput sequencing yields Ig transcript collections of unprecedented size. The authoritative web-based IMGT/HighV-QUEST program is capable of analyzing large collections of transcripts and provides annotated output files to describe many key properties of Ig transcripts. However, additional processing of these flat files is required to create figures, or to facilitate analysis of additional features and comparisons between sequence sets. We present an easy-to-use Microsoft(®) Excel(®) based software, named Immunoglobulin Analysis Tool (IgAT), for the summary, interrogation, and further processing of IMGT/HighV-QUEST output files. IgAT generates descriptive statistics and high-quality figures for collections of murine or human Ig heavy or light chain transcripts ranging from 1 to 150,000 sequences. In addition to traditionally studied properties of Ig transcripts - such as the usage of germline gene segments, or the length and composition of the CDR-3 region - IgAT also uses published algorithms to calculate the probability of antigen selection based on somatic mutational patterns, the average hydrophobicity of the antigen-binding sites, and predictable structural properties of the CDR-H3 loop according to Shirai's H3-rules. These refined analyses provide in-depth information about the selective forces acting upon Ig repertoires and allow the statistical and graphical comparison of two or more sequence sets. IgAT is easy to use on any computer running Excel(®) 2003 or higher. Thus, IgAT is a useful tool to gain insights into the selective forces and functional properties of small to extremely large collections of Ig transcripts, thereby assisting a researcher to mine a data set

  17. Analysis of the influence of tool dynamics in diamond turning

    SciTech Connect

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  18. Exploring Scheduling Algorithms and Analysis Tools for the LSST Operations Simulations

    NASA Astrophysics Data System (ADS)

    Petry, Catherine E.; Miller, M.; Cook, K. H.; Ridgway, S.; Chandrasekharan, S.; Jones, R. L.; Krughoff, K. S.; Ivezic, Z.; Krabbendam, V.

    2012-01-01

    The LSST Operations Simulator models the telescope's design-specific opto-mechanical system performance and site-specific conditions to simulate how observations may be obtained during a 10-year survey. We have found that a remarkable range of science programs are compatible with a single feasible cadence. The current version, OpSim v2.5, incorporates detailed models of the telescope and dome, the camera, weather and a more realistic model for scheduled and unscheduled downtime, as well as a scheduling strategy based on ranking requests for observations from a small number of observing modes attempting to optimize the key science objectives. Each observing mode is driven by a specific algorithm which ranks field-filter combinations of target fields to observe next. The output of the simulator is a detailed record of the activity of the telescope - such as position on the sky, slew activities, weather and various types of downtime - stored in a mySQL database. Sophisticated tools are required to mine this database in order to assess the degree of success of any simulated survey in some detail. An analysis pipeline has been created (SSTAR) which generates a standard report describing the basic characteristics of a simulated survey; a new analysis framework is being designed to allow for the inter-comparison of one or more simulated surveys and to perform more complex analyses in a pipeline fashion. Proprietary software is being used to interactively explore the database and to prototype reports for the new analysis pipeline, and we are working with the ASCOT team (http://ascot.astro.washington.edu) to determine the feasibility of creating our own interactive tools. The next phase of simulator development is being planned to include look-ahead to continue investigating the trade-offs of addressing multiple science goals within a single LSST survey.

  19. HPLC analysis as a tool for assessing targeted liposome composition.

    PubMed

    Oswald, Mira; Platscher, Michael; Geissler, Simon; Goepferich, Achim

    2016-01-30

    Functionalized phospholipids are indispensable materials for the design of targeted liposomes. Control over the quality and quantity of phospholipids is thereby key in the successful development and manufacture of such formulations. This was also the case for a complex liposomal preparation composed of 1,2-dioleoyl-sn-glycero-3-phosphocholine (DOPC), Cholesterol (CHO), 1,2-distearoyl-sn-glycero-3-phosphoethanolamine-N-[amino(polyethylene glycol)-2000] (DSPE-PEG2000). To this end, an RP-HPLC method was developed. Detection was done via evaporative light scattering (ELS) for liposomal components. The method was validated for linearity, precision, accuracy, sensitivity and robustness. The liposomal compounds had a non-linear quadratic response in the concentration range of 0.012-0.42 mg/ml with a correlation coefficient greater than 0.99 with an accuracy of method confirmed 95-105% of the theoretical concentration. Furthermore, degradation products from the liposomal formulation could be identified. The presented method was successfully implemented as a control tool during the preparation of functionalized liposomes. It underlined the benefit of HPLC analysis of phospholipids during liposome preparation as an easy and rapid control method for the functionalized lipid at each preparation step as well as for the quantification of all components. PMID:26570988

  20. SEM analysis as a diagnostic tool for photovoltaic cell degradation

    NASA Astrophysics Data System (ADS)

    Osayemwenre, Gilbert; Meyer, E. L.

    2013-04-01

    The importance of scanning electron microscopy (SEM) analysis as a diagnostic tool for analyzing the degradation of a polycrystalline Photovoltaic cell has been studied. The main aim of this study is to characterize the surface morphology of hot spot regions (degraded) cells in photovoltaic solar cells. In recent years, production of hetero and multi-junction solar cells has experience tremendous growth as compared to conventional silicon (Si) solar cells. Thin film photovoltaic solar cells generally are more prone to exhibiting defects and associated degradation modes. To improve the lifetime of these cells and modules, it is imperative to fully understand the cause and effect of defects and degradation modes. The objective of this paper is to diagnose the observed degradation in polycrystalline silicon cells, using scanning electron microscopy (SEM). In this study poly-Si cells were characterize before and after reverse biasing, the reverse biasing was done to evaluate the cells' susceptibility to leakage currents and hotspots formation. After reverse biasing, some cells were found to exhibit hotspots as confirmed by infrared thermography. The surface morphology of these hotspots re

  1. Bioelectrical impedance analysis: A new tool for assessing fish condition

    USGS Publications Warehouse

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith

    2015-01-01

    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  2. ADVISOR: a systems analysis tool for advanced vehicle modeling

    NASA Astrophysics Data System (ADS)

    Markel, T.; Brooker, A.; Hendricks, T.; Johnson, V.; Kelly, K.; Kramer, B.; O'Keefe, M.; Sprik, S.; Wipke, K.

    This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)—the US Department of Energy's (DOE's) ADVISOR written in the MATLAB/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance, and the emissions of vehicles that use alternative technologies including fuel cells, batteries, electric motors, and internal combustion engines in hybrid (i.e. multiple power sources) configurations. It excels at quantifying the relative change that can be expected due to the implementation of technology compared to a baseline scenario. ADVISOR's capabilities and limitations are presented and the power source models that are included in ADVISOR are discussed. Finally, several applications of the tool are presented to highlight ADVISOR's functionality. The content of this paper is based on a presentation made at the 'Development of Advanced Battery Engineering Models' workshop held in Crystal City, Virginia in August 2001.

  3. A measuring tool for tree-rings analysis

    NASA Astrophysics Data System (ADS)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  4. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    NASA Astrophysics Data System (ADS)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  5. Mission operations data analysis tools for Mars Observer guidance and control

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.

    1994-01-01

    Mission operations for the Mars Observer (MO) Project at the Jet Propulsion Laboratory were supported by a variety of ground data processing software and analysis tools. Some of these tools were generic to multimission spacecraft mission operations, some were specific to the MO spacecraft, and others were custom tailored to the operation and control of the Attitude and Articulation Control Subsystem (AACS). The focus of this paper is on the data analysis tools for the AACS. Four different categories of analysis tools are presented; with details offered for specific tools. Valuable experience was gained from the use of these tools and through their development. These tools formed the backbone and enhanced the efficiency of the AACS Unit in the Mission Operations Spacecraft Team. These same tools, and extensions thereof, have been adopted by the Galileo mission operations, and are being designed into Cassini and other future spacecraft mission operations.

  6. adwTools Developed: New Bulk Alloy and Surface Analysis Software for the Alloy Design Workbench

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Morse, Jeffrey A.; Noebe, Ronald D.; Abel, Phillip B.

    2004-01-01

    A suite of atomistic modeling software, called the Alloy Design Workbench, has been developed by the Computational Materials Group at the NASA Glenn Research Center and the Ohio Aerospace Institute (OAI). The main goal of this software is to guide and augment experimental materials research and development efforts by creating powerful, yet intuitive, software that combines a graphical user interface with an operating code suitable for real-time atomistic simulations of multicomponent alloy systems. Targeted for experimentalists, the interface is straightforward and requires minimum knowledge of the underlying theory, allowing researchers to focus on the scientific aspects of the work. The centerpiece of the Alloy Design Workbench suite is the adwTools module, which concentrates on the atomistic analysis of surfaces and bulk alloys containing an arbitrary number of elements. An additional module, adwParams, handles ab initio input for the parameterization used in adwTools. Future modules planned for the suite include adwSeg, which will provide numerical predictions for segregation profiles to alloy surfaces and interfaces, and adwReport, which will serve as a window into the database, providing public access to the parameterization data and a repository where users can submit their own findings from the rest of the suite. The entire suite is designed to run on desktop-scale computers. The adwTools module incorporates a custom OAI/Glenn-developed Fortran code based on the BFS (Bozzolo- Ferrante-Smith) method for alloys, ref. 1). The heart of the suite, this code is used to calculate the energetics of different compositions and configurations of atoms.

  7. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    NASA Astrophysics Data System (ADS)

    Adams, David; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Farrell, Steven; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-12-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  8. An Analysis of Teacher Selection Tools in Pennsylvania

    ERIC Educational Resources Information Center

    Vitale, Tracy L.

    2009-01-01

    The purpose of this study was to examine teacher screening and selection tools currently being utilized by public school districts in Pennsylvania and to compare these tools to the research on qualities of effective teachers. The researcher developed four research questions that guided her study. The Pennsylvania Association of School Personnel…

  9. Finite element analysis of structural engineering problems using a viscoplastic model incorporating two back stresses

    NASA Technical Reports Server (NTRS)

    Arya, Vinod K.; Halford, Gary R.

    1993-01-01

    The feasibility of a viscoplastic model incorporating two back stresses and a drag strength is investigated for performing nonlinear finite element analyses of structural engineering problems. To demonstrate suitability for nonlinear structural analyses, the model is implemented into a finite element program and analyses for several uniaxial and multiaxial problems are performed. Good agreement is shown between the results obtained using the finite element implementation and those obtained experimentally. The advantages of using advanced viscoplastic models for performing nonlinear finite element analyses of structural components are indicated.

  10. Incorporation of strain hardening in piezoresistance analysis: Application to ytterbium foils in a PMMA matrix

    SciTech Connect

    Gupta, Y.M.; Gupta, S.C.

    1987-01-15

    The results of an earlier paper are extended to include strain hardening in the analytic model for piezoresistance. This development is necessary to model the gauge hysteresis observed in the experimental data. Analytic developments to incorporate strain hardening in the theoretical formalism and procedures to calculate the gauge response are described. The analytic model is used to calculate the response of ytterbium (Yb) foils in a polymethylmethacrylate matrix in both shock wave and quasi-static experiments. Good agreement is observed between the model predictions and experimental results. Difficulties in calculating the lateral gauge response when the matrix material is inelastic are discussed.

  11. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, Daniel; Kueppers, Ulrich; Castro, Jon M.; Pacheco, Jose M. R.; Dingwell, Donald B.

    2010-05-01

    -size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  12. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, D.; Kueppers, U.; Castro, J. M.

    2009-12-01

    -size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  13. Incorporating color into integrative taxonomy: analysis of the varied tit (Sittiparus varius) complex in East Asia.

    PubMed

    McKay, Bailey D; Mays, Herman L; Yao, Cheng-Te; Wan, Dongmei; Higuchi, Hiroyoshi; Nishiumi, Isao

    2014-07-01

    Species designations are critically important scientific hypotheses that serve as the foundational units in a wide range of biological subdisciplines. A growing realization that some classes of data fail to delimit species under certain conditions has led to increasingly more integrative taxonomies, whereby species discovery and hypothesis testing are based on multiple kinds of data (e.g., morphological, molecular, behavioral, ecological, etc.). However, although most taxonomic descriptions have been based on morphology, some key morphological features, such as color, are rarely quantified and incorporated into integrative taxonomic studies. In this article, we applied a new method of ultraviolet digital photography to measure plumage variation in a color-variable avian species complex, the varied tit (Sittiparus varius). Plumage measurements corroborated species limits defined by morphometric, mitochondrial DNA, and nuclear DNA disjunctions and provided the only evidence for distinguishing two recently evolved species. Importantly, color quantification also provided a justification for lumping putative taxa with no evidence of evolutionary independence. Our revised taxonomy thus refines conservation units for listing and management and clarifies the primary units for evolutionary studies. Species tree analyses, which applied the newly delimited species as operational taxonomic units, revealed a robust phylogenetic hypothesis for the group that establishes a foundation for future biogeographic analyses. This study demonstrates how digital photography can be used to incorporate color character variation into integrative taxonomies, which should lead to more informed, more rigorous, and more accurate assessments of biodiversity. [Color, digital photography, integrative taxonomy, Sittiparus varius, species delimitation, varied tit.]. PMID:24603127

  14. Developing shape analysis tools to assist complex spatial decision making

    SciTech Connect

    Mackey, H.E.; Ehler, G.B.; Cowen, D.

    1996-05-31

    The objective of this research was to develop and implement a shape identification measure within a geographic information system, specifically one that incorporates analytical modeling for site location planning. The application that was developed incorporated a location model within a raster-based GIS, which helped address critical performance issues for the decision support system. Binary matrices, which approximate the object`s geometrical form, are passed over the grided data structure and allow identification of irregular and regularly shaped objects. Lastly, the issue of shape rotation is addressed and is resolved by constructing unique matrices corresponding to the object`s orientation

  15. The NOAA Local Climate Analysis Tool - An Application in Support of a Weather Ready Nation

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Horsfall, F. M.

    2012-12-01

    Citizens across the U.S., including decision makers from the local to the national level, have a multitude of questions about climate, such as the current state and how that state fits into the historical context, and more importantly, how climate will impact them, especially with regard to linkages to extreme weather events. Developing answers to these types of questions for locations has typically required extensive work to gather data, conduct analyses, and generate relevant explanations and graphics. Too frequently providers don't have ready access to or knowledge of reliable, trusted data sets, nor sound, scientifically accepted analysis techniques such that they can provide a rapid response to queries they receive. In order to support National Weather Service (NWS) local office forecasters with information they need to deliver timely responses to climate-related questions from their customers, we have developed the Local Climate Analysis Tool (LCAT). LCAT uses the principles of artificial intelligence to respond to queries, in particular, through use of machine technology that responds intelligently to input from users. A user translates customer questions into primary variables and issues and LCAT pulls the most relevant data and analysis techniques to provide information back to the user, who in turn responds to their customer. Most responses take on the order of 10 seconds, which includes providing statistics, graphical displays of information, translations for users, metadata, and a summary of the user request to LCAT. Applications in Phase I of LCAT, which is targeted for the NWS field offices, include Climate Change Impacts, Climate Variability Impacts, Drought Analysis and Impacts, Water Resources Applications, Attribution of Extreme Events, and analysis techniques such as time series analysis, trend analysis, compositing, and correlation and regression techniques. Data accessed by LCAT are homogenized historical COOP and Climate Prediction Center

  16. Static Analysis Tools, a Practical Approach for Safety-Critical Software Verification

    NASA Astrophysics Data System (ADS)

    Lopes, R.; Vicente, D.; Silva, N.

    2009-05-01

    Static code analysis tools available today range from Lintbased syntax parsers to standards' compliance checkers to tools using more formal methods for verification. As safety critical software complexity is increasing, these tools provide a mean to ensure code quality, safety and dependability attributes. They also provide a mean to introduce further automation in code analysis activities. The features presented by static code analysis tools are particularly interesting for V&V activities. In the scope of Independent Code Verification (IVE), two different static analysis tools have been used during Code Verification activities of the LISA Pathfinder onboard software in order to assess their contribution to the efficiency of the process and quality of the results. Polyspace (The MathWorks) and FlexeLint (Gimpel) tools have been used as examples of high-budget and low-budget tools respectively. Several aspects have been addressed: effort has been categorised for closer analysis (e.g. setup and configuration time, execution time, analysis of the results, etc), reported issues have been categorised according to their type and the coverage of traditional IVE tasks by the static code analysis tools has been evaluated. Final observations have been performed by analysing the previously referred subjects, namely regarding cost effectiveness, quality of results, complementarities between the results of different static code analysis tools and relation between automated code analysis and manual code inspection.

  17. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  18. Bioimpedance Harmonic Analysis as a Diagnostic Tool to Assess Regional Circulation and Neural Activity

    NASA Astrophysics Data System (ADS)

    Mudraya, I. S.; Revenko, S. V.; Khodyreva, L. A.; Markosyan, T. G.; Dudareva, A. A.; Ibragimov, A. R.; Romich, V. V.; Kirpatovsky, V. I.

    2013-04-01

    The novel technique based on harmonic analysis of bioimpedance microvariations with original hard- and software complex incorporating a high-resolution impedance converter was used to assess the neural activity and circulation in human urinary bladder and penis in patients with pelvic pain, erectile dysfunction, and overactive bladder. The therapeutic effects of shock wave therapy and Botulinum toxin detrusor injections were evaluated quantitatively according to the spectral peaks at low 0.1 Hz frequency (M for Mayer wave), respiratory (R) and cardiac (C) rhythms with their harmonics. Enhanced baseline regional neural activity identified according to M and R peaks was found to be presumably sympathetic in pelvic pain patients, and parasympathetic - in patients with overactive bladder. Total pulsatile activity and pulsatile resonances found in the bladder as well as in the penile spectrum characterised regional circulation and vascular tone. The abnormal spectral parameters characteristic of the patients with genitourinary diseases shifted to the norm in the cases of efficient therapy. Bioimpedance harmonic analysis seems to be a potent tool to assess regional peculiarities of circulatory and autonomic nervous activity in the course of patient treatment.

  19. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    USGS Publications Warehouse

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  20. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    SciTech Connect

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  1. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    NASA Technical Reports Server (NTRS)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  2. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  3. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  4. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  5. Incorporation of individual-patient data in network meta-analysis for multiple continuous endpoints, with application to diabetes treatment.

    PubMed

    Hong, Hwanhee; Fu, Haoda; Price, Karen L; Carlin, Bradley P

    2015-09-10

    Availability of individual patient-level data (IPD) broadens the scope of network meta-analysis (NMA) and enables us to incorporate patient-level information. Although IPD is a potential gold mine in biomedical areas, methodological development has been slow owing to limited access to such data. In this paper, we propose a Bayesian IPD NMA modeling framework for multiple continuous outcomes under both contrast-based and arm-based parameterizations. We incorporate individual covariate-by-treatment interactions to facilitate personalized decision making. Furthermore, we can find subpopulations performing well with a certain drug in terms of predictive outcomes. We also impute missing individual covariates via an MCMC algorithm. We illustrate this approach using diabetes data that include continuous bivariate efficacy outcomes and three baseline covariates and show its practical implications. Finally, we close with a discussion of our results, a review of computational challenges, and a brief description of areas for future research. PMID:25924975

  6. Online Analysis of Wind and Solar Part II: Transmission Tool

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  7. Online Analysis of Wind and Solar Part I: Ramping Tool

    SciTech Connect

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  8. Computational Tools for Parsimony Phylogenetic Analysis of Omics Data.

    PubMed

    Salazar, Jose; Amri, Hakima; Noursi, David; Abu-Asab, Mones

    2015-08-01

    High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value ("0" for normal states and "1" for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing

  9. Monitoring Activity of Taking Medicine by Incorporating RFID and Video Analysis

    PubMed Central

    Hasanuzzaman, Faiz M.; Yang, Xiaodong; Tian, YingLi; Liu, Qingshan; Capezuti, Elizabeth

    2013-01-01

    In this paper, we present a new framework to monitor medication intake for elderly individuals by incorporating a video camera and Radio Frequency Identification (RFID) sensors. The proposed framework can provide a key function for monitoring activities of daily living (ADLs) of elderly people at their own home. In an assistive environment, RFID tags are applied on medicine bottles located in a medicine cabinet so that each medicine bottle will have a unique ID. The description of the medicine data for each tag is manually input to a database. RFID readers will detect if any of these bottles are taken away from the medicine cabinet and identify the tag attached on the medicine bottle. A video camera is installed to continue monitoring the activity of taking medicine by integrating face detection and tracking, mouth detection, background subtraction, and activity detection. The preliminary results demonstrate that 100% detection accuracy for identifying medicine bottles and promising results for monitoring activity of taking medicine. PMID:23914344

  10. Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools

    NASA Technical Reports Server (NTRS)

    Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

    2006-01-01

    A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

  11. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    NASA Astrophysics Data System (ADS)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  12. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    SciTech Connect

    Jodoin, Vincent J; Lee, Ronald W; Peplow, Douglas E.; Lefebvre, Jordan P

    2011-01-01

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  13. A Finite-Element Analysis of Transient Vibration of an Ultrasonic Welding Tool

    NASA Astrophysics Data System (ADS)

    Koike, Yoshikazu; Ueha, Sadayuki

    1993-05-01

    Transient vibration of an ultrasonic welding tool is computed in order to estimate the fracture of the tool. Equations for this analysis have been formulated by means of the finite-element method (FEM) and Newmark-β method. With the use of these equations, the transient stress distribution of the ultrasonic welding tool driven by a bolt-clamped Langevin-type transducer is calculated. Then, the effect of the transient state of the tool is investigated.

  14. CASSIS - Web based tools for the analysis of astrophysical spectra

    NASA Astrophysics Data System (ADS)

    Walters, Adam D.; Klotz, Alain; Caux, Emmanuel; Crovisier, Jacques

    Future instruments for submillimeter and FIR astronomy like the Herschel Space Observatory and Alma will offer the possibility to make high-resolution wide-frequency spectral surveys of the interstellar and circumstellar media. The CASSIS project aims to facilitate the analysis of these spectra by a maximum number of interested researchers. We will give a demonstration of the present state of CASSIS software which is beginning to be made available for testing and validation through the web. By connecting to the CASSIS server a user will be able to visualise a synthetic spectrum created from a choice of models and associated parameters and compare this simulation with the observations. By choosing a template closest to that of the object being observed the user will obtain a set of starting conditions including the principal chemical species, their abundance and linked physical parameters. Various computer tools will then allow the inversion of the spectra to obtain the best set of parameters. In order to run the model chosen the CASSIS server will have access to a variety of different databases and will automatically transform this data into a common format. As of May 2005, two spectroscopic data bases (JPL and CDMS) are periodically updated and combined on the server allowing rapid injection into the models. Access to bases giving collisional parameters is planned in the near future. At this time two models are available: a general Local Thermal Equilibrium model with a correction for the optical depth and a specialised model for comets written by J. Crovisier. Two working modes can be used: (1) a line-by-line mode that steps automatically from one transition of a given species to the next allowing a comparison between the predictions and the observations to be made for each line; (2) a sum mode which shows the convoluted spectra of a variety of species. For the ETL model it is possible to create interactively a rotational diagram to determine N and T for each

  15. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  16. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  17. A life cycle cost analysis framework for geologic storage of hydrogen : a user's tool.

    SciTech Connect

    Kobos, Peter Holmes; Lord, Anna Snider; Borns, David James; Klise, Geoffrey T.

    2011-09-01

    The U.S. Department of Energy (DOE) has an interest in large scale hydrogen geostorage, which could offer substantial buffer capacity to meet possible disruptions in supply or changing seasonal demands. The geostorage site options being considered are salt caverns, depleted oil/gas reservoirs, aquifers and hard rock caverns. The DOE has an interest in assessing the geological, geomechanical and economic viability for these types of geologic hydrogen storage options. This study has developed an economic analysis methodology and subsequent spreadsheet analysis to address costs entailed in developing and operating an underground geologic storage facility. This year the tool was updated specifically to (1) incorporate more site-specific model input assumptions for the wells and storage site modules, (2) develop a version that matches the general format of the HDSAM model developed and maintained by Argonne National Laboratory, and (3) incorporate specific demand scenarios illustrating the model's capability. Four general types of underground storage were analyzed: salt caverns, depleted oil/gas reservoirs, aquifers, and hard rock caverns/other custom sites. Due to the substantial lessons learned from the geological storage of natural gas already employed, these options present a potentially sizable storage option. Understanding and including these various geologic storage types in the analysis physical and economic framework will help identify what geologic option would be best suited for the storage of hydrogen. It is important to note, however, that existing natural gas options may not translate to a hydrogen system where substantial engineering obstacles may be encountered. There are only three locations worldwide that currently store hydrogen underground and they are all in salt caverns. Two locations are in the U.S. (Texas), and are managed by ConocoPhillips and Praxair (Leighty, 2007). The third is in Teeside, U.K., managed by Sabic Petrochemicals (Crotogino et

  18. The ISOPHOT Interactive Analysis PIA, a Calibration and Scientific Analysis Tool

    NASA Astrophysics Data System (ADS)

    Gabriel, C.; Acosta-Pulido, J.; Heinrichsen, I.; Morris, H.; Tai, W.-M.

    ISOPHOT is one of the instruments on board the Infrared Space Observatory (ISO), launched in November 1995 by the European Space Agency. ISOPHOT Interactive Analysis, PIA, is a scientific and calibration data analysis tool for ISOPHOT data reduction. The PIA software was designed both as a tool for use by the instrument team for calibration of the PHT instrument during and after the ISO mission, and as an interactive tool for ISOPHOT data analysis by general observers. It has been jointly developed by the ESA Astrophysics Division (responsible for planning, direction, and execution) and scientific institutes forming the ISOPHOT consortium. PIA is entirely based on the Interactive Data Language IDL (IDL92). All the capabilities of input/output, processing, and visualization can be reached from window menus for all the different ISOPHOT sub-systems, in all levels of data reduction from digital raw data, coming from the ISO telemetry, to the final level of calibrated images, spectra, and multi-filter, multi-aperture photometry. In this article, we describe the structure of the package, putting special emphasis on the internal data organization and management.

  19. Incorporating Human Movement Behavior into the Analysis of Spatially Distributed Infrastructure

    PubMed Central

    Wu, Lihua; Leung, Henry; Jiang, Hao; Zheng, Hong; Ma, Li

    2016-01-01

    For the first time in human history, the majority of the world's population resides in urban areas. Therefore, city managers are faced with new challenges related to the efficiency, equity and quality of the supply of resources, such as water, food and energy. Infrastructure in a city can be viewed as service points providing resources. These service points function together as a spatially collaborative system to serve an increasing population. To study the spatial collaboration among service points, we propose a shared network according to human's collective movement and resource usage based on data usage detail records (UDRs) from the cellular network in a city in western China. This network is shown to be not scale-free, but exhibits an interesting triangular property governed by two types of nodes with very different link patterns. Surprisingly, this feature is consistent with the urban-rural dualistic context of the city. Another feature of the shared network is that it consists of several spatially separated communities that characterize local people's active zones but do not completely overlap with administrative areas. According to these features, we propose the incorporation of human movement into infrastructure classification. The presence of well-defined spatially separated clusters confirms the effectiveness of this approach. In this paper, our findings reveal the spatial structure inside a city, and the proposed approach provides a new perspective on integrating human movement into the study of a spatially distributed system. PMID:26784749

  20. Incorporating Human Movement Behavior into the Analysis of Spatially Distributed Infrastructure.

    PubMed

    Wu, Lihua; Leung, Henry; Jiang, Hao; Zheng, Hong; Ma, Li

    2016-01-01

    For the first time in human history, the majority of the world's population resides in urban areas. Therefore, city managers are faced with new challenges related to the efficiency, equity and quality of the supply of resources, such as water, food and energy. Infrastructure in a city can be viewed as service points providing resources. These service points function together as a spatially collaborative system to serve an increasing population. To study the spatial collaboration among service points, we propose a shared network according to human's collective movement and resource usage based on data usage detail records (UDRs) from the cellular network in a city in western China. This network is shown to be not scale-free, but exhibits an interesting triangular property governed by two types of nodes with very different link patterns. Surprisingly, this feature is consistent with the urban-rural dualistic context of the city. Another feature of the shared network is that it consists of several spatially separated communities that characterize local people's active zones but do not completely overlap with administrative areas. According to these features, we propose the incorporation of human movement into infrastructure classification. The presence of well-defined spatially separated clusters confirms the effectiveness of this approach. In this paper, our findings reveal the spatial structure inside a city, and the proposed approach provides a new perspective on integrating human movement into the study of a spatially distributed system. PMID:26784749

  1. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    PubMed

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. PMID:21554926

  2. Lagrangian analysis. Modern tool of the dynamics of solids

    NASA Astrophysics Data System (ADS)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  3. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  4. Incorporation of inexact dynamic optimization with fuzzy relation analysis for integrated climate change impact study

    SciTech Connect

    Huang, G.H.; Cohen, S.J.; Yin, Y.Y.; Bass, B. |

    1996-09-01

    A climatic change impact assessment was performed for agricultural and timbering activities. An inexact dynamic optimization model was utilized that can reflect complex system features and a related fuzzy system relation analysis method for comprehensive impact patterns assessment.

  5. Incorporating general race and housing flexibility and deadband in rolling element bearing analysis

    NASA Technical Reports Server (NTRS)

    Davis, R. R.; Vallance, C. S.

    1989-01-01

    Methods for including the effects of general race and housing compliance and outer race-to-housing deadband (clearance) in rolling element bearing mechanics analysis is presented. It is shown that these effects can cause significant changes in bearing stiffness characteristics, which are of major importance in rotordynamic response of turbomachinery and other rotating systems. Preloading analysis is demonstrated with the finite element/contact mechanics hybrid method applied to a 45 mm angular contact ball bearing.

  6. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    SciTech Connect

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  7. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    NASA Astrophysics Data System (ADS)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  8. TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data.

    PubMed

    Fimereli, Danai; Detours, Vincent; Konopka, Tomasz

    2013-04-01

    High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/. PMID:23408855

  9. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  10. Forward-looking activities: incorporating citizens' visions: A critical analysis of the CIVISTI method.

    PubMed

    Gudowsky, Niklas; Peissl, Walter; Sotoudeh, Mahshid; Bechtold, Ulrike

    2012-11-01

    Looking back on the many prophets who tried to predict the future as if it were predetermined, at first sight any forward-looking activity is reminiscent of making predictions with a crystal ball. In contrast to fortune tellers, today's exercises do not predict, but try to show different paths that an open future could take. A key motivation to undertake forward-looking activities is broadening the information basis for decision-makers to help them actively shape the future in a desired way. Experts, laypeople, or stakeholders may have different sets of values and priorities with regard to pending decisions on any issue related to the future. Therefore, considering and incorporating their views can, in the best case scenario, lead to more robust decisions and strategies. However, transferring this plurality into a form that decision-makers can consider is a challenge in terms of both design and facilitation of participatory processes. In this paper, we will introduce and critically assess a new qualitative method for forward-looking activities, namely CIVISTI (Citizen Visions on Science, Technology and Innovation; www.civisti.org), which was developed during an EU project of the same name. Focussing strongly on participation, with clear roles for citizens and experts, the method combines expert, stakeholder and lay knowledge to elaborate recommendations for decision-making in issues related to today's and tomorrow's science, technology and innovation. Consisting of three steps, the process starts with citizens' visions of a future 30-40 years from now. Experts then translate these visions into practical recommendations which the same citizens then validate and prioritise to produce a final product. The following paper will highlight the added value as well as limits of the CIVISTI method and will illustrate potential for the improvement of future processes. PMID:23204998

  11. Bayesian regression discontinuity designs: incorporating clinical knowledge in the causal analysis of primary care data

    PubMed Central

    Geneletti, Sara; O’Keeffe, Aidan G.; Sharples, Linda D.; Richardson, Sylvia; Baio, Gianluca

    2016-01-01

    The regression discontinuity (RD) design is a quasi-experimental design that estimates the causal effects of a treatment by exploiting naturally occurring treatment rules. It can be applied in any context where a particular treatment or intervention is administered according to a pre-specified rule linked to a continuous variable. Such thresholds are common in primary care drug prescription where the RD design can be used to estimate the causal effect of medication in the general population. Such results can then be contrasted to those obtained from randomised controlled trials (RCTs) and inform prescription policy and guidelines based on a more realistic and less expensive context. In this paper, we focus on statins, a class of cholesterol-lowering drugs, however, the methodology can be applied to many other drugs provided these are prescribed in accordance to pre-determined guidelines. Current guidelines in the UK state that statins should be prescribed to patients with 10-year cardiovascular disease risk scores in excess of 20%. If we consider patients whose risk scores are close to the 20% risk score threshold, we find that there is an element of random variation in both the risk score itself and its measurement. We can therefore consider the threshold as a randomising device that assigns statin prescription to individuals just above the threshold and withholds it from those just below. Thus, we are effectively replicating the conditions of an RCT in the area around the threshold, removing or at least mitigating confounding. We frame the RD design in the language of conditional independence, which clarifies the assumptions necessary to apply an RD design to data, and which makes the links with instrumental variables clear. We also have context-specific knowledge about the expected sizes of the effects of statin prescription and are thus able to incorporate this into Bayesian models by formulating informative priors on our causal parameters. PMID:25809691

  12. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    NASA Astrophysics Data System (ADS)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  13. Analysis of tablet compaction. I. Characterization of mechanical behavior of powder and powder/tooling friction.

    PubMed

    Cunningham, J C; Sinka, I C; Zavaliangos, A

    2004-08-01

    In this first of two articles on the modeling of tablet compaction, the experimental inputs related to the constitutive model of the powder and the powder/tooling friction are determined. The continuum-based analysis of tableting makes use of an elasto-plastic model, which incorporates the elements of yield, plastic flow potential, and hardening, to describe the mechanical behavior of microcrystalline cellulose over the range of densities experienced during tableting. Specifically, a modified Drucker-Prager/cap plasticity model, which includes material parameters such as cohesion, internal friction, and hydrostatic yield pressure that evolve with the internal state variable relative density, was applied. Linear elasticity is assumed with the elastic parameters, Young's modulus, and Poisson's ratio dependent on the relative density. The calibration techniques were developed based on a series of simple mechanical tests including diametrical compression, simple compression, and die compaction using an instrumented die. The friction behavior is measured using an instrumented die and the experimental data are analyzed using the method of differential slices. The constitutive model and frictional properties are essential experimental inputs to the finite element-based model described in the companion article. PMID:15236452

  14. Issues and approach to develop validated analysis tools for hypersonic flows: One perspective

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.

    1993-01-01

    Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.

  15. Issues and approach to develop validated analysis tools for hypersonic flows: One perspective

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.

    1992-01-01

    Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.

  16. Tools for Education Policy Analysis [with CD-ROM].

    ERIC Educational Resources Information Center

    Mingat, Alain; Tan, Jee-Peng

    This manual contains a set of tools to assist policymakers in analyzing and revamping educational policy. Its main focus is on some economic and financial aspects of education and selected features in the arrangements for service delivery. Originally offered as a series of training workshops for World Bank staff to work with clients in the…

  17. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  18. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  19. Mapping and spatiotemporal analysis tool for hydrological data: Spellmap

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...

  20. Failure Analysis of a Complex Learning Framework Incorporating Multi-Modal and Semi-Supervised Learning

    SciTech Connect

    Pullum, Laura L; Symons, Christopher T

    2011-01-01

    Machine learning is used in many applications, from machine vision to speech recognition to decision support systems, and is used to test applications. However, though much has been done to evaluate the performance of machine learning algorithms, little has been done to verify the algorithms or examine their failure modes. Moreover, complex learning frameworks often require stepping beyond black box evaluation to distinguish between errors based on natural limits on learning and errors that arise from mistakes in implementation. We present a conceptual architecture, failure model and taxonomy, and failure modes and effects analysis (FMEA) of a semi-supervised, multi-modal learning system, and provide specific examples from its use in a radiological analysis assistant system. The goal of the research described in this paper is to provide a foundation from which dependability analysis of systems using semi-supervised, multi-modal learning can be conducted. The methods presented provide a first step towards that overall goal.

  1. Steady-State Kinetic Analysis of DNA Polymerase Single-Nucleotide Incorporation Products

    PubMed Central

    O'Flaherty, Derek K.

    2014-01-01

    This unit describes the experimental procedures for the steady-state kinetic analysis of DNA synthesis across DNA nucleotides (native or modified) by DNA polymerases. In vitro primer extension experiments with a single nucleoside triphosphate species followed by denaturing polyacrylamide gel electrophoresis of the extended products is described. Data analysis procedures and fitting to steady-state kinetic models is presented to highlight the kinetic differences involved in the bypass of damaged versus undamaged DNA. Moreover, explanations concerning problems encountered in these experiments are addressed. This approach provides useful quantitative parameters for the processing of damaged DNA by DNA polymerases. PMID:25501593

  2. Considerations for Incorporating Bioavailability in Effect-Directed Analysis and Toxicity Identification Evaluation.

    EPA Science Inventory

    In order to avoid a bias toward highly toxic but poorly bioavailable compounds in the effect-directed analysis (EDA) of soils and sediments, approaches are discussed to consider bioavailability in EDA procedures. In parallel, complimentary approaches for making toxicity identific...

  3. Incorporating Asymmetric Dependency Patterns in the Evaluation of IS/IT projects Using Real Option Analysis

    ERIC Educational Resources Information Center

    Burke, John C.

    2012-01-01

    The objective of my dissertation is to create a general approach to evaluating IS/IT projects using Real Option Analysis (ROA). This is an important problem because an IT Project Portfolio (ITPP) can represent hundreds of projects, millions of dollars of investment and hundreds of thousands of employee hours. Therefore, any advance in the…

  4. Functional Analysis of the Early Development of Self-Injurious Behavior: Incorporating Gene-Environment Interactions

    ERIC Educational Resources Information Center

    Langthorne, Paul; McGill, Peter

    2008-01-01

    The analysis of the early development of self-injurious behavior (SIB) has, to date, reflected the wider distinction between nature and nurture. Despite the status of genetic factors as risk markers for the later development of SIB, a model that accounts for their influence on early behavior-environment relations is lacking. In the current paper…

  5. Semantic Feature Analysis: Incorporating Typicality Treatment and Mediating Strategy Training to Promote Generalization

    ERIC Educational Resources Information Center

    Wambaugh, Julie L.; Mauszycki, Shannon; Cameron, Rosalea; Wright, Sandra; Nessler, Christina

    2013-01-01

    Purpose: This investigation was designed to examine the generalization effects of semantic treatment for word retrieval deficits in people with aphasia. Semantic feature analysis (SFA; Boyle & Coelho, 1995), typicality treatment (Kiran & Thompson, 2003), and mediating strategy training were combined to maximize potential generalization effects.…

  6. Teaching for Art Criticism: Incorporating Feldman's Critical Analysis Learning Model in Students' Studio Practice

    ERIC Educational Resources Information Center

    Subramaniam, Maithreyi; Hanafi, Jaffri; Putih, Abu Talib

    2016-01-01

    This study adopted 30 first year graphic design students' artwork, with critical analysis using Feldman's model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students' performances in their critical ability.…

  7. Protein Analysis Using Real-Time PCR Instrumentation: Incorporation in an Integrated, Inquiry-Based Project

    ERIC Educational Resources Information Center

    Southard, Jonathan N.

    2014-01-01

    Instrumentation for real-time PCR is used primarily for amplification and quantitation of nucleic acids. The capability to measure fluorescence while controlling temperature in multiple samples can also be applied to the analysis of proteins. Conformational stability and changes in stability due to ligand binding are easily assessed. Protein…

  8. 77 FR 66867 - Corning Incorporated; Analysis of Proposed Agreement Containing Consent Order To Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-07

    ... unfair methods of competition. The attached Analysis To Aid Public Comment describes both the allegations... Stephanie C. Bovee (202-326-2083), FTC, Bureau of Competition, 600 Pennsylvania Avenue NW., Washington, DC..., and Section 5 of the Federal Trade Commission Act, as amended, 15 U.S.C. 45, by lessening...

  9. Classification Based on Hierarchical Linear Models: The Need for Incorporation of Social Contexts in Classification Analysis

    ERIC Educational Resources Information Center

    Vaughn, Brandon K.; Wang, Qui

    2009-01-01

    Many areas in educational and psychological research involve the use of classification statistical analysis. For example, school districts might be interested in attaining variables that provide optimal prediction of school dropouts. In psychology, a researcher might be interested in the classification of a subject into a particular psychological…

  10. The incorporation of plotting capability into the Unified Subsonic Supersonic Aerodynamic Analysis program, version B

    NASA Technical Reports Server (NTRS)

    Winter, O. A.

    1980-01-01

    The B01 version of the United Subsonic Supersonic Aerodynamic Analysis program is the result of numerous modifications and additions made to the B00 version. These modifications and additions affect the program input, its computational options, the code readability, and the overlay structure. The following are described: (1) the revised input; (2) the plotting overlay programs which were also modified, and their associated subroutines, (3) the auxillary files used by the program, the revised output data; and (4) the program overlay structure.

  11. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    SciTech Connect

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  12. OutbreakTools: A new platform for disease outbreak analysis using the R software

    PubMed Central

    Jombart, Thibaut; Aanensen, David M.; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Höhle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A.; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil

    2014-01-01

    The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks. PMID:24928667

  13. The Mission Planning Lab: A Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  14. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  15. Linkage Analysis Incorporating Gene-Age Interactions Identifies Seven Novel Lipid Loci: The Family Blood Pressure Program

    PubMed Central

    Kraja, Aldi T.; Turner, Stephen T.; Hanis, Craig L.; Sheu, Wayne; Chen, Ida; Jaquish, Cashell; Cooper, Richard S.; Chakravarti, Aravinda; Quertermous, Thomas; Boerwinkle, Eric; Hunt, Steven C.; Rao, DC

    2015-01-01

    OBJECTIVE To detect novel loci with age-dependent effects on fasting (≥8 hours) levels of total cholesterol, high-density lipoprotein, low-density lipoprotein, and triglycerides using3,600 African Americans, 1,283 Asians, 3,218 European Americans, and 2,026 Mexican Americans from the Family Blood Pressure Program (FBPP). METHODS Within each subgroup (defined by network, race, and sex), we employedstepwise linear regression (retention p≤0.05) to adjust lipid levels for age, age-squared, age-cubed,body-mass-index, current smoking status, current drinking status, field center, estrogen therapy (females only), as well asantidiabetic, antihypertensive, and antilipidemic medication use. For each trait, we pooled the standardized male and female residuals within each network and race and fit a generalized variance components model that incorporated gene-age interactions. We conducted FBPP-wide and race-specific meta-analyses by combining the p-values of each linkage marker across subgroups usinga modifiedFisher's method. RESULTS We identified seven novel loci with age-dependent effects; four total cholesterol loci from the meta-analysis of Mexican Americans (on chromosomes 2q24.1, 4q21.21, 8q22.2, and 12p11.23) and three high-density lipoprotein loci from the meta-analysis of all FBPP subgroups (on chromosomes 1p12, 14q11.2, and 21q21.1). These loci lacked significant genome-wide linkage or association evidence in the literature and hadlogarithm of odds (LOD) score ≥ 3 in the meta-analysis with LOD≥1 in at least two network and race subgroups (exclusively of non-European descent). CONCLUSION Incorporating gene-age interactions into the analysis of lipids using multi-ethnic cohorts can enhance gene discovery. These interaction loci can guide the selection of familiesfor sequencingstudies of lipid-associated variants. PMID:24819747

  16. Analysis tools for the calibration and commissioning of the AOF

    NASA Astrophysics Data System (ADS)

    Garcia-Rissmann, Aurea; Kolb, Johann; Le Louarn, Miska; Madec, Pierre-Yves; Muller, Nicolas

    2013-12-01

    The Adaptive Optics Facility (AOF) is an AO-oriented upgrade envisaged to be implemented at the UT4 in Paranal in 2013-2014, and which could serve as a test case for the E-ELT. Counting on the largest Deformable Secondary Mirror ever built (1170 actuators) and on four off-axes Na laser launch telescopes, the AOF will operate in distinct modes (GLAO, LTAO, SCAO), in accordance to the instruments attached to the 2 telescope Nasmyth ports (GALACSI+MUSE, GRAAL+HAWK-I) and to the Cassegrain port (ERIS). Tools are under development to allow a fast testing of important parameters for these systems when at commissioning and for posterior assessment of telemetry data. These concern the determination of turbulence parameters and Cn2 profiling, measurement of Strehl and ensquared energies, misregistration calculation, bandwidth & overall performance, etc. Our tools are presented as Graphical User Interfaces developed in the Matlab environment, and will be able to grab through a dedicated server data saved in SPARTA standards. We present here the tools developed up to present date and discuss details of what can be obtained from the AOF, based on simulations.

  17. Development of guidance for states transitioning to new safety analysis tools

    NASA Astrophysics Data System (ADS)

    Alluri, Priyanka

    With about 125 people dying on US roads each day, the US Department of Transportation heightened the awareness of critical safety issues with the passage of SAFETEA-LU (Safe Accountable Flexible Efficient Transportation Equity Act---a Legacy for Users) legislation in 2005. The legislation required each of the states to develop a Strategic Highway Safety Plan (SHSP) and incorporate data-driven approaches to prioritize and evaluate program outcomes: Failure to do so resulted in funding sanctioning. In conjunction with the legislation, research efforts have also been progressing toward the development of new safety analysis tools such as IHSDM (Interactive Highway Safety Design Model), SafetyAnalyst, and HSM (Highway Safety Manual). These software and analysis tools are comparatively more advanced in statistical theory and level of accuracy, and have a tendency to be more data intensive. A review of the 2009 five-percent reports and excerpts from the nationwide survey revealed astonishing facts about the continuing use of traditional methods including crash frequencies and rates for site selection and prioritization. The intense data requirements and statistical complexity of advanced safety tools are considered as a hindrance to their adoption. In this context, this research aims at identifying the data requirements and data availability for SafetyAnalyst and HSM by working with both the tools. This research sets the stage for working with the Empirical Bayes approach by highlighting some of the biases and issues associated with the traditional methods of selecting projects such as greater emphasis on traffic volume and regression-to-mean phenomena. Further, the not-so-obvious issue with shorter segment lengths, which effect the results independent of the methods used, is also discussed. The more reliable and statistically acceptable Empirical Bayes methodology requires safety performance functions (SPFs), regression equations predicting the relation between crashes

  18. [Incorporation of the Hazard Analysis and Critical Control Point system (HACCP) in food legislation].

    PubMed

    Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R

    2004-01-01

    The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety. PMID:15656068

  19. Creation Of The Residual Stress By Influence Of Wear Of Cutting Tool And Their Analysis

    NASA Astrophysics Data System (ADS)

    Kordík, Marek; Čilliková, Mária; Mrazik, Jozef; Martinček, Juraj; Janota, Miroslav; Nicielnik, Henryk

    2015-12-01

    The aim of this paper is analysis of turned bearing ring made of material 14109 (DIN 100Cr6) without heat treatment. For the analysis a mechanical destructive method was chosen. Analysis focused on existence and character of residual stresses after turning operation of bearing ring by tool with different level of wear. The experiment reveals the relationships between residual stress creation and cutting tool wear.

  20. Statistical analysis of surrogate signals to incorporate respiratory motion variability into radiotherapy treatment planning

    NASA Astrophysics Data System (ADS)

    Wilms, Matthias; Ehrhardt, Jan; Werner, René; Marx, Mirko; Handels, Heinz

    2014-03-01

    Respiratory motion and its variability lead to location uncertainties in radiation therapy (RT) of thoracic and abdominal tumors. Current approaches for motion compensation in RT are usually driven by respiratory surrogate signals, e.g., spirometry. In this contribution, we present an approach for statistical analysis, modeling and subsequent simulation of surrogate signals on a cycle-by-cycle basis. The simulated signals represent typical patient-specific variations of, e.g., breathing amplitude and cycle period. For the underlying statistical analysis, all breathing cycles of an observed signal are consistently parameterized using approximating B-spline curves. Statistics on breathing cycles are then performed by using the parameters of the B-spline approximations. Assuming that these parameters follow a multivariate Gaussian distribution, realistic time-continuous surrogate signals of arbitrary length can be generated and used to simulate the internal motion of tumors and organs based on a patient-specific diffeomorphic correspondence model. As an example, we show how this approach can be employed in RT treatment planning to calculate tumor appearance probabilities and to statistically assess the impact of respiratory motion and its variability on planned dose distributions.

  1. Performance analysis of improved methodology for incorporation of spatial/spectral variability in synthetic hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Scanlan, Neil W.; Schott, John R.; Brown, Scott D.

    2003-12-01

    Synthetic imagery has traditionally been used to support sensor design by enabling design engineers to pre-evaluate image products during the design and development stages. Increasingly exploitation analysts are looking to synthetic imagery as a way to develop and test exploitation algorithms before image data are available from new sensors. Even when sensors are available, synthetic imagery can significantly aid in algorithm development by providing a wide range of "ground truthed" images with varying illumination, atmospheric, viewing and scene conditions. One limitation of synthetic data is that the background variability is often too bland. It does not exhibit the spatial and spectral variability present in real data. In this work, four fundamentally different texture modeling algorithms will first be implemented as necessary into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model environment. Two of the models to be tested are variants of a statistical Z-Score selection model, while the remaining two involve a texture synthesis and a spectral end-member fractional abundance map approach, respectively. A detailed comparative performance analysis of each model will then be carried out on several texturally significant regions of the resultant synthetic hyperspectral imagery. The quantitative assessment of each model will utilize a set of three peformance metrics that have been derived from spatial Gray Level Co-Occurrence Matrix (GLCM) analysis, hyperspectral Signal-to-Clutter Ratio (SCR) measures, and a new concept termed the Spectral Co-Occurrence Matrix (SCM) metric which permits the simultaneous measurement of spatial and spectral texture. Previous research efforts on the validation and performance analysis of texture characterization models have been largely qualitative in nature based on conducting visual inspections of synthetic textures in order to judge the degree of similarity to the original sample texture imagery. The quantitative

  2. Performance analysis of improved methodology for incorporation of spatial/spectral variability in synthetic hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Scanlan, Neil W.; Schott, John R.; Brown, Scott D.

    2004-01-01

    Synthetic imagery has traditionally been used to support sensor design by enabling design engineers to pre-evaluate image products during the design and development stages. Increasingly exploitation analysts are looking to synthetic imagery as a way to develop and test exploitation algorithms before image data are available from new sensors. Even when sensors are available, synthetic imagery can significantly aid in algorithm development by providing a wide range of "ground truthed" images with varying illumination, atmospheric, viewing and scene conditions. One limitation of synthetic data is that the background variability is often too bland. It does not exhibit the spatial and spectral variability present in real data. In this work, four fundamentally different texture modeling algorithms will first be implemented as necessary into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model environment. Two of the models to be tested are variants of a statistical Z-Score selection model, while the remaining two involve a texture synthesis and a spectral end-member fractional abundance map approach, respectively. A detailed comparative performance analysis of each model will then be carried out on several texturally significant regions of the resultant synthetic hyperspectral imagery. The quantitative assessment of each model will utilize a set of three peformance metrics that have been derived from spatial Gray Level Co-Occurrence Matrix (GLCM) analysis, hyperspectral Signal-to-Clutter Ratio (SCR) measures, and a new concept termed the Spectral Co-Occurrence Matrix (SCM) metric which permits the simultaneous measurement of spatial and spectral texture. Previous research efforts on the validation and performance analysis of texture characterization models have been largely qualitative in nature based on conducting visual inspections of synthetic textures in order to judge the degree of similarity to the original sample texture imagery. The quantitative

  3. Multiscale Analysis of Surface Topography from Single Point Incremental Forming using an Acetal Tool

    NASA Astrophysics Data System (ADS)

    Ham, M.; Powers, B. M.; Loiselle, J.

    2014-03-01

    Single point incremental forming (SPIF) is a sheet metal manufacturing process that forms a part by incrementally applying point loads to the material to achieve the desired deformations and final part geometry. This paper investigates the differences in surface topography between a carbide tool and an acetal-tipped tool. Area-scale analysis is performed on the confocal areal surface measurements per ASME B46. The objective of this paper is to determine at which scales surfaces formed by two different tool materials can be differentiated. It is found that the surfaces in contact with the acetal forming tool have greater relative areas at all scales greater than 5 × 104 μm2 than the surfaces in contact with the carbide tools. The surfaces not in contact with the tools during forming, also referred to as the free surface, are unaffected by the tool material.

  4. Enhanced Chemical Incident Response Plan (ECIRP). Appendix F, remediation analysis with Decision Support Tools (DSTs) for wide-area chemical hazards.

    SciTech Connect

    Hassig, Nancy L.; Pulsipher, Brent A.; Foltz, Greg W.; Hoette, Trisha Marie

    2011-07-01

    The Defense Threat Reduction Agency (DTRA) commissioned an assessment of the Consequence Management (CM) plans in place on military bases for response to a chemical attack. The effectiveness of the CM plans for recovering from chemical incidents was modeled using a multiple Decision Support Tools (DSTs). First, a scenario was developed based on an aerial dispersion of a chemical agent over a wide-area of land. The extent of contamination was modeled with the Hazard Prediction and Assessment Capability (HPAC) tool. Subsequently, the Analyzer for Wide Area Restoration Effectiveness (AWARE) tool was used to estimate the cost and time demands for remediation based on input of contamination maps, sampling and decontamination resources, strategies, rates and costs. The sampling strategies incorporated in the calculation were designed using the Visual Sample Plan (VSP) tool. Based on a gaps assessment and the DST remediation analysis, an Enhanced Chemical Incident Response Plan (ECIRP) was developed.

  5. Parametric study on single shot peening by dimensional analysis method incorporated with finite element method

    NASA Astrophysics Data System (ADS)

    Wu, Xian-Qian; Wang, Xi; Wei, Yan-Peng; Song, Hong-Wei; Huang, Chen-Guang

    2012-06-01

    Shot peening is a widely used surface treatment method by generating compressive residual stress near the surface of metallic materials to increase fatigue life and resistance to corrosion fatigue, cracking, etc. Compressive residual stress and dent profile are important factors to evaluate the effectiveness of shot peening process. In this paper, the influence of dimensionless parameters on maximum compressive residual stress and maximum depth of the dent were investigated. Firstly, dimensionless relations of processing parameters that affect the maximum compressive residual stress and the maximum depth of the dent were deduced by dimensional analysis method. Secondly, the influence of each dimensionless parameter on dimensionless variables was investigated by the finite element method. Furthermore, related empirical formulas were given for each dimensionless parameter based on the simulation results. Finally, comparison was made and good agreement was found between the simulation results and the empirical formula, which shows that a useful approach is provided in this paper for analyzing the influence of each individual parameter.

  6. Multi-scale finite element analysis of chloride diffusion in concrete incorporating paste/aggregate ITZs

    NASA Astrophysics Data System (ADS)

    Guo, Li; Guo, XiaoMing; Mi, ChangWen

    2012-09-01

    In this paper, we propose a concurrent multi-scale finite element (FE) model coupling equations of the degree of freedoms of meso-scale model of ITZs and macroscopic model of bulk pastes. The multi-scale model is subsequently implemented and integrated into ABAQUS resulting in easy application to complex concrete structures. A few benchmark numerical examples are performed to test both the accuracy and efficiency of the developed model in analyzing chloride diffusion in concrete. These examples clearly demonstrate that high diffusivity of ITZs, primarily because of its porous microstructure, tends to accelerate chloride penetration along concentration gradient. The proposed model provides new guidelines for the durability analysis of concrete structures under adverse operating conditions.

  7. Incorporating biodiversity considerations into environmental impact analysis under the National Environmental Policy Act

    SciTech Connect

    Not Available

    1993-01-01

    The report presents the results of consultations by the Council on Environmental Quality (CEQ) concerning the consideration of biological diversity in analyses prepared under the National Environmental Policy Act (NEPA). The report is intended to provide background on the emerging, complex subject of biodiversity, outline some general concepts that underlie biological diversity analysis and management, describe how the issue is currently addressed in NEPA analyses, and provide options for agencies undertaking NEPA analyses that consider biodiversity. The report does not establish new requirements for such analyses. It is not, and should not be viewed as, formal CEQ guidance on the matter, nor are the recommendations in the report intended to be legally binding. The report does not mean to suggest the biodiversity analyses should be included in every NEPA document, without regard to the degree of potential impact on biodiversity of the action under review.

  8. SOLERAS - Solar Controlled Environment Agriculture Project. Final report, Volume 6. Science Applications, Incorporated system analysis

    SciTech Connect

    Not Available

    1985-01-01

    This report summarizes the results of the systems analysis task for the conceptual design of a commercial size, solar powered, controlled environment agriculture system. The baseline greenhouse system consists of a 5-hectare growing facility utilizing an innovative fluid roof filter concept to provide temperature and humidity control. Fresh water for the system is produced by means of a reverse osmosis desalination unit and energy is provided by means of a solar photovoltaic array in conjunction with storage batteries and a power conditioning unit. The greenhouse environment is controlled via circulation of brackish groundwater in a closed system, which permits water recovery during dehumidification as well as CO/sub 2/ enrichment for increased crop productivity.

  9. Incorporating multiresolution analysis with multiclassifiers and decision fusion for hyperspectral remote sensing

    NASA Astrophysics Data System (ADS)

    West, Terrance R.

    The ongoing development and increased affordability of hyperspectral sensors are increasing their utilization in a variety of applications, such as agricultural monitoring and decision making. Hyperspectral Automated Target Recognition (ATR) systems typically rely heavily on dimensionality reduction methods, and particularly intelligent reduction methods referred to as feature extraction techniques. This dissertation reports on the development, implementation, and testing of new hyperspectral analysis techniques for ATR systems, including their use in agricultural applications where ground truthed observations available for training the ATR system are typically very limited. This dissertation reports the design of effective methods for grouping and down-selecting Discrete Wavelet Transform (DWT) coefficients and the design of automated Wavelet Packet Decomposition (WPD) filter tree pruning methods for use within the framework of a Multiclassifiers and Decision Fusion (MCDF) ATR system. The efficacy of the DWT MCDF and WPD MCDF systems are compared to existing ATR methods commonly used in hyperspectral remote sensing applications. The newly developed methods' sensitivity to operating conditions, such as mother wavelet selection, decomposition level, and quantity and quality of available training data are also investigated. The newly developed ATR systems are applied to the problem of hyperspectral remote sensing of agricultural food crop contaminations either by airborne chemical application, specifically Glufosinate herbicide at varying concentrations applied to corn crops, or by biological infestation, specifically soybean rust disease in soybean crops. The DWT MCDF and WPD MCDF methods significantly outperform conventional hyperspectral ATR methods. For example, when detecting and classifying varying levels of soybean rust infestation, stepwise linear discriminant analysis, results in accuracies of approximately 30%-40%, but WPD MCDF methods result in accuracies

  10. Powerful tool for design analysis of linear control systems

    SciTech Connect

    Maddux, Jr, A S

    1982-05-10

    The methods for designing linear controls for electronic or mechanical systems have been understood and put to practice. What has not been readily available to engineers, however, is a practical, quick and inexpensive method for analyzing these linear control (feedback) systems once they have been designed into the electronic or mechanical hardware. Now, the PET, manufactured by Commodore Business Machines (CBM), operating with several peripherals via the IEEE 488 Bus, brings to the engineer for about $4000 a complete set of office tools for analyzing these system designs.

  11. MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis

    SciTech Connect

    Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

    2013-02-12

    MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

  12. An advanced image analysis tool for the quantification and characterization of breast cancer in microscopy images.

    PubMed

    Goudas, Theodosios; Maglogiannis, Ilias

    2015-03-01

    The paper presents an advanced image analysis tool for the accurate and fast characterization and quantification of cancer and apoptotic cells in microscopy images. The proposed tool utilizes adaptive thresholding and a Support Vector Machines classifier. The segmentation results are enhanced through a Majority Voting and a Watershed technique, while an object labeling algorithm has been developed for the fast and accurate validation of the recognized cells. Expert pathologists evaluated the tool and the reported results are satisfying and reproducible. PMID:25681102

  13. Incorporation of Socio-Economic Features' Ranking in Multicriteria Analysis Based on Ecosystem Services for Marine Protected Area Planning

    PubMed Central

    Portman, Michelle E.; Shabtay-Yanai, Ateret; Zanzuri, Asaf

    2016-01-01

    Developed decades ago for spatial choice problems related to zoning in the urban planning field, multicriteria analysis (MCA) has more recently been applied to environmental conflicts and presented in several documented cases for the creation of protected area management plans. Its application is considered here for the development of zoning as part of a proposed marine protected area management plan. The case study incorporates specially-explicit conservation features while considering stakeholder preferences, expert opinion and characteristics of data quality. It involves the weighting of criteria using a modified analytical hierarchy process. Experts ranked physical attributes which include socio-economically valued physical features. The parameters used for the ranking of (physical) attributes important for socio-economic reasons are derived from the field of ecosystem services assessment. Inclusion of these feature values results in protection that emphasizes those areas closest to shore, most likely because of accessibility and familiarity parameters and because of data biases. Therefore, other spatial conservation prioritization methods should be considered to supplement the MCA and efforts should be made to improve data about ecosystem service values farther from shore. Otherwise, the MCA method allows incorporation of expert and stakeholder preferences and ecosystem services values while maintaining the advantages of simplicity and clarity. PMID:27183224

  14. Construction and radiolabeling of adenovirus variants that incorporate human metallothionein into protein IX for analysis of biodistribution.

    PubMed

    Liu, Lei; Rogers, Buck E; Aladyshkina, Natalia; Cheng, Bing; Lokitz, Stephen J; Curiel, David T; Mathis, J Michael

    2014-01-01

    Using adenovirus (Ad)-based vectors is a promising strategy for novel cancer treatments; however, current tracking approaches in vivo are limited. The C-terminus of the Ad minor capsid protein IX (pIX) can incorporate heterologous reporters to monitor biodistribution. We incorporated metallothionein (MT), a low-molecular-weight metal-binding protein, as a fusion to pIX. We previously demonstrated 99mTc binding in vitro to a pIX-MT fusion on the Ad capsid. We investigated different fusions of MT within pIX to optimize functional display. We identified a dimeric MT construct fused to pIX that showed significantly increased radiolabeling capacity. After Ad radiolabeling, we characterized metal binding in vitro. We explored biodistribution in vivo in control mice, mice pretreated with warfarin, mice preimmunized with wild-type Ad, and mice that received both warfarin pretreatment and Ad preimmunization. Localization of activity to liver and bladder was seen, with activity detected in spleen, intestine, and kidneys. Afterwards, the mice were euthanized and selected organs were dissected for further analysis. Similar to the imaging results, most of the radioactivity was found in the liver, spleen, kidneys, and bladder, with significant differences between the groups observed in the liver. These results demonstrate this platform application for following Ad dissemination in vivo. PMID:25060486

  15. Incorporation of Socio-Economic Features' Ranking in Multicriteria Analysis Based on Ecosystem Services for Marine Protected Area Planning.

    PubMed

    Portman, Michelle E; Shabtay-Yanai, Ateret; Zanzuri, Asaf

    2016-01-01

    Developed decades ago for spatial choice problems related to zoning in the urban planning field, multicriteria analysis (MCA) has more recently been applied to environmental conflicts and presented in several documented cases for the creation of protected area management plans. Its application is considered here for the development of zoning as part of a proposed marine protected area management plan. The case study incorporates specially-explicit conservation features while considering stakeholder preferences, expert opinion and characteristics of data quality. It involves the weighting of criteria using a modified analytical hierarchy process. Experts ranked physical attributes which include socio-economically valued physical features. The parameters used for the ranking of (physical) attributes important for socio-economic reasons are derived from the field of ecosystem services assessment. Inclusion of these feature values results in protection that emphasizes those areas closest to shore, most likely because of accessibility and familiarity parameters and because of data biases. Therefore, other spatial conservation prioritization methods should be considered to supplement the MCA and efforts should be made to improve data about ecosystem service values farther from shore. Otherwise, the MCA method allows incorporation of expert and stakeholder preferences and ecosystem services values while maintaining the advantages of simplicity and clarity. PMID:27183224

  16. Incorporating Site Amplification into Seismic Hazard Analysis: A Fully Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Cramer, C. H.

    2001-12-01

    Developing site-specific amplification factors from geological, geophysical, and geotechnical information has been the state-of-practice for the last couple of decades. Now the state-of-the-art is to develop a distribution of possible site-specific amplification factors for a given input rock ground-motion. These state-of-the-art site-amplification distributions account for the uncertainty in soil properties and Vs structure at the site. Applying these site amplification distributions to a site-specific seismic hazard analysis requires a fully probabilistic approach. One such approach is to modify the generic ground-motion attenuation relations used in a probabilistic seismic hazard analysis to site-specific relations using a site amplification distribution developed for that site. The modification of the ground-motion attenuation relation is done prior to calculating probabilistic seismic hazard at the site. This approach has been implemented using the USGS National Seismic Hazard Mapping codes. Standard hazard models and hard-rock ground-motion attenuation relations are input into the modified codes along with a description of the site-specific amplification in the form of a lognormal probability-density-function (pdf). For each hard-rock ground-motion level, the pdf is specified by the median site-amplification factor and its natural-logarithmic standard deviation. The fully probabilistic ground-motion hazard curves are always above the hazard curve derived from multiplying the hard-rock hazard curve by the site's median site-amplification factors. At Paducah, Kentucky the difference is significant for 2%-in-50-year ground-motion estimates (0.7g vs. 0.5g for PGA and 1.3g vs. 0.9g for 1.0 s Sa). At Memphis, Tennessee the differences are less significant and may only be important at long periods (1.0 s and longer) on Mississippi flood-plain (lowlands) deposits (on the uplands deposits: 0.35g vs. 0.30g for PGA and 0.8g vs. 0.7g for 1.0 s Sa; on the lowlands

  17. A comparative analysis of Patient-Reported Expanded Disability Status Scale tools

    PubMed Central

    Collins, Christian DE; Ivry, Ben; Bowen, James D; Cheng, Eric M; Dobson, Ruth; Goodin, Douglas S; Lechner-Scott, Jeannette; Kappos, Ludwig; Galea, Ian

    2015-01-01

    Background: Patient-Reported Expanded Disability Status Scale (PREDSS) tools are an attractive alternative to the Expanded Disability Status Scale (EDSS) during long term or geographically challenging studies, or in pressured clinical service environments. Objectives: Because the studies reporting these tools have used different metrics to compare the PREDSS and EDSS, we undertook an individual patient data level analysis of all available tools. Methods: Spearman’s rho and the Bland–Altman method were used to assess correlation and agreement respectively. Results: A systematic search for validated PREDSS tools covering the full EDSS range identified eight such tools. Individual patient data were available for five PREDSS tools. Excellent correlation was observed between EDSS and PREDSS with all tools. A higher level of agreement was observed with increasing levels of disability. In all tools, the 95% limits of agreement were greater than the minimum EDSS difference considered to be clinically significant. However, the intra-class coefficient was greater than that reported for EDSS raters of mixed seniority. The visual functional system was identified as the most significant predictor of the PREDSS–EDSS difference. Conclusion: This analysis will (1) enable researchers and service providers to make an informed choice of PREDSS tool, depending on their individual requirements, and (2) facilitate improvement of current PREDSS tools. PMID:26564998

  18. Self-Gravity Analysis and Visualization Tool For LISA

    NASA Astrophysics Data System (ADS)

    Gopstein, Avi M.; Haile, William B.; Merkowitz, Stephen M.

    2006-11-01

    Self-gravity noise due to sciencecraft distortion and motion is expected to be a significant contributor to the LISA acceleration noise budget. To minimize these effects, the gravitational field at each proof mass must be kept as small, flat, and constant as possible. Most likely it will not be possible to directly verify that the LISA sciencecraft meets these requirements by measurements; they must be verified by models. The LISA Integrated Modeling team developed a new self-gravity tool that calculates the gravitational forces, moments, and gradients on the proof masses and creates a color coded map of the component contributions to the self-gravity field. The color mapping provides an easily recognized and intuitive interface for determining the self-gravity hot-spots of a spacecraft design. Self-gravity color maps can be generated as true representations of the steady-state, or as an approximation of the variability through computation of the difference values across multiple physical states. We present here an overview of the tool and the latest self-gravity results calculated using a recent design of LISA.

  19. Incorporating multiple mixed stocks in mixed stock analysis: 'many-to-many' analyses.

    PubMed

    Bolker, Benjamin M; Okuyama, Toshinori; Bjorndal, Karen A; Bolten, Alan B

    2007-02-01

    Traditional mixed stock analyses use morphological, chemical, or genetic markers measured in several source populations and in a single mixed population to estimate the proportional contribution of each source to the mixed population. In many systems, however, different individuals from a particular source population may go to a variety of mixed populations. Now that data are becoming available from (meta)populations with multiple mixed stocks, the need arises to estimate contributions in this 'many-to-many' scenario. We suggest a Bayesian hierarchical approach, an extension of previous Bayesian mixed stock analysis algorithms, that can estimate contributions in this case. Applying the method to mitochondrial DNA data from green turtles (Chelonia mydas) in the Atlantic gives results that are largely consistent with previous results but makes some novel points, e.g. that the Florida, Bahamas and Corisco Bay foraging grounds have greater contributions than previously thought from distant foraging grounds. More generally, the 'many-to-many' approach gives a more complete understanding of the spatial ecology of organisms, which is especially important in species such as the green turtle that exhibit weak migratory connectivity (several distinct subpopulations at one end of the migration that mix in unknown ways at the other end). PMID:17284204

  20. Incorporating Fuzzy Systems Modeling and Possibility Theory in Hydrogeological Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Faybishenko, B.

    2008-12-01

    Hydrogeological predictions are subject to numerous uncertainties, including the development of conceptual, mathematical, and numerical models, as well as determination of their parameters. Stochastic simulations of hydrogeological systems and the associated uncertainty analysis are usually based on the assumption that the data characterizing spatial and temporal variations of hydrogeological processes are random, and the output uncertainty is quantified using a probability distribution. However, hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete or subjective information. One of the modern approaches to modeling and uncertainty quantification of such systems is based on using a combination of statistical and fuzzy-logic uncertainty analyses. The aims of this presentation are to: (1) present evidence of fuzziness in developing conceptual hydrogeological models, and (2) give examples of the integration of the statistical and fuzzy-logic analyses in modeling and assessing both aleatoric uncertainties (e.g., caused by vagueness in assessing the subsurface system heterogeneities of fractured-porous media) and epistemic uncertainties (e.g., caused by the selection of different simulation models) involved in hydrogeological modeling. The author will discuss several case studies illustrating the application of fuzzy modeling for assessing the water balance and water travel time in unsaturated-saturated media. These examples will include the evaluation of associated uncertainties using the main concepts of possibility theory, a comparison between the uncertainty evaluation using probabilistic and possibility theories, and a transformation of the probabilities into possibilities distributions (and vice versa) for modeling hydrogeological processes.

  1. Analysis of Factors for Incorporating User Preferences in Air Traffic Management: A system Perspective

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil S.; Gutierrez-Nolasco, Sebastian

    2010-01-01

    This paper presents an analysis of factors that impact user flight schedules during air traffic congestion. In pre-departure flight planning, users file one route per flight, which often leads to increased delays, inefficient airspace utilization, and exclusion of user flight preferences. In this paper, first the idea of filing alternate routes and providing priorities on each of those routes is introduced. Then, the impact of varying planning interval and system imposed departure delay increment is discussed. The metrics of total delay and equity are used for analyzing the impact of these factors on increased traffic and on different users. The results are shown for four cases, with and without the optional routes and priority assignments. Results demonstrate that adding priorities to optional routes further improves system performance compared to filing one route per flight and using first-come first-served scheme. It was also observed that a two-hour planning interval with a five-minute system imposed departure delay increment results in highest delay reduction. The trend holds for a scenario with increased traffic.

  2. Core Curriculum Analysis: A Tool for Educational Design

    ERIC Educational Resources Information Center

    Levander, Lena M.; Mikkola, Minna

    2009-01-01

    This paper examines the outcome of a dimensional core curriculum analysis. The analysis process was an integral part of an educational development project, which aimed to compact and clarify the curricula of the degree programmes. The task was also in line with the harmonising of the degree structures as part of the Bologna process within higher…

  3. Teaching the Tools of Pharmaceutical Care Decision-Analysis.

    ERIC Educational Resources Information Center

    Rittenhouse, Brian E.

    1994-01-01

    A method of decision-analysis in pharmaceutical care that integrates epidemiology and economics is presented, including an example illustrating both the deceptive nature of medical decision making and the power of decision analysis. Principles in determining both general and specific probabilities of interest and use of decision trees for…

  4. Computationally Efficient Finite Element Analysis Method Incorporating Virtual Equivalent Projected Model For Metallic Sandwich Panels With Pyramidal Truss Cores

    SciTech Connect

    Seong, Dae-Yong; Jung, Chang Gyun; Yang, Dong-Yol

    2007-05-17

    Metallic sandwich panels composed of two face sheets and cores with low relative density have lightweight characteristics and various static and dynamic load bearing functions. To predict the forming characteristics, performance, and formability of these structured materials, full 3D modeling and analysis involving tremendous computational time and memory are required. Some constitutive continuum models including homogenization approaches to solve these problems have limitations with respect to the prediction of local buckling of face sheets and inner structures. In this work, a computationally efficient FE-analysis method incorporating a virtual equivalent projected model that enables the simulation of local buckling modes is newly introduced for analysis of metallic sandwich panels. Two-dimensional models using the projected shapes of 3D structures have the same equivalent elastic-plastic properties with original geometries that have anisotropic stiffness, yield strength, and hardening function. The sizes and isotropic properties of the virtual equivalent projected model have been estimated analytically with the same equivalent properties and face buckling strength of the full model. The 3-point bending processes with quasi-two-dimensional loads and boundary conditions are simulated to establish the validity of the proposed method. The deformed shapes and load-displacement curves of the virtual equivalent projected model are found to be almost the same as those of a full three-dimensional FE-analysis while reducing computational time drastically.

  5. Development of Advanced Continuum Models that Incorporate Nanomechanical Deformation into Engineering Analysis.

    SciTech Connect

    Zimmerman, Jonathan A.; Jones, Reese E.; Templeton, Jeremy Alan; McDowell, David L.; Mayeur, Jason R.; Tucker, Garritt J.; Bammann, Douglas J.; Gao, Huajian

    2008-09-01

    Materials with characteristic structures at nanoscale sizes exhibit significantly different mechani-cal responses from those predicted by conventional, macroscopic continuum theory. For example,nanocrystalline metals display an inverse Hall-Petch effect whereby the strength of the materialdecreases with decreasing grain size. The origin of this effect is believed to be a change in defor-mation mechanisms from dislocation motion across grains and pileup at grain boundaries at mi-croscopic grain sizes to rotation of grains and deformation within grain boundary interface regionsfor nanostructured materials. These rotational defects are represented by the mathematical conceptof disclinations. The ability to capture these effects within continuum theory, thereby connectingnanoscale materials phenomena and macroscale behavior, has eluded the research community.The goal of our project was to develop a consistent theory to model both the evolution ofdisclinations and their kinetics. Additionally, we sought to develop approaches to extract contin-uum mechanical information from nanoscale structure to verify any developed continuum theorythat includes dislocation and disclination behavior. These approaches yield engineering-scale ex-pressions to quantify elastic and inelastic deformation in all varieties of materials, even those thatpossess highly directional bonding within their molecular structures such as liquid crystals, cova-lent ceramics, polymers and biological materials. This level of accuracy is critical for engineeringdesign and thermo-mechanical analysis is performed in micro- and nanosystems. The researchproposed here innovates on how these nanoscale deformation mechanisms should be incorporatedinto a continuum mechanical formulation, and provides the foundation upon which to develop ameans for predicting the performance of advanced engineering materials.4 AcknowledgmentThe authors acknowledge helpful discussions with Farid F. Abraham, Youping Chen, Terry J

  6. Evaluation of Uncertainty in Runoff Analysis Incorporating Theory of Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshimi, Kazuhiro; Wang, Chao-Wen; Yamada, Tadashi

    2015-04-01

    The aim of this paper is to provide a theoretical framework of uncertainty estimate on rainfall-runoff analysis based on theory of stochastic process. SDE (stochastic differential equation) based on this theory has been widely used in the field of mathematical finance due to predict stock price movement. Meanwhile, some researchers in the field of civil engineering have investigated by using this knowledge about SDE (stochastic differential equation) (e.g. Kurino et.al, 1999; Higashino and Kanda, 2001). However, there have been no studies about evaluation of uncertainty in runoff phenomenon based on comparisons between SDE (stochastic differential equation) and Fokker-Planck equation. The Fokker-Planck equation is a partial differential equation that describes the temporal variation of PDF (probability density function), and there is evidence to suggest that SDEs and Fokker-Planck equations are equivalent mathematically. In this paper, therefore, the uncertainty of discharge on the uncertainty of rainfall is explained theoretically and mathematically by introduction of theory of stochastic process. The lumped rainfall-runoff model is represented by SDE (stochastic differential equation) due to describe it as difference formula, because the temporal variation of rainfall is expressed by its average plus deviation, which is approximated by Gaussian distribution. This is attributed to the observed rainfall by rain-gauge station and radar rain-gauge system. As a result, this paper has shown that it is possible to evaluate the uncertainty of discharge by using the relationship between SDE (stochastic differential equation) and Fokker-Planck equation. Moreover, the results of this study show that the uncertainty of discharge increases as rainfall intensity rises and non-linearity about resistance grows strong. These results are clarified by PDFs (probability density function) that satisfy Fokker-Planck equation about discharge. It means the reasonable discharge can be

  7. The synthesis and enzymatic incorporation of sialic acid derivatives for use as tools to study the structure, activity, and inhibition of glycoproteins and other glycoconjugates.

    PubMed

    Martin, R; Witte, K L; Wong, C H

    1998-08-01

    Methods have been developed for the enzymatic synthesis of complex carbohydrates and glycoproteins containing in the sialic acid moiety the heavy metal mercury or the transition-state analog phosphonate of the influenza C 9-O-acetyl-neuraminic acid esterase-catalyzed reaction. 5-Acetamido-3, 5-dideoxy-9-methylphosphono-beta-D-glycero-D-galacto-nonulopyra nosidonic acid (1), 5-acetamido-3,5-dideoxy-9-methylphosphono-2-propyl-alpha-D- glycero-D-galacto-nonulopyranosidonic acid triethylammonium salt (2), and 5-acetamido-9-thiomethylmercuric-3, 5,9-trideoxy-beta-D-glycero-D-galacto-nonulopyranosidonic acid (3) were synthesized. Compounds 1 and 2 are proposed transition state inhibitors of an esterase vital for the binding and infection of influenza C. Compound 3 was enzymatically incorporated into an oligosaccharide and a non-natural glycoprotein for use as an aid in the structure determination of these compounds by X-ray crystallography. PMID:9784869

  8. Meta-analysis of studies incorporating the interests of young children with autism spectrum disorders into early intervention practices.

    PubMed

    Dunst, Carl J; Trivette, Carol M; Hamby, Deborah W

    2012-01-01

    Incorporating the interests and preferences of young children with autism spectrum disorders into interventions to promote prosocial behavior and decrease behavior excesses has emerged as a promising practice for addressing the core features of autism. The efficacy of interest-based early intervention practices was examined in a meta-analysis of 24 studies including 78 children 2 to 6 years of age diagnosed with autism spectrum disorders. Effect size analyses of intervention versus nonintervention conditions and high-interest versus low-interest contrasts indicated that interest-based intervention practices were effective in terms of increasing prosocial and decreasing aberrant child behavior. Additionally, interest-based interventions that focused on two of the three core features of autism spectrum disorders (poor communication, poor interpersonal relationships) were found most effective in influencing child outcomes. Implications for very early intervention are discussed in terms addressing the behavior markers of autism spectrum disorders before they become firmly established. PMID:22934173

  9. Improve Radial Velocity Precision with Better Data Analysis Tools

    NASA Astrophysics Data System (ADS)

    Xuesong Wang, Sharon; Wright, Jason; Zhao, Ming

    2015-12-01

    The synergy between Kepler and the ground-based radial velocity (RV) surveys have made numerous discoveries of low-mass exoplanets, opening the age of Earth analogs. However, Earth analogs such as Kepler 452-b require a much higher RV precision ( ~ 10 cm/s) than the achievable with current instruments (~ 1 m/s) and understanding of stellar photosphere. This presentation will cover some of the instrumental and data issues that are currently hindering us from achieving the sub 1 m/s precision, as well as remedies and ways forward with future RV instruments. Highlights of our work include: (1) how telluric contamination affects RV precision and how to "telluric-proof" a Doppler pipeline; (2) how errors in the deconvolved stellar reference spectrum can mimic the signal of a super-Earth on a ~1 year orbit; (3) the battle with imperfections in the iodine reference spectra and how an ultra-high resolution (R ~ 500,000) echelle spectrum can help; (4) and a new RV extraction code in Python which incorporates MCMC and Gaussian Processes. This research is based on radial velocity data taken with iodine cell calibrators using Keck/HIRES and HET/HRS.

  10. Tool for Analysis and Reduction of Scientific Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN as a whole (up to version 2.0) has been summarized, and selected aspects of ASPEN have been discussed in several previous NASA Tech Briefs articles. Restated briefly, ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and randomaccess memories. Domain-specific reasoning modules (e.g., modules for determining orbits for spacecraft) can easily be plugged into ASPEN 3.0. Improvements over other, similar software that have been incorporated into ASPEN 3.0 include a provision for more expressive time-line values, new parsing capabilities afforded by an ASPEN language based on Extensible Markup Language, improved search capabilities, and improved interfaces to other, utility-type software (notably including MATLAB).

  11. Tools for the genetic analysis of germ cells.

    PubMed

    Hammond, Shirley S; Matin, Angabin

    2009-09-01

    Germ cells are essential for the propagation of individual species. Studies on germ cell development in mice highlight important biological paradigms. Beginning with their first appearance around embryonic day 7 (E7), germ cells undergo specific cellular changes at different stages of their embryonic and adult development. Germ cells migrate through the hind-regions of the embryo to eventually home into the developing gonads. Further differentiation and development of germ cells differ in males and females. The processes involved in germ cell development and their eventual differentiation into sperm and oocytes have been under extensive investigation in recent years. Studies on germ cells have shed light on the cellular and molecular processes involved in their specification, migration, proliferation, death, and differentiation. These studies have also revealed much about maintenance of stem cell populations and fertility. Here we review the genetic tools that are at present available to study germ cells in the mouse. PMID:19548313

  12. New tools for jet analysis in high energy collisions

    NASA Astrophysics Data System (ADS)

    Duffty, Daniel

    Our understanding of the fundamental interactions of particles has come far in the last century, and is still pushing forward. As we build ever more powerful machines to probe higher and higher energies, we will need to develop new tools to not only understand the new physics objects we are trying to detect, but even to understand the environment that we are searching in. We examine methods of identifying both boosted objects and low energy jets which will be shrouded in a sea of noise from other parts of the detector. We display the power of boosted-b tagging in a simulated W search. We also examine the effect of pileup on low energy jet reconstructions. For this purpose we develop a new priority-based jet algorithm, "p-jets", to cluster the energy that belongs together, but ignore the rest.

  13. Serial analysis of gene expression (SAGE): unraveling the bioinformatics tools.

    PubMed

    Tuteja, Renu; Tuteja, Narendra

    2004-08-01

    Serial analysis of gene expression (SAGE) is a powerful technique that can be used for global analysis of gene expression. Its chief advantage over other methods is that it does not require prior knowledge of the genes of interest and provides qualitative and quantitative data of potentially every transcribed sequence in a particular cell or tissue type. This is a technique of expression profiling, which permits simultaneous, comparative and quantitative analysis of gene-specific, 9- to 13-basepair sequences. These short sequences, called SAGE tags, are linked together for efficient sequencing. The sequencing data are then analyzed to identify each gene expressed in the cell and the levels at which each gene is expressed. The main benefit of SAGE includes the digital output and the identification of novel genes. In this review, we present an outline of the method, various bioinformatics methods for data analysis and general applications of this important technology. PMID:15273993

  14. Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles

    SciTech Connect

    Baer, Donald R.; Gaspar, Daniel J.; Nachimuthu, Ponnusamy; Techane, Sirnegeda D.; Castner, David G.

    2010-02-01

    The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties.

  15. TOOLS FOR COMPARATIVE ANALYSIS OF ALTERNATIVES: COMPETING OR COMPLEMENTARY PERSPECTIVES?

    EPA Science Inventory

    A third generation of environmental policymaking and risk management will increasingly impose environmental measures, which may give rise to analyzing countervailing risks. Therefore, a comprehensive analysis of these risks associated with the decision alternatives at hand will e...

  16. Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles

    PubMed Central

    Baer, DR; Gaspar, DJ; Nachimuthu, P; Techane, SD; Castner, DG

    2010-01-01

    The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties. PMID:20052578

  17. Interfacing interactive data analysis tools with the grid: The PPDG CS-11 activity

    SciTech Connect

    Olson, Douglas L.; Perl, Joseph

    2002-10-09

    For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics DataGrid project (www.ppdg.net) has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services.'' The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments (ALICE, ATLAS, BaBar, D0, CMS, JLab, STAR, others welcome), to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis frame work developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed.

  18. INTERFACING INTERACTIVE DATA ANALYSIS TOOLS WITH THE GRID: THE PPDG CS-11 ACTIVITY

    SciTech Connect

    Perl, Joseph

    2003-07-10

    For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics Data Grid project (www.ppdg.net) has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services''. The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis framework developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed.

  19. Analysis of the electromagnetic wave resistivity tool in deviated well drilling

    NASA Astrophysics Data System (ADS)

    Zhang, Yumei; Xu, Lijun; Cao, Zhang

    2014-04-01

    Electromagnetic wave resistivity (EWR) tools are used to provide real-time measurements of resistivity in the formation around the tool in Logging While Drilling (LWD). In this paper, the acquired resistivity information in the formation is analyzed to extract more information, including dipping angle and azimuth direction of the drill. A finite element (FM) model of EWR tool working in layered earth formations is established. Numerical analysis and FM simulations are employed to analyze the amplitude ratio and phase difference between the voltages measured at the two receivers of the EWR tool in deviated well drilling.

  20. Upgrade of DRAMA-ESA's Space Debris Mitigation Analysis Tool Suite

    NASA Astrophysics Data System (ADS)

    Gelhaus, Johannes; Sanchez-Ortiz, Noelia; Braun, Vitali; Kebschull, Christopher; de Oliveira, Joaquim Correia; Dominguez-Gonzalez, Raul; Wiedemann, Carsten; Krag, Holger; Vorsmann, Peter

    2013-08-01

    One decade ago ESA started the dev elopment of the first version of the software tool called DRAMA (Debris Risk Assessment and Mitigation Analysis) to enable ESA space programs to assess their compliance with the recommendations in the European Code of Conduct for Space Debris Mitigation. This tool was maintained, upgraded and extended during the last year and is now a combination of five individual tools, each addressing a different aspect of debris mitigation. This paper gives an overview of the new DRAMA software in general. Both, the main tools ARES, OSCAR, MIDAS, CROC and SARA will be discussed and the environment used by DRAMA will be explained shortly.