Sample records for comdes development toolset

  1. Reliability study on high power 638-nm triple emitter broad area laser diode

    NASA Astrophysics Data System (ADS)

    Yagi, T.; Kuramoto, K.; Kadoiwa, K.; Wakamatsu, R.; Miyashita, M.

    2016-03-01

    Reliabilities of the 638-nm triple emitter broad area laser diode (BA-LD) with the window-mirror structure were studied. Methodology to estimate mean time to failure (MTTF) due to catastrophic optical mirror degradation (COMD) in reasonable aging duration was newly proposed. Power at which the LD failed due to COMD (PCOMD) was measured for the aged LDs under the several aging conditions. It was revealed that the PCOMD was proportional to logarithm of aging duration, and MTTF due to COMD (MTTF(COMD)) could be estimated by using this relation. MTTF(COMD) estimated by the methodology with the aging duration of approximately 2,000 hours was consistent with that estimated by the long term aging. By using this methodology, the MTTF of the BA-LD was estimated exceeding 100,000 hours under the output of 2.5 W, duty cycles of 30% .

  2. Influence of O-methylated metabolite penetrating the blood-brain barrier to estimation of dopamine synthesis capacity in human L-[β-(11)C]DOPA PET.

    PubMed

    Matsubara, Keisuke; Ikoma, Yoko; Okada, Maki; Ibaraki, Masanobu; Suhara, Tetsuya; Kinoshita, Toshibumi; Ito, Hiroshi

    2014-02-01

    O-methyl metabolite (L-[β-(11)C]OMD) of (11)C-labeled L-3,4-dihydroxyphenylalanine (L-[β-(11)C]DOPA) can penetrate into brain tissue through the blood-brain barrier, and can complicate the estimation of dopamine synthesis capacity by positron emission tomography (PET) study with L-[β-(11)C]DOPA. We evaluated the impact of L-[β-(11)C]OMD on the estimation of the dopamine synthesis capacity in a human L-[β-(11)C]DOPA PET study. The metabolite correction with mathematical modeling of L-[β-(11)C]OMD kinetics in a reference region without decarboxylation and further metabolism, proposed by a previous [(18)F]FDOPA PET study, were implemented to estimate radioactivity of tissue L-[β-(11)C]OMD in 10 normal volunteers. The component of L-[β-(11)C]OMD in tissue time-activity curves (TACs) in 10 regions were subtracted by the estimated radioactivity of L-[β-(11)C]OMD. To evaluate the influence of omitting blood sampling and metabolite correction, relative dopamine synthesis rate (kref) was estimated by Gjedde-Patlak analysis with reference tissue input function, as well as the net dopamine synthesis rate (Ki) by Gjedde-Patlak analysis with the arterial input function and TAC without and with metabolite correction. Overestimation of Ki was observed without metabolite correction. However, the kref and Ki with metabolite correction were significantly correlated. These data suggest that the influence of L-[β-(11)C]OMD is minimal for the estimation of kref as dopamine synthesis capacity.

  3. Responses to depressed mood and suicide attempt in young adults with a history of childhood-onset mood disorder.

    PubMed

    Liu, Xianchen; Gentzler, Amy L; George, Charles J; Kovacs, Maria

    2009-05-01

    Although individuals' responses to their depressed mood are hypothesized to play an important role in the development and maintenance of depression, how these responses might impact the likelihood of suicidal behavior in mood disorders remains largely unexplored. The goal of the current study was to examine whether maladaptive responses to depressed mood are associated with suicide attempts in adults with a history of childhood-onset mood disorder (COMD). Participants included 223 young adult probands with COMD meeting DSM-III or DSM-IV criteria for major depressive disorder or bipolar disorder and 112 controls without a history of psychiatric disorders. All participants were recruited between 1996 and 2004. Probands were followed for 6 to 99 months (median = 32 months). The Responses Styles Questionnaire was used to assess 2 adaptive (distraction and problem solving) and 2 maladaptive (dangerous activity and rumination) ways of coping with depressed mood. Compared to controls, COMD probands scored significantly higher on maladaptive response styles and lower on adaptive styles. Compared to their COMD peers, probands with a history of suicide attempt were less likely to report using distracting activities to manage their depressed mood. However, COMD probands who engaged in dangerous activities in response to depressed mood were more likely to attempt suicide during the follow-up period (hazard ratio = 1.8, 95% CI = 1.2 to 2.8). One of the pathways to suicide attempt in mood disorders may involve maladaptive responses to depressed mood. The assessment of how depressed individuals manage their dysphoric moods, therefore, should be considered an important aspect of treatment and prevention of suicidal behavior. Copyright 2009 Physicians Postgraduate Press, Inc.

  4. Multi-spectral investigation of bulk and facet failures in high-power single emitters at 980 nm

    NASA Astrophysics Data System (ADS)

    Yanson, Dan; Levy, Moshe; Shamay, Moshe; Cohen, Shalom; Shkedy, Lior; Berk, Yuri; Tessler, Renana; Klumel, Genadi; Rappaport, Noam; Karni, Yoram

    2013-03-01

    Reliable single emitters delivering >10W in the 9xx nm spectral range, are common building blocks for fiber laser pumps. As facet passivation techniques can suppress or delay catastrophic optical mirror damage (COMD) extending emitter reliability into hundreds of thousands of hours, other, less dominant, failure modes such as intra-chip catastrophic optical bulk damage (COBD) become apparent. Based on our failure statistics in high current operation, only ~52% of all failures can be attributed to COMD. Imaging through a window opened in the metallization on the substrate (n) side of a p-side down mounted emitter provides valuable insight into both COMD and COBD failure mechanisms. We developed a laser ablation process to define a window on the n-side of an InGaAs/AlGaAs 980nm single emitter that is overlaid on the pumped 90μm stripe on the p-side. The ablation process is compatible with the chip wire-bonding, enabling the device to be operated at high currents with high injection uniformity. We analyzed both COMD and COBD failed emitters in the electroluminescence and mid-IR domains supported by FIB/SEM observation. The ablated devices revealed branching dark line patterns, with a line origin either at the facet center (COMD case) or near the stripe edge away from the facet (COBD case). In both cases, the branching direction is always toward the rear facet (against the photon density gradient), with SEM images revealing a disordered active layer structure. Absorption levels between 0.22eV - 0.55eV were observed in disordered regions by FT-IR spectroscopy. Temperature mapping of a single emitter in the MWIR domain was performed using an InSb detector. We also report an electroluminescence study of a single emitter just before and after failure.

  5. Value of Responsive Launch Safety Toolsets

    NASA Astrophysics Data System (ADS)

    Devoid, Wayne E.

    2013-09-01

    This paper will discuss the advantages and disadvantages of all-in-one risk assessment toolsets as they are applied to a wide variety of orbital, suborbital, lander, and unmanned vehicles. Toolsets like APT's SafeLab and Horizon, that are designed from the ground up specifically to address ever- changing vehicle and mission parameters, reduce the need for additional software development costs for launch ranges and vehicle manufacturers.

  6. Virtual Beach Manager Toolset

    EPA Science Inventory

    The Virtual Beach Manager Toolset (VB) is a set of decision support software tools developed to help local beach managers make decisions as to when beaches should be closed due to predicted high levels of water borne pathogens. The tools are being developed under the umbrella of...

  7. The Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) Toolset

    NASA Technical Reports Server (NTRS)

    Zank, G. P.; Spann, James F.

    2014-01-01

    The goal of this project is to serve the needs of space system designers and operators by developing an interplanetary radiation environment model within 10 AU:Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) toolset: (1) The RISCS toolset will provide specific reference environments for space system designers and nowcasting and forecasting capabilities for space system operators; (2) We envision the RISCS toolset providing the spatial and temporal radiation environment external to the Earth's (and other planets') magnetosphere, as well as possessing the modularity to integrate separate applications (apps) that can map to specific magnetosphere locations and/or perform the subsequent radiation transport and dosimetry for a specific target.

  8. BPMN, Toolsets, and Methodology: A Case Study of Business Process Management in Higher Education

    NASA Astrophysics Data System (ADS)

    Barn, Balbir S.; Oussena, Samia

    This chapter describes ongoing action research which is exploring the use of BPMN and a specific toolset - Intalio Designer to capture the “as is” essential process model of part of an overarching large business process within higher education. The chapter contends that understanding the efficacy of the BPMN notation and the notational elements to use is not enough. Instead, the effectiveness of a notation is determined by the notation, the toolset that is being used, and methodological consideration. The chapter presents some of the challenges that are faced in attempting to develop computation independent models in BPMN using toolsets such as Intalio Designer™.

  9. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  10. Regulation of Bacteriocin Production in Streptococcus mutans by the Quorum-Sensing System Required for Development of Genetic Competence

    PubMed Central

    van der Ploeg, Jan R.

    2005-01-01

    In Streptococcus mutans, competence for genetic transformation and biofilm formation are dependent on the two-component signal transduction system ComDE together with the inducer peptide pheromone competence-stimulating peptide (CSP) (encoded by comC). Here, it is shown that the same system is also required for expression of the nlmAB genes, which encode a two-peptide nonlantibiotic bacteriocin. Expression from a transcriptional nlmAB′-lacZ fusion was highest at high cell density and was increased up to 60-fold following addition of CSP, but it was abolished when the comDE genes were interrupted. Two more genes, encoding another putative bacteriocin and a putative bacteriocin immunity protein, were also regulated by this system. The regions upstream of these genes and of two further putative bacteriocin-encoding genes and a gene encoding a putative bacteriocin immunity protein contained a conserved 9-bp repeat element just upstream of the transcription start, which suggests that expression of these genes is also dependent on the ComCDE regulatory system. Mutations in the repeat element of the nlmAB promoter region led to a decrease in CSP-dependent expression of nlmAB′-lacZ. In agreement with these results, a comDE mutant and mutants unable to synthesize or export CSP did not produce bacteriocins. It is speculated that, at high cell density, bacteriocin production is induced to liberate DNA from competing streptococci. PMID:15937160

  11. Integrated Toolset for WSN Application Planning, Development, Commissioning and Maintenance: The WSN-DPCM ARTEMIS-JU Project.

    PubMed

    Antonopoulos, Christos; Asimogloy, Katerina; Chiti, Sarah; D'Onofrio, Luca; Gianfranceschi, Simone; He, Danping; Iodice, Antonio; Koubias, Stavros; Koulamas, Christos; Lavagno, Luciano; Lazarescu, Mihai T; Mujica, Gabriel; Papadopoulos, George; Portilla, Jorge; Redondo, Luis; Riccio, Daniele; Riesgo, Teresa; Rodriguez, Daniel; Ruello, Giuseppe; Samoladas, Vasilis; Stoyanova, Tsenka; Touliatos, Gerasimos; Valvo, Angela; Vlahoy, Georgia

    2016-06-02

    In this article we present the main results obtained in the ARTEMIS-JU WSN-DPCM project between October 2011 and September 2015. The first objective of the project was the development of an integrated toolset for Wireless sensor networks (WSN) application planning, development, commissioning and maintenance, which aims to support application domain experts, with limited WSN expertise, to efficiently develop WSN applications from planning to lifetime maintenance. The toolset is made of three main tools: one for planning, one for application development and simulation (which can include hardware nodes), and one for network commissioning and lifetime maintenance. The tools are integrated in a single platform which promotes software reuse by automatically selecting suitable library components for application synthesis and the abstraction of the underlying architecture through the use of a middleware layer. The second objective of the project was to test the effectiveness of the toolset for the development of two case studies in different domains, one for detecting the occupancy state of parking lots and one for monitoring air concentration of harmful gasses near an industrial site.

  12. Integrated Toolset for WSN Application Planning, Development, Commissioning and Maintenance: The WSN-DPCM ARTEMIS-JU Project

    PubMed Central

    Antonopoulos, Christos; Asimogloy, Katerina; Chiti, Sarah; D’Onofrio, Luca; Gianfranceschi, Simone; He, Danping; Iodice, Antonio; Koubias, Stavros; Koulamas, Christos; Lavagno, Luciano; Lazarescu, Mihai T.; Mujica, Gabriel; Papadopoulos, George; Portilla, Jorge; Redondo, Luis; Riccio, Daniele; Riesgo, Teresa; Rodriguez, Daniel; Ruello, Giuseppe; Samoladas, Vasilis; Stoyanova, Tsenka; Touliatos, Gerasimos; Valvo, Angela; Vlahoy, Georgia

    2016-01-01

    In this article we present the main results obtained in the ARTEMIS-JU WSN-DPCM project between October 2011 and September 2015. The first objective of the project was the development of an integrated toolset for Wireless sensor networks (WSN) application planning, development, commissioning and maintenance, which aims to support application domain experts, with limited WSN expertise, to efficiently develop WSN applications from planning to lifetime maintenance. The toolset is made of three main tools: one for planning, one for application development and simulation (which can include hardware nodes), and one for network commissioning and lifetime maintenance. The tools are integrated in a single platform which promotes software reuse by automatically selecting suitable library components for application synthesis and the abstraction of the underlying architecture through the use of a middleware layer. The second objective of the project was to test the effectiveness of the toolset for the development of two case studies in different domains, one for detecting the occupancy state of parking lots and one for monitoring air concentration of harmful gasses near an industrial site. PMID:27271622

  13. Helping Autism-Diagnosed Teenagers Navigate and Develop Socially Using E-Learning Based on Mobile Persuasion

    ERIC Educational Resources Information Center

    Ohrstrom, Peter

    2011-01-01

    The HANDS (Helping Autism-diagnosed teenagers Navigate and Develop Socially) research project involves the creation of an e-learning toolset that can be used to develop individualized tools to support the social development of teenagers with an autism diagnosis. The e-learning toolset is based on ideas from persuasive technology. This paper…

  14. Parallax visualization of full motion video using the Pursuer GUI

    NASA Astrophysics Data System (ADS)

    Mayhew, Christopher A.; Forgues, Mark B.

    2014-06-01

    In 2013, the Authors reported to the SPIE on the Phase 1 development of a Parallax Visualization (PV) plug-in toolset for Wide Area Motion Imaging (WAMI) data using the Pursuer Graphical User Interface (GUI).1 In addition to the ability to PV WAMI data, the Phase 1 plug-in toolset also featured a limited ability to visualize Full Motion video (FMV) data. The ability to visualize both WAMI and FMV data is highly advantageous capability for an Electric Light Table (ELT) toolset. This paper reports on the Phase 2 development and addition of a full featured FMV capability to the Pursuer WAMI PV Plug-in.

  15. A CTE matched hard solder passively cooled laser diode package combined with nXLT facet passivation enables high power, high reliability operation

    NASA Astrophysics Data System (ADS)

    Hodges, Aaron; Wang, Jun; DeFranza, Mark; Liu, Xingsheng; Vivian, Bill; Johnson, Curt; Crump, Paul; Leisher, Paul; DeVito, Mark; Martinsen, Robert; Bell, Jacob

    2007-04-01

    A conductively cooled laser diode package design with hard AuSn solder and CTE matched sub mount is presented. We discuss how this platform eliminates the failure mechanisms associated with indium solder. We present the problem of catastrophic optical mirror damage (COMD) and show that nLight's nXLT TM facet passivation technology effectively eliminates facet defect initiated COMD as a failure mechanism for both single emitter and bar format laser diodes. By combining these technologies we have developed a product that has high reliability at high powers, even at increased operation temperatures. We present early results from on-going accelerated life testing of this configuration that suggests an 808nm, 30% fill factor device will have a MTTF of more than 21khrs at 60W CW, 25°C operating conditions and a MTTF of more than 6.4khrs when operated under hard pulsed (1 second on, 1 second off) conditions.

  16. Inheritance of astigmatism: evidence for a major autosomal dominant locus.

    PubMed Central

    Clementi, M; Angi, M; Forabosco, P; Di Gianantonio, E; Tenconi, R

    1998-01-01

    Although astigmatism is a frequent refractive error, its mode of inheritance remains uncertain. Complex segregation analysis was performed, by the POINTER and COMDS programs, with data from a geographically well-defined sample of 125 nuclear families of individuals affected by astigmatism. POINTER could not distinguish between alternative genetic models, and only the hypothesis of no familial transmission could be rejected. After inclusion of the severity parameter, COMDS results defined a genetic model for corneal astigmatism and provided evidence for single-major-locus inheritance. These results suggest that genetic linkage studies could be implemented and that they should be limited to multiplex families with severely affected individuals. PMID:9718344

  17. No association between oxytocin or prolactin gene variants and childhood-onset mood disorders

    PubMed Central

    Strauss, John S.; Freeman, Natalie L.; Shaikh, Sajid A.; Vetró, Ágnes; Kiss, Enikő; Kapornai, Krisztina; Daróczi, Gabriella; Rimay, Timea; Kothencné, Viola Osváth; Dombovári, Edit; Kaczvinszk, Emília; Tamás, Zsuzsa; Baji, Ildikó; Besny, Márta; Gádoros, Julia; DeLuca, Vincenzo; George, Charles J.; Dempster, Emma; Barr, Cathy L.; Kovacs, Maria; Kennedy, James L.

    2010-01-01

    Background Oxytocin (OXT) and prolactin (PRL) are neuropeptide hormones that interact with the serotonin system and are involved in the stress response and social affiliation. In human studies, serum OXT and PRL levels have been associated with depression and related phenotypes. Our purpose was to determine if single nucleotide polymorphisms (SNPs) at the loci for OXT, PRL and their receptors, OXTR and PRLR, were associated with childhood-onset mood disorders (COMD). Methods Using 678 families in a family-based association design, we genotyped sixteen SNPs at OXT, PRL, OXTR and PRLR to test for association with COMD. Results No significant associations were found for SNPs in the OXTR, PRL, or PRLR genes. Two of three SNPs 3' of the OXT gene were associated with COMD (p ≤ 0.02), significant after spectral decomposition, but were not significant after additionally correcting for the number of genes tested. Supplementary analyses of parent-of-origin and proband sex effects for OXT SNPs by Fisher’s Exact test were not significant after Bonferroni correction. Conclusions We have examined sixteen OXT and PRL system gene variants, with no evidence of statistically significant association after correction for multiple tests. PMID:20547007

  18. Utah State University: Cross-Discipline Training through the Graduate Studies Program in Auditory Learning & Spoken Language

    ERIC Educational Resources Information Center

    Houston, K. Todd

    2010-01-01

    Since 1946, Utah State University (USU) has offered specialized coursework in audiology and speech-language pathology, awarding the first graduate degrees in 1948. In 1965, the teacher training program in deaf education was launched. Over the years, the Department of Communicative Disorders and Deaf Education (COMD-DE) has developed a rich history…

  19. Microscopic calculations of heavy-residue formation in quasielastic and deep-inelastic collisions below the Fermi energy.

    NASA Astrophysics Data System (ADS)

    Souliotis, G. A.; Shetty, D. V.; Galanopoulos, S.; Yennello, S. J.

    2007-10-01

    During the last several years we have undertaken a systematic study of heavy residues formed in quasi-elastic and deep- inelastic collisions near and below the Fermi energy [1,2]. Presently, we are exploring the possibility of extracting information on the dynamics by comparing our heavy residue data to calculations using microscopic models based on the quantum molecular dynamics approach (QMD). We have performed detailed calculations of QMD type using the recent version of the constrained molecular dynamics code CoMD of M. Papa [3]. CoMD is especially designed for reactions near the Fermi energy. It implements an effective interaction with a nuclear-matter compressibility of K=200 (soft EOS) with several forms of the density dependence of the nucleon-nucleon symmetry potential. CoMD imposes a constraint in the phase space occupation for each nucleon, thus restoring the Pauli principle at each time step of the collision. Results of the calculations and comparisons with our residue data will be presented and discussed in detail. [1] G.A. Souliotis et al., Phys. Rev. Lett. 91, 022701 (2003); Nucl. Instrum. Methods B 204 166 (2003). [2] G.A. Souliotis et al., Phys. Lett. B 588, 35 (2004). [3] M. Papa et al., Phys. Rev. C 64, 024612 (2001).

  20. Development and Transition of the Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) Toolset

    NASA Technical Reports Server (NTRS)

    Spann, James F.; Zank, G.

    2014-01-01

    We outline a plan to develop and transition a physics based predictive toolset called The Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) to describe the interplanetary energetic particle and radiation environment throughout the inner heliosphere, including at the Earth. To forecast and "nowcast" the radiation environment requires the fusing of three components: 1) the ability to provide probabilities for incipient solar activity; 2) the use of these probabilities and daily coronal and solar wind observations to model the 3D spatial and temporal heliosphere, including magnetic field structure and transients, within 10 Astronomical Units; and 3) the ability to model the acceleration and transport of energetic particles based on current and anticipated coronal and heliospheric conditions. We describe how to address 1) - 3) based on our existing, well developed, and validated codes and models. The goal of RISCS toolset is to provide an operational forecast and "nowcast" capability that will a) predict solar energetic particle (SEP) intensities; b) spectra for protons and heavy ions; c) predict maximum energies and their duration; d) SEP composition; e) cosmic ray intensities, and f) plasma parameters, including shock arrival times, strength and obliquity at any given heliospheric location and time. The toolset would have a 72 hour predicative capability, with associated probabilistic bounds, that would be updated hourly thereafter to improve the predicted event(s) and reduce the associated probability bounds. The RISCS toolset would be highly adaptable and portable, capable of running on a variety of platforms to accommodate various operational needs and requirements. The described transition plan is based on a well established approach developed in the Earth Science discipline that ensures that the customer has a tool that meets their needs

  1. discovery toolset for Emulytics v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, David; Crussell, Jonathan

    The discovery toolset for Emulytics enables the construction of high-fidelity emulation models of systems. The toolset consists of a set of tools and techniques to automatically go from network discovery of operational systems to emulating those complex systems. Our toolset combines data from host discovery and network mapping tools into an intermediate representation that can then be further refined. Once the intermediate representation reaches the desired state, our toolset supports emitting the Emulytics models with varying levels of specificity based on experiment needs.

  2. The Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) Toolset

    NASA Technical Reports Server (NTRS)

    Zank, G. P.; Spann, J.

    2014-01-01

    We outline a plan to develop a physics based predictive toolset RISCS to describe the interplanetary energetic particle and radiation environment throughout the inner heliosphere, including at the Earth. To forecast and "nowcast" the radiation environment requires the fusing of three components: 1) the ability to provide probabilities for incipient solar activity; 2) the use of these probabilities and daily coronal and solar wind observations to model the 3D spatial and temporal heliosphere, including magnetic field structure and transients, within 10 AU; and 3) the ability to model the acceleration and transport of energetic particles based on current and anticipated coronal and heliospheric conditions. We describe how to address 1) - 3) based on our existing, well developed, and validated codes and models. The goal of RISCS toolset is to provide an operational forecast and "nowcast" capability that will a) predict solar energetic particle (SEP) intensities; b) spectra for protons and heavy ions; c) predict maximum energies and their duration; d) SEP composition; e) cosmic ray intensities, and f) plasma parameters, including shock arrival times, strength and obliquity at any given heliospheric location and time. The toolset would have a 72 hour predicative capability, with associated probabilistic bounds, that would be updated hourly thereafter to improve the predicted event(s) and reduce the associated probability bounds. The RISCS toolset would be highly adaptable and portable, capable of running on a variety of platforms to accommodate various operational needs and requirements.

  3. Netbook - A Toolset in Support of a Collaborative Learning.

    DTIC Science & Technology

    1997-01-31

    Netbook is a software development research project being conducted for the DARPA Computer Aided Training Initiative (CEATI). As a part of the Smart...Navigators to Access and Integrated Resources (SNAIR) division of CEATI, Netbook concerns itself with the management of Internet resources. More...specifically, Netbook is a toolset that enables students, teachers, and administrators to navigate the World Wide Web, collect resources found there, index

  4. Netbook - A Toolset in Support of a Collaborative Learning.

    DTIC Science & Technology

    1997-01-30

    As part of its collaborative efforts on the project Netbook - A Toolset in Support of a Collaborative and Cooperative Learning Environment, the...Interactive Multimedia Group (IMG) at Cornell University conducted a usability test of the latest version of Netbook , developed by Odyssey Research...Associates (ORA) in Ithaca, New York. Cornell’s goal was to test the concepts and current functionality of the Netbook software, which is designed to help

  5. Netbook - A Toolset in Support of a Collaborative and Cooperative Learning Environment.

    DTIC Science & Technology

    1996-04-26

    Netbook is a software development/research project being conducted for the DARPA computer aided training initiative (CEATI). As a part of the SNAIR...division of CEATI, Netbook concerns itself with the management of Internet resources. More specifically, Netbook is a toolset that allows students...a meaningful way. In addition Netbook provides the capacity for communication with peers and teachers, enabling students to collaborate while engaged

  6. Toolsets for Airborne Data (TAD): Customized Data Merging Function

    NASA Astrophysics Data System (ADS)

    Benson, A.; Peeters, M. C.; Perez, J.; Parker, L.; Chen, G.

    2013-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The ASDC Toolset for Airborne Data (TAD) is being designed to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. Prior to the actual toolset development, a comprehensive metadata database was created to compensate for the absence of standardization of the ICARTT data format in which the data is stored. This database tracks the Principal Investigator-provided metadata, and links the measurement variables to a common naming system that was developed as a part of this project. This database is used by the data merging module. Most aircraft data reported during a single flight is not on a consistent time base and is difficult to intercompare. This module provides the user with the ability to merge original data measurements from multiple data providers into a specified time interval or common time base. The database development, common naming scheme and data merge module development will be presented.

  7. CoMD Implementation Suite in Emerging Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haque, Riyaz; Reeve, Sam; Juallmes, Luc

    CoMD-Em is a software implementation suite of the CoMD [4] proxy app using different emerging programming models. It is intended to analyze the features and capabilities of novel programming models that could help ensure code and performance portability and scalability across heterogeneous platforms while improving programmer productivity. Another goal is to provide the authors and venders with some meaningful feedback regarding the capabilities and limitations of their models. The actual application is a classical molecular dynamics (MD) simulation using either the Lennard-Jones method (LJ) or the embedded atom method (EAM) for primary particle interaction. The code can be extended tomore » support alternate interaction models. The code is expected ro run on a wide class of heterogeneous hardware configurations like shard/distributed/hybrid memory, GPU's and any other platform supported by the underlying programming model.« less

  8. Multiple two-component systems modulate alkali generation in Streptococcus gordonii in response to environmental stresses.

    PubMed

    Liu, Yaling; Burne, Robert A

    2009-12-01

    The oral commensal Streptococcus gordonii must adapt to constantly fluctuating and often hostile environmental conditions to persist in the oral cavity. The arginine deiminase system (ADS) of S. gordonii enables cells to produce, ornithine, ammonia, CO(2), and ATP from arginine hydrolysis, augmenting the acid tolerance of the organism. The ADS genes are substrate inducible and sensitive to catabolite repression, mediated through ArcR and CcpA, respectively, but the system also requires low pH and anaerobic conditions for optimal activation. Here, we demonstrate that the CiaRH and ComDE two-component systems (TCS) are required for low-pH-dependent expression of ADS genes in S. gordonii. Further, the VicRK TCS is required for optimal ADS gene expression under anaerobic conditions and enhances the sensitivity of the operon to repression by oxygen. The known anaerobic activator of the ADS, Fnr-like protein (Flp), appeared to act independently of the Vic TCS. Mutants of S. gordonii lacking components of the CiaRH, ComDE, or VicRK grew more slowly in acidified media and were more sensitive to killing at lethal pH values and to agents that induce oxidative stress. This study provides the first evidence that TCS can regulate the ADS of bacteria in response to specific environmental signals and reveals some notable differences in the contribution of CiaRH, ComDE, and VicRK to viability and stress tolerance between the oral commensal S. gordonii and the oral pathogen Streptococcus mutans.

  9. Customer-experienced rapid prototyping

    NASA Astrophysics Data System (ADS)

    Zhang, Lijuan; Zhang, Fu; Li, Anbo

    2008-12-01

    In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.

  10. Multi Sector Planning Tools for Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Mainini, Matthew; Brasil, Connie

    2010-01-01

    This paper discusses a suite of multi sector planning tools for trajectory-based operations that were developed and evaluated in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center. The toolset included tools for traffic load and complexity assessment as well as trajectory planning and coordination. The situation assessment tools included an integrated suite of interactive traffic displays, load tables, load graphs, and dynamic aircraft filters. The planning toolset allowed for single and multi aircraft trajectory planning and data communication-based coordination of trajectories between operators. Also newly introduced was a real-time computation of sector complexity into the toolset that operators could use in lieu of aircraft count to better estimate and manage sector workload, especially in situations with convective weather. The tools were used during a joint NASA/FAA multi sector planner simulation in the AOL in 2009 that had multiple objectives with the assessment of the effectiveness of the tools being one of them. Current air traffic control operators who were experienced as area supervisors and traffic management coordinators used the tools throughout the simulation and provided their usefulness and usability ratings in post simulation questionnaires. This paper presents these subjective assessments as well as the actual usage data that was collected during the simulation. The toolset was rated very useful and usable overall. Many elements received high scores by the operators and were used frequently and successfully. Other functions were not used at all, but various requests for new functions and capabilities were received that could be added to the toolset.

  11. The Electric Propulsion Interactions Code (EPIC): A Member of the NASA Space Environment and Effects Program (SEE) Toolset

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Mandell, Myron J.; Kuharski, Robert A.; Davis, D. A.; Gardner, Barbara M.; Minor, Jody

    2003-01-01

    Science Applications International Corporation is currently developing the Electric Propulsion Interactions Code, EPIC, as part of a project sponsored by the Space Environments and Effects Program at NASA Marshall Space Flight Center. Now in its second year of development, EPIC is an interactive computer toolset that allows the construction of a 3-D spacecraft model, and the assessment of a variety of interactions between its subsystems and the plume from an electric thruster. This paper reports on the progress of EPZC including the recently added ability to exchange results the NASA Charging Analyzer Program, Nascap-2k. The capability greatly enhances EPIC's range of applicability. Expansion of the toolset's various physics models proceeds in parallel with the overall development of the software. Also presented are recent upgrades of the elastic scattering algorithm in the electric propulsion Plume Tool. These upgrades are motivated by the need to assess the effects of elastically scattered ions on the SIC for ion beam energies that exceed loo0 eV. Such energy levels are expected in future high-power (>10 kW) ion propulsion systems empowered by nuclear sources.

  12. Fast-Time Evaluations of Airborne Merging and Spacing in Terminal Arrival Operations

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, Karthik; Barmore, Bryan; Bussink, Frank; Weitz, Lesley; Dahlene, Laura

    2005-01-01

    NASA researchers are developing new airborne technologies and procedures to increase runway throughput at capacity-constrained airports by improving the precision of inter-arrival spacing at the runway threshold. In this new operational concept, pilots of equipped aircraft are cleared to adjust aircraft speed to achieve a designated spacing interval at the runway threshold, relative to a designated lead aircraft. A new airborne toolset, prototypes of which are being developed at the NASA Langley Research Center, assists pilots in achieving this objective. The current prototype allows precision spacing operations to commence even when the aircraft and its lead are not yet in-trail, but are on merging arrival routes to the runway. A series of fast-time evaluations of the new toolset were conducted at the Langley Research Center during the summer of 2004. The study assessed toolset performance in a mixed fleet of aircraft on three merging arrival streams under a range of operating conditions. The results of the study indicate that the prototype possesses a high degree of robustness to moderate variations in operating conditions.

  13. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    NASA Astrophysics Data System (ADS)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  14. Statistical against dynamical PLF fission as seen by the IMF-IMF correlation functions and comparisons with CoMD model

    NASA Astrophysics Data System (ADS)

    Pagano, E. V.; Acosta, L.; Auditore, L.; Cap, T.; Cardella, G.; Colonna, M.; De Filippo, E.; Geraci, E.; Gnoffo, B.; Lanzalone, G.; Maiolino, C.; Martorana, N.; Pagano, A.; Papa, M.; Piasecki, E.; Pirrone, S.; Politi, G.; Porto, F.; Quattrocchi, L.; Rizzo, F.; Russotto, P.; Trifiro’, A.; Trimarchi, M.; Siwek-Wilczynska, K.

    2018-05-01

    In nuclear reactions at Fermi energies two and multi particles intensity interferometry correlation methods are powerful tools in order to pin down the characteristic time scale of the emission processes. In this paper we summarize an improved application of the fragment-fragment correlation function in the specific physics case of heavy projectile-like (PLF) binary massive splitting in two fragments of intermediate mass(IMF). Results are shown for the reverse kinematics reaction 124 Sn+64 Ni at 35 AMeV that has been investigated by using the forward part of CHIMERA multi-detector. The analysis was performed as a function of the charge asymmetry of the observed couples of IMF. We show a coexistence of dynamical and statistical components as a function of the charge asymmetry. Transport CoMD simulations are compared with the data in order to pin down the timescale of the fragments production and the relevant ingredients of the in medium effective interaction used in the transport calculations.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching andmore » partitioning (Q&P) heat treatment, as an example.« less

  16. ECLIPSE, an Emerging Standardized Modular, Secure and Affordable Software Toolset in Support of Product Assurance, Quality Assurance and Project Management for the Entire European Space Industry (from Innovative SMEs to Primes and Institutions)

    NASA Astrophysics Data System (ADS)

    Bennetti, Andrea; Ansari, Salim; Dewhirst, Tori; Catanese, Giuseppe

    2010-08-01

    The development of satellites and ground systems (and the technologies that support them) is complex and demands a great deal of rigor in the management of both the information it relies upon and the information it generates via the performance of well established processes. To this extent for the past fifteen years Sapienza Consulting has been supporting the European Space Agency (ESA) in the management of this information and provided ESA with ECSS (European Cooperation for Space Standardization) Standards based Project Management (PM), Product Assurance (PA) and Quality Assurance (QA) software applications. In 2009 Sapienza recognised the need to modernize, standardizing and integrate its core ECSS-based software tools into a single yet modularised suite of applications named ECLIPSE aimed at: • Fulfilling a wider range of historical and emerging requirements, • Providing a better experience for users, • Increasing the value of the information it collects and manages • Lowering the cost of ownership and operation • Increasing collaboration within and between space sector organizations • Aiding in the performance of several PM, PA, QA, and configuration management tasks in adherence to ECSS standards. In this paper, Sapienza will first present the toolset, and a rationale for its development, describing and justifying its architecture, and basic modules composition. Having defined the toolset architecture, this paper will address the current status of the individual applications. A compliance assessment will be presented for each module in the toolset with respect to the ECSS standard it addresses. Lastly experience from early industry and Institutional users will be presented.

  17. Toolsets for Airborne Data (TAD): Enhanced Airborne Data Merging Functionality through Spatial and Temporal Subsetting

    NASA Astrophysics Data System (ADS)

    Early, A. B.; Chen, G.; Beach, A. L., III; Northup, E. A.

    2016-12-01

    NASA has conducted airborne tropospheric chemistry studies for over three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center in Hampton Virginia originally developed the Toolsets for Airborne Data (TAD) web application in September 2013 to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. The analysis of airborne data typically requires data subsetting, which can be challenging and resource intensive for end users. In an effort to streamline this process, the TAD toolset enhancements will include new data subsetting features and updates to the current database model. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. This effort will allow for the automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The development of these enhancements will be discussed in this presentation.

  18. Inhibiting effects of fructanase on competence-stimulating peptide-dependent quorum sensing system in Streptococcus mutans.

    PubMed

    Suzuki, Yusuke; Nagasawa, Ryo; Senpuku, Hidenobu

    2017-09-01

    Streptococcus mutans produces glucosyltransferases encoded by the gtfB and gtfC genes, which synthesize insoluble glucan, and both insoluble and soluble glucans by conversion of sucrose, and are known as principal agents to provide strong biofilm formation and demineralization on tooth surfaces. S. mutans possess a Com-dependent quorum sensing (QS) system, which is important for survival in severe conditions. The QS system is stimulated by the interaction between ComD {Receptor to competence-stimulating peptide (CSP)} encoded by the comD and CSP encoded by the comC, and importantly associated with bacteriocin production and genetic competence. Previously, we found enzyme fructanase (FruA) as a new inhibitor for the glucan-dependent biofilm formation. In the present study, inhibiting effects by FruA on glucan-independent biofilm formation of S. mutans UA159, UA159.gtfB - , UA159.gtfC - , and UA159.gtfBC - were observed in sucrose and no sucrose sugars-supplemented conditions using the plate assay. The reduction of UA159.comC - and UA159.comD - biofilm formation were also observed as compared with UA159 in same conditions. These results suggested that inhibitions of glucan-independent and Com-dependent biofilm formation were involved in the inhibiting mechanism by FruA. To more thoroughly investigate effects by FruA on the QS system, we examined on CSP-stimulated and Com-dependent bacteriocin production and genetic transformation. FruA inhibited bacteriocin production in collaboration with CSP and genetic transformation in bacterial cell conditions treated with FruA. Our findings show that FruA has multiple effects that inhibit survival functions of S. mutans, including biofilm formation and CSP-dependent QS responses, indicating its potential use as an agent for prevention of dental caries. Copyright © 2017 Japanese Society of Chemotherapy and The Japanese Association for Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, A. J.; Fanning, T. H.

    The United States has extensive experience with the design, construction, and operation of sodium cooled fast reactors (SFRs) over the last six decades. Despite the closure of various facilities, the U.S. continues to dedicate research and development (R&D) efforts to the design of innovative experimental, prototype, and commercial facilities. Accordingly, in support of the rich operating history and ongoing design efforts, the U.S. has been developing and maintaining a series of tools with capabilities that envelope all facets of SFR design and safety analyses. This paper provides an overview of the current U.S. SFR analysis toolset, including codes such asmore » SAS4A/SASSYS-1, MC2-3, SE2-ANL, PERSENT, NUBOW-3D, and LIFE-METAL, as well as the higher-fidelity tools (e.g. PROTEUS) being integrated into the toolset. Current capabilities of the codes are described and key ongoing development efforts are highlighted for some codes.« less

  20. Toolsets Maintain Health of Complex Systems

    NASA Technical Reports Server (NTRS)

    2010-01-01

    First featured in Spinoff 2001, Qualtech Systems Inc. (QSI), of Wethersfield, Connecticut, adapted its Testability, Engineering, and Maintenance System (TEAMS) toolset under Small Business Innovation Research (SBIR) contracts from Ames Research Center to strengthen NASA's systems health management approach for its large, complex, and interconnected systems. Today, six NASA field centers utilize the TEAMS toolset, including TEAMS-Designer, TEAMS-RT, TEAMATE, and TEAMS-RDS. TEAMS is also being used on industrial systems that generate power, carry data, refine chemicals, perform medical functions, and produce semiconductor wafers. QSI finds TEAMS can lower costs by decreasing problems requiring service by 30 to 50 percent.

  1. Webinar: Airborne Data Discovery and Analysis with Toolsets for Airborne Data (TAD)

    Atmospheric Science Data Center

    2016-10-18

    Webinar: Airborne Data Discovery and Analysis with Toolsets for Airborne Data (TAD) Wednesday, October 26, 2016 Join us on ... and flight data ranges are available. Registration is now open.  Access the full announcement   For TAD Information, ...

  2. Design of a Competency Administration Toolset (CAT)

    DTIC Science & Technology

    2017-03-01

    Distribution is unlimited. DESIGN OF A COMPETENCY ADMINISTRATION TOOLSET (CAT) by David Cudd, Justin Letwinsky, Allison Moon, David Rodriguez, Blake......processing, which is perceived to be inefficient in both time and cost. The purpose of this systems engineering project was to design a web-based system

  3. UNC-Utah NA-MIC framework for DTI fiber tract analysis.

    PubMed

    Verde, Audrey R; Budin, Francois; Berger, Jean-Baptiste; Gupta, Aditya; Farzinfar, Mahshid; Kaiser, Adrien; Ahn, Mihye; Johnson, Hans; Matsui, Joy; Hazlett, Heather C; Sharma, Anuja; Goodlett, Casey; Shi, Yundi; Gouttard, Sylvain; Vachet, Clement; Piven, Joseph; Zhu, Hongtu; Gerig, Guido; Styner, Martin

    2014-01-01

    Diffusion tensor imaging has become an important modality in the field of neuroimaging to capture changes in micro-organization and to assess white matter integrity or development. While there exists a number of tractography toolsets, these usually lack tools for preprocessing or to analyze diffusion properties along the fiber tracts. Currently, the field is in critical need of a coherent end-to-end toolset for performing an along-fiber tract analysis, accessible to non-technical neuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents a coherent, open source, end-to-end toolset for atlas fiber tract based DTI analysis encompassing DICOM data conversion, quality control, atlas building, fiber tractography, fiber parameterization, and statistical analysis of diffusion properties. Most steps utilize graphical user interfaces (GUI) to simplify interaction and provide an extensive DTI analysis framework for non-technical researchers/investigators. We illustrate the use of our framework on a small sample, cross sectional neuroimaging study of eight healthy 1-year-old children from the Infant Brain Imaging Study (IBIS) Network. In this limited test study, we illustrate the power of our method by quantifying the diffusion properties at 1 year of age on the genu and splenium fiber tracts.

  4. UNC-Utah NA-MIC framework for DTI fiber tract analysis

    PubMed Central

    Verde, Audrey R.; Budin, Francois; Berger, Jean-Baptiste; Gupta, Aditya; Farzinfar, Mahshid; Kaiser, Adrien; Ahn, Mihye; Johnson, Hans; Matsui, Joy; Hazlett, Heather C.; Sharma, Anuja; Goodlett, Casey; Shi, Yundi; Gouttard, Sylvain; Vachet, Clement; Piven, Joseph; Zhu, Hongtu; Gerig, Guido; Styner, Martin

    2014-01-01

    Diffusion tensor imaging has become an important modality in the field of neuroimaging to capture changes in micro-organization and to assess white matter integrity or development. While there exists a number of tractography toolsets, these usually lack tools for preprocessing or to analyze diffusion properties along the fiber tracts. Currently, the field is in critical need of a coherent end-to-end toolset for performing an along-fiber tract analysis, accessible to non-technical neuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents a coherent, open source, end-to-end toolset for atlas fiber tract based DTI analysis encompassing DICOM data conversion, quality control, atlas building, fiber tractography, fiber parameterization, and statistical analysis of diffusion properties. Most steps utilize graphical user interfaces (GUI) to simplify interaction and provide an extensive DTI analysis framework for non-technical researchers/investigators. We illustrate the use of our framework on a small sample, cross sectional neuroimaging study of eight healthy 1-year-old children from the Infant Brain Imaging Study (IBIS) Network. In this limited test study, we illustrate the power of our method by quantifying the diffusion properties at 1 year of age on the genu and splenium fiber tracts. PMID:24409141

  5. The Antiaircraft Journal. Volume 92, Number 3, May-June 1949

    DTIC Science & Technology

    1949-06-01

    Ft. Sill, Okla. Daley, Edward J., to 4051st STU Det Arty Sch, Ft. Sill, Okla. Dalton, Joseph R., to Marianas- Bonins Comd, Guam. Darden, Harry 1., to...AAA and GM Br, Ft. Bliss, Tex. Edwards, Dave W., to 4052d ASU AAA and GM Cen, Ft. Bliss, Tex. Eiler, Donald 1., to US Army Caribbean, Quarry Heights

  6. Data-Mining Toolset Developed for Determining Turbine Engine Part Life Consumption

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.

    2003-01-01

    The current practice in aerospace turbine engine maintenance is to remove components defined as life-limited parts after a fixed time, on the basis of a predetermined number of flight cycles. Under this schedule-based maintenance practice, the worst-case usage scenario is used to determine the usable life of the component. As shown, this practice often requires removing a part before its useful life is fully consumed, thus leading to higher maintenance cost. To address this issue, the NASA Glenn Research Center, in a collaborative effort with Pratt & Whitney, has developed a generic modular toolset that uses data-mining technology to parameterize life usage models for maintenance purposes. The toolset enables a "condition-based" maintenance approach, where parts are removed on the basis of the cumulative history of the severity of operation they have experienced. The toolset uses data-mining technology to tune life-consumption models on the basis of operating and maintenance histories. The flight operating conditions, represented by measured variables within the engine, are correlated with repair records for the engines, generating a relationship between the operating condition of the part and its service life. As shown, with the condition-based maintenance approach, the lifelimited part is in service until its usable life is fully consumed. This approach will lower maintenance costs while maintaining the safety of the propulsion system. The toolset is a modular program that is easily customizable by users. First, appropriate parametric damage accumulation models, which will be functions of engine variables, must be defined. The tool then optimizes the models to match the historical data by computing an effective-cycle metric that reduces the unexplained variability in component life due to each damage mode by accounting for the variability in operational severity. The damage increment due to operating conditions experienced during each flight is used to compute the effective cycles and ultimately the replacement time. Utilities to handle data problems, such as gaps in the flight data records, are included in the toolset. The tool was demonstrated using the first stage, high-pressure turbine blade of the PW4077 engine (Pratt & Whitney, East Hartford, CT). The damage modes considered were thermomechanical fatigue and oxidation/erosion. Each PW4077 engine contains 82 first-stage, high-pressure turbine blades, and data from a fleet of engines were used to tune the life-consumption models. The models took into account not only measured variables within the engine, but also unmeasured variables such as engine health parameters that are affected by degradation of the engine due to aging. The tool proved effective at predicting the average number of blades scrapped over time due to each damage mode, per engine, given the operating history of the engine. The customizable tools are available to interested parties within the aerospace community.

  7. Working Group Reports and Presentations: Virtual Worlds and Virtual Exploration

    NASA Technical Reports Server (NTRS)

    LAmoreaux, Claudia

    2006-01-01

    Scientists and engineers are continually developing innovative methods to capitalize on recent developments in computational power. Virtual worlds and virtual exploration present a new toolset for project design, implementation, and resolution. Replication of the physical world in the virtual domain provides stimulating displays to augment current data analysis techniques and to encourage public participation. In addition, the virtual domain provides stakeholders with a low cost, low risk design and test environment. The following document defines a virtual world and virtual exploration, categorizes the chief motivations for virtual exploration, elaborates upon specific objectives, identifies roadblocks and enablers for realizing the benefits, and highlights the more immediate areas of implementation (i.e. the action items). While the document attempts a comprehensive evaluation of virtual worlds and virtual exploration, the innovative nature of the opportunities presented precludes completeness. The authors strongly encourage readers to derive additional means of utilizing the virtual exploration toolset.

  8. An Ada programming support environment

    NASA Technical Reports Server (NTRS)

    Tyrrill, AL; Chan, A. David

    1986-01-01

    The toolset of an Ada Programming Support Environment (APSE) being developed at North American Aircraft Operations (NAAO) of Rockwell International, is described. The APSE is resident on three different hosts and must support developments for the hosts and for embedded targets. Tools and developed software must be freely portable between the hosts. The toolset includes the usual editors, compilers, linkers, debuggers, configuration magnagers, and documentation tools. Generally, these are being supplied by the host computer vendors. Other tools, for example, pretty printer, cross referencer, compilation order tool, and management tools were obtained from public-domain sources, are implemented in Ada and are being ported to the hosts. Several tools being implemented in-house are of interest, these include an Ada Design Language processor based on compilable Ada. A Standalone Test Environment Generator facilitates test tool construction and partially automates unit level testing. A Code Auditor/Static Analyzer permits the Ada programs to be evaluated against measures of quality. An Ada Comment Box Generator partially automates generation of header comment boxes.

  9. Analyzing the impact of carbon regulatory mechanisms on supply chain management.

    DOT National Transportation Integrated Search

    2014-07-01

    The objective of this research is developing a toolset for designing and managing cost : efficient and environmentally friendly supply chains for perishable products. : The models we propose minimize transportation and inventory holding costs in the ...

  10. Model-Based Data Integration and Process Standardization Techniques for Fault Management: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig

    2018-01-01

    This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.

  11. A Comprehensive review on the open source hackable text editor-ATOM

    NASA Astrophysics Data System (ADS)

    Sumangali, K.; Borra, Lokesh; Suraj Mishra, Amol

    2017-11-01

    This document represents a comprehensive study of “Atom”, one of the best open-source code editors available with many features built-in to support multitude of programming environments and to provide a more productive toolset for developers.

  12. High-efficiency and high-reliability 9xx-nm bars and fiber-coupled devices at Coherent

    NASA Astrophysics Data System (ADS)

    Zhou, Hailong; Kennedy, Keith; Weiss, Eli; Li, Jun; Anikitchev, Serguei; Reichert, Patrick; Du, Jihua; Schleuning, David; Nabors, David; Reed, Murray; Toivonen, Mika; Lehkonen, Sami; Haapamaa, Jouko

    2006-02-01

    Ongoing optimization of epitaxial design within Coherent device engineering has led to a family of high power-conversion-efficiency (PCE) products on conductively cooled packages (CCP) and fiber array packages (FAP). At a 25°C heat sink temperature, the PCE was measured at 71.5% with 75W CW output power on 30% fill-factor (FF) bars with passive cooling. At heat sink temperatures as high as 60°C the PCE of these bars is still maintained above 60%. Powered by such high efficiency 9xx nm diodes, Coherent FAP products have consistently exceeded 55% PCE up to 50W power levels, with 62% PCE demonstrated out of the fiber. High linear-power-density (LPD) operation of 100μm x 7-emitter bars at LPD = 80 mW/μm was also demonstrated. Bars with 7-emitter were measured up to 140W QCW power before catastrophic optical mirror damage (COMD) occurred, which corresponds to a COMD value of 200mW/μm or 2D facet power density of 29.4 MW/cm2. Leveraging these improvements has enabled high power FAPs with >90W CW from an 800μm-diameter fiber bundle. Extensive reliability testing has already accumulated 400,000 total real-time device hours at a variety of accelerated and non-accelerated operating conditions. A random failure rate <0.5% per kilo-hours and gradual degradation rate <0.4% per kilo-hours have been observed. For a 30% FF 50W CW 9xx nm bar, this equates to >30,000 hours of median lifetime at a 90% confidence level. More optimized 30% FF 9xx nm bars are under development for power outputs up to 80W CW with extrapolated median lifetimes greater than 20,000 hours.

  13. Semiclassical Calculations of Peripheral Heavy-Ion Collisions at Fermi Energies and the Nuclear Equation of State

    NASA Astrophysics Data System (ADS)

    Souliotis, G. A.; Shetty, D. V.; Galanopoulos, S.; Yennello, S. J.

    2008-10-01

    A systematic study of quasi-elastic and deep-inelastic collisions at Fermi energies has been undertaken at Texas A&M aiming at obtaining information on the mechanism of nucleon exchange and the course towards N/Z equilibration [1,2]. We expect to get insight in the dynamics and the nuclear equation of state by comparing our experimental heavy residue data to detailed calculations using microscopic models of quantum molecular dynamics (QMD) type. At present, we have performed detailed calculations using the code CoMD (Constrained Molecular Dynamics) of A. Bonasera and M. Papa [3]. The code implements an effective interaction with a nuclear-matter compressibility of K=200 (soft EOS) with several forms of the density dependence of the nucleon-nucleon symmetry potential. CoMD imposes a constraint in the phase space occupation for each nucleon, effectively restoring the Pauli principle at each time step of the collision. Results of the calculations and comparisons with our data will be presented and implications concerning the isospin part of the nuclear equation of state will be discussed. [1] G.A. Souliotis et al., Phys. Rev. Lett. 91, 022701 (2003). [2] G.A. Souliotis et al., Phys. Lett. B 588, 35 (2004). [3] M. Papa et al., Phys. Rev. C 64, 024612 (2001).

  14. A Slag Management Toolset for Determining Optimal Coal Gasification Temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwong, Kyei-Sing; Bennett, James P.

    Abstract Gasifier operation is an intricate process because of the complex relationship between slag chemistry and temperature, limitations of feedstock materials, and operational preference. High gasification temperatures increase refractory degradation, while low gasification temperatures can lead to slag buildup on the gasifier sidewall or exit, either of which are problematic during operation. Maximizing refractory service life and gasifier performance require finding an optimized operating temperature range which is a function of the coal slag chemistry and viscosity. Gasifier operators typically use a slag’s viscosity-temperature relationship and/or ash-fusion fluid temperature to determine the gasification temperature range. NETL has built a slagmore » management toolset to determine the optimal temperature range for gasification of a carbon feedstock. This toolset is based on a viscosity database containing experimental data, and a number of models used to predict slag viscosity as a function of composition and temperature. Gasifier users typically have no scientific basis for selecting an operational temperature range for gasification, instead using experience to select operational conditions. The use of the toolset presented in this paper provides a basis for estimating or modifying carbon feedstock slags generated from ash impurities in carbon feedstock.« less

  15. ISS Solar Array Management

    NASA Technical Reports Server (NTRS)

    Williams, James P.; Martin, Keith D.; Thomas, Justin R.; Caro, Samuel

    2010-01-01

    The International Space Station (ISS) Solar Array Management (SAM) software toolset provides the capabilities necessary to operate a spacecraft with complex solar array constraints. It monitors spacecraft telemetry and provides interpretations of solar array constraint data in an intuitive manner. The toolset provides extensive situational awareness to ensure mission success by analyzing power generation needs, array motion constraints, and structural loading situations. The software suite consists of several components including samCS (constraint set selector), samShadyTimers (array shadowing timers), samWin (visualization GUI), samLock (array motion constraint computation), and samJet (attitude control system configuration selector). It provides high availability and uptime for extended and continuous mission support. It is able to support two-degrees-of-freedom (DOF) array positioning and supports up to ten simultaneous constraints with intuitive 1D and 2D decision support visualizations of constraint data. Display synchronization is enabled across a networked control center and multiple methods for constraint data interpolation are supported. Use of this software toolset increases flight safety, reduces mission support effort, optimizes solar array operation for achieving mission goals, and has run for weeks at a time without issues. The SAM toolset is currently used in ISS real-time mission operations.

  16. A Slag Management Toolset for Determining Optimal Coal Gasification Temperatures

    DOE PAGES

    Kwong, Kyei-Sing; Bennett, James P.

    2016-11-25

    Abstract Gasifier operation is an intricate process because of the complex relationship between slag chemistry and temperature, limitations of feedstock materials, and operational preference. High gasification temperatures increase refractory degradation, while low gasification temperatures can lead to slag buildup on the gasifier sidewall or exit, either of which are problematic during operation. Maximizing refractory service life and gasifier performance require finding an optimized operating temperature range which is a function of the coal slag chemistry and viscosity. Gasifier operators typically use a slag’s viscosity-temperature relationship and/or ash-fusion fluid temperature to determine the gasification temperature range. NETL has built a slagmore » management toolset to determine the optimal temperature range for gasification of a carbon feedstock. This toolset is based on a viscosity database containing experimental data, and a number of models used to predict slag viscosity as a function of composition and temperature. Gasifier users typically have no scientific basis for selecting an operational temperature range for gasification, instead using experience to select operational conditions. The use of the toolset presented in this paper provides a basis for estimating or modifying carbon feedstock slags generated from ash impurities in carbon feedstock.« less

  17. Water Use Optimization Toolset Project: Development and Demonstration Phase Draft Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gasper, John R.; Veselka, Thomas D.; Mahalik, Matthew R.

    2014-05-19

    This report summarizes the results of the development and demonstration phase of the Water Use Optimization Toolset (WUOT) project. It identifies the objective and goals that guided the project, as well as demonstrating potential benefits that could be obtained by applying the WUOT in different geo-hydrologic systems across the United States. A major challenge facing conventional hydropower plants is to operate more efficiently while dealing with an increasingly uncertain water-constrained environment and complex electricity markets. The goal of this 3-year WUOT project, which is funded by the U.S. Department of Energy (DOE), is to improve water management, resulting in moremore » energy, revenues, and grid services from available water, and to enhance environmental benefits from improved hydropower operations and planning while maintaining institutional water delivery requirements. The long-term goal is for the WUOT to be used by environmental analysts and deployed by hydropower schedulers and operators to assist in market, dispatch, and operational decisions.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kress, Joel David

    The development and scale up of cost effective carbon capture processes is of paramount importance to enable the widespread deployment of these technologies to significantly reduce greenhouse gas emissions. The U.S. Department of Energy initiated the Carbon Capture Simulation Initiative (CCSI) in 2011 with the goal of developing a computational toolset that would enable industry to more effectively identify, design, scale up, operate, and optimize promising concepts. The first half of the presentation will introduce the CCSI Toolset consisting of basic data submodels, steady-state and dynamic process models, process optimization and uncertainty quantification tools, an advanced dynamic process control framework,more » and high-resolution filtered computationalfluid- dynamics (CFD) submodels. The second half of the presentation will describe a high-fidelity model of a mesoporous silica supported, polyethylenimine (PEI)-impregnated solid sorbent for CO 2 capture. The sorbent model includes a detailed treatment of transport and amine-CO 2- H 2O interactions based on quantum chemistry calculations. Using a Bayesian approach for uncertainty quantification, we calibrate the sorbent model to Thermogravimetric (TGA) data.« less

  19. The Bauschinger and Hardening Effects on Residual Stresses in an Autofrettaged Thick-Walled Cylinder

    DTIC Science & Technology

    1984-06-01

    8217Large Caliber Weapon3 Systems Labrra-ory 1ý’ NUBE OF PAGES Dover, 1.7)70 14 "NITOING AGENCY N XME: A ADDRF!SS(Ir -ditI.. fromI Controlling Office. IS...US ARMY AMCCOM COMMANDER ATTN: DRSMC-LC(D) 1 US ARMY TANK-AUTMV COMD DRSMC-LCE(D) I ATTN: DRSTA-RC DRSMC-LCM(D) ( BLDC 321) 1 WARREN, MI 48090 DRSMC-LCS

  20. The Politics of Defence Budgeting: A Study of Organisation and Resource Allocation in the United Kingdom and the United States,

    DTIC Science & Technology

    1983-01-01

    fourth responsible for guided weapons and electronics across-the-board. Each controller would be an accounting officer directly responsible to... electronics . Thereafter, as Hastie Smith says, "the identification of the systems controllers with their Service boards [the committees corporately...COMM. COMD. ASST SECDEF CNTR & INTEL) (HEALTH AFFAIRS) ASST TO THE SECDEF (ATOMIC ENERGY ) R DEFENSE LLIGENCE SE AGENCIES DEESEDFES)EFENSE DEFENSE

  1. Classic-Ada(TM)

    NASA Technical Reports Server (NTRS)

    Valley, Lois

    1989-01-01

    The SPS product, Classic-Ada, is a software tool that supports object-oriented Ada programming with powerful inheritance and dynamic binding. Object Oriented Design (OOD) is an easy, natural development paradigm, but it is not supported by Ada. Following the DOD Ada mandate, SPS developed Classic-Ada to provide a tool which supports OOD and implements code in Ada. It consists of a design language, a code generator and a toolset. As a design language, Classic-Ada supports the object-oriented principles of information hiding, data abstraction, dynamic binding, and inheritance. It also supports natural reuse and incremental development through inheritance, code factoring, and Ada, Classic-Ada, dynamic binding and static binding in the same program. Only nine new constructs were added to Ada to provide object-oriented design capabilities. The Classic-Ada code generator translates user application code into fully compliant, ready-to-run, standard Ada. The Classic-Ada toolset is fully supported by SPS and consists of an object generator, a builder, a dictionary manager, and a reporter. Demonstrations of Classic-Ada and the Classic-Ada Browser were given at the workshop.

  2. Hypertext Interchange Using ICA.

    ERIC Educational Resources Information Center

    Rada, Roy; And Others

    1995-01-01

    Discusses extended ICA (Integrated Chameleon Architecture), a public domain toolset for generating text-to-hypertext translators. A system called SGML-MUCH has been developed using E-ICA (Extended Integrated Chameleon Architecture) and is presented as a case study with converters for the hypertext systems MUCH, Guide, Hyperties, and Toolbook.…

  3. 24/7 Operational Effectiveness Toolset: Mishap Investigation Interface

    DTIC Science & Technology

    2008-10-01

    analysis software product. The toolset was based upon the Sleep, Activity, Fatigue, and Task Effectiveness (SAFTE™; Hursh et al., 2004). The SAFTE...meeting was for the purpose of requirements analysis , in which the designers elicited task information from the SMEs. The second meeting included a...investigators who also served as our SMEs. The requirements analysis revealed how fatigue related information is entered into the report of an Air Force

  4. Evaluation of the automated reference toolset for oil and gas reclamation on Colorado rangelands

    USDA-ARS?s Scientific Manuscript database

    Rangelands are characterized by low precipitation and low biomass, making them susceptible to disturbance and difficult to reclaim. Considering the widespread and significant impact of oil and gas development on rangelands, effective reclamation is vital. Thus, it is important that land managers und...

  5. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  6. Microscopic calculations of dynamics and N/Z equilibration in peripheral collisions below the Fermi energy.

    NASA Astrophysics Data System (ADS)

    Souliotis, G. A.; Shetty, D. V.; Galanopoulos, S.; Yennello, S. J.

    2008-04-01

    A systematic study of heavy residues formed in peripheral collisions below the Fermi energy has been undertaken at Texas A&M aiming at obtaining information on the mechanism of nucleon exchange and the course towards N/Z equilibration [1,2]. We expect to get insight on the dynamics and the nuclear equation of state by comparing our heavy residue data to detailed calculations using microscopic models of quantum molecular dynamics (QMD) type. We are performing calculations using two codes: the CoMD code of M. Papa, A. Bonasera [3] and the CHIMERA-QMD code of J. Lukasik [4]. Both codes implement an effective interaction with a nuclear-matter compressibility of K=200 (soft EOS) with several forms of the density dependence of the nucleon-nucleon symmetry potential. CoMD imposes a constraint in the phase space occupation for each nucleon restoring the Pauli principle at each time step of the collision. CHIMERA-QMD uses a Pauli potential term to mimic the Pauli principle. Results of the calculations and comparisons with our residue data will be presented. [1] G.A. Souliotis et al., Phys. Rev. Lett. 91, 022701 (2003). [2] G.A. Souliotis et al., Phys. Lett. B 588, 35 (2004). [3] M. Papa et al., Phys. Rev. C 64, 024612 (2001). [4] J. Lukasik, Z. Majka, Acta Phys. Pol. B 24, 1959 (1993).

  7. SeqHBase: a big data toolset for family based sequencing data analysis.

    PubMed

    He, Min; Person, Thomas N; Hebbring, Scott J; Heinzen, Ethan; Ye, Zhan; Schrodi, Steven J; McPherson, Elizabeth W; Lin, Simon M; Peissig, Peggy L; Brilliant, Murray H; O'Rawe, Jason; Robison, Reid J; Lyon, Gholson J; Wang, Kai

    2015-04-01

    Whole-genome sequencing (WGS) and whole-exome sequencing (WES) technologies are increasingly used to identify disease-contributing mutations in human genomic studies. It can be a significant challenge to process such data, especially when a large family or cohort is sequenced. Our objective was to develop a big data toolset to efficiently manipulate genome-wide variants, functional annotations and coverage, together with conducting family based sequencing data analysis. Hadoop is a framework for reliable, scalable, distributed processing of large data sets using MapReduce programming models. Based on Hadoop and HBase, we developed SeqHBase, a big data-based toolset for analysing family based sequencing data to detect de novo, inherited homozygous, or compound heterozygous mutations that may contribute to disease manifestations. SeqHBase takes as input BAM files (for coverage at every site), variant call format (VCF) files (for variant calls) and functional annotations (for variant prioritisation). We applied SeqHBase to a 5-member nuclear family and a 10-member 3-generation family with WGS data, as well as a 4-member nuclear family with WES data. Analysis times were almost linearly scalable with number of data nodes. With 20 data nodes, SeqHBase took about 5 secs to analyse WES familial data and approximately 1 min to analyse WGS familial data. These results demonstrate SeqHBase's high efficiency and scalability, which is necessary as WGS and WES are rapidly becoming standard methods to study the genetics of familial disorders. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Design Through Analysis (DTA) roadmap vision.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blacker, Teddy Dean; Adams, Charles R.; Hoffman, Edward L.

    2004-10-01

    The Design through Analysis Realization Team (DART) will provide analysts with a complete toolset that reduces the time to create, generate, analyze, and manage the data generated in a computational analysis. The toolset will be both easy to learn and easy to use. The DART Roadmap Vision provides for progressive improvements that will reduce the Design through Analysis (DTA) cycle time by 90-percent over a three-year period while improving both the quality and accountability of the analyses.

  9. Using the Personal Competence Manager as a Complementary Approach to IMS Learning Design Authoring

    ERIC Educational Resources Information Center

    Vogten, Hubert; Koper, Rob; Martens, Harrie; van Bruggen, Jan

    2008-01-01

    In this article TENCompetence will be presented as a framework for lifelong competence development. More specifically, the relationship between the TENCompetence framework and the IMS Learning Design (LD) specification is explored. LD authoring has proven to be challenging and the toolset currently available is targeting expert users mostly…

  10. A systematic approach for analysis and design of secure health information systems.

    PubMed

    Blobel, B; Roger-France, F

    2001-06-01

    A toolset using object-oriented techniques including the nowadays popular unified modelling language (UML) approach has been developed to facilitate the different users' views for security analysis and design of health care information systems. Paradigm and concepts used are based on the component architecture of information systems and on a general layered security model. The toolset was developed in 1996/1997 within the ISHTAR project funded by the European Commission as well as through international standardisation activities. Analysing and systematising real health care scenarios, only six and nine use case types could be found in the health and the security-related view, respectively. By combining these use case types, the analysis and design of any thinkable system architecture can be simplified significantly. Based on generic schemes, the environment needed for both communication and application security can be established by appropriate sets of security services and mechanisms. Because of the importance and the basic character of electronic health care record (EHCR) systems, the understanding of the approach is facilitated by (incomplete) examples for this application.

  11. Command History 1971. Volume 2. Sanitized

    DTIC Science & Technology

    1971-01-01

    2-71) (U) This inquiry was prompted by racial incidents oc•,urring on 17-15 December 1970 at Camp Baxter. baqse camp of the 5th Trans Comd, US Army...SUPCOM, Da Nang. Tile incident begam ot 17 December with a fist fight in the 329th Trans Co but quickly deteriorated into a racial gang fight. This...Directive for I’,41 tiry. A total of os witnesses were into erviwewad. Of these4 13 presented the view of the $t. Trans Co Commander and #taff and the

  12. An end-to-end approach to developing biological and chemical detector requirements

    NASA Astrophysics Data System (ADS)

    Teclemariam, Nerayo P.; Purvis, Liston K.; Foltz, Greg W.; West, Todd; Edwards, Donna M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    2009-05-01

    Effective defense against chemical and biological threats requires an "end-to-end" strategy that encompasses the entire problem space, from threat assessment and target hardening to response planning and recovery. A key element of the strategy is the definition of appropriate system requirements for surveillance and detection of threat agents. Our end-to-end approach to venue chem/bio defense is captured in the Facilities Weapons of Mass Destruction Decision Analysis Capability (FacDAC), an integrated system-of-systems toolset that can be used to generate requirements across all stages of detector development. For example, in the early stage of detector development the approach can be used to develop performance targets (e.g., sensitivity, selectivity, false positive rate) to provide guidance on what technologies to pursue. In the development phase, after a detector technology has been selected, the approach can aid in determining performance trade-offs and down-selection of competing technologies. During the application stage, the approach can be employed to design optimal defensive architectures that make the best use of available technology to maximize system performance. This presentation will discuss the end-to-end approach to defining detector requirements and demonstrate the capabilities of the FacDAC toolset using examples from a number of studies for the Department of Homeland Security.

  13. A complete toolset for the study of Ustilago bromivora and Brachypodium sp. as a fungal-temperate grass pathosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabe, Franziska; Bosch, Jason; Stirnberg, Alexandra

    Due to their economic relevance, the study of plant pathogen interactions is of importance. However, elucidating these interactions and their underlying molecular mechanisms remains challenging since both host and pathogen need to be fully genetically accessible organisms. Here we present milestones in the establishment of a new biotrophic model pathosystem: Ustilago bromivora and Brachypodium sp. We provide a complete toolset, including an annotated fungal genome and methods for genetic manipulation of the fungus and its host plant. This toolset will enable researchers to easily study biotrophic interactions at the molecular level on both the pathogen and the host side. Moreover,more » our research on the fungal life cycle revealed a mating type bias phenomenon. U. bromivora harbors a haplo-lethal allele that is linked to one mating type region. As a result, the identified mating type bias strongly promotes inbreeding, which we consider to be a potential speciation driver.« less

  14. A complete toolset for the study of Ustilago bromivora and Brachypodium sp. as a fungal-temperate grass pathosystem

    DOE PAGES

    Rabe, Franziska; Bosch, Jason; Stirnberg, Alexandra; ...

    2016-11-11

    Due to their economic relevance, the study of plant pathogen interactions is of importance. However, elucidating these interactions and their underlying molecular mechanisms remains challenging since both host and pathogen need to be fully genetically accessible organisms. Here we present milestones in the establishment of a new biotrophic model pathosystem: Ustilago bromivora and Brachypodium sp. We provide a complete toolset, including an annotated fungal genome and methods for genetic manipulation of the fungus and its host plant. This toolset will enable researchers to easily study biotrophic interactions at the molecular level on both the pathogen and the host side. Moreover,more » our research on the fungal life cycle revealed a mating type bias phenomenon. U. bromivora harbors a haplo-lethal allele that is linked to one mating type region. As a result, the identified mating type bias strongly promotes inbreeding, which we consider to be a potential speciation driver.« less

  15. A complete toolset for the study of Ustilago bromivora and Brachypodium sp. as a fungal-temperate grass pathosystem

    PubMed Central

    Rabe, Franziska; Bosch, Jason; Stirnberg, Alexandra; Guse, Tilo; Bauer, Lisa; Seitner, Denise; Rabanal, Fernando A; Czedik-Eysenberg, Angelika; Uhse, Simon; Bindics, Janos; Genenncher, Bianca; Navarrete, Fernando; Kellner, Ronny; Ekker, Heinz; Kumlehn, Jochen; Vogel, John P; Gordon, Sean P; Marcel, Thierry C; Münsterkötter, Martin; Walter, Mathias C; Sieber, Christian MK; Mannhaupt, Gertrud; Güldener, Ulrich; Kahmann, Regine; Djamei, Armin

    2016-01-01

    Due to their economic relevance, the study of plant pathogen interactions is of importance. However, elucidating these interactions and their underlying molecular mechanisms remains challenging since both host and pathogen need to be fully genetically accessible organisms. Here we present milestones in the establishment of a new biotrophic model pathosystem: Ustilago bromivora and Brachypodium sp. We provide a complete toolset, including an annotated fungal genome and methods for genetic manipulation of the fungus and its host plant. This toolset will enable researchers to easily study biotrophic interactions at the molecular level on both the pathogen and the host side. Moreover, our research on the fungal life cycle revealed a mating type bias phenomenon. U. bromivora harbors a haplo-lethal allele that is linked to one mating type region. As a result, the identified mating type bias strongly promotes inbreeding, which we consider to be a potential speciation driver. DOI: http://dx.doi.org/10.7554/eLife.20522.001 PMID:27835569

  16. Image Processing for Bioluminescence Resonance Energy Transfer Measurement-BRET-Analyzer.

    PubMed

    Chastagnier, Yan; Moutin, Enora; Hemonnot, Anne-Laure; Perroy, Julie

    2017-01-01

    A growing number of tools now allow live recordings of various signaling pathways and protein-protein interaction dynamics in time and space by ratiometric measurements, such as Bioluminescence Resonance Energy Transfer (BRET) Imaging. Accurate and reproducible analysis of ratiometric measurements has thus become mandatory to interpret quantitative imaging. In order to fulfill this necessity, we have developed an open source toolset for Fiji- BRET-Analyzer -allowing a systematic analysis, from image processing to ratio quantification. We share this open source solution and a step-by-step tutorial at https://github.com/ychastagnier/BRET-Analyzer. This toolset proposes (1) image background subtraction, (2) image alignment over time, (3) a composite thresholding method of the image used as the denominator of the ratio to refine the precise limits of the sample, (4) pixel by pixel division of the images and efficient distribution of the ratio intensity on a pseudocolor scale, and (5) quantification of the ratio mean intensity and standard variation among pixels in chosen areas. In addition to systematize the analysis process, we show that the BRET-Analyzer allows proper reconstitution and quantification of the ratiometric image in time and space, even from heterogeneous subcellular volumes. Indeed, analyzing twice the same images, we demonstrate that compared to standard analysis BRET-Analyzer precisely define the luminescent specimen limits, enlightening proficient strengths from small and big ensembles over time. For example, we followed and quantified, in live, scaffold proteins interaction dynamics in neuronal sub-cellular compartments including dendritic spines, for half an hour. In conclusion, BRET-Analyzer provides a complete, versatile and efficient toolset for automated reproducible and meaningful image ratio analysis.

  17. Analysis of In-Space Assembly of Modular Systems

    NASA Technical Reports Server (NTRS)

    Moses, Robert W.; VanLaak, James; Johnson, Spencer L.; Chytka, Trina M.; Reeves, John D.; Todd, B. Keith; Moe, Rud V.; Stambolian, Damon B.

    2005-01-01

    Early system-level life cycle assessments facilitate cost effective optimization of system architectures to enable implementation of both modularity and in-space assembly, two key Exploration Systems Research & Technology (ESR&T) Strategic Challenges. Experiences with the International Space Station (ISS) demonstrate that the absence of this rigorous analysis can result in increased cost and operational risk. An effort is underway, called Analysis of In-Space Assembly of Modular Systems, to produce an innovative analytical methodology, including an evolved analysis toolset and proven processes in a collaborative engineering environment, to support the design and evaluation of proposed concepts. The unique aspect of this work is that it will produce the toolset, techniques and initial products to analyze and compare the detailed, life cycle costs and performance of different implementations of modularity for in-space assembly. A multi-Center team consisting of experienced personnel from the Langley Research Center, Johnson Space Center, Kennedy Space Center, and the Goddard Space Flight Center has been formed to bring their resources and experience to this development. At the end of this 30-month effort, the toolset will be ready to support the Exploration Program with an integrated assessment strategy that embodies all life-cycle aspects of the mission from design and manufacturing through operations to enable early and timely selection of an optimum solution among many competing alternatives. Already there are many different designs for crewed missions to the Moon that present competing views of modularity requiring some in-space assembly. The purpose of this paper is to highlight the approach for scoring competing designs.

  18. A New Tool for Environmental and Economic Optimization of Hydropower Operations

    NASA Astrophysics Data System (ADS)

    Saha, S.; Hayse, J. W.

    2012-12-01

    As part of a project funded by the U.S. Department of Energy, researchers from Argonne, Oak Ridge, Pacific Northwest, and Sandia National Laboratories collaborated on the development of an integrated toolset to enhance hydropower operational decisions related to economic value and environmental performance. As part of this effort, we developed an analytical approach (Index of River Functionality, IRF) and an associated software tool to evaluate how well discharge regimes achieve ecosystem management goals for hydropower facilities. This approach defines site-specific environmental objectives using relationships between environmental metrics and hydropower-influenced flow characteristics (e.g., discharge or temperature), with consideration given to seasonal timing, duration, and return frequency requirements for the environmental objectives. The IRF approach evaluates the degree to which an operational regime meets each objective and produces a score representing how well that regime meets the overall set of defined objectives. When integrated with other components in the toolset that are used to plan hydropower operations based upon hydrologic forecasts and various constraints on operations, the IRF approach allows an optimal release pattern to be developed based upon tradeoffs between environmental performance and economic value. We tested the toolset prototype to generate a virtual planning operation for a hydropower facility located in the Upper Colorado River basin as a demonstration exercise. We conducted planning as if looking five months into the future using data for the recently concluded 2012 water year. The environmental objectives for this demonstration were related to spawning and nursery habitat for endangered fishes using metrics associated with maintenance of instream habitat and reconnection of the main channel with floodplain wetlands in a representative reach of the river. We also applied existing mandatory operational constraints for the facility during the demonstration. We compared the optimized virtual operation identified by the toolset to actual operations at the facility for the same time period to evaluate implications of the optimized operational regime on power/revenue generation and environmental performance. Argonne National Laboratory's work was part of a larger "Water-Use-Optimization" project supported by the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy, Water Power Program, under Announcement DE-FOA-0000070. The submitted manuscript has been created by UChicago Argonne, LLC, Operator of Argonne National Laboratory ("Argonne"). Argonne, a U.S. Department of Energy Office of Science laboratory, is operated under Contract No. DE-AC02-06CH11357. The U.S. Government retains for itself, and others acting on its behalf, a paid-up nonexclusive, irrevocable worldwide license in said article to reproduce, prepare derivative works, distribute copies to the public, and perform publicly and display publicly, by or on behalf of the Government.

  19. Reading Guided by Automated Graphical Representations: How Model-Based Text Visualizations Facilitate Learning in Reading Comprehension Tasks

    ERIC Educational Resources Information Center

    Pirnay-Dummer, Pablo; Ifenthaler, Dirk

    2011-01-01

    Our study integrates automated natural language-oriented assessment and analysis methodologies into feasible reading comprehension tasks. With the newly developed T-MITOCAR toolset, prose text can be automatically converted into an association net which has similarities to a concept map. The "text to graph" feature of the software is based on…

  20. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  1. Water Energy Simulation Toolset

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Thuy; Jeffers, Robert

    The Water-Energy Simulation Toolset (WEST) is an interactive simulation model that helps visualize impacts of different stakeholders on water quantity and quality of a watershed. The case study is applied for the Snake River Basin with the fictional name Cutthroat River Basin. There are four groups of stakeholders of interest: hydropower, agriculture, flood control, and environmental protection. Currently, the quality component depicts nitrogen-nitrate contaminant. Users can easily interact with the model by changing certain inputs (climate change, fertilizer inputs, etc.) to observe the change over the entire system. Users can also change certain parameters to test their management policy.

  2. Pilot Preference, Compliance, and Performance With an Airborne Conflict Management Toolset

    NASA Technical Reports Server (NTRS)

    Doble, Nathan A.; Barhydt, Richard; Krishnamurthy, Karthik

    2005-01-01

    A human-in-the-loop experiment was conducted at the NASA Ames and Langley Research Centers, investigating the En Route Free Maneuvering component of a future air traffic management concept termed Distributed Air/Ground Traffic Management (DAG-TM). NASA Langley test subject pilots used the Autonomous Operations Planner (AOP) airborne toolset to detect and resolve traffic conflicts, interacting with subject pilots and air traffic controllers at NASA Ames. Experimental results are presented, focusing on conflict resolution maneuver choices, AOP resolution guidance acceptability, and performance metrics. Based on these results, suggestions are made to further improve the AOP interface and functionality.

  3. ASC Tri-lab Co-design Level 2 Milestone Report 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, Rich; Jones, Holger; Keasler, Jeff

    2015-09-23

    In 2015, the three Department of Energy (DOE) National Laboratories that make up the Advanced Sci- enti c Computing (ASC) Program (Sandia, Lawrence Livermore, and Los Alamos) collaboratively explored performance portability programming environments in the context of several ASC co-design proxy applica- tions as part of a tri-lab L2 milestone executed by the co-design teams at each laboratory. The programming environments that were studied included Kokkos (developed at Sandia), RAJA (LLNL), and Legion (Stan- ford University). The proxy apps studied included: miniAero, LULESH, CoMD, Kripke, and SNAP. These programming models and proxy-apps are described herein. Each lab focused on amore » particular combination of abstractions and proxy apps, with the goal of assessing performance portability using those. Performance portability was determined by: a) the ability to run a single application source code on multiple advanced architectures, b) comparing runtime performance between \

  4. The SysMan monitoring service and its management environment

    NASA Astrophysics Data System (ADS)

    Debski, Andrzej; Janas, Ekkehard

    1996-06-01

    Management of modern information systems is becoming more and more complex. There is a growing need for powerful, flexible and affordable management tools to assist system managers in maintaining such systems. It is at the same time evident that effective management should integrate network management, system management and application management in a uniform way. Object oriented OSI management architecture with its four basic modelling concepts (information, organization, communication and functional models) together with widely accepted distribution platforms such as ANSA/CORBA, constitutes a reliable and modern framework for the implementation of a management toolset. This paper focuses on the presentation of concepts and implementation results of an object oriented management toolset developed and implemented within the framework of the ESPRIT project 7026 SysMan. An overview is given of the implemented SysMan management services including the System Management Service, Monitoring Service, Network Management Service, Knowledge Service, Domain and Policy Service, and the User Interface. Special attention is paid to the Monitoring Service which incorporates the architectural key entity responsible for event management. Its architecture and building components, especially filters, are emphasized and presented in detail.

  5. Opportunities for CRISPR/Cas9 Gene Editing in Retinal Regeneration Research

    PubMed Central

    Campbell, Leah J.; Hyde, David R.

    2017-01-01

    While retinal degeneration and disease results in permanent damage and vision loss in humans, the severely damaged zebrafish retina has a high capacity to regenerate lost neurons and restore visual behaviors. Advancements in understanding the molecular and cellular basis of this regeneration response give hope that strategies and therapeutics may be developed to restore sight to blind and visually-impaired individuals. Our current understanding has been facilitated by the amenability of zebrafish to molecular tools, imaging techniques, and forward and reverse genetic approaches. Accordingly, the zebrafish research community has developed a diverse array of research tools for use in developing and adult animals, including toolkits for facilitating the generation of transgenic animals, systems for inducible, cell-specific transgene expression, and the creation of knockout alleles for nearly every protein coding gene. As CRISPR/Cas9 genome editing has begun to revolutionize molecular biology research, the zebrafish community has responded in stride by developing CRISPR/Cas9 techniques for the zebrafish as well as incorporating CRISPR/Cas9 into available toolsets. The application of CRISPR/Cas9 to retinal regeneration research will undoubtedly bring us closer to understanding the mechanisms underlying retinal repair and vision restoration in the zebrafish, as well as developing therapeutic approaches that will restore vision to blind and visually-impaired individuals. This review focuses on how CRISPR/Cas9 has been integrated into zebrafish research toolsets and how this new tool will revolutionize the field of retinal regeneration research. PMID:29218308

  6. Key Factors for Determining Risk of Groundwater Impacts Due to Leakage from Geologic Carbon Sequestration Reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, Susan; Keating, Elizabeth; Mansoor, Kayyum

    2014-01-06

    The National Risk Assessment Partnership (NRAP) is developing a science-based toolset for the analysis of potential impacts to groundwater chemistry from CO 2 injection (www.netldoe.gov/nrap). The toolset adopts a stochastic approach in which predictions address uncertainties in shallow underwater and leakage scenarios. It is derived from detailed physics and chemistry simulation results that are used to train more computationally efficient models,l referred to here as reduced-order models (ROMs), for each component system. In particular, these tools can be used to help regulators and operators understand the expected sizes and longevity of plumes in pH, TDS, and dissolved metals that couldmore » result from a leakage of brine and/or CO 2 from a storage reservoir into aquifers. This information can inform, for example, decisions on monitoring strategies that are both effective and efficient. We have used this approach to develop predictive reduced-order models for two common types of reservoirs, but the approach could be used to develop a model for a specific aquifer or other common types of aquifers. In this paper we describe potential impacts to groundwater quality due to CO 2 and brine leakage, discuss an approach to calculate thresholds under which "no impact" to groundwater occurs, describe the time scale for impact on groundwater, and discuss the probability of detecting a groundwater plume should leakage occur.« less

  7. Optogenetics and the future of neuroscience.

    PubMed

    Boyden, Edward S

    2015-09-01

    Over the last 10 years, optogenetics has become widespread in neuroscience for the study of how specific cell types contribute to brain functions and brain disorder states. The full impact of optogenetics will emerge only when other toolsets mature, including neural connectivity and cell phenotyping tools and neural recording and imaging tools. The latter tools are rapidly improving, in part because optogenetics has helped galvanize broad interest in neurotechnology development.

  8. Model-Based Trade Space Exploration for Near-Earth Space Missions

    NASA Technical Reports Server (NTRS)

    Cohen, Ronald H.; Boncyk, Wayne; Brutocao, James; Beveridge, Iain

    2005-01-01

    We developed a capability for model-based trade space exploration to be used in the conceptual design of Earth-orbiting space missions. We have created a set of reusable software components to model various subsystems and aspects of space missions. Several example mission models were created to test the tools and process. This technique and toolset has demonstrated itself to be valuable for space mission architectural design.

  9. Real-time simulation of thermal shadows with EMIT

    NASA Astrophysics Data System (ADS)

    Klein, Andreas; Oberhofer, Stefan; Schätz, Peter; Nischwitz, Alfred; Obermeier, Paul

    2016-05-01

    Modern missile systems use infrared imaging for tracking or target detection algorithms. The development and validation processes of these missile systems need high fidelity simulations capable of stimulating the sensors in real-time with infrared image sequences from a synthetic 3D environment. The Extensible Multispectral Image Generation Toolset (EMIT) is a modular software library developed at MBDA Germany for the generation of physics-based infrared images in real-time. EMIT is able to render radiance images in full 32-bit floating point precision using state of the art computer graphics cards and advanced shader programs. An important functionality of an infrared image generation toolset is the simulation of thermal shadows as these may cause matching errors in tracking algorithms. However, for real-time simulations, such as hardware in the loop simulations (HWIL) of infrared seekers, thermal shadows are often neglected or precomputed as they require a thermal balance calculation in four-dimensions (3D geometry in one-dimensional time up to several hours in the past). In this paper we will show the novel real-time thermal simulation of EMIT. Our thermal simulation is capable of simulating thermal effects in real-time environments, such as thermal shadows resulting from the occlusion of direct and indirect irradiance. We conclude our paper with the practical use of EMIT in a missile HWIL simulation.

  10. Regulation of Bacteriocin Production and Cell Death by the VicRK Signaling System in Streptococcus mutans

    PubMed Central

    Senadheera, D. B.; Cordova, M.; Ayala, E. A.; Chávez de Paz, L. E.; Singh, K.; Downey, J. S.; Svensäter, G.; Goodman, S. D.

    2012-01-01

    The VicRK two-component signaling system modulates biofilm formation, genetic competence, and stress tolerance in Streptococcus mutans. We show here that the VicRK modulates bacteriocin production and cell viability, in part by direct modulation of competence-stimulating peptide (CSP) production in S. mutans. Global transcriptome and real-time transcriptional analysis of the VicK-deficient mutant (SmuvicK) revealed significant modulation of several bacteriocin-related loci, including nlmAB, nlmC, and nlmD (P < 0.001), suggesting a role for the VicRK in producing mutacins IV, V, and VI. Bacteriocin overlay assays revealed an altered ability of the vic mutants to kill related species. Since a well-conserved VicR binding site (TGTWAH-N5-TGTWAH) was identified within the comC coding region, we confirmed VicR binding to this sequence using DNA footprinting. Overexpression of the vic operon caused growth-phase-dependent repression of comC, comDE, and comX. In the vic mutants, transcription of nlmC/cipB encoding mutacin V, previously linked to CSP-dependent cell lysis, as well as expression of its putative immunity factor encoded by immB, were significantly affected relative to the wild type (P < 0.05). In contrast to previous reports that proposed a hyper-resistant phenotype for the VicK mutant in cell viability, the release of extracellular genomic DNA was significantly enhanced in SmuvicK (P < 0.05), likely as a result of increased autolysis compared with the parent. The drastic influence of VicRK on cell viability was also demonstrated using vic mutant biofilms. Taken together, we have identified a novel regulatory link between the VicRK and ComDE systems to modulate bacteriocin production and cell viability of S. mutans. PMID:22228735

  11. Visualizing relativity: The OpenRelativity project

    NASA Astrophysics Data System (ADS)

    Sherin, Zachary W.; Cheu, Ryan; Tan, Philip; Kortemeyer, Gerd

    2016-05-01

    We present OpenRelativity, an open-source toolkit to simulate effects of special relativity within the popular Unity game engine. Intended for game developers, educators, and anyone interested in physics, OpenRelativity can help people create, test, and share experiments to explore the effects of special relativity. We describe the underlying physics and some of the implementation details of this toolset with the hope that engaging games and interactive relativistic "laboratory" experiments might be implemented.

  12. Enlisting Ecosystem Benefits: Quantification and Valuation of Ecosystem Services to Inform Installation Management

    DTIC Science & Technology

    2015-05-27

    human development and conservation of terrestrial, freshwater, and marine ecosystems. The InVEST toolset currently includes 17 distinct InVEST... Plateau to the north and the Coastal Plain to the south, which represent distinct features of topography, geology and soils, and vegetation communities...threatened by a complex of tree diseases and pine beetles that cause declines or mortality in loblolly pine, a dominant tree across the base. When loblolly

  13. Development of X-TOOLSS: Preliminary Design of Space Systems Using Evolutionary Computation

    NASA Technical Reports Server (NTRS)

    Schnell, Andrew R.; Hull, Patrick V.; Turner, Mike L.; Dozier, Gerry; Alverson, Lauren; Garrett, Aaron; Reneau, Jarred

    2008-01-01

    Evolutionary computational (EC) techniques such as genetic algorithms (GA) have been identified as promising methods to explore the design space of mechanical and electrical systems at the earliest stages of design. In this paper the authors summarize their research in the use of evolutionary computation to develop preliminary designs for various space systems. An evolutionary computational solver developed over the course of the research, X-TOOLSS (Exploration Toolset for the Optimization of Launch and Space Systems) is discussed. With the success of early, low-fidelity example problems, an outline of work involving more computationally complex models is discussed.

  14. A Hydropower Biological Evaluation Toolset (HBET) for Characterizing Hydraulic Conditions and Impacts of Hydro-Structures on Fish

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Hongfei; Deng, Zhiqun; Martinez, Jayson

    Currently, approximately 16% of the world’s electricity and over 80% of the world’s renewable electricity is generated from hydropower resources, and there is potential for development of a significant amount of new hydropower capacity. However, in practice, realizing all the potential hydropower resource is limited by various factors, including environmental effects and related mitigation requirements. That is why hydropower regulatory requirements frequently call for targets to be met regarding fish injury and mortality rates. Hydropower Biological Evaluation Toolset (HBET), an integrated suite of software tools, is designed to characterize hydraulic conditions of hydropower structures and provide quantitative estimates of fishmore » injury and mortality rates due to various physical stressors including strike, pressure, and shear. HBET enables users to design new studies, analyze data, perform statistical analyses, and evaluate biological responses. In this paper, we discuss the features of the HBET software and describe a case study that illustrates its functionalities. HBET can be used by turbine manufacturers, hydropower operators, and regulators to design and operate hydropower systems that minimize ecological impacts in a cost-effective manner.« less

  15. Developing Toolsets for AirBorne Data (TAD): Overview of Design Concept

    NASA Astrophysics Data System (ADS)

    Parker, L.; Perez, J.; Chen, G.; Benson, A.; Peeters, M. C.

    2013-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. Even though the spatial and temporal coverage is limited, the aircraft data offer high resolution and comprehensive simultaneous coverage of many variables, e.g. ozone precursors, intermediate photochemical species, and photochemical products. The recent NASA Earth Venture Program has generated an unprecedented amount of aircraft observations in terms of the sheer number of measurements and data volume. The ASDC Toolset for Airborne Data (TAD) is being designed to meet the user community needs for aircraft data for scientific research on climate change and air quality relevant issues, particularly: 1) Provide timely access to a broad user community, 2) Provide an intuitive user interface to facilitate quick discovery of the variables and data, 3) Provide data products and tools to facilitate model assessment activities, e.g., merge files and data subsetting capabilities, 4) Provide simple utility 'calculators', e.g., unit conversion and aerosol size distribution processing, and 5) Provide Web Coverage Service capable tools to enhance the data usability. The general strategy and design of TAD will be presented.

  16. SU-E-T-157: CARMEN: A MatLab-Based Research Platform for Monte Carlo Treatment Planning (MCTP) and Customized System for Planning Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baeza, J.A.; Ureba, A.; Jimenez-Ortega, E.

    Purpose: Although there exist several radiotherapy research platforms, such as: CERR, the most widely used and referenced; SlicerRT, which allows treatment plan comparison from various sources; and MMCTP, a full MCTP system; it is still needed a full MCTP toolset that provides users complete control of calculation grids, interpolation methods and filters in order to “fairly” compare results from different TPSs, supporting verification with experimental measurements. Methods: This work presents CARMEN, a MatLab-based platform including multicore and GPGPU accelerated functions for loading RT data; designing treatment plans; and evaluating dose matrices and experimental data.CARMEN supports anatomic and functional imaging inmore » DICOM format, as well as RTSTRUCT, RTPLAN and RTDOSE. Besides, it contains numerous tools to accomplish the MCTP process, managing egs4phant and phase space files.CARMEN planning mode assist in designing IMRT, VMAT and MERT treatments via both inverse and direct optimization. The evaluation mode contains a comprehensive toolset (e.g. 2D/3D gamma evaluation, difference matrices, profiles, DVH, etc.) to compare datasets from commercial TPS, MC simulations (i.e. 3ddose) and radiochromic film in a user-controlled manner. Results: CARMEN has been validated against commercial RTPs and well-established evaluation tools, showing coherent behavior of its multiple algorithms. Furthermore, CARMEN platform has been used to generate competitive complex treatment that has been published in comparative studies. Conclusion: A new research oriented MCTP platform with a customized validation toolset has been presented. Despite of being coded with a high-level programming language, CARMEN is agile due to the use of parallel algorithms. The wide-spread use of MatLab provides straightforward access to CARMEN’s algorithms to most researchers. Similarly, our platform can benefit from the MatLab community scientific developments as filters, registration algorithms etc. Finally, CARMEN arises the importance of grid and filtering control in treatment plan comparison.« less

  17. European Software Engineering Process Group Conference (2nd Annual), EUROPEAN SEPG󈨥. Delegate Material, Tutorials

    DTIC Science & Technology

    1997-06-17

    There is Good and Bad News With CMMs8 *bad news: process improvement takes time *good news: the first benefit Is better schedule management With PSP s...e g similar supp v EURO not sudden death toolset for assessment and v EURO => Business benefits detailed analysis) . EURO could collapse (low risk...from SPI live on even after year 2000. Priority BENEFITS Actions * Improved management and application development processes * Strengthened Change

  18. Development and Application of an Analyst Process Model for a Search Task Scenario

    DTIC Science & Technology

    2013-12-01

    varied experience levels of the users we will be looking at not only testing the new tool, but also understanding the impact on user groups that the...each group using the toolsets to complete search tasks. 2.4 Hypotheses This research effort seeks to test the following hypotheses: H0... quantitative measures: report quality, errors, and cognitive workload. Due to the crossover design of the experiment, these were analyzed by group and within

  19. The automated reference toolset: A soil-geomorphic ecological potential matching algorithm

    USGS Publications Warehouse

    Nauman, Travis; Duniway, Michael C.

    2016-01-01

    Ecological inventory and monitoring data need referential context for interpretation. Identification of appropriate reference areas of similar ecological potential for site comparison is demonstrated using a newly developed automated reference toolset (ART). Foundational to identification of reference areas was a soil map of particle size in the control section (PSCS), a theme in US Soil Taxonomy. A 30-m resolution PSCS map of the Colorado Plateau (366,000 km2) was created by interpolating ∼5000 field soil observations using a random forest model and a suite of raster environmental spatial layers representing topography, climate, general ecological community, and satellite imagery ratios. The PSCS map had overall out of bag accuracy of 61.8% (Kappa of 0.54, p < 0.0001), and an independent validation accuracy of 93.2% at a set of 356 field plots along the southern edge of Canyonlands National Park, Utah. The ART process was also tested at these plots, and matched plots with the same ecological sites (ESs) 67% of the time where sites fell within 2-km buffers of each other. These results show that the PSCS and ART have strong application for ecological monitoring and sampling design, as well as assessing impacts of disturbance and land management action using an ecological potential framework. Results also demonstrate that PSCS could be a key mapping layer for the USDA-NRCS provisional ES development initiative.

  20. Kennedy Space Center Orion Processing Team Planning for Ground Operations

    NASA Technical Reports Server (NTRS)

    Letchworth, Gary; Schlierf, Roland

    2011-01-01

    Topics in this presentation are: Constellation Ares I/Orion/Ground Ops Elements Orion Ground Operations Flow Orion Operations Planning Process and Toolset Overview, including: 1 Orion Concept of Operations by Phase 2 Ops Analysis Capabilities Overview 3 Operations Planning Evolution 4 Functional Flow Block Diagrams 5 Operations Timeline Development 6 Discrete Event Simulation (DES) Modeling 7 Ground Operations Planning Document Database (GOPDb) Using Operations Planning Tools for Operability Improvements includes: 1 Kaizen/Lean Events 2 Mockups 3 Human Factors Analysis

  1. ControlShell - A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Ullman, Marc A.; Chen, Vincent W.

    1991-01-01

    ControlShell is designed to enable modular design and impplementation of real-time software. It is an object-oriented tool-set for real-time software system programming. It provides a series of execution and data interchange mechansims that form a framework for building real-time applications. These mechanisms allow a component-based approach to real-time software generation and mangement. By defining a set of interface specifications for intermodule interaction, ControlShell provides a common platform that is the basis for real-time code development and exchange.

  2. Methodology for testing and validating knowledge bases

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  3. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  4. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  5. Aspect Determination Using a Beacon with a Spiral Wave Front: Modeling and Performance Analysis in Operational Environments

    DTIC Science & Technology

    2014-12-19

    used to evaluate the beacon performance at the Navy’s Seneca Lake Sonar Test Facility operated by NUWC-Newport. These tests occurred in the summer...prototype has been designed. Efforts have been underway to implement the spiral beacon into the Navy’s Sonar Simulation Toolset developed by Dr. Robert...mil). Digital Object Identifier 10.1109/JOE.2013.2293962 acoustic depth finding or sonar imaging may be compared with maps to coordinate position and

  6. Framework Nucleic Acids-Enabled Biosensor Development.

    PubMed

    Yang, Fan; Li, Qian; Wang, Lihua; Zhang, Guo-Jun; Fan, Chunhai

    2018-05-03

    Nucleic acids have been actively exploited to develop various exquisite nanostructures due to their unparalleled programmability. Especially, framework nucleic acids (FNAs) with tailorable functionality and precise addressability hold great promise for biomedical applications. In this review, we summarize recent progress of FNA-enabled biosensing in homogeneous solutions, on heterogeneous surfaces and inside cells. We describe the strategies to translate the structural order and rigidity of FNAs to interfacial engineering with high controllability, and approaches to realize multiplexing for highly parallel in-vitro detection. We also envision the marriage of the currently available FNA toolsets with other emerging technologies to develop a new generation of biosensors for precision diagnosis and bioimaging.

  7. The generic task toolset: High level languages for the construction of planning and problem solving systems

    NASA Technical Reports Server (NTRS)

    Chandrasekaran, B.; Josephson, J.; Herman, D.

    1987-01-01

    The current generation of languages for the construction of knowledge-based systems as being at too low a level of abstraction is criticized, and the need for higher level languages for building problem solving systems is advanced. A notion of generic information processing tasks in knowledge-based problem solving is introduced. A toolset which can be used to build expert systems in a way that enhances intelligibility and productivity in knowledge acquistion and system construction is described. The power of these ideas is illustrated by paying special attention to a high level language called DSPL. A description is given of how it was used in the construction of a system called MPA, which assists with planning in the domain of offensive counter air missions.

  8. An Introduction to Transient Engine Applications Using the Numerical Propulsion System Simulation (NPSS) and MATLAB

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey C.; Csank, Jeffrey T.; Haller, William J.; Seidel, Jonathan A.

    2016-01-01

    This document outlines methodologies designed to improve the interface between the Numerical Propulsion System Simulation framework and various control and dynamic analyses developed in the Matlab and Simulink environment. Although NPSS is most commonly used for steady-state modeling, this paper is intended to supplement the relatively sparse documentation on it's transient analysis functionality. Matlab has become an extremely popular engineering environment, and better methodologies are necessary to develop tools that leverage the benefits of these disparate frameworks. Transient analysis is not a new feature of the Numerical Propulsion System Simulation (NPSS), but transient considerations are becoming more pertinent as multidisciplinary trade-offs begin to play a larger role in advanced engine designs. This paper serves to supplement the relatively sparse documentation on transient modeling and cover the budding convergence between NPSS and Matlab based modeling toolsets. The following sections explore various design patterns to rapidly develop transient models. Each approach starts with a base model built with NPSS, and assumes the reader already has a basic understanding of how to construct a steady-state model. The second half of the paper focuses on further enhancements required to subsequently interface NPSS with Matlab codes. The first method being the simplest and most straightforward but performance constrained, and the last being the most abstract. These methods aren't mutually exclusive and the specific implementation details could vary greatly based on the designer's discretion. Basic recommendations are provided to organize model logic in a format most easily amenable to integration with existing Matlab control toolsets.

  9. NASA Langley Atmospheric Science Data Center Toolsets for Airborne Data (TAD): Common Variable Naming Schema

    NASA Astrophysics Data System (ADS)

    Chen, G.; Early, A. B.; Peeters, M. C.

    2014-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, which are characterized by a wide range of trace gases and aerosol properties. The airborne observational data have often been used in assessment and validation of models and satellite instruments. One particular issue is a lack of consistent variable naming across field campaigns, which makes cross-mission data discovery difficult. The ASDC Toolset for Airborne Data (TAD) is being designed to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. As part of this effort, a common naming system was developed to provide a link between variables from different aircraft field studies. This system covers all current and past airborne in-situ measurements housed at the ASDC, as well as select NOAA missions. The TAD common variable naming system consists of 6 categories and 3 sub-levels. The top-level category is primarily defined by the physical characteristics of the measurement: e.g., aerosol, cloud, trace gases. The sub-levels were designed to organize the variables according to nature of measurement (e.g., aerosol microphysical and optical properties) or chemical structures (e.g., carbon compound). The development of the TAD common variable naming system was in consultation with staff from the Global Change Master Directory (GCMD) and referenced/expanded the existing Climate and Forecast (CF) variable naming conventions. The detailed structure of the TAD common variable naming convention and its application in TAD development will be presented.

  10. Wound Ballistics Modeling for Blast Loading Blunt Force Impact and Projectile Penetration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Paul A.

    Light body armor development for the warfighter is based on trial-and-error testing of prototype designs against ballistic projectiles. Torso armor testing against blast is nonexistent but necessary to protect the heart and lungs. In tests against ballistic projectiles, protective apparel is placed over ballistic clay and the projectiles are fired into the armor/clay target. The clay represents the human torso and its behind-armor, permanent deflection is the principal metric used to assess armor protection. Although this approach provides relative merit assessment of protection, it does not examine the behind-armor blunt trauma to crucial torso organs. We propose a modeling andmore » simulation (M&S) capability for wound injury scenarios to the head, neck, and torso of the warfighter. We will use this toolset to investigate the consequences of, and mitigation against, blast exposure, blunt force impact, and ballistic projectile penetration leading to damage of critical organs comprising the central nervous, cardiovascular, and respiratory systems. We will leverage Sandia codes and our M&S expertise on traumatic brain injury to develop virtual anatomical models of the head, neck, and torso and the simulation methodology to capture the physics of wound mechanics. Specifically, we will investigate virtual wound injuries to the head, neck, and torso without and with protective armor to demonstrate the advantages of performing injury simulations for the development of body armor. The proposed toolset constitutes a significant advance over current methods by providing a virtual simulation capability to investigate wound injury and optimize armor design without the need for extensive field testing.« less

  11. SIPSim: A Modeling Toolkit to Predict Accuracy and Aid Design of DNA-SIP Experiments.

    PubMed

    Youngblut, Nicholas D; Barnett, Samuel E; Buckley, Daniel H

    2018-01-01

    DNA Stable isotope probing (DNA-SIP) is a powerful method that links identity to function within microbial communities. The combination of DNA-SIP with multiplexed high throughput DNA sequencing enables simultaneous mapping of in situ assimilation dynamics for thousands of microbial taxonomic units. Hence, high throughput sequencing enabled SIP has enormous potential to reveal patterns of carbon and nitrogen exchange within microbial food webs. There are several different methods for analyzing DNA-SIP data and despite the power of SIP experiments, it remains difficult to comprehensively evaluate method accuracy across a wide range of experimental parameters. We have developed a toolset (SIPSim) that simulates DNA-SIP data, and we use this toolset to systematically evaluate different methods for analyzing DNA-SIP data. Specifically, we employ SIPSim to evaluate the effects that key experimental parameters (e.g., level of isotopic enrichment, number of labeled taxa, relative abundance of labeled taxa, community richness, community evenness, and beta-diversity) have on the specificity, sensitivity, and balanced accuracy (defined as the product of specificity and sensitivity) of DNA-SIP analyses. Furthermore, SIPSim can predict analytical accuracy and power as a function of experimental design and community characteristics, and thus should be of great use in the design and interpretation of DNA-SIP experiments.

  12. SIPSim: A Modeling Toolkit to Predict Accuracy and Aid Design of DNA-SIP Experiments

    PubMed Central

    Youngblut, Nicholas D.; Barnett, Samuel E.; Buckley, Daniel H.

    2018-01-01

    DNA Stable isotope probing (DNA-SIP) is a powerful method that links identity to function within microbial communities. The combination of DNA-SIP with multiplexed high throughput DNA sequencing enables simultaneous mapping of in situ assimilation dynamics for thousands of microbial taxonomic units. Hence, high throughput sequencing enabled SIP has enormous potential to reveal patterns of carbon and nitrogen exchange within microbial food webs. There are several different methods for analyzing DNA-SIP data and despite the power of SIP experiments, it remains difficult to comprehensively evaluate method accuracy across a wide range of experimental parameters. We have developed a toolset (SIPSim) that simulates DNA-SIP data, and we use this toolset to systematically evaluate different methods for analyzing DNA-SIP data. Specifically, we employ SIPSim to evaluate the effects that key experimental parameters (e.g., level of isotopic enrichment, number of labeled taxa, relative abundance of labeled taxa, community richness, community evenness, and beta-diversity) have on the specificity, sensitivity, and balanced accuracy (defined as the product of specificity and sensitivity) of DNA-SIP analyses. Furthermore, SIPSim can predict analytical accuracy and power as a function of experimental design and community characteristics, and thus should be of great use in the design and interpretation of DNA-SIP experiments. PMID:29643843

  13. Disturbance automated reference toolset (DART): Assessing patterns in ecological recovery from energy development on the Colorado Plateau

    USGS Publications Warehouse

    Nauman, Travis; Duniway, Michael C.; Villarreal, Miguel; Poitras, Travis

    2017-01-01

    A new disturbance automated reference toolset (DART) was developed to monitor human land surface impacts using soil-type and ecological context. DART identifies reference areas with similar soils, topography, and geology; and compares the disturbance condition to the reference area condition using a quantile-based approach based on a satellite vegetation index. DART was able to represent 26–55% of variation of relative differences in bare ground and 26–41% of variation in total foliar cover when comparing sites with nearby ecological reference areas using the Soil Adjusted Total Vegetation Index (SATVI). Assessment of ecological recovery at oil and gas pads on the Colorado Plateau with DART revealed that more than half of well-pads were below the 25th percentile of reference areas. Machine learning trend analysis of poorly recovering well-pads (quantile < 0.23) had out-of-bag error rates between 37 and 40% indicating moderate association with environmental and management variables hypothesized to influence recovery. Well-pads in grasslands (median quantile [MQ] = 13%), blackbrush (Coleogyne ramosissima) shrublands (MQ = 18%), arid canyon complexes (MQ = 18%), warmer areas with more summer-dominated precipitation, and state administered areas (MQ = 12%) had low recovery rates. Results showcase the usefulness of DART for assessing discrete surface land disturbances, and highlight the need for more targeted rehabilitation efforts at oil and gas well-pads in the arid southwest US.

  14. Disturbance automated reference toolset (DART): Assessing patterns in ecological recovery from energy development on the Colorado Plateau.

    PubMed

    Nauman, Travis W; Duniway, Michael C; Villarreal, Miguel L; Poitras, Travis B

    2017-04-15

    A new disturbance automated reference toolset (DART) was developed to monitor human land surface impacts using soil-type and ecological context. DART identifies reference areas with similar soils, topography, and geology; and compares the disturbance condition to the reference area condition using a quantile-based approach based on a satellite vegetation index. DART was able to represent 26-55% of variation of relative differences in bare ground and 26-41% of variation in total foliar cover when comparing sites with nearby ecological reference areas using the Soil Adjusted Total Vegetation Index (SATVI). Assessment of ecological recovery at oil and gas pads on the Colorado Plateau with DART revealed that more than half of well-pads were below the 25th percentile of reference areas. Machine learning trend analysis of poorly recovering well-pads (quantile<0.23) had out-of-bag error rates between 37 and 40% indicating moderate association with environmental and management variables hypothesized to influence recovery. Well-pads in grasslands (median quantile [MQ]=13%), blackbrush (Coleogyne ramosissima) shrublands (MQ=18%), arid canyon complexes (MQ=18%), warmer areas with more summer-dominated precipitation, and state administered areas (MQ=12%) had low recovery rates. Results showcase the usefulness of DART for assessing discrete surface land disturbances, and highlight the need for more targeted rehabilitation efforts at oil and gas well-pads in the arid southwest US. Published by Elsevier B.V.

  15. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2012-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less

  16. Inhibition of the Quorum Sensing System (ComDE Pathway) by Aromatic 1,3-di-m-tolylurea (DMTU): Cariostatic Effect with Fluoride in Wistar Rats

    PubMed Central

    Kaur, Gurmeet; Balamurugan, P.; Princy, S. Adline

    2017-01-01

    Dental caries occurs as a result of dysbiosis among commensal and pathogenic bacteria leading to demineralization of enamel within a dental biofilm (plaque) as a consequence of lower pH in the oral cavity. In our previous study, we have reported 1,3-disubstituted ureas particularly, 1,3-di-m-tolylurea (DMTU) could inhibit the biofilm formation along with lower concentrations of fluoride (31.25 ppm) without affecting bacterial growth. In the present study, RT-qPCR analysis showed the target specific molecular mechanism of DMTU. In vivo treatment with DMTU, alone or in combination with fluoride, resulted in inhibition of caries (biofilm development of Streptococcus mutans) using a Wistar rat model for dental caries. The histopathological analysis reported the development of lesions on dentine in infected subjects whereas the dentines of treated rodents were found to be intact and healthy. Reduction in inflammatory markers in rodents' blood and liver samples was observed when treated with DMTU. Collectively, data speculate that DMTU is an effective anti-biofilm and anti-inflammatory agent, which may improve the cariostatic properties of fluoride. PMID:28748175

  17. Inhibition of the Quorum Sensing System (ComDE Pathway) by Aromatic 1,3-di-m-tolylurea (DMTU): Cariostatic Effect with Fluoride in Wistar Rats.

    PubMed

    Kaur, Gurmeet; Balamurugan, P; Princy, S Adline

    2017-01-01

    Dental caries occurs as a result of dysbiosis among commensal and pathogenic bacteria leading to demineralization of enamel within a dental biofilm (plaque) as a consequence of lower pH in the oral cavity. In our previous study, we have reported 1,3-disubstituted ureas particularly, 1,3-di-m-tolylurea (DMTU) could inhibit the biofilm formation along with lower concentrations of fluoride (31.25 ppm) without affecting bacterial growth. In the present study, RT-qPCR analysis showed the target specific molecular mechanism of DMTU. In vivo treatment with DMTU, alone or in combination with fluoride, resulted in inhibition of caries (biofilm development of Streptococcus mutans ) using a Wistar rat model for dental caries. The histopathological analysis reported the development of lesions on dentine in infected subjects whereas the dentines of treated rodents were found to be intact and healthy. Reduction in inflammatory markers in rodents' blood and liver samples was observed when treated with DMTU. Collectively, data speculate that DMTU is an effective anti-biofilm and anti-inflammatory agent, which may improve the cariostatic properties of fluoride.

  18. Toolsets for Airborne Data

    Atmospheric Science Data Center

    2018-05-23

    ... from COlumn and VERtically Resolved Observations Relevant to Air Quality (DISCOVER-AQ) Campaign including Maryland, Texas, California, and ... observations to diagnose near-surface conditions relating to air quality. To diagnose air quality conditions from space, reliable satellite ...

  19. Strategic planning toolset for reproduction of machinebuilding engines and equipment

    NASA Astrophysics Data System (ADS)

    Boyko, A. A.; Kukartsev, V. V.; Lobkov, K. Y.; Stupina, A. A.

    2018-05-01

    This article illustrates a replica of a dynamic model of machine-building equipment. The model was designed on the basis of a ‘system dynamics method’ including the Powersim Studio toolset. The given model provides the basis and delineates the reproduction process of equipment in its natural as well as appraisal forms. The presented model was employed as a tool to explore reproduction of a wide range of engines and equipment in machine-building industry. As a result of these experiments, a variety of reproducible options were revealed which include productive capacity and distribution of equipment among technology groups. The authors’ research concludes that the replica of the dynamic model designed by us has proved to be universal. This also opens the way for further research exploring a wide range of industrial equipment reproduction.

  20. GLINT: a user-friendly toolset for the analysis of high-throughput DNA-methylation array data.

    PubMed

    Rahmani, Elior; Yedidim, Reut; Shenhav, Liat; Schweiger, Regev; Weissbrod, Omer; Zaitlen, Noah; Halperin, Eran

    2017-06-15

    GLINT is a user-friendly command-line toolset for fast analysis of genome-wide DNA methylation data generated using the Illumina human methylation arrays. GLINT, which does not require any programming proficiency, allows an easy execution of Epigenome-Wide Association Study analysis pipeline under different models while accounting for known confounders in methylation data. GLINT is a command-line software, freely available at https://github.com/cozygene/glint/releases . It requires Python 2.7 and several freely available Python packages. Further information and documentation as well as a quick start tutorial are available at http://glint-epigenetics.readthedocs.io . elior.rahmani@gmail.com or ehalperin@cs.ucla.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Dennis L.

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  2. Reaping the benefits of an open systems approach: getting the commercial approach right

    NASA Astrophysics Data System (ADS)

    Pearson, Gavin; Dawe, Tony; Stubbs, Peter; Worthington, Olwen

    2016-05-01

    Critical to reaping the benefits of an Open System Approach within Defence, or any other sector, is the ability to design the appropriate commercial model (or framework). This paper reports on the development and testing of a commercial strategy decision support tool. The tool set comprises a number of elements, including a process model, and provides business intelligence insights into likely supplier behaviour. The tool has been developed by subject matter experts and has been tested with a number of UK Defence procurement teams. The paper will present the commercial model framework, the elements of the toolset and the results of testing.

  3. Comparison of Performance Predictions for New Low-Thrust Trajectory Tools

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Kos, Larry; Hopkins, Randall; Crane, Tracie

    2006-01-01

    Several low thrust trajectory optimization tools have been developed over the last 3% years by the Low Thrust Trajectory Tools development team. This toolset includes both low-medium fidelity and high fidelity tools which allow the analyst to quickly research a wide mission trade space and perform advanced mission design. These tools were tested using a set of reference trajectories that exercised each tool s unique capabilities. This paper compares the performance predictions of the various tools against several of the reference trajectories. The intent is to verify agreement between the high fidelity tools and to quantify the performance prediction differences between tools of different fidelity levels.

  4. Development of the V4.2m5 and V5.0m0 Multigroup Cross Section Libraries for MPACT for PWR and BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog; Clarno, Kevin T.; Gentry, Cole

    2017-03-01

    The MPACT neutronics module of the Consortium for Advanced Simulation of Light Water Reactors (CASL) core simulator is a 3-D whole core transport code being developed for the CASL toolset, Virtual Environment for Reactor Analysis (VERA). Key characteristics of the MPACT code include (1) a subgroup method for resonance selfshielding and (2) a whole-core transport solver with a 2-D/1-D synthesis method. The MPACT code requires a cross section library to support all the MPACT core simulation capabilities which would be the most influencing component for simulation accuracy.

  5. Nuclear Electric Vehicle Optimization Toolset (NEVOT)

    NASA Technical Reports Server (NTRS)

    Tinker, Michael L.; Steincamp, James W.; Stewart, Eric T.; Patton, Bruce W.; Pannell, William P.; Newby, Ronald L.; Coffman, Mark E.; Kos, Larry D.; Qualls, A. Lou; Greene, Sherrell

    2004-01-01

    The Nuclear Electric Vehicle Optimization Toolset (NEVOT) optimizes the design of all major nuclear electric propulsion (NEP) vehicle subsystems for a defined mission within constraints and optimization parameters chosen by a user. The tool uses a genetic algorithm (GA) search technique to combine subsystem designs and evaluate the fitness of the integrated design to fulfill a mission. The fitness of an individual is used within the GA to determine its probability of survival through successive generations in which the designs with low fitness are eliminated and replaced with combinations or mutations of designs with higher fitness. The program can find optimal solutions for different sets of fitness metrics without modification and can create and evaluate vehicle designs that might never be considered through traditional design techniques. It is anticipated that the flexible optimization methodology will expand present knowledge of the design trade-offs inherent in designing nuclear powered space vehicles and lead to improved NEP designs.

  6. Toolsets for Airborne Data (TAD): Improving Machine Readability for ICARTT Data Files

    NASA Astrophysics Data System (ADS)

    Northup, E. A.; Early, A. B.; Beach, A. L., III; Kusterer, J.; Quam, B.; Wang, D.; Chen, G.

    2015-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The ASDC Toolsets for Airborne Data (TAD) is designed to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. TAD makes use of aircraft data stored in the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) file format. ICARTT has been the NASA standard since 2010, and is widely used by NOAA, NSF, and international partners (DLR, FAAM). Its level of acceptance is due in part to it being generally self-describing for researchers, i.e., it provides necessary data descriptions for proper research use. Despite this, there are a number of issues with the current ICARTT format, especially concerning the machine readability. In order to overcome these issues, the TAD team has developed an "idealized" file format. This format is ASCII and is sufficiently machine readable to sustain the TAD system, however, it is not fully compatible with the current ICARTT format. The process of mapping ICARTT metadata to the idealized format, the format specifics, and the actual conversion process will be discussed. The goal of this presentation is to demonstrate an example of how to improve the machine readability of ASCII data format protocols.

  7. MATCH: An Atom- Typing Toolset for Molecular Mechanics Force Fields

    PubMed Central

    Yesselman, Joseph D.; Price, Daniel J.; Knight, Jennifer L.; Brooks, Charles L.

    2011-01-01

    We introduce a toolset of program libraries collectively titled MATCH (Multipurpose Atom-Typer for CHARMM) for the automated assignment of atom types and force field parameters for molecular mechanics simulation of organic molecules. The toolset includes utilities for the conversion from multiple chemical structure file formats into a molecular graph. A general chemical pattern-matching engine using this graph has been implemented whereby assignment of molecular mechanics atom types, charges and force field parameters is achieved by comparison against a customizable list of chemical fragments. While initially designed to complement the CHARMM simulation package and force fields by generating the necessary input topology and atom-type data files, MATCH can be expanded to any force field and program, and has core functionality that makes it extendable to other applications such as fragment-based property prediction. In the present work, we demonstrate the accurate construction of atomic parameters of molecules within each force field included in CHARMM36 through exhaustive cross validation studies illustrating that bond increment rules derived from one force field can be transferred to another. In addition, using leave-one-out substitution it is shown that it is also possible to substitute missing intra and intermolecular parameters with ones included in a force field to complete the parameterization of novel molecules. Finally, to demonstrate the robustness of MATCH and the coverage of chemical space offered by the recent CHARMM CGENFF force field (Vanommeslaeghe, et al., JCC., 2010, 31, 671–690), one million molecules from the PubChem database of small molecules are typed, parameterized and minimized. PMID:22042689

  8. A simulator tool set for evaluating HEVC/SHVC streaming

    NASA Astrophysics Data System (ADS)

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to the standard can be investigated. We demonstrate the effectiveness and usability of our toolset by evaluating SHVC streaming and adaptation to meet terminal constraints and network conditions in a range of wired, wireless, and large scale wireless mesh network scenarios, each of which is designed to simulate a realistic environment. Our results are compared to those for H264/SVC, the scalable extension to the existing H.264/AVC advanced video coding standard.

  9. 2010 ESMD Faculty Fellowship Project

    NASA Technical Reports Server (NTRS)

    Carmen, Christina L.; Morris, Tommy; Schmidt, Peter; van Susante, Paul; Zalewski, Janusz; Murphy, Gloria

    2010-01-01

    This slide presentation reviews is composed of 6 individual sections. The first is a introductory section that explains the Exploration Systems Mission Directorate (ESMD) Faculty Fellowship Project, the purpose of which is to prepare selected university faculty to work with senior design students to complete projects that have potential to contribute to NASA objectives. The following university presentations represent the chosen projects: (1) the use of Exploration Toolset for the Optimization of Launch and Space Systems (X-TOOLSS) to optimize the Lunar Wormbot design; (2) development of Hardware Definition Language (HDL) realization of ITU G.729 for FGPA; (3) cryogenic fluid and electrical quick connect system and a lunar regolith design; (4) Lunar Landing Pad development; and (5) Prognostics for complex systems.

  10. Using CASE tools to write engineering specifications

    NASA Astrophysics Data System (ADS)

    Henry, James E.; Howard, Robert W.; Iveland, Scott T.

    1993-08-01

    There are always a wide variety of obstacles to writing and maintaining engineering documentation. To combat these problems, documentation generation can be linked to the process of engineering development. The same graphics and communication tools used for structured system analysis and design (SSA/SSD) also form the basis for the documentation. The goal is to build a living document, such that as an engineering design changes, the documentation will `automatically' revise. `Automatic' is qualified by the need to maintain textual descriptions associated with the SSA/SSD graphics, and the need to generate new documents. This paper describes a methodology and a computer aided system engineering toolset that enables a relatively seamless transition into document generation for the development engineering team.

  11. Assessing the Impact of Development Disruptions and Dependencies in System of Systems

    DTIC Science & Technology

    2017-02-24

    time  situational   awareness .  The  Analytical Workbench’s combination of dependency metrics (strength, criticality, and impact) will  be leveraged...IAC 2016 “Understanding Human Space Exploration” 67th IAF International Astronautical Congress Guadalajara, Mexico, 26‐30 Sept 2016...demonstrated use  of  internal  simulation  based  data  being  used  in  tandem  with  SODA  toolset  for  interdependency analysis.    NASA Marshall  Space

  12. Incubator Display Software Cost Reduction Toolset Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Moran, Susanne; Jeffords, Ralph

    2005-01-01

    The Incubator Display Software Requirements Specification was initially developed by Intrinsyx Technologies Corporation (Intrinsyx) under subcontract to Lockheed Martin, Contract Number NAS2-02090, for the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC) Space Station Biological Research Project (SSBRP). The Incubator Display is a User Payload Application (UPA) used to control an Incubator subrack payload for the SSBRP. The Incubator Display functions on-orbit as part of the subrack payload laptop, on the ground as part of the Communication and Data System (CDS) ground control system, and also as part of the crew training environment.

  13. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  14. Rapid Benefit Indicators (RBI) Spatial Analysis Toolset - Manual

    EPA Science Inventory

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  15. Tradespace Exploration for the Engineering of Resilient Systems

    DTIC Science & Technology

    2015-05-01

    world scenarios. The types of tools within the SAE set include visualization, decision analysis, and M&S, so it is difficult to categorize this toolset... overpopulated , or questionable. ERS Tradespace Workshop Create predictive models using multiple techniques (e.g., regression, Kriging, neural nets

  16. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormier, Dallas; Edra, Sherwin; Espinoza, Michael

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations,more » identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.« less

  17. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and auralizations.

  18. Capturing and Understanding Experiment Provenance using NiNaC

    NASA Astrophysics Data System (ADS)

    Rosati, C.

    2017-12-01

    A problem the model development team faces at the GFDL is determining climate model experiment provenance. Each experiment is configured with at least one configuration file which may reference other files. The experiment then passes through three phases before completion. Configuration files or other input files may be modified between phases. Finding the modifications later is tedious due to the expanse of the experiment input and duplication across phases. Determining provenance may be impossible if any file has been changed or deleted. To reduce these efforts and address these problems, we propose a new toolset, NiNaC, for archiving experiment provenance from the beginning of the experiment to the end and every phase in-between. Each of the three phases, check-out, build, and run, of the experiment depends on the previous phase. We use a graph to model the phase dependencies. Let each phase be represented by a node. Let each edge correspond to a dependency between phases where the node incident with the tail depends on the node incident with the head. It follows that the dependency graph is a tree. We reduce the problem to finding the lowest common ancestor and diffing the successor nodes. All files related to input for a phase are assigned a checksum. A new file is created to aggregate the checksums. Then each phase is assigned a checksum of aforementioned file as an identifier. Any change to part of a phase configuration will create unique checksums in all subsequent phases. Finding differences between experiments with this toolset is as simple as diffing two files containing checksums found by traversing the tree. One new benefit is that this toolset now allows differences in source code to be found after experiments are run, which was previously impossible for executables that cannot be linked to a known version controlled source code. Knowing that these changes exist allows us to give priority to help desk tickets concerning unmodified supported experiment releases, and minimize effort spent on unsupported experiments. It is also possible that a change is made, either by mistake or by system error. NiNaC would find the exact file in the precise phase with the change. In this way, NiNaC makes provenance tracking less tedious and solves problems where tracking provenance may previously have been impossible to do.

  19. June 2017 Atmospheric Science Forum Newsletter

    Atmospheric Science Data Center

    2017-07-05

    June 2017 Atmospheric Science Forum Newsletter Friday, June 30, 2017 ... DISCOVER-AQ campaign available on Toolsets for Airborne Data (TAD), release of the CERES EBAF TOA and SURFACE Edition 4.0 data products, and the MOPITT V7 product upgrade. Access the full article at: ...

  20. Nuclear Electric Vehicle Optimization Toolset (NEVOT): Integrated System Design Using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Tinker, Michael L.; Steincamp, James W.; Stewart, Eric T.; Patton, Bruce W.; Pannell, William P.; Newby, Ronald L.; Coffman, Mark E.; Qualls, A. L.; Bancroft, S.; Molvik, Greg

    2003-01-01

    The Nuclear Electric Vehicle Optimization Toolset (NEVOT) optimizes the design of all major Nuclear Electric Propulsion (NEP) vehicle subsystems for a defined mission within constraints and optimization parameters chosen by a user. The tool uses a Genetic Algorithm (GA) search technique to combine subsystem designs and evaluate the fitness of the integrated design to fulfill a mission. The fitness of an individual is used within the GA to determine its probability of survival through successive generations in which the designs with low fitness are eliminated and replaced with combinations or mutations of designs with higher fitness. The program can find optimal solutions for different sets of fitness metrics without modification and can create and evaluate vehicle designs that might never be conceived of through traditional design techniques. It is anticipated that the flexible optimization methodology will expand present knowledge of the design trade-offs inherent in designing nuclear powered space vehicles and lead to improved NEP designs.

  1. XML Translator for Interface Descriptions

    NASA Technical Reports Server (NTRS)

    Boroson, Elizabeth R.

    2009-01-01

    A computer program defines an XML schema for specifying the interface to a generic FPGA from the perspective of software that will interact with the device. This XML interface description is then translated into header files for C, Verilog, and VHDL. User interface definition input is checked via both the provided XML schema and the translator module to ensure consistency and accuracy. Currently, programming used on both sides of an interface is inconsistent. This makes it hard to find and fix errors. By using a common schema, both sides are forced to use the same structure by using the same framework and toolset. This makes for easy identification of problems, which leads to the ability to formulate a solution. The toolset contains constants that allow a programmer to use each register, and to access each field in the register. Once programming is complete, the translator is run as part of the make process, which ensures that whenever an interface is changed, all of the code that uses the header files describing it is recompiled.

  2. The C. elegans rab family: identification, classification and toolkit construction.

    PubMed

    Gallegos, Maria E; Balakrishnan, Sanjeev; Chandramouli, Priya; Arora, Shaily; Azameera, Aruna; Babushekar, Anitha; Bargoma, Emilee; Bokhari, Abdulmalik; Chava, Siva Kumari; Das, Pranti; Desai, Meetali; Decena, Darlene; Saramma, Sonia Dev Devadas; Dey, Bodhidipra; Doss, Anna-Louise; Gor, Nilang; Gudiputi, Lakshmi; Guo, Chunyuan; Hande, Sonali; Jensen, Megan; Jones, Samantha; Jones, Norman; Jorgens, Danielle; Karamchedu, Padma; Kamrani, Kambiz; Kolora, Lakshmi Divya; Kristensen, Line; Kwan, Kelly; Lau, Henry; Maharaj, Pranesh; Mander, Navneet; Mangipudi, Kalyani; Menakuru, Himabindu; Mody, Vaishali; Mohanty, Sandeepa; Mukkamala, Sridevi; Mundra, Sheena A; Nagaraju, Sudharani; Narayanaswamy, Rajhalutshimi; Ndungu-Case, Catherine; Noorbakhsh, Mersedeh; Patel, Jigna; Patel, Puja; Pendem, Swetha Vandana; Ponakala, Anusha; Rath, Madhusikta; Robles, Michael C; Rokkam, Deepti; Roth, Caroline; Sasidharan, Preeti; Shah, Sapana; Tandon, Shweta; Suprai, Jagdip; Truong, Tina Quynh Nhu; Uthayaruban, Rubatharshini; Varma, Ajitha; Ved, Urvi; Wang, Zeran; Yu, Zhe

    2012-01-01

    Rab monomeric GTPases regulate specific aspects of vesicle transport in eukaryotes including coat recruitment, uncoating, fission, motility, target selection and fusion. Moreover, individual Rab proteins function at specific sites within the cell, for example the ER, golgi and early endosome. Importantly, the localization and function of individual Rab subfamily members are often conserved underscoring the significant contributions that model organisms such as Caenorhabditis elegans can make towards a better understanding of human disease caused by Rab and vesicle trafficking malfunction. With this in mind, a bioinformatics approach was first taken to identify and classify the complete C. elegans Rab family placing individual Rabs into specific subfamilies based on molecular phylogenetics. For genes that were difficult to classify by sequence similarity alone, we did a comparative analysis of intron position among specific subfamilies from yeast to humans. This two-pronged approach allowed the classification of 30 out of 31 C. elegans Rab proteins identified here including Rab31/Rab50, a likely member of the last eukaryotic common ancestor (LECA). Second, a molecular toolset was created to facilitate research on biological processes that involve Rab proteins. Specifically, we used Gateway-compatible C. elegans ORFeome clones as starting material to create 44 full-length, sequence-verified, dominant-negative (DN) and constitutive active (CA) rab open reading frames (ORFs). Development of this toolset provided independent research projects for students enrolled in a research-based molecular techniques course at California State University, East Bay (CSUEB).

  3. The C. elegans Rab Family: Identification, Classification and Toolkit Construction

    PubMed Central

    Gallegos, Maria E.; Balakrishnan, Sanjeev; Chandramouli, Priya

    2012-01-01

    Rab monomeric GTPases regulate specific aspects of vesicle transport in eukaryotes including coat recruitment, uncoating, fission, motility, target selection and fusion. Moreover, individual Rab proteins function at specific sites within the cell, for example the ER, golgi and early endosome. Importantly, the localization and function of individual Rab subfamily members are often conserved underscoring the significant contributions that model organisms such as Caenorhabditis elegans can make towards a better understanding of human disease caused by Rab and vesicle trafficking malfunction. With this in mind, a bioinformatics approach was first taken to identify and classify the complete C. elegans Rab family placing individual Rabs into specific subfamilies based on molecular phylogenetics. For genes that were difficult to classify by sequence similarity alone, we did a comparative analysis of intron position among specific subfamilies from yeast to humans. This two-pronged approach allowed the classification of 30 out of 31 C. elegans Rab proteins identified here including Rab31/Rab50, a likely member of the last eukaryotic common ancestor (LECA). Second, a molecular toolset was created to facilitate research on biological processes that involve Rab proteins. Specifically, we used Gateway-compatible C. elegans ORFeome clones as starting material to create 44 full-length, sequence-verified, dominant-negative (DN) and constitutive active (CA) rab open reading frames (ORFs). Development of this toolset provided independent research projects for students enrolled in a research-based molecular techniques course at California State University, East Bay (CSUEB). PMID:23185324

  4. Using GRACE-Derived Water and Moisture Products as a Predictive Tool for Fire Response in the Contiguous United States

    NASA Astrophysics Data System (ADS)

    Rousseau, N. J.; Jensen, D.; Zajic, B.; Rodell, M.; Reager, J. T., II

    2015-12-01

    Understanding the relationship between wildfire activity and soil moisture in the United States has been difficult to assess, with limited ability to determine areas that are at high risk. This limitation is largely due to complex environmental factors at play, especially as they relate to alternating periods of wet and dry conditions, and the lack of remotely-sensed products. Recent drought conditions and accompanying low Fuel Moisture Content (FMC) have led to disastrous wildfire outbreaks causing economic loss, property damage, and environmental degradation. Thus, developing a programmed toolset to assess the relationship between soil moisture, which contributes greatly to FMC and fire severity, can establish the framework for determining overall wildfire risk. To properly evaluate these parameters, we used data assimilated from the Gravity Recovery and Climate Experiment (GRACE) and data from the Fire Program Analysis fire-occurrence database (FPA FOD) to determine the extent soil moisture affects fire activity. Through these datasets, we produced correlation and regression maps at a coarse resolution of 0.25 degrees for the contiguous United States. These fire-risk products and toolsets proved the viability of this methodology, allowing for the future incorporation of more GRACE-derived water parameters, MODIS vegetation indices, and other environmental datasets to refine the model for fire risk. Additionally, they will allow assessment to national-scale early fire management and provide responders with a predictive tool to better employ early decision-support to areas of high risk during regions' respective fire season(s).

  5. CRI planning and scheduling for space

    NASA Technical Reports Server (NTRS)

    Aarup, Mads

    1994-01-01

    Computer Resources International (CRI) has many years of experience in developing space planning and scheduling systems for the European Space Agency. Activities range from AIT/AIV planning over mission planning to research in on-board autonomy using advanced planning and scheduling technologies in conjunction with model based diagnostics. This article presents four projects carried out for ESA by CRI with various subcontractors: (1) DI, Distributed Intelligence for Ground/Space Systems is an on-going research project; (2) GMPT, Generic Mission Planning Toolset, a feasibility study concluded in 1993; (3) OPTIMUM-AIV, Open Planning Tool for AIV, development of a knowledge based AIV planning and scheduling tool ended in 1992; and (4) PlanERS-1, development of an AI and knowledge-based mission planning prototype for the ERS-1 earth observation spacecraft ended in 1991.

  6. A Toolset for Supporting Iterative Human Automation: Interaction in Design

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2010-01-01

    The addition of automation has greatly extended humans' capability to accomplish tasks, including those that are difficult, complex and safety critical. The majority of Human - Automation Interacton (HAl) results in more efficient and safe operations, ho,,:,ever ertain unpected atomatlon behaviors or "automation surprises" can be frustrating and, In certain safety critical operations (e.g. transporttion, manufacturing control, medicine), may result in injuries or. the loss of life.. (Mellor, 1994; Leveson, 1995; FAA, 1995; BASI, 1998; Sheridan, 2002). This papr describes he development of a design tool that enables on the rapid development and evaluation. of automaton prototypes. The ultimate goal of the work is to provide a design platform upon which automation surprise vulnerability analyses can be integrated.

  7. Toolsets for Airborne Data - URS and New Documentation

    Atmospheric Science Data Center

    2015-03-23

    ... geolocated) files based on a user’s choice of time base. In addition, the TAD merge feature allows users to generate standard deviations ... NASA airborne missions. We are currently focused on in situ measurements and we would like to hear from you about the need for other ...

  8. Automated Design Space Exploration with Aspen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spafford, Kyle L.; Vetter, Jeffrey S.

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  9. Automated Design Space Exploration with Aspen

    DOE PAGES

    Spafford, Kyle L.; Vetter, Jeffrey S.

    2015-01-01

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  10. High power diode lasers emitting from 639 nm to 690 nm

    NASA Astrophysics Data System (ADS)

    Bao, L.; Grimshaw, M.; DeVito, M.; Kanskar, M.; Dong, W.; Guan, X.; Zhang, S.; Patterson, J.; Dickerson, P.; Kennedy, K.; Li, S.; Haden, J.; Martinsen, R.

    2014-03-01

    There is increasing market demand for high power reliable red lasers for display and cinema applications. Due to the fundamental material system limit at this wavelength range, red diode lasers have lower efficiency and are more temperature sensitive, compared to 790-980 nm diode lasers. In terms of reliability, red lasers are also more sensitive to catastrophic optical mirror damage (COMD) due to the higher photon energy. Thus developing higher power-reliable red lasers is very challenging. This paper will present nLIGHT's released red products from 639 nm to 690nm, with established high performance and long-term reliability. These single emitter diode lasers can work as stand-alone singleemitter units or efficiently integrate into our compact, passively-cooled Pearl™ fiber-coupled module architectures for higher output power and improved reliability. In order to further improve power and reliability, new chip optimizations have been focused on improving epitaxial design/growth, chip configuration/processing and optical facet passivation. Initial optimization has demonstrated promising results for 639 nm diode lasers to be reliably rated at 1.5 W and 690nm diode lasers to be reliably rated at 4.0 W. Accelerated life-test has started and further design optimization are underway.

  11. The Argument for Open

    ERIC Educational Resources Information Center

    Byrd, Rob

    2008-01-01

    Is open source business intelligence (OS BI) software ready for prime time? The author thoroughly investigated each of three OS BI toolsets--Pentaho BI Suite, Jaspersoft BI Suite, and Talend Open Studio--by installing the OS BI tools himself, by interviewing technologists at academic institutions who had implemented these OS BI solutions, and by…

  12. Expanding the Reach of Extension through Social Media

    ERIC Educational Resources Information Center

    Gharis, Lauri W.; Bardon, Robert E.; Evans, Jennifer L.; Hubbard, William G.; Taylor, Eric

    2014-01-01

    With increasing numbers of the public using social media applications, Extension professionals have the ability to apply these same tools to connect with their clients. This article demonstrates how a social media toolset can be employed by Extension professionals by identifying how Extension professionals are currently using social media,…

  13. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    USDA-ARS?s Scientific Manuscript database

    Interactive modules for data exploration and visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data sets with a user-friendly interface. Individual modules were designed to provide toolsets to enable interactive ...

  14. Employing the Management Internal Control Toolset (MICT) Across the Enterprise

    DTIC Science & Technology

    2012-06-16

    Change In the 14th edition of Organizations: Behavior, Structure, Processes, authors Gibson, Ivancevich , Donnelly and Konopaske’s explain how...internet on 4 Apr 2012 at: http://www.defense.gov/Speeches/Speech.aspx?SpeechID=1527 Gibson, J.L., Ivancevich , J.M., Donnelly, J.J., Jr., & Konopaske

  15. Shakespeare, Our Digital Native

    ERIC Educational Resources Information Center

    Shamburg, Christopher; Craighead, Cari

    2009-01-01

    Performance-based activities and creative projects with technology that focus on Shakespeare's language are powerful developmental tools for students to express and extend thoughts and feelings from their lives. Shakespeare becomes a toy chest and a toolset that allows students to live in situations they never could and to express language they…

  16. Toolsets for Airborne Data Beta Release

    Atmospheric Science Data Center

    2014-09-17

    ... create merge files based on a user’s choice of time base. In addition, the TAD merge feature allows users to generate standard deviation ... to the TAD database. We are currently focused on the in situ measurements and we want to hear from you about the need for other data ...

  17. Spectacle and SpecViz: New Spectral Analysis and Visualization Tools

    NASA Astrophysics Data System (ADS)

    Earl, Nicholas; Peeples, Molly; JDADF Developers

    2018-01-01

    A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user-created plugins that add new functionality.This work was supported in part by HST AR #13919, HST GO #14268, and HST AR #14560.

  18. SpecViz: Interactive Spectral Data Analysis

    NASA Astrophysics Data System (ADS)

    Earl, Nicholas Michael; STScI

    2016-06-01

    The astronomical community is about to enter a new generation of scientific enterprise. With next-generation instrumentation and advanced capabilities, the need has arisen to equip astronomers with the necessary tools to deal with large, multi-faceted data. The Space Telescope Science Institute has initiated a data analysis forum for the creation, development, and maintenance of software tools for the interpretation of these new data sets. SpecViz is a spectral 1-D interactive visualization and analysis application built with Python in an open source development environment. A user-friendly GUI allows for a fast, interactive approach to spectral analysis. SpecViz supports handling of unique and instrument-specific data, incorporation of advanced spectral unit handling and conversions in a flexible, high-performance interactive plotting environment. Active spectral feature analysis is possible through interactive measurement and statistical tools. It can be used to build wide-band SEDs, with the capability of combining or overplotting data products from various instruments. SpecViz sports advanced toolsets for filtering and detrending spectral lines; identifying, isolating, and manipulating spectral features; as well as utilizing spectral templates for renormalizing data in an interactive way. SpecViz also includes a flexible model fitting toolset that allows for multi-component models, as well as custom models, to be used with various fitting and decomposition routines. SpecViz also features robust extension via custom data loaders and connection to the central communication system underneath the interface for more advanced control. Incorporation with Jupyter notebooks via connection with the active iPython kernel allows for SpecViz to be used in addition to a user’s normal workflow without demanding the user drastically alter their method of data analysis. In addition, SpecViz allows the interactive analysis of multi-object spectroscopy in the same straight-forward, consistent way. Through the development of such tools, STScI hopes to unify astronomical data analysis software for JWST and other instruments, allowing for efficient, reliable, and consistent scientific results.

  19. Micromachined pressure sensors: Review and recent developments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eaton, W.P.; Smith, J.H.

    1997-03-01

    Since the discovery of piezoresistivity in silicon in the mid 1950s, silicon-based pressure sensors have been widely produced. Micromachining technology has greatly benefited from the success of the integrated circuits industry, burrowing materials, processes, and toolsets. Because of this, microelectromechanical systems (MEMS) are now poised to capture large segments of existing sensor markets and to catalyze the development of new markets. Given the emerging importance of MEMS, it is instructive to review the history of micromachined pressure sensors, and to examine new developments in the field. Pressure sensors will be the focus of this paper, starting from metal diaphragm sensorsmore » with bonded silicon strain gauges, and moving to present developments of surface-micromachined, optical, resonant, and smart pressure sensors. Considerations for diaphragm design will be discussed in detail, as well as additional considerations for capacitive and piezoresistive devices.« less

  20. Development of Near-Isogenic Lines in a Parthenogenetically Reproduced Thrips Species, Frankliniella occidentalis

    PubMed Central

    Yuan, Guangdi; Wan, Yanran; Li, Xiaoyu; He, Bingqing; Zhang, Youjun; Xu, Baoyun; Wang, Shaoli; Xie, Wen; Zhou, Xuguo; Wu, Qingjun

    2017-01-01

    Although near-isogenic lines (NILs) can standardize genetic backgrounds among individuals, it has never been applied in parthenogenetically reproduced animals. Here, through multiple rounds of backcrossing and spinosad screening, we generated spinosad resistant NILs in the western flower thrips, Frankliniella occidentalis (Pergande) (Thysanoptera: Thripidae), with a haplo-diploid reproduction system. The resultant F. occidentalis NIL-R strain maintained a resistance ratio over 30,000-fold, which was comparable to its parental resistant strain, Spin-R. More importantly, F. occidentalis NIL-R shared 98.90% genetic similarity with its susceptible parental strain Ivf03. By developing this toolset, we are able to segregate individual resistance and facilitate the mechanistic study of insecticide resistances in phloem-feeding arthropods, a group of devastating pest species reproducing sexually as well as asexually. PMID:28348528

  1. Development of Near-Isogenic Lines in a Parthenogenetically Reproduced Thrips Species, Frankliniella occidentalis.

    PubMed

    Yuan, Guangdi; Wan, Yanran; Li, Xiaoyu; He, Bingqing; Zhang, Youjun; Xu, Baoyun; Wang, Shaoli; Xie, Wen; Zhou, Xuguo; Wu, Qingjun

    2017-01-01

    Although near-isogenic lines (NILs) can standardize genetic backgrounds among individuals, it has never been applied in parthenogenetically reproduced animals. Here, through multiple rounds of backcrossing and spinosad screening, we generated spinosad resistant NILs in the western flower thrips, Frankliniella occidentalis (Pergande) (Thysanoptera: Thripidae), with a haplo-diploid reproduction system. The resultant F. occidentalis NIL-R strain maintained a resistance ratio over 30,000-fold, which was comparable to its parental resistant strain, Spin-R. More importantly, F. occidentalis NIL-R shared 98.90% genetic similarity with its susceptible parental strain Ivf03. By developing this toolset, we are able to segregate individual resistance and facilitate the mechanistic study of insecticide resistances in phloem-feeding arthropods, a group of devastating pest species reproducing sexually as well as asexually.

  2. The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)

    1997-01-01

    Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.

  3. Semantic Overlays in Educational Content Networks--The hylOs Approach

    ERIC Educational Resources Information Center

    Engelhardt, Michael; Hildebrand, Arne; Lange, Dagmar; Schmidt, Thomas C.

    2006-01-01

    Purpose: The paper aims to introduce an educational content management system, Hypermedia Learning Objects System (hylOs), which is fully compliant to the IEEE LOM eLearning object metadata standard. Enabled through an advanced authoring toolset, hylOs allows the definition of instructional overlays of a given eLearning object mesh.…

  4. Solving Common Mathematical Problems

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Mathematical Solutions Toolset is a collection of five software programs that rapidly solve some common mathematical problems. The programs consist of a set of Microsoft Excel worksheets. The programs provide for entry of input data and display of output data in a user-friendly, menu-driven format, and for automatic execution once the input data has been entered.

  5. A method for generating new datasets based on copy number for cancer analysis.

    PubMed

    Kim, Shinuk; Kon, Mark; Kang, Hyunsik

    2015-01-01

    New data sources for the analysis of cancer data are rapidly supplementing the large number of gene-expression markers used for current methods of analysis. Significant among these new sources are copy number variation (CNV) datasets, which typically enumerate several hundred thousand CNVs distributed throughout the genome. Several useful algorithms allow systems-level analyses of such datasets. However, these rich data sources have not yet been analyzed as deeply as gene-expression data. To address this issue, the extensive toolsets used for analyzing expression data in cancerous and noncancerous tissue (e.g., gene set enrichment analysis and phenotype prediction) could be redirected to extract a great deal of predictive information from CNV data, in particular those derived from cancers. Here we present a software package capable of preprocessing standard Agilent copy number datasets into a form to which essentially all expression analysis tools can be applied. We illustrate the use of this toolset in predicting the survival time of patients with ovarian cancer or glioblastoma multiforme and also provide an analysis of gene- and pathway-level deletions in these two types of cancer.

  6. Image formation simulation for computer-aided inspection planning of machine vision systems

    NASA Astrophysics Data System (ADS)

    Irgenfried, Stephan; Bergmann, Stephan; Mohammadikaji, Mahsa; Beyerer, Jürgen; Dachsbacher, Carsten; Wörn, Heinz

    2017-06-01

    In this work, a simulation toolset for Computer Aided Inspection Planning (CAIP) of systems for automated optical inspection (AOI) is presented along with a versatile two-robot-setup for verification of simulation and system planning results. The toolset helps to narrow down the large design space of optical inspection systems in interaction with a system expert. The image formation taking place in optical inspection systems is simulated using GPU-based real time graphics and high quality off-line-rendering. The simulation pipeline allows a stepwise optimization of the system, from fast evaluation of surface patch visibility based on real time graphics up to evaluation of image processing results based on off-line global illumination calculation. A focus of this work is on the dependency of simulation quality on measuring, modeling and parameterizing the optical surface properties of the object to be inspected. The applicability to real world problems is demonstrated by taking the example of planning a 3D laser scanner application. Qualitative and quantitative comparison results of synthetic and real images are presented.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Aiman; Laguna, Ignacio; Sato, Kento

    Future high-performance computing systems may face frequent failures with their rapid increase in scale and complexity. Resilience to faults has become a major challenge for large-scale applications running on supercomputers, which demands fault tolerance support for prevalent MPI applications. Among failure scenarios, process failures are one of the most severe issues as they usually lead to termination of applications. However, the widely used MPI implementations do not provide mechanisms for fault tolerance. We propose FTA-MPI (Fault Tolerance Assistant MPI), a programming model that provides support for failure detection, failure notification and recovery. Specifically, FTA-MPI exploits a try/catch model that enablesmore » failure localization and transparent recovery of process failures in MPI applications. We demonstrate FTA-MPI with synthetic applications and a molecular dynamics code CoMD, and show that FTA-MPI provides high programmability for users and enables convenient and flexible recovery of process failures.« less

  8. Cancer CRISPR Screens In Vivo.

    PubMed

    Chow, Ryan D; Chen, Sidi

    2018-05-01

    Clustered regularly interspaced short palindromic repeats (CRISPR) screening is a powerful toolset for investigating diverse biological processes. Most CRISPR screens to date have been performed with in vitro cultures or cellular transplant models. To interrogate cancer in animal models that more closely recapitulate the human disease, autochthonous direct in vivo CRISPR screens have recently been developed that can identify causative drivers in the native tissue microenvironment. By empowering multiplexed mutagenesis in fully immunocompetent animals, direct in vivo CRISPR screens enable the rapid generation of patient-specific avatars that can guide precision medicine. This Opinion article discusses the current status of in vivo CRISPR screens in cancer and offers perspectives on future applications. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Juggling Act: Re-Planning and Building on Observatory...Simultaneously!

    NASA Technical Reports Server (NTRS)

    Zavala, Eddie; Daws, Patricia

    2011-01-01

    SOFIA (Stratospheric Observatory for Infrared Astronomy) is a major SMD program that has been required to meet several requirements and implement major planning and business initiatives overthe past 1 1/2 years, in the midst of system development and flight test phases. The program was required to implementing JCL and EVM simultaneously, as well as undergo a major replan and Standing Review Board - and all without impacting technical schedule progress. The team developed innovative processes that met all the requirements, and improved Program Management process toolsets. The SOFIA team, being subject to all the typical budget constraints, found ways to leverage existing roles in new ways to meet the requirements without creating unmanageable overhead. The team developed strategies and value added processes - such as improved risk identification, structured reserves management, cost/risk integration - so that the effort expended resulted in a positive return to the program.

  10. Satellite Systems Design/Simulation Environment: A Systems Approach to Pre-Phase A Design

    NASA Technical Reports Server (NTRS)

    Ferebee, Melvin J., Jr.; Troutman, Patrick A.; Monell, Donald W.

    1997-01-01

    A toolset for the rapid development of small satellite systems has been created. The objective of this tool is to support the definition of spacecraft mission concepts to satisfy a given set of mission and instrument requirements. The objective of this report is to provide an introduction to understanding and using the SMALLSAT Model. SMALLSAT is a computer-aided Phase A design and technology evaluation tool for small satellites. SMALLSAT enables satellite designers, mission planners, and technology program managers to observe the likely consequences of their decisions in terms of satellite configuration, non-recurring and recurring cost, and mission life cycle costs and availability statistics. It was developed by Princeton Synergetic, Inc. and User Systems, Inc. as a revision of the previous TECHSAT Phase A design tool, which modeled medium-sized Earth observation satellites. Both TECHSAT and SMALLSAT were developed for NASA.

  11. The Moment of Learning: Quantitative Analysis of Exemplar Gameplay Supports CyGaMEs Approach to Embedded Assessment

    ERIC Educational Resources Information Center

    Reese, Debbie Denise; Tabachnick, Barbara G.

    2010-01-01

    In this paper, the authors summarize a quantitative analysis demonstrating that the CyGaMEs toolset for embedded assessment of learning within instructional games measures growth in conceptual knowledge by quantifying player behavior. CyGaMEs stands for Cyberlearning through GaME-based, Metaphor Enhanced Learning Objects. Some scientists of…

  12. Tool-Use in a Blended Undergraduate Course: In Search of User Profiles

    ERIC Educational Resources Information Center

    Lust, Griet; Vandewaetere, Mieke; Ceulemans, Eva; Elen, Jan; Clarebout, Geraldine

    2011-01-01

    The popularity of today's blended courses in higher education is driven by the assumption that students are provided with a rich toolset that supports them in their learning process. However, little is known on how students actually use these tools and how this affects their performance for the course. The current study investigates how students…

  13. GOATS - Orbitology Component

    NASA Technical Reports Server (NTRS)

    Haber, Benjamin M.; Green, Joseph J.

    2010-01-01

    The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the early stage of mission concept development.

  14. Development of a customizable software application for medical imaging analysis and visualization.

    PubMed

    Martinez-Escobar, Marisol; Peloquin, Catherine; Juhnke, Bethany; Peddicord, Joanna; Jose, Sonia; Noon, Christian; Foo, Jung Leng; Winer, Eliot

    2011-01-01

    Graphics technology has extended medical imaging tools to the hands of surgeons and doctors, beyond the radiology suite. However, a common issue in most medical imaging software is the added complexity for non-radiologists. This paper presents the development of a unique software toolset that is highly customizable and targeted at the general physicians as well as the medical specialists. The core functionality includes features such as viewing medical images in two-and three-dimensional representations, clipping, tissue windowing, and coloring. Additional features can be loaded in the form of 'plug-ins' such as tumor segmentation, tissue deformation, and surgical planning. This allows the software to be lightweight and easy to use while still giving the user the flexibility of adding the necessary features, thus catering to a wide range of user population.

  15. Integrated Multidisciplinary Optimization Objects

    NASA Technical Reports Server (NTRS)

    Alston, Katherine

    2014-01-01

    OpenMDAO is an open-source MDAO framework. It is used to develop an integrated analysis and design environment for engineering challenges. This Phase II project integrated additional modules and design tools into OpenMDAO to perform discipline-specific analysis across multiple flight regimes at varying levels of fidelity. It also showcased a refined system architecture that allows the system to be less customized to a specific configuration (i.e., system and configuration separation). By delivering a capable and validated MDAO system along with a set of example applications to be used as a template for future users, this work greatly expands NASA's high-fidelity, physics-based MDAO capabilities and enables the design of revolutionary vehicles in a cost-effective manner. This proposed work complements M4 Engineering's expertise in developing modeling and simulation toolsets that solve relevant subsonic, supersonic, and hypersonic demonstration applications.

  16. Sighten Final Technical Report DEEE0006690 Deploying an integrated and comprehensive solar financing software platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Conlan

    Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software,more » and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a rooftop solar financing program. Standardizing and improving all calculations, improving data quality, and exposing new analysis tools previously unavailable affects investment in the residential space in several important ways: 1) lowering the cost of capital for existing capital providers by mitigating uncertainty and de-risking the solar asset class; 2) attracting new, lower cost investors to the solar asset class as reporting and data quality resemble standards of more mature asset classes; 3) increasing the prevalence of liquidity options for investors through back leverage, securitization, or secondary sale by providing the tools necessary for lenders, ratings agencies, etc. to properly understand a portfolio of residential solar assets. During the project period, Sighten successfully built and scaled a commercially ready tool for the residential solar market. The software solution built by Sighten has been deployed with key target customer segments identified in the award deliverables: solar installers, solar developers/channel managers, and solar financiers, including lenders. Each of these segments greatly benefits from the availability of the Sighten toolset.« less

  17. Development of a General Form CO 2 and Brine Flux Input Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansoor, K.; Sun, Y.; Carroll, S.

    2014-08-01

    The National Risk Assessment Partnership (NRAP) project is developing a science-based toolset for the quantitative analysis of the potential risks associated with changes in groundwater chemistry from CO 2 injection. In order to address uncertainty probabilistically, NRAP is developing efficient, reduced-order models (ROMs) as part of its approach. These ROMs are built from detailed, physics-based process models to provide confidence in the predictions over a range of conditions. The ROMs are designed to reproduce accurately the predictions from the computationally intensive process models at a fraction of the computational time, thereby allowing the utilization of Monte Carlo methods to probemore » variability in key parameters. This report presents the procedures used to develop a generalized model for CO 2 and brine leakage fluxes based on the output of a numerical wellbore simulation. The resulting generalized parameters and ranges reported here will be used for the development of third-generation groundwater ROMs.« less

  18. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2013)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2013-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories’ core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI’s industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI’s academic participants (Carnegie Mellon University, Princeton University, West Virginia University, Boston University and the University of Texas at Austin) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 13, CCSI announced the initial release of its first set of computational tools and models during the October 2012 meeting of its Industry Advisory Board. This initial release led to five companies licensing the CCSI Toolset under a Test and Evaluation Agreement this year. By the end of FY13, the CCSI Technical Team had completed development of an updated suite of computational tools and models. The list below summarizes the new and enhanced toolset components that were released following comprehensive testing during October 2013. 1. FOQUS. Framework for Optimization and Quantification of Uncertainty and Sensitivity. Package includes: FOQUS Graphic User Interface (GUI), simulation-based optimization engine, Turbine Client, and heat integration capabilities. There is also an updated simulation interface and new configuration GUI for connecting Aspen Plus or Aspen Custom Modeler (ACM) simulations to FOQUS and the Turbine Science Gateway. 2. A new MFIX-based Computational Fluid Dynamics (CFD) model to predict particle attrition. 3. A new dynamic reduced model (RM) builder, which generates computationally efficient RMs of the behavior of a dynamic system. 4. A completely re-written version of the algebraic surrogate model builder for optimization (ALAMO). The new version is several orders of magnitude faster than the initial release and eliminates the MATLAB dependency. 5. A new suite of high resolution filtered models for the hydrodynamics associated with horizontal cylindrical objects in a flow path. 6. The new Turbine Science Gateway (Cluster), which supports FOQUS for running multiple simulations for optimization or UQ using a local computer or cluster. 7. A new statistical tool (BSS-ANOVA-UQ) for calibration and validation of CFD models. 8. A new basic data submodel in Aspen Plus format for a representative high viscosity capture solvent, 2-MPZ system. 9. An updated RM tool for CFD (REVEAL) that can create a RM from MFIX. A new lightweight, stand-alone version will be available in late 2013. 10. An updated RM integration tool to convert the RM from REVEAL into a CAPE-OPEN or ACM model for use in a process simulator. 11. An updated suite of unified steady-state and dynamic process models for solid sorbent carbon capture included bubbling fluidized bed and moving bed reactors. 12. An updated and unified set of compressor models including steady-state design point model and dynamic model with surge detection. 13. A new framework for the synthesis and optimization of coal oxycombustion power plants using advanced optimization algorithms. This release focuses on modeling and optimization of a cryogenic air separation unit (ASU). 14. A new technical risk model in spreadsheet format. 15. An updated version of the sorbent kinetic/equilibrium model for parameter estimation for the 1st generation sorbent model. 16. An updated process synthesis superstructure model to determine optimal process configurations utilizing surrogate models from ALAMO for adsorption and regeneration in a solid sorbent process. 17. Validation models for NETL Carbon Capture Unit utilizing sorbent AX. Additional validation models will be available for sorbent 32D in 2014. 18. An updated hollow fiber membrane model and system example for carbon capture. 19. An updated reference power plant model in Thermoflex that includes additional steam extraction and reinjection points to enable heat integration module. 20. An updated financial risk model in spreadsheet format.« less

  19. APT: what it has enabled us to do

    NASA Astrophysics Data System (ADS)

    Blacker, Brett S.; Golombek, Daniel

    2004-09-01

    With the development and operations deployment of the Astronomer's Proposal Tool (APT), Hubble Space Telescope (HST) proposers have been provided with an integrated toolset for Phase I and Phase II. This toolset consists of editors for filling out proposal information, an Orbit Planner for determining observation feasibility, a Visit Planner for determining schedulability, diagnostic and reporting tools and an integrated Visual Target Tuner (VTT) for viewing exposure specifications. The VTT can also overlay HST"s field of view on user-selected Flexible Image Transport System (FITS) images, perform bright object checks and query the HST archive. In addition to these direct benefits for the HST user, STScI"s internal Phase I process has been able to take advantage of the APT products. APT has enabled a substantial streamlining of the process and software processing tools, which enabled a compression by three months of the Phase I to Phase II schedule, allowing to schedule observations earlier and thus further benefiting HST observers. Some of the improvements to our process include: creating a compact disk (CD) of Phase I products; being able to print all proposals on the day of the deadline; link the proposal in Portable Document Format (PDF) with a database, and being able to run all Phase I software on a single platform. In this paper we will discuss the operational results of using APT for HST's Cycles 12 and 13 Phase I process and will show the improvements for the users and the overall process that is allowing STScI to obtain scientific results with HST three months earlier than in previous years. We will also show how APT can be and is being used for multiple missions.

  20. Subsetting Tools for Enabling Easy Access to International Airborne Chemistry Data

    NASA Astrophysics Data System (ADS)

    Northup, E. A.; Chen, G.; Quam, B. M.; Beach, A. L., III; Silverman, M. L.; Early, A. B.

    2017-12-01

    In response to the Research Opportunities in Earth and Space Science (ROSES) 2015 release announcement for Advancing Collaborative Connections for Earth System Science (ACCESS), researchers at NASA Langley Research Center (LaRC) proposed to extend the capabilities of the existing Toolsets for Airborne Data (TAD) to include subsetting functionality to allow for easier access to international airborne field campaign data. Airborne field studies are commonly used to gain a detailed understanding of atmospheric processes for scientific research on international climate change and air quality issues. To accommodate the rigorous process for manipulating airborne field study chemistry data, and to lessen barriers for researchers, TAD was created with the ability to geolocate data from various sources measured on different time scales from a single flight. The analysis of airborne chemistry data typically requires data subsetting, which can be challenging and resource-intensive for end users. In an effort to streamline this process, new data subsetting features and updates to the current database model will be added to the TAD toolset. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. These new web-based tools will allow for automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The system has been designed to allow for new in-situ airborne missions to be added as they become available, with only minor pre-processing required. The development of these enhancements will be discussed in this presentation.

  1. Acceleration of atmospheric Cherenkov telescope signal processing to real-time speed with the Auto-Pipe design system

    NASA Astrophysics Data System (ADS)

    Tyson, Eric J.; Buckley, James; Franklin, Mark A.; Chamberlain, Roger D.

    2008-10-01

    The imaging atmospheric Cherenkov technique for high-energy gamma-ray astronomy is emerging as an important new technique for studying the high energy universe. Current experiments have data rates of ≈20TB/year and duty cycles of about 10%. In the future, more sensitive experiments may produce up to 1000 TB/year. The data analysis task for these experiments requires keeping up with this data rate in close to real-time. Such data analysis is a classic example of a streaming application with very high performance requirements. This class of application often benefits greatly from the use of non-traditional approaches for computation including using special purpose hardware (FPGAs and ASICs), or sophisticated parallel processing techniques. However, designing, debugging, and deploying to these architectures is difficult and thus they are not widely used by the astrophysics community. This paper presents the Auto-Pipe design toolset that has been developed to address many of the difficulties in taking advantage of complex streaming computer architectures for such applications. Auto-Pipe incorporates a high-level coordination language, functional and performance simulation tools, and the ability to deploy applications to sophisticated architectures. Using the Auto-Pipe toolset, we have implemented the front-end portion of an imaging Cherenkov data analysis application, suitable for real-time or offline analysis. The application operates on data from the VERITAS experiment, and shows how Auto-Pipe can greatly ease performance optimization and application deployment of a wide variety of platforms. We demonstrate a performance improvement over a traditional software approach of 32x using an FPGA solution and 3.6x using a multiprocessor based solution.

  2. A System for Integrated Reliability and Safety Analyses

    NASA Technical Reports Server (NTRS)

    Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Coumeri, Marc; Scheidler, Peter, Jr.; Bonesteel, Charles

    1999-01-01

    We present an integrated reliability and aviation safety analysis tool. The reliability models for selected infrastructure components of the air traffic control system are described. The results of this model are used to evaluate the likelihood of seeing outcomes predicted by simulations with failures injected. We discuss the design of the simulation model, and the user interface to the integrated toolset.

  3. Design Principles and First Educational Experiments of pR, a Platform to Infer Geo-Referenced Itineraries from Travel Stories

    ERIC Educational Resources Information Center

    Loustau, Pierre; Nodenot, Thierry; Gaio, Mauro

    2009-01-01

    Purpose: The purpose of this paper is to present a computational approach and a toolset to infer spatial displacements as they occur in route narrative documents and report on first experiments done to produce computer-aided learning (CAL) applications and instructional design editors that exploit the inferred georeferenced itineraries.…

  4. Constructing Virtual Training Demonstrations

    DTIC Science & Technology

    2008-12-01

    virtual environments have been shown to be effective for training, and distributed game -based architectures contribute an added benefit of wide...investigation of how a demonstration authoring toolset can be constructed from existing virtual training environments using 3-D multiplayer gaming ...intelligent agents project to create AI middleware for simulations and videogames . The result was SimBionic®, which enables users to graphically author

  5. VARiD: a variation detection framework for color-space and letter-space platforms.

    PubMed

    Dalca, Adrian V; Rumble, Stephen M; Levy, Samuel; Brudno, Michael

    2010-06-15

    High-throughput sequencing (HTS) technologies are transforming the study of genomic variation. The various HTS technologies have different sequencing biases and error rates, and while most HTS technologies sequence the residues of the genome directly, generating base calls for each position, the Applied Biosystem's SOLiD platform generates dibase-coded (color space) sequences. While combining data from the various platforms should increase the accuracy of variation detection, to date there are only a few tools that can identify variants from color space data, and none that can analyze color space and regular (letter space) data together. We present VARiD--a probabilistic method for variation detection from both letter- and color-space reads simultaneously. VARiD is based on a hidden Markov model and uses the forward-backward algorithm to accurately identify heterozygous, homozygous and tri-allelic SNPs, as well as micro-indels. Our analysis shows that VARiD performs better than the AB SOLiD toolset at detecting variants from color-space data alone, and improves the calls dramatically when letter- and color-space reads are combined. The toolset is freely available at http://compbio.cs.utoronto.ca/varid.

  6. Integrating Flight Dynamics & Control Analysis and Simulation in Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben; Berger, Tom; Tischler, Mark B.; Theodore, Colin R; Elmore, Josh; Gallaher, Andrew; Tobias, Eric L.

    2016-01-01

    The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.

  7. Verification of a Multiphysics Toolkit against the Magnetized Target Fusion Concept

    NASA Technical Reports Server (NTRS)

    Thomas, Scott; Perrell, Eric; Liron, Caroline; Chiroux, Robert; Cassibry, Jason; Adams, Robert B.

    2005-01-01

    In the spring of 2004 the Advanced Concepts team at MSFC embarked on an ambitious project to develop a suite of modeling routines that would interact with one another. The tools would each numerically model a portion of any advanced propulsion system. The tools were divided by physics categories, hence the name multiphysics toolset. Currently most of the anticipated modeling tools have been created and integrated. Results are given in this paper for both a quarter nozzle with chemically reacting flow and the interaction of two plasma jets representative of a Magnetized Target Fusion device. The results have not been calibrated against real data as of yet, but this paper demonstrates the current capability of the multiphysics tool and planned future enhancements

  8. Safety Case Patterns: Theory and Applications

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh J.

    2015-01-01

    We develop the foundations for a theory of patterns of safety case argument structures, clarifying the concepts involved in pattern specification, including choices, labeling, and well-founded recursion. We specify six new patterns in addition to those existing in the literature. We give a generic way to specify the data required to instantiate patterns and a generic algorithm for their instantiation. This generalizes earlier work on generating argument fragments from requirements tables. We describe an implementation of these concepts in AdvoCATE, the Assurance Case Automation Toolset, showing how patterns are defined and can be instantiated. In particular, we describe how our extended notion of patterns can be specified, how they can be instantiated in an interactive manner, and, finally, how they can be automatically instantiated using our algorithm.

  9. Building Safer Systems With SpecTRM

    NASA Technical Reports Server (NTRS)

    2003-01-01

    System safety, an integral component in software development, often poses a challenge to engineers designing computer-based systems. While the relaxed constraints on software design allow for increased power and flexibility, this flexibility introduces more possibilities for error. As a result, system engineers must identify the design constraints necessary to maintain safety and ensure that the system and software design enforces them. Safeware Engineering Corporation, of Seattle, Washington, provides the information, tools, and techniques to accomplish this task with its Specification Tools and Requirements Methodology (SpecTRM). NASA assisted in developing this engineering toolset by awarding the company several Small Business Innovation Research (SBIR) contracts with Ames Research Center and Langley Research Center. The technology benefits NASA through its applications for Space Station rendezvous and docking. SpecTRM aids system and software engineers in developing specifications for large, complex safety critical systems. The product enables engineers to find errors early in development so that they can be fixed with the lowest cost and impact on the system design. SpecTRM traces both the requirements and design rationale (including safety constraints) throughout the system design and documentation, allowing engineers to build required system properties into the design from the beginning, rather than emphasizing assessment at the end of the development process when changes are limited and costly.System safety, an integral component in software development, often poses a challenge to engineers designing computer-based systems. While the relaxed constraints on software design allow for increased power and flexibility, this flexibility introduces more possibilities for error. As a result, system engineers must identify the design constraints necessary to maintain safety and ensure that the system and software design enforces them. Safeware Engineering Corporation, of Seattle, Washington, provides the information, tools, and techniques to accomplish this task with its Specification Tools and Requirements Methodology (SpecTRM). NASA assisted in developing this engineering toolset by awarding the company several Small Business Innovation Research (SBIR) contracts with Ames Research Center and Langley Research Center. The technology benefits NASA through its applications for Space Station rendezvous and docking. SpecTRM aids system and software engineers in developing specifications for large, complex safety critical systems. The product enables engineers to find errors early in development so that they can be fixed with the lowest cost and impact on the system design. SpecTRM traces both the requirements and design rationale (including safety constraints) throughout the system design and documentation, allowing engineers to build required system properties into the design from the beginning, rather than emphasizing assessment at the end of the development process when changes are limited and costly.

  10. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  11. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods.

    PubMed

    Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P

    2012-06-05

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.

  12. Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment

    NASA Astrophysics Data System (ADS)

    Zeigler, Bernard P.; Lee, J. S.

    1998-08-01

    In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.

  13. Development of Next Generation Synthetic Biology Tools for Use in Streptomyces venezuelae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phelan, Ryan M.; Sachs, Daniel; Petkiewicz, Shayne J.

    Streptomyces have a rich history as producers of important natural products and this genus of bacteria has recently garnered attention for its potential applications in the broader context of synthetic biology. However, the dearth of genetic tools available to control and monitor protein production precludes rapid and predictable metabolic engineering that is possible in hosts such as Escherichia coli or Saccharomyces cerevisiae. In an effort to improve genetic tools for Streptomyces venezuelae, we developed a suite of standardized, orthogonal integration vectors and an improved method to monitor protein production in this host. These tools were applied to characterize heterologous promotersmore » and various attB chromosomal integration sites. A final study leveraged the characterized toolset to demonstrate its use in producing the biofuel precursor bisabolene using a chromosomally integrated expression system. In conclusion, these tools advance S. venezuelae to be a practical host for future metabolic engineering efforts.« less

  14. Towards a Formal Basis for Modular Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh

    2015-01-01

    Safety assurance using argument-based safety cases is an accepted best-practice in many safety-critical sectors. Goal Structuring Notation (GSN), which is widely used for presenting safety arguments graphically, provides a notion of modular arguments to support the goal of incremental certification. Despite the efforts at standardization, GSN remains an informal notation whereas the GSN standard contains appreciable ambiguity especially concerning modular extensions. This, in turn, presents challenges when developing tools and methods to intelligently manipulate modular GSN arguments. This paper develops the elements of a theory of modular safety cases, leveraging our previous work on formalizing GSN arguments. Using example argument structures we highlight some ambiguities arising through the existing guidance, present the intuition underlying the theory, clarify syntax, and address modular arguments, contracts, well-formedness and well-scopedness of modules. Based on this theory, we have a preliminary implementation of modular arguments in our toolset, AdvoCATE.

  15. Designing an architectural style for Pervasive Healthcare systems.

    PubMed

    Rafe, Vahid; Hajvali, Masoumeh

    2013-04-01

    Nowadays, the Pervasive Healthcare (PH) systems are considered as an important research area. These systems have a dynamic structure and configuration. Therefore, an appropriate method for designing such systems is necessary. The Publish/Subscribe Architecture (pub/sub) is one of the convenient architectures to support such systems. PH systems are safety critical; hence, errors can bring disastrous results. To prevent such problems, a powerful analytical tool is required. So using a proper formal language like graph transformation systems for developing of these systems seems necessary. But even if software engineers use such high level methodologies, errors may occur in the system under design. Hence, it should be investigated automatically and formally that whether this model of system satisfies all their requirements or not. In this paper, a dynamic architectural style for developing PH systems is presented. Then, the behavior of these systems is modeled and evaluated using GROOVE toolset. The results of the analysis show its high reliability.

  16. A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing

    DTIC Science & Technology

    2016-12-08

    project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from

  17. The Assimilation of Technology in a Sixth-Grade Classroom: Teacher Learning from the Use of an Open Toolset.

    ERIC Educational Resources Information Center

    Hartmann, Christopher E.; McFarlane, Doug

    This paper discusses a case study of the implementation of the computer software, "Boxer," in a single sixth-grade classroom. The paper reports on the process of teacher learning accompanying the assimilation of a new technology into this classroom. A case study approach was used because the teacher taught a 4-week class using Boxer to…

  18. Framework for End-User Programming of Cross-Smart Space Applications

    PubMed Central

    Palviainen, Marko; Kuusijärvi, Jarkko; Ovaska, Eila

    2012-01-01

    Cross-smart space applications are specific types of software services that enable users to share information, monitor the physical and logical surroundings and control it in a way that is meaningful for the user's situation. For developing cross-smart space applications, this paper makes two main contributions: it introduces (i) a component design and scripting method for end-user programming of cross-smart space applications and (ii) a backend framework of components that interwork to support the brunt of the RDFScript translation, and the use and execution of ontology models. Before end-user programming activities, the software professionals must develop easy-to-apply Driver components for the APIs of existing software systems. Thereafter, end-users are able to create applications from the commands of the Driver components with the help of the provided toolset. The paper also introduces the reference implementation of the framework, tools for the Driver component development and end-user programming of cross-smart space applications and the first evaluation results on their application. PMID:23202169

  19. Integrated multiscale biomaterials experiment and modelling: a perspective

    PubMed Central

    Buehler, Markus J.; Genin, Guy M.

    2016-01-01

    Advances in multiscale models and computational power have enabled a broad toolset to predict how molecules, cells, tissues and organs behave and develop. A key theme in biological systems is the emergence of macroscale behaviour from collective behaviours across a range of length and timescales, and a key element of these models is therefore hierarchical simulation. However, this predictive capacity has far outstripped our ability to validate predictions experimentally, particularly when multiple hierarchical levels are involved. The state of the art represents careful integration of multiscale experiment and modelling, and yields not only validation, but also insights into deformation and relaxation mechanisms across scales. We present here a sampling of key results that highlight both challenges and opportunities for integrated multiscale experiment and modelling in biological systems. PMID:28981126

  20. Virophages to viromes: a report from the frontier of viral oceanography.

    PubMed

    Culley, Alexander I

    2011-07-01

    The investigation of marine viruses has advanced our understanding of ecology, evolution, microbiology, oceanography and virology. Significant findings discussed in this review include the discovery of giant viruses that have genome sizes and metabolic capabilities that distort the line between virus and cell, viruses that participate in photosynthesis and apoptosis, the detection of communities of viruses of all genomic compositions and the preeminence of viruses in the evolution of marine microbes. Although we have made great progress, we have yet to synthesize the rich archive of viral genomic data with oceanographic processes. The development of cutting edge methods such as single virus genomics now provide a toolset to better integrate viruses into the ecology of the ocean. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. The AstroHDF Effort

    NASA Astrophysics Data System (ADS)

    Masters, J.; Alexov, A.; Folk, M.; Hanisch, R.; Heber, G.; Wise, M.

    2012-09-01

    Here we update the astronomy community on our effort to deal with the demands of ever-increasing astronomical data size and complexity, using the Hierarchical Data Format, version 5 (HDF5) format (Wise et al. 2011). NRAO, LOFAR and VAO have joined forces with The HDF Group to write an NSF grant, requesting funding to assist in the effort. This paper briefly summarizes our motivation for the proposed project, an outline of the project itself, and some of the material discussed at the ADASS Birds of a Feather (BoF) discussion. Topics of discussion included: community experiences with HDF5 and other file formats; toolsets which exist and/or can be adapted for HDF5; a call for development towards visualizing large (> 1 TB) image cubes; and, general lessons learned from working with large and complex data.

  2. Virtual Battlespace Behavior Generation Through Class Imitation

    DTIC Science & Technology

    2011-03-01

    honed to the pinnacle of human capacity. His 1 mind was similarly trained and equipped with an extensive array of leadership and reactionary toolsets...in this crucial moment, Davis had one clear thought resonating within his mind ; that the outcome of this instant would forever change him. In this same...behavior adaption [17]. Their work includes a proof-of-concept for adaptive human-robot interaction scenarios, in particular focusing on autism

  3. On-Demand Targeting: Investigating Biology with Proximity-Directed Chemistry

    PubMed Central

    2016-01-01

    Proximity enhancement is a central chemical tenet underpinning an exciting suite of small-molecule toolsets that have allowed us to unravel many biological complexities. The leitmotif of this opus is “tethering”—a strategy in which a multifunctional small molecule serves as a template to bring proteins/biomolecules together. Scaffolding approaches have been powerfully applied to control diverse biological outcomes such as protein–protein association, protein stability, activity, and improve imaging capabilities. A new twist on this strategy has recently appeared, in which the small-molecule probe is engineered to unleash controlled amounts of reactive chemical signals within the microenvironment of a target protein. Modification of a specific target elicits a precisely timed and spatially controlled gain-of-function (or dominant loss-of-function) signaling response. Presented herein is a unique personal outlook conceptualizing the powerful proximity-enhanced chemical biology toolsets into two paradigms: “multifunctional scaffolding” versus “on-demand targeting”. By addressing the latest advances and challenges in the established yet constantly evolving multifunctional scaffolding strategies as well as in the emerging on-demand precision targeting (and related) systems, this Perspective is aimed at choosing when it is best to employ each of the two strategies, with an emphasis toward further promoting novel applications and discoveries stemming from these innovative chemical biology platforms. PMID:26907082

  4. On-Demand Targeting: Investigating Biology with Proximity-Directed Chemistry.

    PubMed

    Long, Marcus J C; Poganik, Jesse R; Aye, Yimon

    2016-03-23

    Proximity enhancement is a central chemical tenet underpinning an exciting suite of small-molecule toolsets that have allowed us to unravel many biological complexities. The leitmotif of this opus is "tethering"-a strategy in which a multifunctional small molecule serves as a template to bring proteins/biomolecules together. Scaffolding approaches have been powerfully applied to control diverse biological outcomes such as protein-protein association, protein stability, activity, and improve imaging capabilities. A new twist on this strategy has recently appeared, in which the small-molecule probe is engineered to unleash controlled amounts of reactive chemical signals within the microenvironment of a target protein. Modification of a specific target elicits a precisely timed and spatially controlled gain-of-function (or dominant loss-of-function) signaling response. Presented herein is a unique personal outlook conceptualizing the powerful proximity-enhanced chemical biology toolsets into two paradigms: "multifunctional scaffolding" versus "on-demand targeting". By addressing the latest advances and challenges in the established yet constantly evolving multifunctional scaffolding strategies as well as in the emerging on-demand precision targeting (and related) systems, this Perspective is aimed at choosing when it is best to employ each of the two strategies, with an emphasis toward further promoting novel applications and discoveries stemming from these innovative chemical biology platforms.

  5. A Collaborative Extensible User Environment for Simulation and Knowledge Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Lansing, Carina S.; Porter, Ellen A.

    2015-06-01

    In scientific simulation, scientists use measured data to create numerical models, execute simulations and analyze results from advanced simulators executing on high performance computing platforms. This process usually requires a team of scientists collaborating on data collection, model creation and analysis, and on authorship of publications and data. This paper shows that scientific teams can benefit from a user environment called Akuna that permits subsurface scientists in disparate locations to collaborate on numerical modeling and analysis projects. The Akuna user environment is built on the Velo framework that provides both a rich client environment for conducting and analyzing simulations andmore » a Web environment for data sharing and annotation. Akuna is an extensible toolset that integrates with Velo, and is designed to support any type of simulator. This is achieved through data-driven user interface generation, use of a customizable knowledge management platform, and an extensible framework for simulation execution, monitoring and analysis. This paper describes how the customized Velo content management system and the Akuna toolset are used to integrate and enhance an effective collaborative research and application environment. The extensible architecture of Akuna is also described and demonstrates its usage for creation and execution of a 3D subsurface simulation.« less

  6. Multidiscipline Optimization of Aircraft Structures for FAA Certification.

    DTIC Science & Technology

    1998-09-01

    Remote Services Br wser UNIX Workstatigo Lcpwrl e lients --- Intenet , CADIGOLDI/MIDAS - ~ ~A HAplir MDO IPT Clients 4~A Dt’I4P SHA plir Figure 5...Once the technology and general shape of the structure is selected, trade off studies are used to select the best affordable architecture. Trade -offs...architecture Make trade -oft, Define diagnostics SYSTEMS ENGINEERING MDO ENGINEERING LSypprt •MANAGEMENT SCIENCES’ LOCKHEED-MARTIN’S REALITY TOOLSET PR ICE

  7. SaRAD: a Simple and Robust Abbreviation Dictionary.

    PubMed

    Adar, Eytan

    2004-03-01

    Due to recent interest in the use of textual material to augment traditional experiments it has become necessary to automatically cluster, classify and filter natural language information. The Simple and Robust Abbreviation Dictionary (SaRAD) provides an easy to implement, high performance tool for the construction of a biomedical symbol dictionary. The algorithms, applied to the MEDLINE document set, result in a high quality dictionary and toolset to disambiguate abbreviation symbols automatically.

  8. Biological imaging with coherent Raman scattering microscopy: a tutorial

    PubMed Central

    Alfonso-García, Alba; Mittal, Richa; Lee, Eun Seong; Potma, Eric O.

    2014-01-01

    Abstract. Coherent Raman scattering (CRS) microscopy is gaining acceptance as a valuable addition to the imaging toolset of biological researchers. Optimal use of this label-free imaging technique benefits from a basic understanding of the physical principles and technical merits of the CRS microscope. This tutorial offers qualitative explanations of the principles behind CRS microscopy and provides information about the applicability of this nonlinear optical imaging approach for biological research. PMID:24615671

  9. Sharing Research Models: Using Software Engineering Practices for Facilitation

    PubMed Central

    Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.

    2011-01-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780

  10. An open data repository and a data processing software toolset of an equivalent Nordic grid model matched to historical electricity market data.

    PubMed

    Vanfretti, Luigi; Olsen, Svein H; Arava, V S Narasimham; Laera, Giuseppe; Bidadfar, Ali; Rabuzin, Tin; Jakobsen, Sigurd H; Lavenius, Jan; Baudette, Maxime; Gómez-López, Francisco J

    2017-04-01

    This article presents an open data repository, the methodology to generate it and the associated data processing software developed to consolidate an hourly snapshot historical data set for the year 2015 to an equivalent Nordic power grid model (aka Nordic 44), the consolidation was achieved by matching the model׳s physical response w.r.t historical power flow records in the bidding regions of the Nordic grid that are available from the Nordic electricity market agent, Nord Pool. The model is made available in the form of CIM v14, Modelica and PSS/E (Siemens PTI) files. The Nordic 44 model in Modelica and PSS/E were first presented in the paper titled "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations" (Vanfretti et al., 2016) [1] for a single snapshot. In the digital repository being made available with the submission of this paper (SmarTSLab_Nordic44 Repository at Github, 2016) [2], a total of 8760 snapshots (for the year 2015) that can be used to initialize and execute dynamic simulations using tools compatible with CIM v14, the Modelica language and the proprietary PSS/E tool are provided. The Python scripts to generate the snapshots (processed data) are also available with all the data in the GitHub repository (SmarTSLab_Nordic44 Repository at Github, 2016) [2]. This Nordic 44 equivalent model was also used in iTesla project (iTesla) [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016) [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3) [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016) [4] are also available in the repository. The CIM and Modelica snapshots of the "Nordic 44" model for the year 2015 are available in a Zenodo repository.

  11. NASA Langley Atmospheric Science Data Center Toolsets for Airborne Data (TAD): User Interface Design and Development

    NASA Astrophysics Data System (ADS)

    Beach, A. L., III; Early, A. B.; Chen, G.; Parker, L.

    2014-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, which are characterized by a wide range of trace gases and aerosol properties. The airborne observational data have often been used in assessment and validation of models and satellite instruments. The ASDC Toolset for Airborne Data (TAD) is being designed to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. Given the sheer volume of data variables across field campaigns and instruments reporting data on different time scales, this data is often difficult and time-intensive for researchers to analyze. The TAD web application is designed to provide an intuitive user interface (UI) to facilitate quick and efficient discovery from a vast number of airborne variables and data. Users are given the option to search based on high-level parameter groups, individual common names, mission and platform, as well as date ranges. Experienced users can immediately filter by keyword using the global search option. Once the user has chosen their required variables, they are given the option to either request PI data files based on their search criteria or create merged data, i.e. geo-located data from one or more measurement PIs. The purpose of the merged data feature is to allow users to compare data from one flight, as not all data from each flight is taken on the same time scale. Time bases can be continuous or based on the time base from one of the measurement time scales and intervals. After an order is submitted and processed, an ASDC email is sent to the user with a link for data download. The TAD user interface design, application architecture, and proposed future enhancements will be presented.

  12. Therapeutic genome engineering via CRISPR-Cas systems.

    PubMed

    Moreno, Ana M; Mali, Prashant

    2017-07-01

    Differences in genomes underlie most organismal diversity, and aberrations in genomes underlie many disease states. With the growing knowledge of the genetic and pathogenic basis of human disease, development of safe and efficient platforms for genome and epigenome engineering will transform our ability to therapeutically target human diseases and also potentially engineer disease resistance. In this regard, the recent advent of clustered regularly interspaced short palindromic repeats (CRISPR)-CRISPR-associated (Cas) RNA-guided nuclease systems have transformed our ability to target nucleic acids. Here we review therapeutic genome engineering applications with a specific focus on the CRISPR-Cas toolsets. We summarize past and current work, and also outline key challenges and future directions. WIREs Syst Biol Med 2017, 9:e1380. doi: 10.1002/wsbm.1380 For further resources related to this article, please visit the WIREs website. © 2017 Wiley Periodicals, Inc.

  13. Advanced Satellite Workstation - An integrated workstation environment for operational support of satellite system planning and analysis

    NASA Astrophysics Data System (ADS)

    Hamilton, Marvin J.; Sutton, Stewart A.

    A prototype integrated environment, the Advanced Satellite Workstation (ASW), which was developed and delivered for evaluation and operator feedback in an operational satellite control center, is described. The current ASW hardware consists of a Sun Workstation and Macintosh II Workstation connected via an ethernet Network Hardware and Software, Laser Disk System, Optical Storage System, and Telemetry Data File Interface. The central objective of ASW is to provide an intelligent decision support and training environment for operator/analysis of complex systems such as satellites. Compared to the many recent workstation implementations that incorporate graphical telemetry displays and expert systems, ASW provides a considerably broader look at intelligent, integrated environments for decision support, based on the premise that the central features of such an environment are intelligent data access and integrated toolsets.

  14. AzoCholine Enables Optical Control of Alpha 7 Nicotinic Acetylcholine Receptors in Neural Networks.

    PubMed

    Damijonaitis, Arunas; Broichhagen, Johannes; Urushima, Tatsuya; Hüll, Katharina; Nagpal, Jatin; Laprell, Laura; Schönberger, Matthias; Woodmansee, David H; Rafiq, Amir; Sumser, Martin P; Kummer, Wolfgang; Gottschalk, Alexander; Trauner, Dirk

    2015-05-20

    Nicotinic acetylcholine receptors (nAChRs) are essential for cellular communication in higher organisms. Even though a vast pharmacological toolset to study cholinergic systems has been developed, control of endogenous neuronal nAChRs with high spatiotemporal precision has been lacking. To address this issue, we have generated photoswitchable nAChR agonists and re-evaluated the known photochromic ligand, BisQ. Using electrophysiology, we found that one of our new compounds, AzoCholine, is an excellent photoswitchable agonist for neuronal α7 nAChRs, whereas BisQ was confirmed to be an agonist for the muscle-type nAChR. AzoCholine could be used to modulate cholinergic activity in a brain slice and in dorsal root ganglion neurons. In addition, we demonstrate light-dependent perturbation of behavior in the nematode, Caenorhabditis elegans.

  15. Establishing Networks to Lever Investments in Developing Capacity for Agricultural Monitoring: A GEOGLAM Perspective

    NASA Astrophysics Data System (ADS)

    Whitcraft, A. K.; Becker-Reshef, I.

    2016-12-01

    Since 2011, the Group on Earth Observations Global Agricultural Monitoring (GEOGLAM) Initiative has been working to strengthen the international community's capacity to use Earth observation (EO) data to derive timely, accurate, and transparent information on agriculture. A key component of GEOGLAM is the development of individual and institutional capacity for EO-based agricultural monitoring at multiple scales, from national to regional to global, in low-, middle-, and high-income countries. Despite the fact that the need for enhancing capacity is frequently acknowledged, there is little formal or informal literature documenting best practices for developing and implementing comprehensive capacity development strategies around Earth observations knowledge sharing. As a result, many projects and activities develop knowledge-sharing strategies on an ad hoc basis, and may be missing out on levering lessons, techniques, and toolsets already developed. In the past year, GEOGLAM has aimed to spur relationships and collaborations with capacity development initiatives and networks, toward sharing and documenting strategies and tactical experiences in this domain. This presentation will provide some perspective on challenges and opportunities encountered so far, from the GEOGLAM perspective, with the goal of continued dialogue and coordination with other session participants.

  16. GLOBE (Global Oceanographic Bathymetry Explorer) : an innovative and generic software combining all necessary functionalities for cruise preparation, for collection, linking, processing and display of scientific data acquired during sea cruises, and for exporting data and information to the main marine data centers and networks.

    NASA Astrophysics Data System (ADS)

    Sinquin, J. M.; Sorribas, J.

    2014-12-01

    Within the EUROFLEETS project, and linked to the EMODNet and Geo-Seas European projects, GLOBE (Global Oceanographic Bathymetry Explorer) is an innovative and generic software. I. INTRODUCTION The first version can be used onboard during the survey to get a quick overview of acquired data, or later, to re-process data with accurate environmental data. II. MAIN FUNCTIONALITIES The version shown at AGU-2014 will present several key items : - 3D visualization: DTM multi-layers from EMODNet, - Water Column echogram, Seismic lines, ... - Bathymetry Plug-In: manual and automatic data cleaning, integration of EMODNet methodology to introduce CDI concept, filtering, spline, data gridding, ... - Backscatter with compensation, - Tectonic toolset, - Photo/Video Plug-In - Navigation 3D including tide correction, MRU corrections, GPS offsets correction, - WMS/WFS interfaces. III. FOCUS ON EMODNET One of the main objectives of the EMODNet European project is to elaborate a common processing flow for gridding the bathymetry data and for generating harmonized digital terrain model (DTM) : this flow includes the definition of the DTM characteristics (geodetic parameters, grid spacing, interpolation and smoothing parameters…) and also the specifications of a set of layers which enrich the basic depth layer : statistical layers (sounding density, standard deviation,…) and an innovative data source layer which indicates the source of the soundings and and which is linked and collects to the associated metadata. GLOBE Software provides the required tools for applying this methodology and is offered to the project partners. V. FOCUS ON THE TECTONIC TOOLSET The tectonic toolset allows the user to associate any DTM to 3D rotation movements. These rotations represent the movement of tectonic plates along discrete time lines (from 200 million years ago to now). One rotation is described by its axes, its value angle and its date. GLOBE can display the movement of tectonic plates, represented by a DTM, at different geological times. The same movements can be operated for geotiff images or GMT files representing grids for any kind of data. The free software GLOBE3D is a product of Ifremer and is funded by Carnot-Edrome

  17. Usability Evaluation of Air Warfare Assessment & Review Toolset in Exercise Black Skies 2012

    DTIC Science & Technology

    2013-12-01

    is, it allows the user to do what they want to do with it ( Pressman , 2005). This concept is sometimes called fitness for purpose (Nielsen, 1993...Other characteristics of good software defined by Pressman (2005) are: reliability – the proportion of time the software is available for its intended...Diego, CA: Academic Press,. Pressman , R. S. (2005). Software Engineering: A Practitioner’s Approach. New York: McGraw- Hill. Symons, S., France, M

  18. WISARD: workbench for integrated superfast association studies for related datasets.

    PubMed

    Lee, Sungyoung; Choi, Sungkyoung; Qiao, Dandi; Cho, Michael; Silverman, Edwin K; Park, Taesung; Won, Sungho

    2018-04-20

    A Mendelian transmission produces phenotypic and genetic relatedness between family members, giving family-based analytical methods an important role in genetic epidemiological studies-from heritability estimations to genetic association analyses. With the advance in genotyping technologies, whole-genome sequence data can be utilized for genetic epidemiological studies, and family-based samples may become more useful for detecting de novo mutations. However, genetic analyses employing family-based samples usually suffer from the complexity of the computational/statistical algorithms, and certain types of family designs, such as incorporating data from extended families, have rarely been used. We present a Workbench for Integrated Superfast Association studies for Related Data (WISARD) programmed in C/C++. WISARD enables the fast and a comprehensive analysis of SNP-chip and next-generation sequencing data on extended families, with applications from designing genetic studies to summarizing analysis results. In addition, WISARD can automatically be run in a fully multithreaded manner, and the integration of R software for visualization makes it more accessible to non-experts. Comparison with existing toolsets showed that WISARD is computationally suitable for integrated analysis of related subjects, and demonstrated that WISARD outperforms existing toolsets. WISARD has also been successfully utilized to analyze the large-scale massive sequencing dataset of chronic obstructive pulmonary disease data (COPD), and we identified multiple genes associated with COPD, which demonstrates its practical value.

  19. GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan

    2015-04-01

    Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.

  20. Correlation of cholinergic drug induced quenching of acetylcholinesterase bound thioflavin-T fluorescence with their inhibition activity

    NASA Astrophysics Data System (ADS)

    Islam, Mullah Muhaiminul; Rohman, Mostofa Ataur; Gurung, Arun Bahadur; Bhattacharjee, Atanu; Aguan, Kripamoy; Mitra, Sivaprasad

    2018-01-01

    The development of new acetylcholinesterase inhibitors (AChEIs) and subsequent assay of their inhibition efficiency is considered to be a key step for AD treatment. The fluorescence intensity of thioflavin-T (ThT) bound in the active site of acetylcholinesterase (AChE) quenches substantially in presence of standard AChEI drugs due to the dynamic replacement of the fluorophore from the AChE active site as confirmed from steady state emission as well as time-resolved fluorescence anisotropy measurement and molecular dynamics simulation in conjunction with docking calculation. The parametrized % quenching data for individual system shows excellent correlation with enzyme inhibition activity measured independently by standard Ellman AChE assay method in a high throughput plate reader system. The results are encouraging towards design of a fluorescence intensity based AChE inhibition assay method and may provide a better toolset to rapidly evaluate as well as develop newer AChE-inhibitors for AD treatment.

  1. Virtual Plant Tissue: Building Blocks for Next-Generation Plant Growth Simulation

    PubMed Central

    De Vos, Dirk; Dzhurakhalov, Abdiravuf; Stijven, Sean; Klosiewicz, Przemyslaw; Beemster, Gerrit T. S.; Broeckhove, Jan

    2017-01-01

    Motivation: Computational modeling of plant developmental processes is becoming increasingly important. Cellular resolution plant tissue simulators have been developed, yet they are typically describing physiological processes in an isolated way, strongly delimited in space and time. Results: With plant systems biology moving toward an integrative perspective on development we have built the Virtual Plant Tissue (VPTissue) package to couple functional modules or models in the same framework and across different frameworks. Multiple levels of model integration and coordination enable combining existing and new models from different sources, with diverse options in terms of input/output. Besides the core simulator the toolset also comprises a tissue editor for manipulating tissue geometry and cell, wall, and node attributes in an interactive manner. A parameter exploration tool is available to study parameter dependence of simulation results by distributing calculations over multiple systems. Availability: Virtual Plant Tissue is available as open source (EUPL license) on Bitbucket (https://bitbucket.org/vptissue/vptissue). The project has a website https://vptissue.bitbucket.io. PMID:28523006

  2. Generation of Rab-based transgenic lines for in vivo studies of endosome biology in zebrafish

    PubMed Central

    Clark, Brian S.; Winter, Mark; Cohen, Andrew R.; Link, Brian A.

    2011-01-01

    The Rab family of small GTPases function as molecular switches regulating membrane and protein trafficking. Individual Rab isoforms define and are required for specific endosomal compartments. To facilitate in vivo investigation of specific Rab proteins, and endosome biology in general, we have generated transgenic zebrafish lines to mark and manipulate Rab proteins. We also developed software to track and quantify endosome dynamics within time-lapse movies. The established transgenic lines ubiquitously express EGFP fusions of Rab5c (early endosomes), Rab11a (recycling endosomes), and Rab7 (late endosomes) to study localization and dynamics during development. Additionally, we generated UAS-based transgenic lines expressing constitutive active (CA) and dominant negative (DN) versions for each of these Rab proteins. Predicted localization and functional consequences for each line were verified through a variety of assays, including lipophilic dye uptake and Crumbs2a localization. In summary, we have established a toolset for in vivo analyses of endosome dynamics and functions. PMID:21976318

  3. Passive monitoring and localization of marine mammals in open ocean environments using widely spaced bottom mounted hydrophones

    NASA Astrophysics Data System (ADS)

    Jarvis, Susan; Moretti, David; Morrissey, Ronald; Dimarzio, Nancy

    2003-10-01

    The Marine Mammal Monitoring on Navy Ranges (M3R) project has developed a toolset for passive detection and localization of marine mammals using the existing infrastructure of Navy's undersea ranges. The Office of Naval Research funded the M3R project as part of the Navy's effort to determine the effects of acoustic and other emissions on marine mammals and threatened/endangered species. A necessary first step in this effort is the creation of a baseline of behavior, which requires long-term monitoring of marine mammals. Such monitoring, in turn, requires the ability to detect and localize the animals. This paper will present the passive acoustic monitoring and localization tools developed under M3R. It will also present results of the deployment of the M3R tools at the Atlantic Undersea Test and Evaluation Center (AUTEC), Andros Island, Bahamas from June through November 2003. Finally, it will discuss current work to improve automated species classification.

  4. A Decision Support Framework for Feasibility Analysis of International Space Station (ISS) Research Capability Enhancing Options

    NASA Technical Reports Server (NTRS)

    Ortiz, James N.; Scott,Kelly; Smith, Harold

    2004-01-01

    The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.

  5. Mathematical modeling of coupled drug and drug-encapsulated nanoparticle transport in patient-specific coronary artery walls

    NASA Astrophysics Data System (ADS)

    Hossain, Shaolie S.; Hossainy, Syed F. A.; Bazilevs, Yuri; Calo, Victor M.; Hughes, Thomas J. R.

    2012-02-01

    The majority of heart attacks occur when there is a sudden rupture of atherosclerotic plaque, exposing prothrombotic emboli to coronary blood flow, forming clots that can cause blockages of the arterial lumen. Diseased arteries can be treated with drugs delivered locally to vulnerable plaques. The objective of this work was to develop a computational tool-set to support the design and analysis of a catheter-based nanoparticulate drug delivery system to treat vulnerable plaques and diffuse atherosclerosis. A three-dimensional mathematical model of coupled mass transport of drug and drug-encapsulated nanoparticles was developed and solved numerically utilizing isogeometric finite element analysis. Simulations were run on a patient-specific multilayered coronary artery wall segment with a vulnerable plaque and the effect of artery and plaque inhomogeneity was analyzed. The method captured trends observed in local drug delivery and demonstrated potential for optimizing drug design parameters, including delivery location, nanoparticle surface properties, and drug release rate.

  6. Preparing Graduate Students for Non-Academic Careers

    NASA Astrophysics Data System (ADS)

    Woolf, Lawrence

    2014-03-01

    One of the primary topics discussed at the conference concerned career development, since most graduate students will not have the academic careers of their advisors. Goals included reviewing the primary functions of physicists in industry, evaluating how students are currently prepared for these careers, and identifying how to fill gaps in preparation. A number of non-academic physicists provided insight into meeting these goals. Most physics graduate programs in general do not purposely prepare students for a non-academic career. Strategies for overcoming this shortcoming include advising students about these careers and providing training on broadly valued professional skills such as written and verbal communication, time and project management, leadership, working in teams, innovation, product development, and proposal writing. Alumni and others from industry could provide guidance on careers and skills and should be invited to talk to students. Academic training could also better prepare students for non-academic careers by including engineering and cross disciplinary problem solving as well as incorporating software and toolsets common in industry.

  7. Human genetic susceptibility and infection with Leishmania peruviana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, M.A.; Davis, C.R.; Collins, A.

    1995-11-01

    Racial differences, familial clustering, and murine studies are suggestive of host genetic control of Leishmania infections. Complex segregation analysis has been carried out by use of the programs POINTER and COMDS and data from a total population survey, comprising 636 nuclear families, from an L. perurviana endemic area. The data support genetic components controlling susceptibility to clinical leishmaniasis, influencing severity of disease and resistance to disease among healthy individuals. A multifactorial model is favored over a sporadic model. Two-locus models provided the best fit to the data, the optimal model being a recessive gene (frequency .57) plus a modifier locus.more » Individuals infected at an early age and with recurrent lesions are genetically more susceptible than those infected with a single episode of disease at a later age. Among people with no lesions, those with a positive skin-test response are genetically less susceptible than those with a negative response. The possibility of the involvement of more than one gene together with environmental effects has implications for the design of future linkage studies. 31 refs., 7 tabs.« less

  8. Neutron-rich rare-isotope production from projectile fission of heavy nuclei near 20 MeV/nucleon beam energy

    NASA Astrophysics Data System (ADS)

    Vonta, N.; Souliotis, G. A.; Loveland, W.; Kwon, Y. K.; Tshoo, K.; Jeong, S. C.; Veselsky, M.; Bonasera, A.; Botvina, A.

    2016-12-01

    We investigate the possibilities of producing neutron-rich nuclides in projectile fission of heavy beams in the energy range of 20 MeV/nucleon expected from low-energy facilities. We report our efforts to theoretically describe the reaction mechanism of projectile fission following a multinucleon transfer collision at this energy range. Our calculations are mainly based on a two-step approach: The dynamical stage of the collision is described with either the phenomenological deep-inelastic transfer model (DIT) or with the microscopic constrained molecular dynamics model (CoMD). The de-excitation or fission of the hot heavy projectile fragments is performed with the statistical multifragmentation model (SMM). We compared our model calculations with our previous experimental projectile-fission data of 238U (20 MeV/nucleon) + 208Pb and 197Au (20 MeV/nucleon) + 197Au and found an overall reasonable agreement. Our study suggests that projectile fission following peripheral heavy-ion collisions at this energy range offers an effective route to access very neutron-rich rare isotopes toward and beyond the astrophysical r-process path.

  9. Designing Specification Languages for Process Control Systems: Lessons Learned and Steps to the Future

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.; Heimdahl, Mats P. E.; Reese, Jon Damon

    1999-01-01

    Previously, we defined a blackbox formal system modeling language called RSML (Requirements State Machine Language). The language was developed over several years while specifying the system requirements for a collision avoidance system for commercial passenger aircraft. During the language development, we received continual feedback and evaluation by FAA employees and industry representatives, which helped us to produce a specification language that is easily learned and used by application experts. Since the completion of the PSML project, we have continued our research on specification languages. This research is part of a larger effort to investigate the more general problem of providing tools to assist in developing embedded systems. Our latest experimental toolset is called SpecTRM (Specification Tools and Requirements Methodology), and the formal specification language is SpecTRM-RL (SpecTRM Requirements Language). This paper describes what we have learned from our use of RSML and how those lessons were applied to the design of SpecTRM-RL. We discuss our goals for SpecTRM-RL and the design features that support each of these goals.

  10. Health region development from the perspective of system theory - an empirical cross-regional case study.

    PubMed

    Volgger, Michael; Mainil, Tomas; Pechlaner, Harald; Mitas, Ondrej

    2015-01-01

    Governments are increasingly establishing health regions to deal with current challenges of public health service. These regions are seen as instruments to balance public and private stakeholders, and offer health care to regional citizens as well as to medical/health tourists. However, it is still unclear how the development of such health regions as well as their governance may be conceptualized. We apply Luhmann's system theory approach in the context of a cross-regional case study that compares health region developments in the Autonomous Province of Bolzano-South Tyrol (Italy) with particular regard to the Eastern Dolomites and in the province of Zeeland (the Netherlands). We suggest that Luhmann's system theory provides a useful set of criteria to evaluate and judge health region development. Fully developed health regions can be understood as auto-poietic systems. By emphasizing programs, personnel, and communication channels, these case studies illustrate the suitability of the system theory toolset to analyze the governance and spatial embeddedness of health regions. Additionally, the study contributes to literature by indicating that health regions are closely related to identity issues and to decision making in regions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Verification Tools Secure Online Shopping, Banking

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Just like rover or rocket technology sent into space, the software that controls these technologies must be extensively tested to ensure reliability and effectiveness. Ames Research Center invented the open-source Java Pathfinder (JPF) toolset for the deep testing of Java-based programs. Fujitsu Labs of America Inc., based in Sunnyvale, California, improved the capabilities of the JPF Symbolic Pathfinder tool, establishing the tool as a means of thoroughly testing the functionality and security of Web-based Java applications such as those used for Internet shopping and banking.

  12. Acoustic Metadata Management and Transparent Access to Networked Oceanographic Data Sets

    DTIC Science & Technology

    2011-09-30

    Roberts in Pat Halpin’s lab, integrating the Marine Geospatial Ecology (GeoEco) toolset into our database services. While there is a steep...noise bands. The lower box at each site denotes the 1-6 kHz band while the upper box denotes 6-96 kHz band. Lad seamount has deployments at two sites...N00014-11-1-0697 http://cetus.ucsd.edu Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of

  13. Distributed data mining on grids: services, tools, and applications.

    PubMed

    Cannataro, Mario; Congiusta, Antonio; Pugliese, Andrea; Talia, Domenico; Trunfio, Paolo

    2004-12-01

    Data mining algorithms are widely used today for the analysis of large corporate and scientific datasets stored in databases and data archives. Industry, science, and commerce fields often need to analyze very large datasets maintained over geographically distributed sites by using the computational power of distributed and parallel systems. The grid can play a significant role in providing an effective computational support for distributed knowledge discovery applications. For the development of data mining applications on grids we designed a system called Knowledge Grid. This paper describes the Knowledge Grid framework and presents the toolset provided by the Knowledge Grid for implementing distributed knowledge discovery. The paper discusses how to design and implement data mining applications by using the Knowledge Grid tools starting from searching grid resources, composing software and data components, and executing the resulting data mining process on a grid. Some performance results are also discussed.

  14. Big Data Analytics for Genomic Medicine

    PubMed Central

    He, Karen Y.; Ge, Dongliang; He, Max M.

    2017-01-01

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients’ genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs. PMID:28212287

  15. FootFall: A Ground Based Operations Toolset Enabling Walking for the ATHLETE Rover

    NASA Technical Reports Server (NTRS)

    SunSpiral, Vytas; Chavez-Clemente, Daniel; Broxton, Michael; Keely, Leslie; Mihelich, Patrick; Mittman, David; Collins, Curtis

    2008-01-01

    The ATHLETE (All-Terrain Hex-Limbed Extra-Terrestrial Explorer) vehicle consists of six identical, six degree of freedom limbs. FootFall is a ground tool for ATHLETE intended to provide an operator with integrated situational awareness, terrain reconstruction, stability and safety analysis, motion planning, and decision support capabilities to enable the efficient generation of flight software command sequences for walking. FootFall has been under development at NASA Ames for the last year, and having accomplished the initial integration, it is being used to generate command sequences for single footfalls. In this paper, the architecture of FootFall in its current state will be presented, results from the recent Human Robotic Systems Project?s Integrated Field Test (Moses Lake, Washington, June, 2008) will be discussed, and future plans for extending the capabilities of FootFall to enable ATHLETE to walk across a boulder field in real time will be described.

  16. Observation Planning Made Simple with Science Opportunity Analyzer (SOA)

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Polanskey, Carol A.

    2004-01-01

    As NASA undertakes the exploration of the Moon and Mars as well as the rest of the Solar System while continuing to investigate Earth's oceans, winds, atmosphere, weather, etc., the ever-existing need to allow operations users to easily define their observations increases. Operation teams need to be able to determine the best time to perform an observation, as well as its duration and other parameters such as the observation target. In addition, operations teams need to be able to check the observation for validity against objectives and intent as well as spacecraft constraints such as turn rates and acceleration or pointing exclusion zones. Science Opportunity Analyzer (SOA), in development for the last six years, is a multi-mission toolset that has been built to meet those needs. The operations team can follow six simple steps and define his/her observation without having to know the complexities of orbital mechanics, coordinate transformations, or the spacecraft itself.

  17. Big Data Analytics for Genomic Medicine.

    PubMed

    He, Karen Y; Ge, Dongliang; He, Max M

    2017-02-15

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients' genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs.

  18. Conceptual Design of Environmentally Friendly Rotorcraft - A Comparison of NASA and ONERA Approaches

    NASA Technical Reports Server (NTRS)

    Russell, Carl; Basset, Pierre-Marie

    2015-01-01

    In 2011, a task was initiated under the US-French Project Agreement on rotorcraft studies to collaborate on design methodologies for environmentally friendly rotorcraft. This paper summarizes the efforts of that collaboration. The French and US aerospace agencies, ONERA and NASA, have their own software toolsets and approaches to rotorcraft design. The first step of this research effort was to understand how rotorcraft impact the environment, with the initial focus on air pollution. Second, similar baseline helicopters were developed for a passenger transport mission, using NASA and ONERA rotorcraft design software tools. Comparisons were made between the designs generated by the two tools. Finally, rotorcraft designs were generated targeting reduced environmental impact. The results show that a rotorcraft design that targets reduced environmental impact can be significantly different than one that targets traditional cost drivers, such as fuel burn and empty weight.

  19. First records of tool-set use for ant-dipping by Eastern chimpanzees (Pan troglodytes schweinfurthii) in the Kalinzu Forest Reserve, Uganda.

    PubMed

    Hashimoto, Chie; Isaji, Mina; Koops, Kathelijne; Furuichi, Takeshi

    2015-10-01

    Chimpanzees at numerous study sites are known to prey on army ants by using a single wand to dip into the ant nest or column. However, in Goualougo (Republic of Congo) in Central Africa, chimpanzees use a different technique, use of a woody sapling to perforate the ant nest, then use of a herb stem as dipping tool to harvest the army ants. Use of a tool set has also been found in Guinea, West Africa: at Seringbara in the Nimba Mountains and at nearby Bossou. There are, however, no reports for chimpanzees in East Africa. We observed use of such a tool set in Kalinzu, Uganda, for the first time by Eastern chimpanzees. This behavior was observed among one group of chimpanzees at Kalinzu (S-group) but not among the adjacent group (M-group) with partly overlapping ranging areas despite the fact that the latter group has been under intensive observation since 1997. In Uganda, ant-dipping has not been observed in the northern three sites (Budongo, Semliki, and Kibale) but has been observed or seems to occur in the southern sites (Kalinzu and Bwindi), which suggests that ant-dipping was invented by and spread from the southern region after the northern and southern forest blocks became separated. Use of a tool-set by only one group at Kalinzu further suggests that this behavior was recently invented and has not yet spread to the other group via migrating females.

  20. Spatio-Temporal Process Variability in Watershed Scale Wetland Restoration Planning

    NASA Astrophysics Data System (ADS)

    Evenson, G. R.

    2012-12-01

    Watershed scale restoration decision making processes are increasingly informed by quantitative methodologies providing site-specific restoration recommendations - sometimes referred to as "systematic planning." The more advanced of these methodologies are characterized by a coupling of search algorithms and ecological models to discover restoration plans that optimize environmental outcomes. Yet while these methods have exhibited clear utility as decision support toolsets, they may be critiqued for flawed evaluations of spatio-temporally variable processes fundamental to watershed scale restoration. Hydrologic and non-hydrologic mediated process connectivity along with post-restoration habitat dynamics, for example, are commonly ignored yet known to appreciably affect restoration outcomes. This talk will present a methodology to evaluate such spatio-temporally complex processes in the production of watershed scale wetland restoration plans. Using the Tuscarawas Watershed in Eastern Ohio as a case study, a genetic algorithm will be coupled with the Soil and Water Assessment Tool (SWAT) to reveal optimal wetland restoration plans as measured by their capacity to maximize nutrient reductions. Then, a so-called "graphical" representation of the optimization problem will be implemented in-parallel to promote hydrologic and non-hydrologic mediated connectivity amongst existing wetlands and sites selected for restoration. Further, various search algorithm mechanisms will be discussed as a means of accounting for temporal complexities such as post-restoration habitat dynamics. Finally, generalized patterns of restoration plan optimality will be discussed as an alternative and possibly superior decision support toolset given the complexity and stochastic nature of spatio-temporal process variability.

  1. An Integrated Toolset for Agile Systems Engineering Requirements Analysis

    DTIC Science & Technology

    2011-05-19

    Tool STDUse Cases Collaboration Tool Data Mgmt T l 1 e a a managemen oo Run the test in the test lab, redline the STD Update the collaboration...Boeing Defense, Space & Security Lean-Agile Software A I t t d T l t fn n egra e oo se or Agile Systems Engineering Requirements Analysis Phyllis...Regulations (ITAR) and the Export Administration R l ti (EAR) h i l bl b t h th i th BOEING is a trademark of Boeing Management Company. Copyright © 2010

  2. Acoustic Response of Underwater Munitions near a Sediment Interface: Measurement Model Comparisons and Classification Schemes

    DTIC Science & Technology

    2015-04-23

    12 Figure 4. Pulse- compressed baseband signals for sequence 40 from TREX13 …… 13 Figure 5. SAS image for sequence 40 from TREX13...12 meshes with data …………… 28 Figure 14. FE simulations for aluminum and steel replicas of an 100-mm UXO …… 28 Figure 15. FE meshes for two targets...PCB Pulse- compressed and baseband PC SWAT Personal Computer Shallow Water Acoustic Toolset PondEx09 Pond Experiment 2009 PondEx10 Pond Experiment

  3. Multidisciplinary Analysis and Optimization Generation 1 and Next Steps

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia Gutierrez

    2008-01-01

    The Multidisciplinary Analysis & Optimization Working Group (MDAO WG) of the Systems Analysis Design & Optimization (SAD&O) discipline in the Fundamental Aeronautics Program s Subsonic Fixed Wing (SFW) project completed three major milestones during Fiscal Year (FY)08: "Requirements Definition" Milestone (1/31/08); "GEN 1 Integrated Multi-disciplinary Toolset" (Annual Performance Goal) (6/30/08); and "Define Architecture & Interfaces for Next Generation Open Source MDAO Framework" Milestone (9/30/08). Details of all three milestones are explained including documentation available, potential partner collaborations, and next steps in FY09.

  4. Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; McCrea, Andrew C.

    2009-01-01

    The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.

  5. Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; McCrea, Andrew C.

    2010-01-01

    The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.

  6. Integration of tools for the Design and Assessment of High-Performance, Highly Reliable Computing Systems (DAHPHRS), phase 1

    NASA Technical Reports Server (NTRS)

    Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.

    1992-01-01

    Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.

  7. A modular toolset for recombination transgenesis and neurogenetic analysis of Drosophila.

    PubMed

    Wang, Ji-Wu; Beck, Erin S; McCabe, Brian D

    2012-01-01

    Transgenic Drosophila have contributed extensively to our understanding of nervous system development, physiology and behavior in addition to being valuable models of human neurological disease. Here, we have generated a novel series of modular transgenic vectors designed to optimize and accelerate the production and analysis of transgenes in Drosophila. We constructed a novel vector backbone, pBID, that allows both phiC31 targeted transgene integration and incorporates insulator sequences to ensure specific and uniform transgene expression. Upon this framework, we have built a series of constructs that are either backwards compatible with existing restriction enzyme based vectors or utilize Gateway recombination technology for high-throughput cloning. These vectors allow for endogenous promoter or Gal4 targeted expression of transgenic proteins with or without fluorescent protein or epitope tags. In addition, we have generated constructs that facilitate transgenic splice isoform specific RNA inhibition of gene expression. We demonstrate the utility of these constructs to analyze proteins involved in nervous system development, physiology and neurodegenerative disease. We expect that these reagents will facilitate the proficiency and sophistication of Drosophila genetic analysis in both the nervous system and other tissues.

  8. Employing WebGL to develop interactive stereoscopic 3D content for use in biomedical visualization

    NASA Astrophysics Data System (ADS)

    Johnston, Semay; Renambot, Luc; Sauter, Daniel

    2013-03-01

    Web Graphics Library (WebGL), the forthcoming web standard for rendering native 3D graphics in a browser, represents an important addition to the biomedical visualization toolset. It is projected to become a mainstream method of delivering 3D online content due to shrinking support for third-party plug-ins. Additionally, it provides a virtual reality (VR) experience to web users accommodated by the growing availability of stereoscopic displays (3D TV, desktop, and mobile). WebGL's value in biomedical visualization has been demonstrated by applications for interactive anatomical models, chemical and molecular visualization, and web-based volume rendering. However, a lack of instructional literature specific to the field prevents many from utilizing this technology. This project defines a WebGL design methodology for a target audience of biomedical artists with a basic understanding of web languages and 3D graphics. The methodology was informed by the development of an interactive web application depicting the anatomy and various pathologies of the human eye. The application supports several modes of stereoscopic displays for a better understanding of 3D anatomical structures.

  9. Integrating hydrology into catchment scale studies - need for new paradigms?

    NASA Astrophysics Data System (ADS)

    Teutsch, G.

    2009-04-01

    Until the seventies, scientific development in the field of groundwater hydrology concentrated mainly on a better understanding of the physics of subsurface flow in homogeneous or simply stratified porous respectively fractured media. Then, since mid of the seventies, a much more complex vision of groundwater hydrology gradually developed. A more realistic description of the subsurface including its heterogeneity, predominant physico-chemical-biological reactions and also technologies for the efficient clean-up of contaminants developed during the past 30 years, much facilitated by the advancement in numerical modelling techniques and the boost in computer power. Even though the advancements in this field have been very significant, a new grand challenge evolved during the past 10 years trying to bring together the fields needed to build Integrated Watershed Management Systems (IWMS). The fundamental conceptual question is: Do we need new approaches to groundwater hydrology, maybe even new paradigms in order to successfully build IWMS - or can we simply extrapolate our existing concepts and tool-sets to the scale of catchments and watersheds and simply add some interfaces to adjacent disciplines like economy, ecology and others? This lecture tries to provide some of the answers by describing some successful examples.

  10. Metabolomics for Assessment of Nutritional Status

    PubMed Central

    Zivkovic, Angela M.; German, J. Bruce

    2010-01-01

    Purpose of review The current rise in diet-related diseases continues to be one of the most significant health problems facing both the developed and the developing world. The use of metabolomics – the accurate and comprehensive measurement of a significant fraction of important metabolites in accessible biological fluids – for the assessment of nutritional status, is a promising way forward. The basic toolset, targets, and knowledge are all being developed in the emerging field of metabolomics, yet important knowledge and technology gaps will need to be addressed in order to bring such assessment to practice. Recent findings Dysregulation within the principal metabolic organs (e.g. intestine, adipose, skeletal muscle, liver) are at the center of a diet-disease paradigm that includes metabolic syndrome, type 2 diabetes, and obesity. The assessment of both essential nutrient status, and the more comprehensive systemic metabolic response to dietary, lifestyle, and environmental influences (e.g. metabolic phenotype) are necessary for the evaluation of status in individuals that can identify the multiple targets of intervention needed to address metabolic disease. Summary The first proofs of principle building the knowledge to bring actionable metabolic diagnostics to practice through metabolomics are now appearing. PMID:19584717

  11. The Streptococcus mutans Serine/Threonine Kinase, PknB, Regulates Competence Development, Bacteriocin Production, and Cell Wall Metabolism ▿

    PubMed Central

    Banu, Liliana Danusia; Conrads, Georg; Rehrauer, Hubert; Hussain, Haitham; Allan, Elaine; van der Ploeg, Jan R.

    2010-01-01

    Bacteria can detect, transmit, and react to signals from the outside world by using two-component systems (TCS) and serine-threonine kinases and phosphatases. Streptococcus mutans contains one serine-threonine kinase, encoded by pknB. A gene encoding a serine-threonine phosphatase, pppL, is located upstream of pknB. In this study, the phenotypes of pknB and pppL single mutants and a pknB pppL double mutant were characterized. All mutants exhibited a reduction in genetic transformability and biofilm formation, showed abnormal cell shapes, grew slower than the wild-type strain in several complex media, and exhibited reduced acid tolerance. The mutants had reduced cariogenic capacity but no significant defects in colonization in a rat caries model. Whole-genome transcriptome analysis revealed that a pknB mutant showed reduced expression of genes involved in bacteriocin production and genetic competence. Among the genes that were differentially regulated in the pknB mutant, several were likely to be involved in cell wall metabolism. One such gene, SMU.2146c, and two genes encoding bacteriocins were shown to also be downregulated in a vicK mutant, which encodes a sensor kinase involved in the response to oxidative stress. Collectively, the results lead us to speculate that PknB may modulate the activity of the two-component signal transduction systems VicKR and ComDE. Real-time reverse transcriptase PCR (RT-PCR) showed that genes downregulated in the pknB mutant were upregulated in the pppL mutant, indicating that PppL serves to counteract PknB. PMID:20231406

  12. The Streptococcus mutans serine/threonine kinase, PknB, regulates competence development, bacteriocin production, and cell wall metabolism.

    PubMed

    Banu, Liliana Danusia; Conrads, Georg; Rehrauer, Hubert; Hussain, Haitham; Allan, Elaine; van der Ploeg, Jan R

    2010-05-01

    Bacteria can detect, transmit, and react to signals from the outside world by using two-component systems (TCS) and serine-threonine kinases and phosphatases. Streptococcus mutans contains one serine-threonine kinase, encoded by pknB. A gene encoding a serine-threonine phosphatase, pppL, is located upstream of pknB. In this study, the phenotypes of pknB and pppL single mutants and a pknB pppL double mutant were characterized. All mutants exhibited a reduction in genetic transformability and biofilm formation, showed abnormal cell shapes, grew slower than the wild-type strain in several complex media, and exhibited reduced acid tolerance. The mutants had reduced cariogenic capacity but no significant defects in colonization in a rat caries model. Whole-genome transcriptome analysis revealed that a pknB mutant showed reduced expression of genes involved in bacteriocin production and genetic competence. Among the genes that were differentially regulated in the pknB mutant, several were likely to be involved in cell wall metabolism. One such gene, SMU.2146c, and two genes encoding bacteriocins were shown to also be downregulated in a vicK mutant, which encodes a sensor kinase involved in the response to oxidative stress. Collectively, the results lead us to speculate that PknB may modulate the activity of the two-component signal transduction systems VicKR and ComDE. Real-time reverse transcriptase PCR (RT-PCR) showed that genes downregulated in the pknB mutant were upregulated in the pppL mutant, indicating that PppL serves to counteract PknB.

  13. 760nm: a new laser diode wavelength for hair removal modules

    NASA Astrophysics Data System (ADS)

    Wölz, Martin; Zorn, Martin; Pietrzak, Agnieszka; Kindsvater, Alex; Meusel, Jens; Hülsewede, Ralf; Sebastian, Jürgen

    2015-02-01

    A new high-power semiconductor laser diode module, emitting at 760 nm is introduced. This wavelength permits optimum treatment results for fair skin individuals, as demonstrated by the use of Alexandrite lasers in dermatology. Hair removal applications benefit from the industry-standard diode laser design utilizing highly efficient, portable and light-weight construction. We show the performance of a tap-water-cooled encapsulated laser diode stack with a window for use in dermatological hand-pieces. The stack design takes into account the pulse lengths required for selectivity in heating the hair follicle vs. the skin. Super-long pulse durations place the hair removal laser between industry-standard CW and QCW applications. The new 760 nm laser diode bars are 30% fill factor devices with 1.5 mm long resonator cavities. At CW operation, these units provide 40 W of optical power at 43 A with wall-plug-efficiency greater than 50%. The maximum output power before COMD is 90 W. Lifetime measurements starting at 40 W show an optical power loss of 20% after about 3000 h. The hair removal modules are available in 1x3, 1x8 and 2x8 bar configurations.

  14. Global Data Toolset (GDT)

    USGS Publications Warehouse

    Cress, Jill J.; Riegle, Jodi L.

    2007-01-01

    According to the United Nations Environment Programme World Conservation Monitoring Centre (UNEP-WCMC) approximately 60 percent of the data contained in the World Database on Protected Areas (WDPA) has missing or incomplete boundary information. As a result, global analyses based on the WDPA can be inaccurate, and professionals responsible for natural resource planning and priority setting must rely on incomplete geospatial data sets. To begin to address this problem the World Data Center for Biodiversity and Ecology, in cooperation with the U. S. Geological Survey (USGS) Rocky Mountain Geographic Science Center (RMGSC), the National Biological Information Infrastructure (NBII), the Global Earth Observation System, and the Inter-American Biodiversity Information Network (IABIN) sponsored a Protected Area (PA) workshop in Asuncion, Paraguay, in November 2007. The primary goal of this workshop was to train representatives from eight South American countries on the use of the Global Data Toolset (GDT) for reviewing and editing PA data. Use of the GDT will allow PA experts to compare their national data to other data sets, including non-governmental organization (NGO) and WCMC data, in order to highlight inaccuracies or gaps in the data, and then to apply any needed edits, especially in the delineation of the PA boundaries. In addition, familiarizing the participants with the web-enabled GDT will allow them to maintain and improve their data after the workshop. Once data edits have been completed the GDT will also allow the country authorities to perform any required review and validation processing. Once validated, the data can be used to update the global WDPA and IABIN databases, which will enhance analysis on global and regional levels.

  15. Mercury Toolset for Spatiotemporal Metadata

    NASA Technical Reports Server (NTRS)

    Wilson, Bruce E.; Palanisamy, Giri; Devarakonda, Ranjeet; Rhyne, B. Timothy; Lindsley, Chris; Green, James

    2010-01-01

    Mercury (http://mercury.ornl.gov) is a set of tools for federated harvesting, searching, and retrieving metadata, particularly spatiotemporal metadata. Version 3.0 of the Mercury toolset provides orders of magnitude improvements in search speed, support for additional metadata formats, integration with Google Maps for spatial queries, facetted type search, support for RSS (Really Simple Syndication) delivery of search results, and enhanced customization to meet the needs of the multiple projects that use Mercury. It provides a single portal to very quickly search for data and information contained in disparate data management systems, each of which may use different metadata formats. Mercury harvests metadata and key data from contributing project servers distributed around the world and builds a centralized index. The search interfaces then allow the users to perform a variety of fielded, spatial, and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data. Mercury periodically (typically daily) harvests metadata sources through a collection of interfaces and re-indexes these metadata to provide extremely rapid search capabilities, even over collections with tens of millions of metadata records. A number of both graphical and application interfaces have been constructed within Mercury, to enable both human users and other computer programs to perform queries. Mercury was also designed to support multiple different projects, so that the particular fields that can be queried and used with search filters are easy to configure for each different project.

  16. Mercury Toolset for Spatiotemporal Metadata

    NASA Astrophysics Data System (ADS)

    Devarakonda, Ranjeet; Palanisamy, Giri; Green, James; Wilson, Bruce; Rhyne, B. Timothy; Lindsley, Chris

    2010-06-01

    Mercury (http://mercury.ornl.gov) is a set of tools for federated harvesting, searching, and retrieving metadata, particularly spatiotemporal metadata. Version 3.0 of the Mercury toolset provides orders of magnitude improvements in search speed, support for additional metadata formats, integration with Google Maps for spatial queries, facetted type search, support for RSS (Really Simple Syndication) delivery of search results, and enhanced customization to meet the needs of the multiple projects that use Mercury. It provides a single portal to very quickly search for data and information contained in disparate data management systems, each of which may use different metadata formats. Mercury harvests metadata and key data from contributing project servers distributed around the world and builds a centralized index. The search interfaces then allow the users to perform a variety of fielded, spatial, and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data. Mercury periodically (typically daily)harvests metadata sources through a collection of interfaces and re-indexes these metadata to provide extremely rapid search capabilities, even over collections with tens of millions of metadata records. A number of both graphical and application interfaces have been constructed within Mercury, to enable both human users and other computer programs to perform queries. Mercury was also designed to support multiple different projects, so that the particular fields that can be queried and used with search filters are easy to configure for each different project.

  17. Low-dimensional dynamical characterization of human performance of cancer patients using motion data.

    PubMed

    Hasnain, Zaki; Li, Ming; Dorff, Tanya; Quinn, David; Ueno, Naoto T; Yennu, Sriram; Kolatkar, Anand; Shahabi, Cyrus; Nocera, Luciano; Nieva, Jorge; Kuhn, Peter; Newton, Paul K

    2018-05-18

    Biomechanical characterization of human performance with respect to fatigue and fitness is relevant in many settings, however is usually limited to either fully qualitative assessments or invasive methods which require a significant experimental setup consisting of numerous sensors, force plates, and motion detectors. Qualitative assessments are difficult to standardize due to their intrinsic subjective nature, on the other hand, invasive methods provide reliable metrics but are not feasible for large scale applications. Presented here is a dynamical toolset for detecting performance groups using a non-invasive system based on the Microsoft Kinect motion capture sensor, and a case study of 37 cancer patients performing two clinically monitored tasks before and after therapy regimens. Dynamical features are extracted from the motion time series data and evaluated based on their ability to i) cluster patients into coherent fitness groups using unsupervised learning algorithms and to ii) predict Eastern Cooperative Oncology Group performance status via supervised learning. The unsupervised patient clustering is comparable to clustering based on physician assigned Eastern Cooperative Oncology Group status in that they both have similar concordance with change in weight before and after therapy as well as unexpected hospitalizations throughout the study. The extracted dynamical features can predict physician, coordinator, and patient Eastern Cooperative Oncology Group status with an accuracy of approximately 80%. The non-invasive Microsoft Kinect sensor and the proposed dynamical toolset comprised of data preprocessing, feature extraction, dimensionality reduction, and machine learning offers a low-cost and general method for performance segregation and can complement existing qualitative clinical assessments. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Querying Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Naylor, Dwight; Pai, Ganesh

    2014-01-01

    Querying a safety case to show how the various stakeholders' concerns about system safety are addressed has been put forth as one of the benefits of argument-based assurance (in a recent study by the Health Foundation, UK, which reviewed the use of safety cases in safety-critical industries). However, neither the literature nor current practice offer much guidance on querying mechanisms appropriate for, or available within, a safety case paradigm. This paper presents a preliminary approach that uses a formal basis for querying safety cases, specifically Goal Structuring Notation (GSN) argument structures. Our approach semantically enriches GSN arguments with domain-specific metadata that the query language leverages, along with its inherent structure, to produce views. We have implemented the approach in our toolset AdvoCATE, and illustrate it by application to a fragment of the safety argument for an Unmanned Aircraft System (UAS) being developed at NASA Ames. We also discuss the potential practical utility of our query mechanism within the context of the existing framework for UAS safety assurance.

  19. AstroBlend: An astrophysical visualization package for Blender

    NASA Astrophysics Data System (ADS)

    Naiman, J. P.

    2016-04-01

    The rapid growth in scale and complexity of both computational and observational astrophysics over the past decade necessitates efficient and intuitive methods for examining and visualizing large datasets. Here, I present AstroBlend, an open-source Python library for use within the three dimensional modeling software, Blender. While Blender has been a popular open-source software among animators and visual effects artists, in recent years it has also become a tool for visualizing astrophysical datasets. AstroBlend combines the three dimensional capabilities of Blender with the analysis tools of the widely used astrophysical toolset, yt, to afford both computational and observational astrophysicists the ability to simultaneously analyze their data and create informative and appealing visualizations. The introduction of this package includes a description of features, work flow, and various example visualizations. A website - www.astroblend.com - has been developed which includes tutorials, and a gallery of example images and movies, along with links to downloadable data, three dimensional artistic models, and various other resources.

  20. A state indicator on regional scale for high-voltage power lines: defining a priority for in situ inspections.

    PubMed

    Comelli, M; Colonna, N; Martini, L; Licitra, G

    2009-12-01

    An integrated system to evaluate the magnetic field generated by power lines exposure has been developed using a specific simulation model (PLEIA-EMF). This is part of a software toolset, subjected to internal suitability verifications and in-field validations. A state indicator related to each span has been determined using the data extracted from digital cartography, the magnetic field calculated by PLEIA and the number of people living in the nearest buildings. In this way, it is possible to determine eventual criticalities in the considered area, focusing attention on those cases with more considerable exposure levels and involving a higher number of people. A campaign of inspections has been planned using PLEIA simulations. The reliability of stored technical data and the real population exposure levels have been evaluated in critical cases, individuated through the following described methodology. The procedures leading to the indicator determination and the modalities of in situ inspections are here presented.

  1. KeyWare: an open wireless distributed computing environment

    NASA Astrophysics Data System (ADS)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  2. Optodynamic simulation of β-adrenergic receptor signalling

    PubMed Central

    Siuda, Edward R.; McCall, Jordan G.; Al-Hasani, Ream; Shin, Gunchul; Il Park, Sung; Schmidt, Martin J.; Anderson, Sonya L.; Planer, William J.; Rogers, John A.; Bruchas, Michael R.

    2015-01-01

    Optogenetics has provided a revolutionary approach to dissecting biological phenomena. However, the generation and use of optically active GPCRs in these contexts is limited and it is unclear how well an opsin-chimera GPCR might mimic endogenous receptor activity. Here we show that a chimeric rhodopsin/β2 adrenergic receptor (opto-β2AR) is similar in dynamics to endogenous β2AR in terms of: cAMP generation, MAP kinase activation and receptor internalization. In addition, we develop and characterize a novel toolset of optically active, functionally selective GPCRs that can bias intracellular signalling cascades towards either G-protein or arrestin-mediated cAMP and MAP kinase pathways. Finally, we show how photoactivation of opto-β2AR in vivo modulates neuronal activity and induces anxiety-like behavioural states in both fiber-tethered and wireless, freely moving animals when expressed in brain regions known to contain β2ARs. These new GPCR approaches enhance the utility of optogenetics and allow for discrete spatiotemporal control of GPCR signalling in vitro and in vivo. PMID:26412387

  3. Optodynamic simulation of β-adrenergic receptor signalling.

    PubMed

    Siuda, Edward R; McCall, Jordan G; Al-Hasani, Ream; Shin, Gunchul; Il Park, Sung; Schmidt, Martin J; Anderson, Sonya L; Planer, William J; Rogers, John A; Bruchas, Michael R

    2015-09-28

    Optogenetics has provided a revolutionary approach to dissecting biological phenomena. However, the generation and use of optically active GPCRs in these contexts is limited and it is unclear how well an opsin-chimera GPCR might mimic endogenous receptor activity. Here we show that a chimeric rhodopsin/β2 adrenergic receptor (opto-β2AR) is similar in dynamics to endogenous β2AR in terms of: cAMP generation, MAP kinase activation and receptor internalization. In addition, we develop and characterize a novel toolset of optically active, functionally selective GPCRs that can bias intracellular signalling cascades towards either G-protein or arrestin-mediated cAMP and MAP kinase pathways. Finally, we show how photoactivation of opto-β2AR in vivo modulates neuronal activity and induces anxiety-like behavioural states in both fiber-tethered and wireless, freely moving animals when expressed in brain regions known to contain β2ARs. These new GPCR approaches enhance the utility of optogenetics and allow for discrete spatiotemporal control of GPCR signalling in vitro and in vivo.

  4. Spatiotemporal control of opioid signaling and behavior

    PubMed Central

    Siuda, Edward R.; Copits, Bryan A.; Schmidt, Martin J.; Baird, Madison A.; Al-Hasani, Ream; Planer, William J.; Funderburk, Samuel C.; McCall, Jordan G.; Gereau, Robert W.; Bruchas, Michael R.

    2015-01-01

    Summary Optogenetics is now a widely accepted tool for spatiotemporal manipulation of neuronal activity. However, a majority of optogenetic approaches use binary on/off control schemes. Here we extend the optogenetic toolset by developing a neuromodulatory approach using a rationale-based design to generate a Gi-coupled, optically-sensitive, mu-opioid-like receptor, we term opto-MOR. We demonstrate that opto-MOR engages canonical mu-opioid signaling through inhibition of adenylyl cyclase, activation of MAPK and G protein-gated inward rectifying potassium (GIRK) channels, and internalizes with similar kinetics as the mu-opioid receptor. To assess in vivo utility we expressed a Cre-dependent viral opto-MOR in RMTg/VTA GABAergic neurons, which led to a real-time place preference. In contrast, expression of opto-MOR in GABAergic neurons of the ventral pallidum hedonic cold spot, led to real-time place aversion. This tool has generalizable application for spatiotemporal control of opioid signaling and, furthermore, can be used broadly for mimicking endogenous neuronal inhibition pathways. PMID:25937173

  5. Generating Neuron Geometries for Detailed Three-Dimensional Simulations Using AnaMorph.

    PubMed

    Mörschel, Konstantin; Breit, Markus; Queisser, Gillian

    2017-07-01

    Generating realistic and complex computational domains for numerical simulations is often a challenging task. In neuroscientific research, more and more one-dimensional morphology data is becoming publicly available through databases. This data, however, only contains point and diameter information not suitable for detailed three-dimensional simulations. In this paper, we present a novel framework, AnaMorph, that automatically generates water-tight surface meshes from one-dimensional point-diameter files. These surface triangulations can be used to simulate the electrical and biochemical behavior of the underlying cell. In addition to morphology generation, AnaMorph also performs quality control of the semi-automatically reconstructed cells coming from anatomical reconstructions. This toolset allows an extension from the classical dimension-reduced modeling and simulation of cellular processes to a full three-dimensional and morphology-including method, leading to novel structure-function interplay studies in the medical field. The developed numerical methods can further be employed in other areas where complex geometries are an essential component of numerical simulations.

  6. Building Capacity in Community-Based Participatory Research Partnerships Through a Focus on Process and Multiculturalism.

    PubMed

    Corbie-Smith, Giselle; Bryant, Angela R; Walker, Deborah J; Blumenthal, Connie; Council, Barbara; Courtney, Dana; Adimora, Ada

    2015-01-01

    In health research, investigators and funders are emphasizing the importance of collaboration between communities and academic institutions to achieve health equity. Although the principles underlying community-academic partnered research have been well-articulated, the processes by which partnerships integrate these principles when working across cultural differences are not as well described. We present how Project GRACE (Growing, Reaching, Advocating for Change and Empowerment) integrated participatory research principles with the process of building individual and partnership capacity. We worked with Vigorous Interventions In Ongoing Natural Settings (VISIONS) Inc., a process consultant and training organization, to develop a capacity building model. We present the conceptual framework and multicultural process of change (MPOC) that was used to build individual and partnership capacity to address health disparities. The process and capacity building model provides a common language, approach, and toolset to understand differences and the dynamics of inequity. These tools can be used by other partnerships in the conduct of research to achieve health equity.

  7. Operating in "Strange New Worlds" and Measuring Success - Test and Evaluation in Complex Environments

    NASA Technical Reports Server (NTRS)

    Qualls, Garry; Cross, Charles; Mahlin, Matthew; Montague, Gilbert; Motter, Mark; Neilan, James; Rothhaar, Paul; Tran, Loc; Trujillo, Anna; Allen, B. Danette

    2015-01-01

    Software tools are being developed by the Autonomy Incubator at NASA's Langley Research Center that will provide an integrated and scalable capability to support research and non-research flight operations across several flight domains, including urban and mixed indoor-outdoor operations. These tools incorporate a full range of data products to support mission planning, approval, flight operations, and post-flight review. The system can support a number of different operational scenarios that can incorporate live and archived data streams for UAS operators, airspace regulators, and other important stakeholders. Example use cases are described that illustrate how the tools will benefit a variety of users in nominal and off-nominal operational scenarios. An overview is presented for the current state of the toolset, including a summary of current demonstrations that have been completed. Details of the final, fully operational capability are also presented, including the interfaces that will be supported to ensure compliance with existing and future airspace operations environments.

  8. Revamping Spacecraft Operational Intelligence

    NASA Technical Reports Server (NTRS)

    Hwang, Victor

    2012-01-01

    The EPOXI flight mission has been testing a new commercial system, Splunk, which employs data mining techniques to organize and present spacecraft telemetry data in a high-level manner. By abstracting away data-source specific details, Splunk unifies arbitrary data formats into one uniform system. This not only reduces the time and effort for retrieving relevant data, but it also increases operational visibility by allowing a spacecraft team to correlate data across many different sources. Splunk's scalable architecture coupled with its graphing modules also provide a solid toolset for generating data visualizations and building real-time applications such as browser-based telemetry displays.

  9. Future Approach to tier-0 extension

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Cordeiro, C.; Giordano, D.; Traylen, S.; Moreno García, D.

    2017-10-01

    The current tier-0 processing at CERN is done on two managed sites, the CERN computer centre and the Wigner computer centre. With the proliferation of public cloud resources at increasingly competitive prices, we have been investigating how to transparently increase our compute capacity to include these providers. The approach taken has been to integrate these resources using our existing deployment and computer management tools and to provide them in a way that exposes them to users as part of the same site. The paper will describe the architecture, the toolset and the current production experiences of this model.

  10. Today's CIO: catalyst for managed care change.

    PubMed

    Sanchez, P

    1997-05-01

    As the impact of managed care increases and capitation becomes all pervasive, healthcare providers' attention to cost control will intensify. For integrated delivery networks (IDNs) to be competitive, today's CIO must leverage managed care as a catalyst for change, and use a sophisticated information system toolset as the means to an integrated end. An area many CIOs target for fast results and maximum cost savings in resource management. This article reviews how Dick Escue, chief information officer at Baptist Memorial Health Care Corporation (Memphis, TN), uses electronic information management systems to integrate and conserve the resources of Baptist's widespread healthcare organization.

  11. The Sonar Simulation Toolset, Release 4.6: Science, Mathematics, and Algorithms

    DTIC Science & Technology

    2008-10-01

    t) = ∑ s ∫ [∫ e2πifSτ ′ S Bs(fS,MSSSp)dfS ] xsp(t− τ ′S) dτ ′S (48) yp(t) = ∫ [∫ e2πifτ Lp(f)df ] xp(t− τ) dτ (49) xrp (t) = ∫ [∫ e2πifRτ ′ R Br(fR...MRSRp)dfR ] yp(t− τ ′R) dτ ′R (50) yrp(t) = xrp (t− τp(MRSRp, r′r)) (51) yr(t) = ∑ p yrp(t). (52) Thus, the transformation of a source channel signal xs

  12. The COMPASS Project

    NASA Astrophysics Data System (ADS)

    Duley, A. R.; Sullivan, D.; Fladeland, M. M.; Myers, J.; Craig, M.; Enomoto, F.; Van Gilst, D. P.; Johan, S.

    2011-12-01

    The Common Operations and Management Portal for Airborne Science Systems (COMPASS) project is a multi-center collaborative effort to advance and extend the research capabilities of the National Aeronautics and Space Administration's (NASA) Airborne Science Program (ASP). At its most basic, COMPASS provides tools for visualizing the position of aircraft and instrument observations during the course of a mission, and facilitates dissemination, discussion, and analysis and of multiple disparate data sources in order to more efficiently plan and execute airborne science missions. COMPASS targets a number of key objectives. First, deliver a common operating picture for improved shared situational awareness to all participants in NASA's Airborne Science missions. These participants include scientists, engineers, managers, and the general public. Second, encourage more responsive and collaborative measurements between instruments on multiple aircraft, satellites, and on the surface in order to increase the scientific value of these measurements. Fourth, provide flexible entry points for data providers to supply model and advanced analysis products to mission team members. Fifth, provide data consumers with a mechanism to ingest, search and display data products. Finally, embrace an open and transparent platform where common data products, services, and end user components can be shared with the broader scientific community. In pursuit of these objectives, and in concert with requirements solicited by the airborne science research community, the COMPASS project team has delivered a suite of core tools intended to represent the next generation toolset for airborne research. This toolset includes a collection of loosely coupled RESTful web-services, a system to curate, register, and search, commonly used data sources, end-user tools which leverage web socket and other next generation HTML5 technologies to aid real time aircraft position and data visualization, and an extensible a framework to rapidly accommodate mission specific requirements and mission tools.

  13. A Modular Toolset for Recombination Transgenesis and Neurogenetic Analysis of Drosophila

    PubMed Central

    Wang, Ji-Wu; Beck, Erin S.; McCabe, Brian D.

    2012-01-01

    Transgenic Drosophila have contributed extensively to our understanding of nervous system development, physiology and behavior in addition to being valuable models of human neurological disease. Here, we have generated a novel series of modular transgenic vectors designed to optimize and accelerate the production and analysis of transgenes in Drosophila. We constructed a novel vector backbone, pBID, that allows both phiC31 targeted transgene integration and incorporates insulator sequences to ensure specific and uniform transgene expression. Upon this framework, we have built a series of constructs that are either backwards compatible with existing restriction enzyme based vectors or utilize Gateway recombination technology for high-throughput cloning. These vectors allow for endogenous promoter or Gal4 targeted expression of transgenic proteins with or without fluorescent protein or epitope tags. In addition, we have generated constructs that facilitate transgenic splice isoform specific RNA inhibition of gene expression. We demonstrate the utility of these constructs to analyze proteins involved in nervous system development, physiology and neurodegenerative disease. We expect that these reagents will facilitate the proficiency and sophistication of Drosophila genetic analysis in both the nervous system and other tissues. PMID:22848718

  14. Proactive malware detection

    NASA Astrophysics Data System (ADS)

    Gloster, Jonathan; Diep, Michael; Dredden, David; Mix, Matthew; Olsen, Mark; Price, Brian; Steil, Betty

    2014-06-01

    Small-to-medium sized businesses lack resources to deploy and manage high-end advanced solutions to deter sophisticated threats from well-funded adversaries, but evidence shows that these types of businesses are becoming key targets. As malicious code and network attacks become more sophisticated, classic signature-based virus and malware detection methods are less effective. To augment the current malware methods of detection, we developed a proactive approach to detect emerging malware threats using open source tools and intelligence to discover patterns and behaviors of malicious attacks and adversaries. Technical and analytical skills are combined to track adversarial behavior, methods and techniques. We established a controlled (separated domain) network to identify, monitor, and track malware behavior to increase understanding of the methods and techniques used by cyber adversaries. We created a suite of tools that observe the network and system performance looking for anomalies that may be caused by malware. The toolset collects information from open-source tools and provides meaningful indicators that the system was under or has been attacked. When malware is discovered, we analyzed and reverse engineered it to determine how it could be detected and prevented. Results have shown that with minimum resources, cost effective capabilities can be developed to detect abnormal behavior that may indicate malicious software.

  15. Formal Foundations for Hierarchical Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2015-01-01

    Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.

  16. Automated Test Case Generation for an Autopilot Requirement Prototype

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael

    2011-01-01

    Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.

  17. Making the most of MBSE: pragmatic model-based engineering for the SKA Telescope Manager

    NASA Astrophysics Data System (ADS)

    Le Roux, Gerhard; Bridger, Alan; MacIntosh, Mike; Nicol, Mark; Schnetler, Hermine; Williams, Stewart

    2016-08-01

    Many large projects including major astronomy projects are adopting a Model Based Systems Engineering approach. How far is it possible to get value for the effort involved in developing a model that accurately represents a significant project such as SKA? Is it possible for such a large project to ensure that high-level requirements are traceable through the various system-engineering artifacts? Is it possible to utilize the tools available to produce meaningful measures for the impact of change? This paper shares one aspect of the experience gained on the SKA project. It explores some of the recommended and pragmatic approaches developed, to get the maximum value from the modeling activity while designing the Telescope Manager for the SKA. While it is too early to provide specific measures of success, certain areas are proving to be the most helpful and offering significant potential over the lifetime of the project. The experience described here has been on the 'Cameo Systems Modeler' tool-set, supporting a SysML based System Engineering approach; however the concepts and ideas covered would potentially be of value to any large project considering a Model based approach to their Systems Engineering.

  18. Multiplexing T- and B-Cell FLUOROSPOT Assays: Experimental Validation of the Multi-Color ImmunoSpot® Software Based on Center of Mass Distance Algorithm.

    PubMed

    Karulin, Alexey Y; Megyesi, Zoltán; Caspell, Richard; Hanson, Jodi; Lehmann, Paul V

    2018-01-01

    Over the past decade, ELISPOT has become a highly implemented mainstream assay in immunological research, immune monitoring, and vaccine development. Unique single cell resolution along with high throughput potential sets ELISPOT apart from flow cytometry, ELISA, microarray- and bead-based multiplex assays. The necessity to unambiguously identify individual T and B cells that do, or do not co-express certain analytes, including polyfunctional cytokine producing T cells has stimulated the development of multi-color ELISPOT assays. The success of these assays has also been driven by limited sample/cell availability and resource constraints with reagents and labor. There are few commercially available test kits and instruments available at present for multi-color FLUOROSPOT. Beyond commercial descriptions of competing systems, little is known about their accuracy in experimental settings detecting individual cells that secrete multiple analytes vs. random overlays of spots. Here, we present a theoretical and experimental validation study for three and four color T- and B-cell FLUOROSPOT data analysis. The ImmunoSpot ® Fluoro-X™ analysis system we used includes an automatic image acquisition unit that generates individual color images free of spectral overlaps and multi-color spot counting software based on the maximal allowed distance between centers of spots of different colors or Center of Mass Distance (COMD). Using four color B-cell FLUOROSPOT for IgM, IgA, IgG1, IgG3; and three/four color T-cell FLUOROSPOT for IL-2, IFN-γ, TNF-α, and GzB, in serial dilution experiments, we demonstrate the validity and accuracy of Fluoro-X™ multi-color spot counting algorithms. Statistical predictions based on the Poisson spatial distribution, coupled with scrambled image counting, permit objective correction of true multi-color spot counts to exclude randomly overlaid spots.

  19. Toolsets for Airborne Data (TAD): Improving Machine Readability for ICARTT Data Files

    NASA Technical Reports Server (NTRS)

    Early, Amanda Benson; Beach, Aubrey; Northup, Emily; Wang, Dali; Kusterer, John; Quam, Brandi; Chen, Gao

    2015-01-01

    The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for the ingest, archive, and distribution of NASA Earth Science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The ASDC specializes in atmospheric data that is important to understanding the causes and processes of global climate change and the consequences of human activities on the climate. The ASDC currently supports more than 44 projects and has over 1,700 archived data sets, which increase daily. ASDC customers include scientists, researchers, federal, state, and local governments, academia, industry, and application users, the remote sensing community, and the general public.

  20. Using the Drosophila Nephrocyte to Model Podocyte Function and Disease

    PubMed Central

    Helmstädter, Martin; Huber, Tobias B.; Hermle, Tobias

    2017-01-01

    Glomerular disorders are a major cause of end-stage renal disease and effective therapies are often lacking. Nephrocytes are considered to be part of the Drosophila excretory system and form slit diaphragms across cellular membrane invaginations. Nehphrocytes have been shown to share functional, morphological, and molecular features with podocytes, which form the glomerular filter in vertebrates. Here, we report the progress and the evolving tool-set of this model system. Combining a functional, accessible slit diaphragm with the power of the genetic tool-kit in Drosophila, the nephrocyte has the potential to greatly advance our understanding of the glomerular filtration barrier in health and disease. PMID:29270398

  1. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  2. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  3. Gender, religion, and sociopolitical issues in cross-cultural online education.

    PubMed

    Zaidi, Zareen; Verstegen, Daniëlle; Naqvi, Rahat; Morahan, Page; Dornan, Tim

    2016-05-01

    Cross-cultural education is thought to develop critical consciousness of how unequal distributions of power and privilege affect people's health. Learners in different sociopolitical settings can join together in developing critical consciousness-awareness of power and privilege dynamics in society-by means of communication technology. The aim of this research was to define strengths and limitations of existing cross-cultural discussions in generating critical consciousness. The setting was the FAIMER international fellowship program for mid-career interdisciplinary health faculty, whose goal is to foster global advancement of health professions education. Fellows take part in participant-led, online, written, task-focused discussions on topics like professionalism, community health, and leadership. We reflexively identified text that brought sociopolitical topics into the online environment during the years 2011 and 2012 and used a discourse analysis toolset to make our content analysis relevant to critical consciousness. While references to participants' cultures and backgrounds were infrequent, narratives of political-, gender-, religion-, and other culture-related topics did emerge. When participants gave accounts of their experiences and exchanged cross-cultural stories, they were more likely to develop ad hoc networks to support one another in facing those issues than explore issues relating to the development of critical consciousness. We suggest that cross-cultural discussions need to be facilitated actively to transform learners' frames of reference, create critical consciousness, and develop cultural competence. Further research is needed into how to provide a safe environment for such learning and provide faculty development for the skills needed to facilitate these exchanges.

  4. Model-Driven Development of Safety Architectures

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2017-01-01

    We describe the use of model-driven development for safety assurance of a pioneering NASA flight operation involving a fleet of small unmanned aircraft systems (sUAS) flying beyond visual line of sight. The central idea is to develop a safety architecture that provides the basis for risk assessment and visualization within a safety case, the formal justification of acceptable safety required by the aviation regulatory authority. A safety architecture is composed from a collection of bow tie diagrams (BTDs), a practical approach to manage safety risk by linking the identified hazards to the appropriate mitigation measures. The safety justification for a given unmanned aircraft system (UAS) operation can have many related BTDs. In practice, however, each BTD is independently developed, which poses challenges with respect to incremental development, maintaining consistency across different safety artifacts when changes occur, and in extracting and presenting stakeholder specific information relevant for decision making. We show how a safety architecture reconciles the various BTDs of a system, and, collectively, provide an overarching picture of system safety, by considering them as views of a unified model. We also show how it enables model-driven development of BTDs, replete with validations, transformations, and a range of views. Our approach, which we have implemented in our toolset, AdvoCATE, is illustrated with a running example drawn from a real UAS safety case. The models and some of the innovations described here were instrumental in successfully obtaining regulatory flight approval.

  5. Monitoring Evolution at CERN

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Fiorini, B.; Murphy, S.; Pigueiras, L.; Santos, M.

    2015-12-01

    Over the past two years, the operation of the CERN Data Centres went through significant changes with the introduction of new mechanisms for hardware procurement, new services for cloud provisioning and configuration management, among other improvements. These changes resulted in an increase of resources being operated in a more dynamic environment. Today, the CERN Data Centres provide over 11000 multi-core processor servers, 130 PB disk servers, 100 PB tape robots, and 150 high performance tape drives. To cope with these developments, an evolution of the data centre monitoring tools was also required. This modernisation was based on a number of guiding rules: sustain the increase of resources, adapt to the new dynamic nature of the data centres, make monitoring data easier to share, give more flexibility to Service Managers on how they publish and consume monitoring metrics and logs, establish a common repository of monitoring data, optimise the handling of monitoring notifications, and replace the previous toolset by new open source technologies with large adoption and community support. This contribution describes how these improvements were delivered, present the architecture and technologies of the new monitoring tools, and review the experience of its production deployment.

  6. A model-based design and validation approach with OMEGA-UML and the IF toolset

    NASA Astrophysics Data System (ADS)

    Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh

    2009-03-01

    Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.

  7. Montana StreamStats

    USGS Publications Warehouse

    2016-04-05

    About this volumeMontana StreamStats is a Web-based geographic information system (http://water.usgs.gov/osw/streamstats/) application that provides users with access to basin and streamflow characteristics for gaged and ungaged streams in Montana. Montana StreamStats was developed by the U.S. Geological Survey (USGS) in cooperation with the Montana Departments of Transportation, Environmental Quality, and Natural Resources and Conservation. The USGS Scientific Investigations Report consists of seven independent but complementary chapters dealing with various aspects of this effort.Chapter A describes the Montana StreamStats application, the basin and streamflow datasets, and provides a brief overview of the streamflow characteristics and regression equations used in the study. Chapters B through E document the datasets, methods, and results of analyses to determine streamflow characteristics, such as peak-flow frequencies, low-flow frequencies, and monthly and annual characteristics, for USGS streamflow-gaging stations in and near Montana. The StreamStats analytical toolsets that allow users to delineate drainage basins and solve regression equations to estimate streamflow characteristics at ungaged sites in Montana are described in Chapters F and G.

  8. Space Weather Status for Exploration Radiation Protection

    NASA Technical Reports Server (NTRS)

    Fry, Dan J.; Lee, Kerry; Zapp, Neal; Barzilla, Janet; Dunegan, Audrey; Johnson, Steve; Stoffle, Nicholas

    2011-01-01

    Management of crew exposure to radiation is a major concern for manned spaceflight and will be even more important for the modern concept of longer-duration exploration. The inherent protection afforded to astronauts by the magnetic field of the Earth in Low Earth Orbit (LEO) makes operations on the space shuttle or space station very different from operations during an exploration mission. In order to experience significant radiation-derived Loss of Mission (LOM) or Loss of Crew (LOC) risk for LEO operations, one is almost driven to dictate extreme duration or to dictate an extreme sequence of solar activity. Outside of the geo-magnetosphere, however, this scenario changes dramatically. Exposures to the same event on the ISS and in free space, for example, may differ by orders of magnitude. This change in magnitude, coupled with the logistical constraints present in implementing any practical operational mitigation make situational awareness with regard to space weather a limiting factor for the ability to conduct exploration operations. We present a current status of developing operational concepts for manned exploration and expectations for asset viability and available predictive and characterization toolsets.

  9. Investigation of REST-Class Hypersonic Inlet Designs

    NASA Technical Reports Server (NTRS)

    Gollan, Rowan; Ferlemann, Paul G.

    2011-01-01

    Rectangular-to-elliptical shape-transition (REST) inlets are of interest for use on scramjet engines because they are efficient and integrate well with the forebody of a planar vehicle. The classic design technique by Smart for these inlets produces an efficient inlet but the complex three-dimensional viscous effects are only approximately included. Certain undesirable viscous features often occur in these inlets. In the present work, a design toolset has been developed which allows for rapid design of REST-class inlet geometries and the subsequent Navier-Stokes analysis of the inlet performance. This gives the designer feedback on the complex viscous effects at each design iteration. This new tool is applied to design an inlet for on-design operation at Mach 8. The tool allows for rapid investigation of design features that was previously not possible. The outcome is that the inlet shape can be modified to affect aspects of the flow field in a positive way. In one particular example, the boundary layer build-up on the bodyside of the inlet was reduced by 20% of the thickness associated with the classically designed inlet shape.

  10. Estimating the HVAC energy consumption of plug-in electric vehicles

    NASA Astrophysics Data System (ADS)

    Kambly, Kiran R.; Bradley, Thomas H.

    2014-08-01

    Plug in electric vehicles are vehicles that use energy from the electric grid to provide tractive and accessory power to the vehicle. Due to the limited specific energy of energy storage systems, the energy requirements of heating, ventilation, and air conditioning (HVAC) systems for cabin conditioning can significantly reduce their range between charges. Factors such as local ambient temperature, local solar radiation, local humidity, length of the trip and thermal soak have been identified as primary drivers of cabin conditioning loads and therefore of vehicle range. The objective of this paper is to develop a detailed systems-level approach to connect HVAC technologies and usage conditions to consumer-centric metrics of vehicle performance including energy consumption and range. This includes consideration of stochastic and transient inputs to the HVAC energy consumption model including local weather, solar loads, driving behavior, charging behavior, and regional passenger fleet population. The resulting engineering toolset is used to determine the summation of and geographical distribution of energy consumption by HVAC systems in electric vehicles, and to identify regions of US where the distributions of electric vehicle range are particularly sensitive to climate.

  11. Spatiotemporal control of opioid signaling and behavior.

    PubMed

    Siuda, Edward R; Copits, Bryan A; Schmidt, Martin J; Baird, Madison A; Al-Hasani, Ream; Planer, William J; Funderburk, Samuel C; McCall, Jordan G; Gereau, Robert W; Bruchas, Michael R

    2015-05-20

    Optogenetics is now a widely accepted tool for spatiotemporal manipulation of neuronal activity. However, a majority of optogenetic approaches use binary on/off control schemes. Here, we extend the optogenetic toolset by developing a neuromodulatory approach using a rationale-based design to generate a Gi-coupled, optically sensitive, mu-opioid-like receptor, which we term opto-MOR. We demonstrate that opto-MOR engages canonical mu-opioid signaling through inhibition of adenylyl cyclase, activation of MAPK and G protein-gated inward rectifying potassium (GIRK) channels and internalizes with kinetics similar to that of the mu-opioid receptor. To assess in vivo utility, we expressed a Cre-dependent viral opto-MOR in RMTg/VTA GABAergic neurons, which led to a real-time place preference. In contrast, expression of opto-MOR in GABAergic neurons of the ventral pallidum hedonic cold spot led to real-time place aversion. This tool has generalizable application for spatiotemporal control of opioid signaling and, furthermore, can be used broadly for mimicking endogenous neuronal inhibition pathways. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Finding Feasible Abstract Counter-Examples

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.

  13. Evolutionary Optimization of a Geometrically Refined Truss

    NASA Technical Reports Server (NTRS)

    Hull, P. V.; Tinker, M. L.; Dozier, G. V.

    2007-01-01

    Structural optimization is a field of research that has experienced noteworthy growth for many years. Researchers in this area have developed optimization tools to successfully design and model structures, typically minimizing mass while maintaining certain deflection and stress constraints. Numerous optimization studies have been performed to minimize mass, deflection, and stress on a benchmark cantilever truss problem. Predominantly traditional optimization theory is applied to this problem. The cross-sectional area of each member is optimized to minimize the aforementioned objectives. This Technical Publication (TP) presents a structural optimization technique that has been previously applied to compliant mechanism design. This technique demonstrates a method that combines topology optimization, geometric refinement, finite element analysis, and two forms of evolutionary computation: genetic algorithms and differential evolution to successfully optimize a benchmark structural optimization problem. A nontraditional solution to the benchmark problem is presented in this TP, specifically a geometrically refined topological solution. The design process begins with an alternate control mesh formulation, multilevel geometric smoothing operation, and an elastostatic structural analysis. The design process is wrapped in an evolutionary computing optimization toolset.

  14. Modeling Reservoir-River Networks in Support of Optimizing Seasonal-Scale Reservoir Operations

    NASA Astrophysics Data System (ADS)

    Villa, D. L.; Lowry, T. S.; Bier, A.; Barco, J.; Sun, A.

    2011-12-01

    HydroSCOPE (Hydropower Seasonal Concurrent Optimization of Power and the Environment) is a seasonal time-scale tool for scenario analysis and optimization of reservoir-river networks. Developed in MATLAB, HydroSCOPE is an object-oriented model that simulates basin-scale dynamics with an objective of optimizing reservoir operations to maximize revenue from power generation, reliability in the water supply, environmental performance, and flood control. HydroSCOPE is part of a larger toolset that is being developed through a Department of Energy multi-laboratory project. This project's goal is to provide conventional hydropower decision makers with better information to execute their day-ahead and seasonal operations and planning activities by integrating water balance and operational dynamics across a wide range of spatial and temporal scales. This presentation details the modeling approach and functionality of HydroSCOPE. HydroSCOPE consists of a river-reservoir network model and an optimization routine. The river-reservoir network model simulates the heat and water balance of river-reservoir networks for time-scales up to one year. The optimization routine software, DAKOTA (Design Analysis Kit for Optimization and Terascale Applications - dakota.sandia.gov), is seamlessly linked to the network model and is used to optimize daily volumetric releases from the reservoirs to best meet a set of user-defined constraints, such as maximizing revenue while minimizing environmental violations. The network model uses 1-D approximations for both the reservoirs and river reaches and is able to account for surface and sediment heat exchange as well as ice dynamics for both models. The reservoir model also accounts for inflow, density, and withdrawal zone mixing, and diffusive heat exchange. Routing for the river reaches is accomplished using a modified Muskingum-Cunge approach that automatically calculates the internal timestep and sub-reach lengths to match the conditions of each timestep and minimize computational overhead. Power generation for each reservoir is estimated using a 2-dimensional regression that accounts for both the available head and turbine efficiency. The object-oriented architecture makes run configuration easy to update. The dynamic model inputs include inflow and meteorological forecasts while static inputs include bathymetry data, reservoir and power generation characteristics, and topological descriptors. Ensemble forecasts of hydrological and meteorological conditions are supplied in real-time by Pacific Northwest National Laboratory and are used as a proxy for uncertainty, which is carried through the simulation and optimization process to produce output that describes the probability that different operational scenario's will be optimal. The full toolset, which includes HydroSCOPE, is currently being tested on the Feather River system in Northern California and the Upper Colorado Storage Project.

  15. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  16. Developing an Automated Method for Detection of Operationally Relevant Ocean Fronts and Eddies

    NASA Astrophysics Data System (ADS)

    Rogers-Cotrone, J. D.; Cadden, D. D. H.; Rivera, P.; Wynn, L. L.

    2016-02-01

    Since the early 90's, the U.S. Navy has utilized an observation-based process for identification of frontal systems and eddies. These Ocean Feature Assessments (OFA) rely on trained analysts to identify and position ocean features using satellite-observed sea surface temperatures. Meanwhile, as enhancements and expansion of the navy's Hybrid Coastal Ocean Model (HYCOM) and Regional Navy Coastal Ocean Model (RNCOM) domains have proceeded, the Naval Oceanographic Office (NAVO) has provided Tactical Oceanographic Feature Assessments (TOFA) that are based on data-validated model output but also rely on analyst identification of significant features. A recently completed project has migrated OFA production to the ArcGIS-based Acoustic Reach-back Cell Ocean Analysis Suite (ARCOAS), enabling use of additional observational datasets and significantly decreasing production time; however, it has highlighted inconsistencies inherent to this analyst-based identification process. Current efforts are focused on development of an automated method for detecting operationally significant fronts and eddies that integrates model output and observational data on a global scale. Previous attempts to employ techniques from the scientific community have been unable to meet the production tempo at NAVO. Thus, a system that incorporates existing techniques (Marr-Hildreth, Okubo-Weiss, etc.) with internally-developed feature identification methods (from model-derived physical and acoustic properties) is required. Ongoing expansions to the ARCOAS toolset have shown promising early results.

  17. Parallel computing on Unix workstation arrays

    NASA Astrophysics Data System (ADS)

    Reale, F.; Bocchino, F.; Sciortino, S.

    1994-12-01

    We have tested arrays of general-purpose Unix workstations used as MIMD systems for massive parallel computations. In particular we have solved numerically a demanding test problem with a 2D hydrodynamic code, generally developed to study astrophysical flows, by exucuting it on arrays either of DECstations 5000/200 on Ethernet LAN, or of DECstations 3000/400, equipped with powerful Alpha processors, on FDDI LAN. The code is appropriate for data-domain decomposition, and we have used a library for parallelization previously developed in our Institute, and easily extended to work on Unix workstation arrays by using the PVM software toolset. We have compared the parallel efficiencies obtained on arrays of several processors to those obtained on a dedicated MIMD parallel system, namely a Meiko Computing Surface (CS-1), equipped with Intel i860 processors. We discuss the feasibility of using non-dedicated parallel systems and conclude that the convenience depends essentially on the size of the computational domain as compared to the relative processor power and network bandwidth. We point out that for future perspectives a parallel development of processor and network technology is important, and that the software still offers great opportunities of improvement, especially in terms of latency times in the message-passing protocols. In conditions of significant gain in terms of speedup, such workstation arrays represent a cost-effective approach to massive parallel computations.

  18. Progress in development of HEDP capabilities in FLASH's Unsplit Staggered Mesh MHD solver

    NASA Astrophysics Data System (ADS)

    Lee, D.; Xia, G.; Daley, C.; Dubey, A.; Gopal, S.; Graziani, C.; Lamb, D.; Weide, K.

    2011-11-01

    FLASH is a publicly available astrophysical community code designed to solve highly compressible multi-physics reactive flows. We are adding capabilities to FLASH that will make it an open science code for the academic HEDP community. Among many important numerical requirements, we consider the following features to be important components necessary to meet our goals for FLASH as an HEDP open toolset. First, we are developing computationally efficient time-stepping integration methods that overcome the stiffness that arises in the equations describing a physical problem when there are disparate time scales. To this end, we are adding two different time-stepping schemes to FLASH that relax the time step limit when diffusive effects are present: an explicit super-time-stepping algorithm (Alexiades et al. in Com. Num. Mech. Eng. 12:31-42, 1996) and a Jacobian-Free Newton-Krylov implicit formulation. These two methods will be integrated into a robust, efficient, and high-order accurate Unsplit Staggered Mesh MHD (USM) solver (Lee and Deane in J. Comput. Phys. 227, 2009). Second, we have implemented an anisotropic Spitzer-Braginskii conductivity model to treat thermal heat conduction along magnetic field lines. Finally, we are implementing the Biermann Battery term to account for spontaneous generation of magnetic fields in the presence of non-parallel temperature and density gradients.

  19. Music to knowledge: A visual programming environment for the development and evaluation of music information retrieval techniques

    NASA Astrophysics Data System (ADS)

    Ehmann, Andreas F.; Downie, J. Stephen

    2005-09-01

    The objective of the International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) project is the creation of a large, secure corpus of audio and symbolic music data accessible to the music information retrieval (MIR) community for the testing and evaluation of various MIR techniques. As part of the IMIRSEL project, a cross-platform JAVA based visual programming environment called Music to Knowledge (M2K) is being developed for a variety of music information retrieval related tasks. The primary objective of M2K is to supply the MIR community with a toolset that provides the ability to rapidly prototype algorithms, as well as foster the sharing of techniques within the MIR community through the use of a standardized set of tools. Due to the relatively large size of audio data and the computational costs associated with some digital signal processing and machine learning techniques, M2K is also designed to support distributed computing across computing clusters. In addition, facilities to allow the integration of non-JAVA based (e.g., C/C++, MATLAB, etc.) algorithms and programs are provided within M2K. [Work supported by the Andrew W. Mellon Foundation and NSF Grants No. IIS-0340597 and No. IIS-0327371.

  20. Key factors for determining groundwater impacts due to leakage from geologic carbon sequestration reservoirs

    DOE PAGES

    Carroll, Susan A.; Keating, Elizabeth; Mansoor, Kayyum; ...

    2014-09-07

    The National Risk Assessment Partnership (NRAP) is developing a science-based toolset for the analysis of potential impacts to groundwater chemistry from CO 2 injection (www.netldoe.gov/nrap). The toolset adopts a stochastic approach in which predictions address uncertainties in shallow groundwater and leakage scenarios. It is derived from detailed physics and chemistry simulation results that are used to train more computationally efficient models, referred to here as reduced-order models (ROMs), for each component system. In particular, these tools can be used to help regulators and operators understand the expected sizes and longevity of plumes in pH, TDS, and dissolved metals that couldmore » result from a leakage of brine and/or CO 2 from a storage reservoir into aquifers. This information can inform, for example, decisions on monitoring strategies that are both effective and efficient. We have used this approach to develop predictive reduced-order models for two common types of reservoirs, but the approach could be used to develop a model for a specific aquifer or other common types of aquifers. In this paper we describe potential impacts to groundwater quality due to CO 2 and brine leakage, discuss an approach to calculate thresholds under which no impact to groundwater occurs, describe the time scale for impact on groundwater, and discuss the probability of detecting a groundwater plume should leakage occur. To facilitate this, multi-phase flow and reactive transport simulations and emulations were developed for two classes of aquifers, considering uncertainty in leakage source terms and aquifer hydrogeology. We targeted an unconfined fractured carbonate aquifer based on the Edwards aquifer in Texas and a confined alluvium aquifer based on the High Plains Aquifer in Kansas, which share characteristics typical of many drinking water aquifers in the United States. The hypothetical leakage scenarios centered on the notion that wellbores are the most likely conduits for brine and CO 2 leaks. Leakage uncertainty was based on hypothetical injection of CO 2 for 50 years at a rate of 5 million tons per year into a depleted oil/gas reservoir with high permeability and, one or more wells provided leakage pathways from the storage reservoir to the overlying aquifer. This scenario corresponds to a storage site with historical oil/gas production and some poorly completed legacy wells that went undetected through site evaluation, operations, and post-closure. For the aquifer systems and leakage scenarios studied here, CO 2 and brine leakage are likely to drive pH below and increase total dissolved solids (TDS) above the “no-impact thresholds;” and the subsequent plumes, although small, are likely to persist for long periods of time in the absence of remediation. In these scenarios, however, risk to human health may not be significant for two reasons. First, our simulated plume volumes are much smaller than the average inter-well spacing for these representative aquifers, so the impacted groundwater would be unlikely to be pumped for drinking water. Second, even within the impacted plume volumes little water exceeds the primary maximum contamination levels.« less

  1. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  2. Antimicrobial peptide GH12 suppresses cariogenic virulence factors of Streptococcus mutans

    PubMed Central

    Wang, Yufei; Wang, Xiuqing; Jiang, Wentao; Wang, Kun; Luo, Junyuan; Li, Wei; Zhou, Xuedong; Zhang, Linglin

    2018-01-01

    ABSTRACT Cariogenic virulence factors of Streptococcus mutans include acidogenicity, aciduricity, and extracellular polysaccharides (EPS) synthesis. The de novo designed antimicrobial peptide GH12 has shown bactericidal effects on S. mutans, but its interaction with virulence and regulatory systems of S. mutans remains to be elucidated. The objectives were to investigate the effects of GH12 on virulence factors of S. mutans, and further explore the function mechanisms at enzymatic and transcriptional levels. To avoid decrease in bacterial viability, we limited GH12 to subinhibitory levels. We evaluated effects of GH12 on acidogenicity of S. mutans by pH drop, lactic acid measurement and lactate dehydrogenase (LDH) assay, on aciduricity through survival rate at pH 5.0 and F1F0-ATPase assay, and on EPS synthesis using quantitative measurement, morphology observation, vertical distribution analyses and biomass calculation. Afterwards, we conducted quantitative real-time PCR to acquire the expression profile of related genes. GH12 at 1/2 MIC (4 mg/L) inhibited acid production, survival rate, EPS synthesis, and biofilm formation. The enzymatic activity of LDH and F1F0-ATPase was inhibited, and ldh, gtfBCD, vicR, liaR, and comDE genes were significantly downregulated. In conclusion, GH12 inhibited virulence factors of S. mutans, through reducing the activity of related enzymes, downregulating virulence genes, and inactivating specific regulatory systems. PMID:29503706

  3. High-power operation of AlGaInP red laser diode for display applications

    NASA Astrophysics Data System (ADS)

    Kuramoto, K.; Nishida, T.; Abe, S.; Miyashita, M.; Mori, K.; Yagi, T.

    2015-03-01

    Substantial limitation of output power in AlGaInP based red broad area (BA) laser diode (LD) originates from an electron thermal overflow from an active layer to a p-cladding layer and fatal failure due to catastrophic optical mirror degradation during the LD operation. New red BA-LD was designed and fabricated. The LD chip had triple emitters in one chip with each stripe width of 60 um, and was assembled on Φ9.0 mm -TO package. The LD emitted exceeding 5.5 W at heat sink temperature of 25 °C and 3.8W at 45 °C under pulsed operation with frequency of 120Hz and duty of 30%, although the current product, which has a 40 um single emitter chip assembled on Φ5.6mm -TO, does 2.0 W at 25 °C. The lasing wavelength at 25 °C and 2.5W output was 638.6 nm. The preliminary aging test under the condition with the operation current of 3.56A, CW, auto-current-control mode (ACC), and the heat sink temperature of 20 °C (almost equal to the output of 3.5 W) indicated that the MTTF due to COMD was longer than 6,600 hours under CW, 22,000 hours under the pulse with duty of 30%.

  4. Axial and Centrifugal Compressor Mean Line Flow Analysis Method

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2009-01-01

    This paper describes a method to estimate key aerodynamic parameters of single and multistage axial and centrifugal compressors. This mean-line compressor code COMDES provides the capability of sizing single and multistage compressors quickly during the conceptual design process. Based on the compressible fluid flow equations and the Euler equation, the code can estimate rotor inlet and exit blade angles when run in the design mode. The design point rotor efficiency and stator losses are inputs to the code, and are modeled at off design. When run in the off-design analysis mode, it can be used to generate performance maps based on simple models for losses due to rotor incidence and inlet guide vane reset angle. The code can provide an improved understanding of basic aerodynamic parameters such as diffusion factor, loading levels and incidence, when matching multistage compressor blade rows at design and at part-speed operation. Rotor loading levels and relative velocity ratio are correlated to the onset of compressor surge. NASA Stage 37 and the three-stage NASA 74-A axial compressors were analyzed and the results compared to test data. The code has been used to generate the performance map for the NASA 76-B three-stage axial compressor featuring variable geometry. The compressor stages were aerodynamically matched at off-design speeds by adjusting the variable inlet guide vane and variable stator geometry angles to control the rotor diffusion factor and incidence angles.

  5. Lensfree On-Chip Microscopy and Tomography for Bio-Medical Applications

    PubMed Central

    Isikman, Serhan O.; Bishara, Waheb; Mudanyali, Onur; Sencan, Ikbal; Su, Ting-Wei; Tseng, Derek; Yaglidere, Oguzhan; Sikora, Uzair; Ozcan, Aydogan

    2012-01-01

    Lensfree on-chip holographic microscopy is an emerging technique that offers imaging of biological specimens over a large field-of-view without using any lenses or bulky optical components. Lending itself to a compact, cost-effective and mechanically robust architecture, lensfree on-chip holographic microscopy can offer an alternative toolset addressing some of the emerging needs of microscopic analysis and diagnostics in low-resource settings, especially for telemedicine applications. In this review, we summarize the latest achievements in lensfree optical microscopy based on partially coherent on-chip holography, including portable telemedicine microscopy, cell-phone based microscopy and field-portable optical tomographic microscopy. We also discuss some of the future directions for telemedicine microscopy and its prospects to help combat various global health challenges. PMID:24478572

  6. Rapid Analysis of Mass Distribution of Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Zapp, Edward

    2007-01-01

    Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.

  7. FAST - A multiprocessed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    The paper presents the Flow Analysis Software Toolset (FAST) to be used for fluid-mechanics analysis. The design criteria for FAST including the minimization of the data path in the computational fluid-dynamics (CFD) process, consistent user interface, extensible software architecture, modularization, and the isolation of three-dimensional tasks from the application programmer are outlined. Each separate process communicates through the FAST Hub, while other modules such as FAST Central, NAS file input, CFD calculator, surface extractor and renderer, titler, tracer, and isolev might work together to generate the scene. An interprocess communication package making it possible for FAST to operate as a modular environment where resources could be shared among different machines as well as a single host is discussed.

  8. Multiscale approach for the construction of equilibrated all-atom models of a poly(ethylene glycol)-based hydrogel

    PubMed Central

    Li, Xianfeng; Murthy, N. Sanjeeva; Becker, Matthew L.; Latour, Robert A.

    2016-01-01

    A multiscale modeling approach is presented for the efficient construction of an equilibrated all-atom model of a cross-linked poly(ethylene glycol) (PEG)-based hydrogel using the all-atom polymer consistent force field (PCFF). The final equilibrated all-atom model was built with a systematic simulation toolset consisting of three consecutive parts: (1) building a global cross-linked PEG-chain network at experimentally determined cross-link density using an on-lattice Monte Carlo method based on the bond fluctuation model, (2) recovering the local molecular structure of the network by transitioning from the lattice model to an off-lattice coarse-grained (CG) model parameterized from PCFF, followed by equilibration using high performance molecular dynamics methods, and (3) recovering the atomistic structure of the network by reverse mapping from the equilibrated CG structure, hydrating the structure with explicitly represented water, followed by final equilibration using PCFF parameterization. The developed three-stage modeling approach has application to a wide range of other complex macromolecular hydrogel systems, including the integration of peptide, protein, and/or drug molecules as side-chains within the hydrogel network for the incorporation of bioactivity for tissue engineering, regenerative medicine, and drug delivery applications. PMID:27013229

  9. Genexpi: a toolset for identifying regulons and validating gene regulatory networks using time-course expression data.

    PubMed

    Modrák, Martin; Vohradský, Jiří

    2018-04-13

    Identifying regulons of sigma factors is a vital subtask of gene network inference. Integrating multiple sources of data is essential for correct identification of regulons and complete gene regulatory networks. Time series of expression data measured with microarrays or RNA-seq combined with static binding experiments (e.g., ChIP-seq) or literature mining may be used for inference of sigma factor regulatory networks. We introduce Genexpi: a tool to identify sigma factors by combining candidates obtained from ChIP experiments or literature mining with time-course gene expression data. While Genexpi can be used to infer other types of regulatory interactions, it was designed and validated on real biological data from bacterial regulons. In this paper, we put primary focus on CyGenexpi: a plugin integrating Genexpi with the Cytoscape software for ease of use. As a part of this effort, a plugin for handling time series data in Cytoscape called CyDataseries has been developed and made available. Genexpi is also available as a standalone command line tool and an R package. Genexpi is a useful part of gene network inference toolbox. It provides meaningful information about the composition of regulons and delivers biologically interpretable results.

  10. Looking Down Through the Clouds – Optical Attenuation through Real-Time Clouds

    NASA Astrophysics Data System (ADS)

    Burley, J.; Lazarewicz, A.; Dean, D.; Heath, N.

    Detecting and identifying nuclear explosions in the atmosphere and on the surface of the Earth is critical for the Air Force Technical Applications Center (AFTAC) treaty monitoring mission. Optical signals, from surface or atmospheric nuclear explosions detected by satellite sensors, are attenuated by the atmosphere and clouds. Clouds present a particularly complex challenge as they cover up to seventy percent of the earth's surface. Moreover, their highly variable and diverse nature requires physics-based modeling. Determining the attenuation for each optical ray-path is uniquely dependent on the source geolocation, the specific optical transmission characteristics along that ray path, and sensor detection capabilities. This research details a collaborative AFTAC and AFIT effort to fuse worldwide weather data, from a variety of sources, to provide near-real-time profiles of atmospheric and cloud conditions and the resulting radiative transfer analysis for virtually any wavelength(s) of interest from source to satellite. AFIT has developed a means to model global clouds using the U.S. Air Force’s World Wide Merged Cloud Analysis (WWMCA) cloud data in a new toolset that enables radiance calculations through clouds from UV to RF wavelengths.

  11. The Use of Genomics in Conservation Management of the Endangered Visayan Warty Pig (Sus cebifrons).

    PubMed

    Nuijten, Rascha J M; Bosse, Mirte; Crooijmans, Richard P M A; Madsen, Ole; Schaftenaar, Willem; Ryder, Oliver A; Groenen, Martien A M; Megens, Hendrik-Jan

    2016-01-01

    The list of threatened and endangered species is growing rapidly, due to various anthropogenic causes. Many endangered species are present in captivity and actively managed in breeding programs in which often little is known about the founder individuals. Recent developments in genetic research techniques have made it possible to sequence and study whole genomes. In this study we used the critically endangered Visayan warty pig (Sus cebifrons) as a case study to test the use of genomic information as a tool in conservation management. Two captive populations of S. cebifrons exist, which originated from two different Philippine islands. We found some evidence for a recent split between the two island populations; however all individuals that were sequenced show a similar demographic history. Evidence for both past and recent inbreeding indicated that the founders were at least to some extent related. Together with this, the low level of nucleotide diversity compared to other Sus species potentially poses a threat to the viability of the captive populations. In conclusion, genomic techniques answered some important questions about this critically endangered mammal and can be a valuable toolset to inform future conservation management in other species as well.

  12. Rapid-X - An FPGA Development Toolset Using a Custom Simulink Library for MTCA.4 Modules

    NASA Astrophysics Data System (ADS)

    Prędki, Paweł; Heuer, Michael; Butkowski, Łukasz; Przygoda, Konrad; Schlarb, Holger; Napieralski, Andrzej

    2015-06-01

    The recent introduction of advanced hardware architectures such as the Micro Telecommunications Computing Architecture (MTCA) caused a change in the approach to implementation of control schemes in many fields. The development has been moving away from traditional programming languages ( C/C++), to hardware description languages (VHDL, Verilog), which are used in FPGA development. With MATLAB/Simulink it is possible to describe complex systems with block diagrams and simulate their behavior. Those diagrams are then used by the HDL experts to implement exactly the required functionality in hardware. Both the porting of existing applications and adaptation of new ones require a lot of development time from them. To solve this, Xilinx System Generator, a toolbox for MATLAB/Simulink, allows rapid prototyping of those block diagrams using hardware modelling. It is still up to the firmware developer to merge this structure with the hardware-dependent HDL project. This prevents the application engineer from quickly verifying the proposed schemes in real hardware. The framework described in this article overcomes these challenges, offering a hardware-independent library of components that can be used in Simulink/System Generator models. The components are subsequently translated into VHDL entities and integrated with a pre-prepared VHDL project template. Furthermore, the entire implementation process is run in the background, giving the user an almost one-click path from control scheme modelling and simulation to bit-file generation. This approach allows the application engineers to quickly develop new schemes and test them in real hardware environment. The applications may range from simple data logging or signal generation ones to very advanced controllers. Taking advantage of the Simulink simulation capabilities and user-friendly hardware implementation routines, the framework significantly decreases the development time of FPGA-based applications.

  13. Multidisciplinary model-based-engineering for laser weapon systems: recent progress

    NASA Astrophysics Data System (ADS)

    Coy, Steve; Panthaki, Malcolm

    2013-09-01

    We are working to develop a comprehensive, integrated software framework and toolset to support model-based engineering (MBE) of laser weapons systems. MBE has been identified by the Office of the Director, Defense Science and Engineering as one of four potentially "game-changing" technologies that could bring about revolutionary advances across the entire DoD research and development and procurement cycle. To be effective, however, MBE requires robust underlying modeling and simulation technologies capable of modeling all the pertinent systems, subsystems, components, effects, and interactions at any level of fidelity that may be required in order to support crucial design decisions at any point in the system development lifecycle. Very often the greatest technical challenges are posed by systems involving interactions that cut across two or more distinct scientific or engineering domains; even in cases where there are excellent tools available for modeling each individual domain, generally none of these domain-specific tools can be used to model the cross-domain interactions. In the case of laser weapons systems R&D these tools need to be able to support modeling of systems involving combined interactions among structures, thermal, and optical effects, including both ray optics and wave optics, controls, atmospheric effects, target interaction, computational fluid dynamics, and spatiotemporal interactions between lasing light and the laser gain medium. To address this problem we are working to extend Comet™, to add the addition modeling and simulation capabilities required for this particular application area. In this paper we will describe our progress to date.

  14. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  15. The application of multiple biophysical cues to engineer functional neocartilage for treatment of osteoarthritis. Part II: signal transduction.

    PubMed

    Brady, Mariea A; Waldman, Stephen D; Ethier, C Ross

    2015-02-01

    The unique mechanoelectrochemical environment of cartilage has motivated researchers to investigate the effect of multiple biophysical cues, including mechanical, magnetic, and electrical stimulation, on chondrocyte biology. It is well established that biophysical stimuli promote chondrocyte proliferation, differentiation, and maturation within "biological windows" of defined dose parameters, including mode, frequency, magnitude, and duration of stimuli (see companion review Part I: Cellular Response). However, the underlying molecular mechanisms and signal transduction pathways activated in response to multiple biophysical stimuli remain to be elucidated. Understanding the mechanisms of biophysical signal transduction will deepen knowledge of tissue organogenesis, remodeling, and regeneration and aiding in the treatment of pathologies such as osteoarthritis. Further, this knowledge will provide the tissue engineer with a potent toolset to manipulate and control cell fate and subsequently develop functional replacement cartilage. The aim of this article is to review chondrocyte signal transduction pathways in response to mechanical, magnetic, and electrical cues. Signal transduction does not occur along a single pathway; rather a number of parallel pathways appear to be activated, with calcium signaling apparently common to all three types of stimuli, though there are different modes of activation. Current tissue engineering strategies, such as the development of "smart" functionalized biomaterials that enable the delivery of growth factors or integration of conjugated nanoparticles, may further benefit from targeting known signal transduction pathways in combination with external biophysical cues.

  16. Improving 130nm node patterning using inverse lithography techniques for an analog process

    NASA Astrophysics Data System (ADS)

    Duan, Can; Jessen, Scott; Ziger, David; Watanabe, Mizuki; Prins, Steve; Ho, Chi-Chien; Shu, Jing

    2018-03-01

    Developing a new lithographic process routinely involves usage of lithographic toolsets and much engineering time to perform data analysis. Process transfers between fabs occur quite often. One of the key assumptions made is that lithographic settings are equivalent from one fab to another and that the transfer is fluid. In some cases, that is far from the truth. Differences in tools can change the proximity effect seen in low k1 imaging processes. If you use model based optical proximity correction (MBOPC), then a model built in one fab will not work under the same conditions at another fab. This results in many wafers being patterned to try and match a baseline response. Even if matching is achieved, there is no guarantee that optimal lithographic responses are met. In this paper, we discuss the approach used to transfer and develop new lithographic processes and define MBOPC builds for the new lithographic process in Fab B which was transferred from a similar lithographic process in Fab A. By using PROLITHTM simulations to match OPC models for each level, minimal downtime in wafer processing was observed. Source Mask Optimization (SMO) was also used to optimize lithographic processes using novel inverse lithography techniques (ILT) to simultaneously optimize mask bias, depth of focus (DOF), exposure latitude (EL) and mask error enhancement factor (MEEF) for critical designs for each level.

  17. Ergonomic assessment of airport shuttle driver tasks using an ergonomic analysis toolset.

    PubMed

    Çakıt, Erman

    2018-06-01

    This study aimed to (a) evaluate strength requirements and lower back stresses during lifting and baggage handling tasks with the 3D Static Strength Prediction Program (3DSSPP) and (b) provide additional analyses using rapid entire body assessment (REBA) and the NASA task load index (TLX) to assess the risks associated with the tasks. Four healthy female shuttle drivers of good health aged between 55 and 60 years were observed and interviewed in an effort to determine the tasks required of their occupations. The results indicated that lifting bags and placing them in a shuttle were high risk for injury and possible changes should be further investigated. The study concluded there was a potential for injury associated with baggage storing and retrieval tasks of a shuttle driver.

  18. Distributing Variable Star Data to the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Kinne, Richard C.; Templeton, M. R.; Henden, A. A.; Zografou, P.; Harbo, P.; Evans, J.; Rots, A. H.; LAZIO, J.

    2013-01-01

    Effective distribution of data is a core element of effective astronomy today. The AAVSO is the home of several different unique databases. The AAVSO International Database (AID) contains over a century of photometric and time-series data on thousands of individual variable stars comprising over 22 million observations. The AAVSO Photometric All-Sky Survey (APASS) is a new photometric catalog containing calibrated photometry in Johnson B, V and Sloan g', r' and i' filters for stars with magnitudes of 10 < V < 17. The AAVSO is partnering with researchers and technologists at the Virtual Astronomical Observatory (VAO) to solve the data distribution problem for these datasets by making them available via various VO tools. We give specific examples of how these data can be accessed through Virtual Observatory (VO) toolsets and utilized for astronomical research.

  19. The zebrafish as a model for complex tissue regeneration

    PubMed Central

    Gemberling, Matthew; Bailey, Travis J.; Hyde, David R.; Poss, Kenneth D.

    2013-01-01

    For centuries, philosophers and scientists have been fascinated by the principles and implications of regeneration in lower vertebrate species. Two features have made zebrafish an informative model system for determining mechanisms of regenerative events. First, they are highly regenerative, able to regrow amputated fins, as well as a lesioned brain, retina, spinal cord, heart, and other tissues. Second, they are amenable to both forward and reverse genetic approaches, with a research toolset regularly updated by an expanding community of zebrafish researchers. Zebrafish studies have helped identify new mechanistic underpinnings of regeneration in multiple tissues, and in some cases have served as a guide for contemplating regenerative strategies in mammals. Here, we review the recent history of zebrafish as a genetic model system for understanding how and why tissue regeneration occurs. PMID:23927865

  20. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    NASA Technical Reports Server (NTRS)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth

    2010-01-01

    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  1. Amanzi: An Open-Source Multi-process Simulator for Environmental Applications

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.

    2014-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.

  2. Coastal On-line Assessment and Synthesis Tool 2.0

    NASA Technical Reports Server (NTRS)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  3. OBO to UML: Support for the development of conceptual models in the biomedical domain.

    PubMed

    Waldemarin, Ricardo C; de Farias, Cléver R G

    2018-04-01

    A conceptual model abstractly defines a number of concepts and their relationships for the purposes of understanding and communication. Once a conceptual model is available, it can also be used as a starting point for the development of a software system. The development of conceptual models using the Unified Modeling Language (UML) facilitates the representation of modeled concepts and allows software developers to directly reuse these concepts in the design of a software system. The OBO Foundry represents the most relevant collaborative effort towards the development of ontologies in the biomedical domain. The development of UML conceptual models in the biomedical domain may benefit from the use of domain-specific semantics and notation. Further, the development of these models may also benefit from the reuse of knowledge contained in OBO ontologies. This paper investigates the support for the development of conceptual models in the biomedical domain using UML as a conceptual modeling language and using the support provided by the OBO Foundry for the development of biomedical ontologies, namely entity kind and relationship types definitions provided by the Basic Formal Ontology (BFO) and the OBO Core Relations Ontology (OBO Core), respectively. Further, the paper investigates the support for the reuse of biomedical knowledge currently available in OBOFFF ontologies in the development these conceptual models. The paper describes a UML profile for the OBO Core Relations Ontology, which basically defines a number of stereotypes to represent BFO entity kinds and OBO Core relationship types definitions. The paper also presents a support toolset consisting of a graphical editor named OBO-RO Editor, which directly supports the development of UML models using the extensions defined by our profile, and a command-line tool named OBO2UML, which directly converts an OBOFFF ontology into a UML model. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Addressing Challenges to the Design & Test of Operational Lighting Environments for the International Space Station

    NASA Technical Reports Server (NTRS)

    Clark, Toni A.

    2014-01-01

    In our day to day lives, the availability of light, with which to see our environment, is often taken for granted. The designers of land based lighting systems use sunlight and artificial light as their toolset. The availability of power, quantity of light sources, and variety of design options are often unlimited. The accessibility of most land based lighting systems makes it easy for the architect and engineer to verify and validate their design ideas. Failures with an implementation, while sometimes costly, can easily be addressed by renovation. Consider now, an architectural facility orbiting in space, 260 miles above the surface of the earth. This human rated architectural facility, the International Space Station (ISS) must maintain operations every day, including life support and appropriate human comforts without fail. The facility must also handle logistics of regular shipments of cargo, including new passengers. The ISS requires accommodations necessary for human control of machine systems. Additionally, the ISS is a research facility and supports investigations performed inside and outside its livable volume. Finally, the facility must support remote operations and observations by ground controllers. All of these architectural needs require a functional, safe, and even an aesthetic lighting environment. At Johnson Space Center, our Habitability and Human Factors team assists our diverse customers with their lighting environment challenges, via physical test and computer based analysis. Because of the complexity of ISS operational environment, our team has learned and developed processes that help ISS operate safely. Because of the dynamic exterior lighting environment, uses computational modeling to predict the lighting environment. The ISS' orbit exposes it to a sunrise every 90 minutes, causing work surfaces to quickly change from direct sunlight to earthshine to total darkness. Proper planning of vehicle approaches, robotics operations, and crewed Extra Vehicular Activities are mandatory to ensure safety to the crew and all others involved. Innovation in testing techniques is important as well. The advent of Solid State Lighting technology and the lack of stable national and international standards for its implementation pose new challenges on how to design, test and verify individual light fixtures and the environment that uses them. The ISS will soon be replacing its internal fluorescent lighting system to a solid state LED system. The Solid State Lighting Assembly will be used not only for general lighting, but also as a medical countermeasure to control the circadian rhythm of the crew. The new light source has performance criteria very specific to its spectral fingerprint, creating new challenges that were originally not as significant during the original design of the ISS. This presentation will showcase findings and toolsets our team is using to assist in the planning of tasks, and design of operational lighting environments on the International Space Station.

  5. Hierarchical Testing with Automated Document Generation for Amanzi, ASCEM's Subsurface Flow and Reactive Transport Simulator

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.

    2013-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).

  6. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  7. Design of experiments (DoE) in pharmaceutical development.

    PubMed

    N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios

    2017-06-01

    At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.

  8. Two's company, three (or more) is a simplex : Algebraic-topological tools for understanding higher-order structure in neural data.

    PubMed

    Giusti, Chad; Ghrist, Robert; Bassett, Danielle S

    2016-08-01

    The language of graph theory, or network science, has proven to be an exceptional tool for addressing myriad problems in neuroscience. Yet, the use of networks is predicated on a critical simplifying assumption: that the quintessential unit of interest in a brain is a dyad - two nodes (neurons or brain regions) connected by an edge. While rarely mentioned, this fundamental assumption inherently limits the types of neural structure and function that graphs can be used to model. Here, we describe a generalization of graphs that overcomes these limitations, thereby offering a broad range of new possibilities in terms of modeling and measuring neural phenomena. Specifically, we explore the use of simplicial complexes: a structure developed in the field of mathematics known as algebraic topology, of increasing applicability to real data due to a rapidly growing computational toolset. We review the underlying mathematical formalism as well as the budding literature applying simplicial complexes to neural data, from electrophysiological recordings in animal models to hemodynamic fluctuations in humans. Based on the exceptional flexibility of the tools and recent ground-breaking insights into neural function, we posit that this framework has the potential to eclipse graph theory in unraveling the fundamental mysteries of cognition.

  9. Airways, vasculature, and interstitial tissue: anatomically informed computational modeling of human lungs for virtual clinical trials

    NASA Astrophysics Data System (ADS)

    Abadi, Ehsan; Sturgeon, Gregory M.; Agasthya, Greeshma; Harrawood, Brian; Hoeschen, Christoph; Kapadia, Anuj; Segars, W. P.; Samei, Ehsan

    2017-03-01

    This study aimed to model virtual human lung phantoms including both non-parenchymal and parenchymal structures. Initial branches of the non-parenchymal structures (airways, arteries, and veins) were segmented from anatomical data in each lobe separately. A volume-filling branching algorithm was utilized to grow the higher generations of the airways and vessels to the level of terminal branches. The diameters of the airways and vessels were estimated using established relationships between flow rates and diameters. The parenchyma was modeled based on secondary pulmonary lobule units. Polyhedral shapes with variable sizes were modeled, and the borders were assigned to interlobular septa. A heterogeneous background was added inside these units using a non-parametric texture synthesis algorithm which was informed by a high-resolution CT lung specimen dataset. A voxelized based CT simulator was developed to create synthetic helical CT images of the phantom with different pitch values. Results showed the progressive degradation in depiction of lung details with increased pitch. Overall, the enhanced lung models combined with the XCAT phantoms prove to provide a powerful toolset to perform virtual clinical trials in the context of thoracic imaging. Such trials, not practical using clinical datasets or simplistic phantoms, can quantitatively evaluate and optimize advanced imaging techniques towards patient-based care.

  10. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  11. Second-generation PLINK: rising to the challenge of larger and richer datasets.

    PubMed

    Chang, Christopher C; Chow, Carson C; Tellier, Laurent Cam; Vattikuti, Shashaank; Purcell, Shaun M; Lee, James J

    2015-01-01

    PLINK 1 is a widely used open-source C/C++ toolset for genome-wide association studies (GWAS) and research in population genetics. However, the steady accumulation of data from imputation and whole-genome sequencing studies has exposed a strong need for faster and scalable implementations of key functions, such as logistic regression, linkage disequilibrium estimation, and genomic distance evaluation. In addition, GWAS and population-genetic data now frequently contain genotype likelihoods, phase information, and/or multiallelic variants, none of which can be represented by PLINK 1's primary data format. To address these issues, we are developing a second-generation codebase for PLINK. The first major release from this codebase, PLINK 1.9, introduces extensive use of bit-level parallelism, [Formula: see text]-time/constant-space Hardy-Weinberg equilibrium and Fisher's exact tests, and many other algorithmic improvements. In combination, these changes accelerate most operations by 1-4 orders of magnitude, and allow the program to handle datasets too large to fit in RAM. We have also developed an extension to the data format which adds low-overhead support for genotype likelihoods, phase, multiallelic variants, and reference vs. alternate alleles, which is the basis of our planned second release (PLINK 2.0). The second-generation versions of PLINK will offer dramatic improvements in performance and compatibility. For the first time, users without access to high-end computing resources can perform several essential analyses of the feature-rich and very large genetic datasets coming into use.

  12. The GMT-Consortium Large Earth Finder (G-CLEF): an optical Echelle spectrograph for the Giant Magellan Telescope (GMT)

    NASA Astrophysics Data System (ADS)

    Szentgyorgyi, Andrew; Baldwin, Daniel; Barnes, Stuart; Bean, Jacob; Ben-Ami, Sagi; Brennan, Patricia; Budynkiewicz, Jamie; Chun, Moo-Young; Conroy, Charlie; Crane, Jeffrey D.; Epps, Harland; Evans, Ian; Evans, Janet; Foster, Jeff; Frebel, Anna; Gauron, Thomas; Guzmán, Dani; Hare, Tyson; Jang, Bi-Ho; Jang, Jeong-Gyun; Jordan, Andres; Kim, Jihun; Kim, Kang-Miin; Mendes de Oliveira, Claudia Mendes; Lopez-Morales, Mercedes; McCracken, Kenneth; McMuldroch, Stuart; Miller, Joseph; Mueller, Mark; Oh, Jae Sok; Onyuksel, Cem; Ordway, Mark; Park, Byeong-Gon; Park, Chan; Park, Sung-Joon; Paxson, Charles; Phillips, David; Plummer, David; Podgorski, William; Seifahrt, Andreas; Stark, Daniel; Steiner, Joao; Uomoto, Alan; Walsworth, Ronald; Yu, Young-Sam

    2016-08-01

    The GMT-Consortium Large Earth Finder (G-CLEF) will be a cross-dispersed, optical band echelle spectrograph to be delivered as the first light scientific instrument for the Giant Magellan Telescope (GMT) in 2022. G-CLEF is vacuum enclosed and fiber-fed to enable precision radial velocity (PRV) measurements, especially for the detection and characterization of low-mass exoplanets orbiting solar-type stars. The passband of G-CLEF is broad, extending from 3500Å to 9500Å. This passband provides good sensitivity at blue wavelengths for stellar abundance studies and deep red response for observations of high-redshift phenomena. The design of G-CLEF incorporates several novel technical innovations. We give an overview of the innovative features of the current design. G-CLEF will be the first PRV spectrograph to have a composite optical bench so as to exploit that material's extremely low coefficient of thermal expansion, high in-plane thermal conductivity and high stiffness-to-mass ratio. The spectrograph camera subsystem is divided into a red and a blue channel, split by a dichroic, so there are two independent refractive spectrograph cameras. The control system software is being developed in model-driven software context that has been adopted globally by the GMT. G-CLEF has been conceived and designed within a strict systems engineering framework. As a part of this process, we have developed a analytical toolset to assess the predicted performance of G-CLEF as it has evolved through design phases.

  13. Development and implementation of an Integrated Water Resources Management System (IWRMS)

    NASA Astrophysics Data System (ADS)

    Flügel, W.-A.; Busch, C.

    2011-04-01

    One of the innovative objectives in the EC project BRAHMATWINN was the development of a stakeholder oriented Integrated Water Resources Management System (IWRMS). The toolset integrates the findings of the project and presents it in a user friendly way for decision support in sustainable integrated water resources management (IWRM) in river basins. IWRMS is a framework, which integrates different types of basin information and which supports the development of IWRM options for climate change mitigation. It is based on the River Basin Information System (RBIS) data models and delivers a graphical user interface for stakeholders. A special interface was developed for the integration of the enhanced DANUBIA model input and the NetSyMod model with its Mulino decision support system (mulino mDss) component. The web based IWRMS contains and combines different types of data and methods to provide river basin data and information for decision support. IWRMS is based on a three tier software framework which uses (i) html/javascript at the client tier, (ii) PHP programming language to realize the application tier, and (iii) a postgresql/postgis database tier to manage and storage all data, except the DANUBIA modelling raw data, which are file based and registered in the database tier. All three tiers can reside on one or different computers and are adapted to the local hardware infrastructure. IWRMS as well as RBIS are based on Open Source Software (OSS) components and flexible and time saving access to that database is guaranteed by web-based interfaces for data visualization and retrieval. The IWRMS is accessible via the BRAHMATWINN homepage: http://www.brahmatwinn.uni-jena.de and a user manual for the RBIS is available for download as well.

  14. Enriching the Web Processing Service

    NASA Astrophysics Data System (ADS)

    Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer

    2014-05-01

    The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS-T interface. In order to deal with the results of these processing workflows, a server side extension enables the RichWPS Server and its clients to use WPS presentation directives (WPS-PD), a content related enhancement for the standardized WPS schema. We identified essential requirements of the components of our toolset by applying two use cases. The first enables the simplified comparison of modeled and measured data, a common task in hydro-engineering to validate the accuracy of a model. An implementation of the workflow includes reading, harmonizing and comparing two datasets in NetCDF-format. 2D Water level data from the German Bight can be chosen, presented and evaluated in a web client with interactive plots. The second use case is motivated by the Marine Strategy Directive (MSD) of the EU, which demands monitoring, action plans and at least an evaluation of the ecological situation in marine environment. Information technics adapted to those of INSPIRE should be used. One of the parameters monitored and evaluated for MSD is the expansion and quality of seagrass fields. With the view towards other evaluation parameters we decompose the complex process of evaluation of seagrass in reusable process steps and implement those packages as configurable WPS.

  15. Genome-Wide Screens Reveal New Gene Products That Influence Genetic Competence in Streptococcus mutans

    PubMed Central

    O'Brien, Greg; Maricic, Natalie; Kesterson, Alexandria; Grace, Megan

    2017-01-01

    ABSTRACT A network of genes and at least two peptide signaling molecules tightly control when Streptococcus mutans becomes competent to take up DNA from its environment. Widespread changes in the expression of genes occur when S. mutans is presented with competence signal peptides in vitro, including the increased production of the alternative sigma factor, ComX, which activates late competence genes. Still, the way that gene products that are regulated by competence peptides influence DNA uptake and cellular physiology are not well understood. Here, we developed and employed comprehensive transposon mutagenesis of the S. mutans genome, with a screen to identify mutants that aberrantly expressed comX, coupled with transposon sequencing (Tn-seq) to gain a more thorough understanding of the factors modulating comX expression and progression to the competent state. The screens effectively identified genes known to affect competence, e.g., comR, comS, comD, comE, cipB, clpX, rcrR, and ciaH, but disclosed an additional 20 genes that were not previously competence associated. The competence phenotypes of mutants were characterized, including by fluorescence microscopy to determine at which stage the mutants were impaired for comX activation. Among the novel genes studied were those implicated in cell division, the sensing of cell envelope stress, cell envelope biogenesis, and RNA stability. Our results provide a platform for determining the specific chemical and physical cues that are required for genetic competence in S. mutans, while highlighting the effectiveness of using Tn-seq in S. mutans to discover and study novel biological processes. IMPORTANCE Streptococcus mutans acquires DNA from its environment by becoming genetically competent, a physiologic state triggered by cell-cell communication using secreted peptides. Competence is important for acquiring novel genetic traits and has a strong influence on the expression of virulence-associated traits of S. mutans. Here, we used transposon mutagenesis and genomic technologies to identify novel genes involved in competence development. In addition to identifying genes previously known to be required for comX expression, 20 additional genes were identified and characterized. The findings create opportunities to diminish the pathogenic potential of S. mutans, while validating technologies that can rapidly advance our understanding of the physiology, biology, and genetics of S. mutans and related pathogens. PMID:29109185

  16. Genome-wide screens reveal new gene products that influence genetic competence in Streptococcus mutans.

    PubMed

    Shields, Robert C; O'Brien, Greg; Maricic, Natalie; Kesterson, Alexandria; Grace, Megan; Hagen, Stephen J; Burne, Robert A

    2017-11-06

    A network of genes and at least two peptide signaling molecules tightly control when Streptococcus mutans becomes competent to take up DNA from its environment. Widespread changes in the expression of genes occur when S. mutans is presented with competence signal peptides in vitro , including increased production of the alternative sigma factor, ComX, which activates late competence genes. Still, the way that gene products that are regulated by competence peptides influence DNA uptake and cellular physiology are not well understood. Here, we developed and employed comprehensive transposon mutagenesis of the S. mutans genome with a screen to identify mutants that aberrantly expressed comX , coupled with transposon sequencing (Tn-seq) to gain a more thorough understanding of the factors modulating comX expression and progression to the competent state. The screens effectively identified genes known to affect competence, e.g. comR , comS , comD , comE , cipB , clpX , rcrR , ciaH , but disclosed an additional 20 genes that were not previously competence-associated. The competence phenotypes of mutants were characterized, including using fluorescence microscopy to determine at which stage the mutants were impaired for comX activation. Among the novel genes studied were those implicated in cell division, sensing of cell envelope stress, cell envelope biogenesis, and RNA stability. Our results provide a platform for determining the specific chemical and physical cues that are required for genetic competence in S. mutans , while highlighting the effectiveness of using Tn-seq in S. mutans to discover and study novel biological processes. IMPORTANCE Streptococcus mutans acquires DNA from its environment by becoming genetically competent, a physiologic state triggered by cell-cell communication using secreted peptides. Competence is important for acquiring novel genetic traits and has a strong influence on the expression of virulence-associated traits of S. mutans Here, we used transposon mutagenesis and genomic technologies to identify novel genes involved in competence development. In addition to identifying genes previously known to be required for comX expression, 20 additional genes were identified and characterized. The findings create opportunities to diminish the pathogenic potential of S. mutans , while validating technologies that can rapidly advance our understanding of the physiology, biology and genetics of S. mutans and related pathogens. Copyright © 2017 American Society for Microbiology.

  17. Evaluating healthcare information technology outside of academia: observations from the national resource center for healthcare information technology at the Agency for Healthcare Research and Quality.

    PubMed

    Poon, Eric G; Cusack, Caitlin M; McGowan, Julie J

    2009-01-01

    The National Resource Center for Health Information Technology (NRC) was formed in the fall of 2004 as part of the Agency for Healthcare Research and Quality (AHRQ) health IT portfolio to support its grantees. One of the core functions of the NRC was to assist grantees in their evaluation efforts of Health IT. This manuscript highlights some common challenges experienced by health IT project teams at nonacademic institutions, including inappropriately scoped and resourced evaluation efforts, inappropriate choice of metrics, inadequate planning for data collection and analysis, and lack of consideration of qualitative methodologies. Many of these challenges can be avoided or overcome. The strategies adopted by various AHRQ grantees and the lessons learned from their projects should become part of the toolset for current and future implementers of health IT as the nation moves rapidly towards its widespread adoption.

  18. pySeismicDQA: open source post experiment data quality assessment and processing

    NASA Astrophysics Data System (ADS)

    Polkowski, Marcin

    2017-04-01

    Seismic Data Quality Assessment is python based, open source set of tools dedicated for data processing after passive seismic experiments. Primary goal of this toolset is unification of data types and formats from different dataloggers necessary for further processing. This process requires additional data checks for errors, equipment malfunction, data format errors, abnormal noise levels, etc. In all such cases user needs to decide (manually or by automatic threshold) if data is removed from output dataset. Additionally, output dataset can be visualized in form of website with data availability charts and waveform visualization with earthquake catalog (external). Data processing can be extended with simple STA/LTA event detection. pySeismicDQA is designed and tested for two passive seismic experiments in central Europe: PASSEQ 2006-2008 and "13 BB Star" (2013-2016). National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.

  19. The Cosmic Zoo: The (Near) Inevitability of the Evolution of Complex, Macroscopic Life

    PubMed Central

    Bains, William; Schulze-Makuch, Dirk

    2016-01-01

    Life on Earth provides a unique biological record from single-cell microbes to technologically intelligent life forms. Our evolution is marked by several major steps or innovations along a path of increasing complexity from microbes to space-faring humans. Here we identify various major key innovations, and use an analytical toolset consisting of a set of models to analyse how likely each key innovation is to occur. Our conclusion is that once the origin of life is accomplished, most of the key innovations can occur rather readily. The conclusion for other worlds is that if the origin of life can occur rather easily, we should live in a cosmic zoo, as the innovations necessary to lead to complex life will occur with high probability given sufficient time and habitat. On the other hand, if the origin of life is rare, then we might live in a rather empty universe. PMID:27376334

  20. Framework for Building Collaborative Research Environment

    DOE PAGES

    Devarakonda, Ranjeet; Palanisamy, Giriprakash; San Gil, Inigo

    2014-10-25

    Wide range of expertise and technologies are the key to solving some global problems. Semantic web technology can revolutionize the nature of how scientific knowledge is produced and shared. The semantic web is all about enabling machine-machine readability instead of a routine human-human interaction. Carefully structured data, as in machine readable data is the key to enabling these interactions. Drupal is an example of one such toolset that can render all the functionalities of Semantic Web technology right out of the box. Drupal’s content management system automatically stores the data in a structured format enabling it to be machine. Withinmore » this paper, we will discuss how Drupal promotes collaboration in a research setting such as Oak Ridge National Laboratory (ORNL) and Long Term Ecological Research Center (LTER) and how it is effectively using the Semantic Web in achieving this.« less

  1. Tunable Room-Temperature Synthesis of Coinage Metal Chalcogenide Nanocrystals from N -Heterocyclic Carbene Synthons

    DOE PAGES

    Lu, Haipeng; Brutchey, Richard L.

    2017-01-23

    Here we present a new toolset of precursors for semiconductor nanocrystal synthesis, N-heterocyclic carbene (NHC)-metal halide complexes, which enables a tunable molecular platform for the preparation of coinage metal chalcogenide quantum dots (QDs). Phase-pure and highly monodisperse coinage metal chalcogenide (Ag 2E, Cu 2-xE; E = S, Se) QDs are readily synthesized from the direct reaction of an NHC-MBr synthon (where M = Ag, Cu) with alkylsilyl chalcogenide reagents at room temperature. We demonstrate that the size of the resulting QDs is well tailored by the electron-donating ability of the L-type NHC ligands, which are further confirmed to be themore » only organic capping ligands on the QD surface, imparting excellent colloidal stability. Local superstructures of the NHC-capped Ag 2S QDs are observed by TEM, further demonstrating their potential for synthesizing monodisperse ensembles and mediating self-assembly.« less

  2. Tunable Room-Temperature Synthesis of Coinage Metal Chalcogenide Nanocrystals from N -Heterocyclic Carbene Synthons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Haipeng; Brutchey, Richard L.

    Here we present a new toolset of precursors for semiconductor nanocrystal synthesis, N-heterocyclic carbene (NHC)-metal halide complexes, which enables a tunable molecular platform for the preparation of coinage metal chalcogenide quantum dots (QDs). Phase-pure and highly monodisperse coinage metal chalcogenide (Ag 2E, Cu 2-xE; E = S, Se) QDs are readily synthesized from the direct reaction of an NHC-MBr synthon (where M = Ag, Cu) with alkylsilyl chalcogenide reagents at room temperature. We demonstrate that the size of the resulting QDs is well tailored by the electron-donating ability of the L-type NHC ligands, which are further confirmed to be themore » only organic capping ligands on the QD surface, imparting excellent colloidal stability. Local superstructures of the NHC-capped Ag 2S QDs are observed by TEM, further demonstrating their potential for synthesizing monodisperse ensembles and mediating self-assembly.« less

  3. The success of the X-33 depends on its technology—an overview

    NASA Astrophysics Data System (ADS)

    Bunting, Jackie O.; Sasso, Steven E.

    1996-03-01

    The success of the X-33, and therefore the Reusable Launch Vehicle (RLV) program, is highly dependent on the maturity of the components and subsystems selected and the ability to verify their performance, cost, and operability goals. The success of the technology that will be developed to support these components and subsystems will be critical to developing an operationally efficient X-33 that is traceable to a full-scale RLV system. This paper will delineate the key objectives of each technology demonstration area and provide an assessment of its ability to meet the X-33/RLV requirements. It is our intent to focus on these key technology areas to achieve the ambitious but achievable goals of the RLV and X-33 programs. Based on our assessment of the X-33 and RLV systems, we have focused on the performance verification and validation of the linear aerospike engine. This engine, first developed in the mid-1960s, shows promise in achieving the RLV objectives. Equally critical to the engine selection is the development of cryogenic composite tanks and the associated health management system required to meet the operability goals. We are also developing a highly reusable form of thermal protection system based on years of hypersonic research and Space Shuttle experience. To meet the mass fraction goals, reduction in engine component weights will also be developed. Due to the high degree of operability required, we will investigate the use of real-time integrated system health management and propulsion systems diagnostics, and mature the use of electromechanical actuators for highly reusable systems. The rapid turn-around requirements will require an adaptive guidance, navigation, and control algorithm toolset, which is well underway. We envision our X-33 and RLV to use mature, low-risk technologies that will allow truly low-cost access to space (Lockheed Martin Internal Document, 1995).

  4. An Assessment of Integrated Health Management (IHM) Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Lybeck; M. Tawfik; L. Bond

    In order to meet the ever increasing demand for energy, the United States nuclear industry is turning to life extension of existing nuclear power plants (NPPs). Economically ensuring the safe, secure, and reliable operation of aging nuclear power plants presents many challenges. The 2009 Light Water Reactor Sustainability Workshop identified online monitoring of active and structural components as essential to the better understanding and management of the challenges posed by aging nuclear power plants. Additionally, there is increasing adoption of condition-based maintenance (CBM) for active components in NPPs. These techniques provide a foundation upon which a variety of advanced onlinemore » surveillance, diagnostic, and prognostic techniques can be deployed to continuously monitor and assess the health of NPP systems and components. The next step in the development of advanced online monitoring is to move beyond CBM to estimating the remaining useful life of active components using prognostic tools. Deployment of prognostic health management (PHM) on the scale of a NPP requires the use of an integrated health management (IHM) framework - a software product (or suite of products) used to manage the necessary elements needed for a complete implementation of online monitoring and prognostics. This paper provides a thoughtful look at the desirable functions and features of IHM architectures. A full PHM system involves several modules, including data acquisition, system modeling, fault detection, fault diagnostics, system prognostics, and advisory generation (operations and maintenance planning). The standards applicable to PHM applications are indentified and summarized. A list of evaluation criteria for PHM software products, developed to ensure scalability of the toolset to an environment with the complexity of a NPP, is presented. Fourteen commercially available PHM software products are identified and classified into four groups: research tools, PHM system development tools, deployable architectures, and peripheral tools.« less

  5. Data hosting infrastructure for primary biodiversity data

    PubMed Central

    2011-01-01

    Background Today, an unprecedented volume of primary biodiversity data are being generated worldwide, yet significant amounts of these data have been and will continue to be lost after the conclusion of the projects tasked with collecting them. To get the most value out of these data it is imperative to seek a solution whereby these data are rescued, archived and made available to the biodiversity community. To this end, the biodiversity informatics community requires investment in processes and infrastructure to mitigate data loss and provide solutions for long-term hosting and sharing of biodiversity data. Discussion We review the current state of biodiversity data hosting and investigate the technological and sociological barriers to proper data management. We further explore the rescuing and re-hosting of legacy data, the state of existing toolsets and propose a future direction for the development of new discovery tools. We also explore the role of data standards and licensing in the context of data hosting and preservation. We provide five recommendations for the biodiversity community that will foster better data preservation and access: (1) encourage the community's use of data standards, (2) promote the public domain licensing of data, (3) establish a community of those involved in data hosting and archival, (4) establish hosting centers for biodiversity data, and (5) develop tools for data discovery. Conclusion The community's adoption of standards and development of tools to enable data discovery is essential to sustainable data preservation. Furthermore, the increased adoption of open content licensing, the establishment of data hosting infrastructure and the creation of a data hosting and archiving community are all necessary steps towards the community ensuring that data archival policies become standardized. PMID:22373257

  6. A game-based crowdsourcing platform for rapidly training middle and high school students to perform biomedical image analysis

    NASA Astrophysics Data System (ADS)

    Feng, Steve; Woo, Min-jae; Kim, Hannah; Kim, Eunso; Ki, Sojung; Shao, Lei; Ozcan, Aydogan

    2016-03-01

    We developed an easy-to-use and widely accessible crowd-sourcing tool for rapidly training humans to perform biomedical image diagnostic tasks and demonstrated this platform's ability on middle and high school students in South Korea to diagnose malaria infected red-blood-cells (RBCs) using Giemsa-stained thin blood smears imaged under light microscopes. We previously used the same platform (i.e., BioGames) to crowd-source diagnostics of individual RBC images, marking them as malaria positive (infected), negative (uninfected), or questionable (insufficient information for a reliable diagnosis). Using a custom-developed statistical framework, we combined the diagnoses from both expert diagnosticians and the minimally trained human crowd to generate a gold standard library of malaria-infection labels for RBCs. Using this library of labels, we developed a web-based training and educational toolset that provides a quantified score for diagnosticians/users to compare their performance against their peers and view misdiagnosed cells. We have since demonstrated the ability of this platform to quickly train humans without prior training to reach high diagnostic accuracy as compared to expert diagnosticians. Our initial trial group of 55 middle and high school students has collectively played more than 170 hours, each demonstrating significant improvements after only 3 hours of training games, with diagnostic scores that match expert diagnosticians'. Next, through a national-scale educational outreach program in South Korea we recruited >1660 students who demonstrated a similar performance level after 5 hours of training. We plan to further demonstrate this tool's effectiveness for other diagnostic tasks involving image labeling and aim to provide an easily-accessible and quickly adaptable framework for online training of new diagnosticians.

  7. Technology and Tool Development to Support Safety and Mission Assurance

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh

    2017-01-01

    The Assurance Case approach is being adopted in a number of safety-mission-critical application domains in the U.S., e.g., medical devices, defense aviation, automotive systems, and, lately, civil aviation. This paradigm refocuses traditional, process-based approaches to assurance on demonstrating explicitly stated assurance goals, emphasizing the use of structured rationale, and concrete product-based evidence as the means for providing justified confidence that systems and software are fit for purpose in safely achieving mission objectives. NASA has also been embracing assurance cases through the concepts of Risk Informed Safety Cases (RISCs), as documented in the NASA System Safety Handbook, and Objective Hierarchies (OHs) as put forth by the Agency's Office of Safety and Mission Assurance (OSMA). This talk will give an overview of the work being performed by the SGT team located at NASA Ames Research Center, in developing technologies and tools to engineer and apply assurance cases in customer projects pertaining to aviation safety. We elaborate how our Assurance Case Automation Toolset (AdvoCATE) has not only extended the state-of-the-art in assurance case research, but also demonstrated its practical utility. We have successfully developed safety assurance cases for a number of Unmanned Aircraft Systems (UAS) operations, which underwent, and passed, scrutiny both by the aviation regulator, i.e., the FAA, as well as the applicable NASA boards for airworthiness and flight safety, flight readiness, and mission readiness. We discuss our efforts in expanding AdvoCATE capabilities to support RISCs and OHs under a project recently funded by OSMA under its Software Assurance Research Program. Finally, we speculate on the applicability of our innovations beyond aviation safety to such endeavors as robotic, and human spaceflight.

  8. iReport: a generalised Galaxy solution for integrated experimental reporting.

    PubMed

    Hiltemann, Saskia; Hoogstrate, Youri; der Spek, Peter van; Jenster, Guido; Stubbs, Andrew

    2014-01-01

    Galaxy offers a number of visualisation options with components, such as Trackster, Circster and Galaxy Charts, but currently lacks the ability to easily combine outputs from different tools into a single view or report. A number of tools produce HTML reports as output in order to combine the various output files from a single tool; however, this requires programming and knowledge of HTML, and the reports must be custom-made for each new tool. We have developed a generic and flexible reporting tool for Galaxy, iReport, that allows users to create interactive HTML reports directly from the Galaxy UI, with the ability to combine an arbitrary number of outputs from any number of different tools. Content can be organised into different tabs, and interactivity can be added to components. To demonstrate the capability of iReport we provide two publically available examples, the first is an iReport explaining about iReports, created for, and using content from the recent Galaxy Community Conference 2014. The second is a genetic report based on a trio analysis to determine candidate pathogenic variants which uses our previously developed Galaxy toolset for whole-genome NGS analysis, CGtag. These reports may be adapted for outputs from any sequencing platform and any results, such as omics data, non-high throughput results and clinical variables. iReport provides a secure, collaborative, and flexible web-based reporting system that is compatible with Galaxy (and non-Galaxy) generated content. We demonstrate its value with a real-life example of reporting genetic trio-analysis.

  9. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    NASA Astrophysics Data System (ADS)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  10. Digital hologram transformations for RGB color holographic display with independent image magnification and translation in 3D.

    PubMed

    Makowski, Piotr L; Zaperty, Weronika; Kozacki, Tomasz

    2018-01-01

    A new framework for in-plane transformations of digital holograms (DHs) is proposed, which provides improved control over basic geometrical features of holographic images reconstructed optically in full color. The method is based on a Fourier hologram equivalent of the adaptive affine transformation technique [Opt. Express18, 8806 (2010)OPEXFF1094-408710.1364/OE.18.008806]. The solution includes four elementary geometrical transformations that can be performed independently on a full-color 3D image reconstructed from an RGB hologram: (i) transverse magnification; (ii) axial translation with minimized distortion; (iii) transverse translation; and (iv) viewing angle rotation. The independent character of transformations (i) and (ii) constitutes the main result of the work and plays a double role: (1) it simplifies synchronization of color components of the RGB image in the presence of mismatch between capture and display parameters; (2) provides improved control over position and size of the projected image, particularly the axial position, which opens new possibilities for efficient animation of holographic content. The approximate character of the operations (i) and (ii) is examined both analytically and experimentally using an RGB circular holographic display system. Additionally, a complex animation built from a single wide-aperture RGB Fourier hologram is presented to demonstrate full capabilities of the developed toolset.

  11. Bio-TDS: bioscience query tool discovery system.

    PubMed

    Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M

    2017-01-04

    Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  13. Against the integrative turn in bioethics: burdens of understanding.

    PubMed

    Savić, Lovro; Ivanković, Viktor

    2018-06-01

    The advocates of Integrative Bioethics have insisted that this recently emerging project aspires to become a new stage of bioethical development, surpassing both biomedically oriented bioethics and global bioethics. We claim in this paper that if the project wants to successfully replace the two existing paradigms, it at least needs to properly address and surmount the lack of common moral vocabulary problem. This problem points to a semantic incommensurability due to cross-language communication in moral terms. This paper proceeds as follows. In the first part, we provide an overview of Integrative Bioethics and its conceptual building blocks: mutlidisciplinarity, interdisciplinarity, and transdisciplinarity. In the second part, we disclose the problem of semantic incommensurability. The third part gives an overview of various positions on the understanding of interdisciplinarity and integration in interdisciplinary communication, and corresponding attempts at solving the lack of common moral vocabulary problem. Here we lean mostly on Holbrook's three theses regarding the character of interdisciplinary communication. Finally, in the fourth part, we discuss a particular bioethical case-that of euthanasia-to demonstrate the challenge semantic incommensurability poses to dialogues in Integrative Bioethics. We conclude that Integrative Bioethics does not offer a methodological toolset that would warrant optimism in its advocates' predictions of surpassing current modes of doing bioethics. Since Integrative Bioethics leaves controversial methodological questions unresolved on almost all counts and shows no attempts at overcoming the critical stumbling points, we argue for its rejection.

  14. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  15. LHCbDIRAC as Apache Mesos microservices

    NASA Astrophysics Data System (ADS)

    Haen, Christophe; Couturier, Benjamin

    2017-10-01

    The LHCb experiment relies on LHCbDIRAC, an extension of DIRAC, to drive its offline computing. This middleware provides a development framework and a complete set of components for building distributed computing systems. These components are currently installed and run on virtual machines (VM) or bare metal hardware. Due to the increased workload, high availability is becoming more and more important for the LHCbDIRAC services, and the current installation model is showing its limitations. Apache Mesos is a cluster manager which aims at abstracting heterogeneous physical resources on which various tasks can be distributed thanks to so called “frameworks” The Marathon framework is suitable for long running tasks such as the DIRAC services, while the Chronos framework meets the needs of cron-like tasks like the DIRAC agents. A combination of the service discovery tool Consul together with HAProxy allows to expose the running containers to the outside world while hiding their dynamic placements. Such an architecture brings a greater flexibility in the deployment of LHCbDirac services, allowing for easier deployment maintenance and scaling of services on demand (e..g LHCbDirac relies on 138 services and 116 agents). Higher reliability is also easier, as clustering is part of the toolset, which allows constraints on the location of the services. This paper describes the investigations carried out to package the LHCbDIRAC and DIRAC components into Docker containers and orchestrate them using the previously described set of tools.

  16. A genome-wide study of two-component signal transduction systems in eight newly sequenced mutans streptococci strains

    PubMed Central

    2012-01-01

    Background Mutans streptococci are a group of gram-positive bacteria including the primary cariogenic dental pathogen Streptococcus mutans and closely related species. Two component systems (TCSs) composed of a signal sensing histidine kinase (HK) and a response regulator (RR) play key roles in pathogenicity, but have not been comparatively studied for these oral bacterial pathogens. Results HKs and RRs of 8 newly sequenced mutans streptococci strains, including S. sobrinus DSM20742, S. ratti DSM20564 and six S. mutans strains, were identified and compared to the TCSs of S. mutans UA159 and NN2025, two previously genome sequenced S. mutans strains. Ortholog analysis revealed 18 TCS clusters (HK-RR pairs), 2 orphan HKs and 2 orphan RRs, of which 8 TCS clusters were common to all 10 strains, 6 were absent in one or more strains, and the other 4 were exclusive to individual strains. Further classification of the predicted HKs and RRs revealed interesting aspects of their putative functions. While TCS complements were comparable within the six S. mutans strains, S. sobrinus DSM20742 lacked TCSs possibly involved in acid tolerance and fructan catabolism, and S. ratti DSM20564 possessed 3 unique TCSs but lacked the quorum-sensing related TCS (ComDE). Selected computational predictions were verified by PCR experiments. Conclusions Differences in the TCS repertoires of mutans streptococci strains, especially those of S. sobrinus and S. ratti in comparison to S. mutans, imply differences in their response mechanisms for survival in the dynamic oral environment. This genomic level study of TCSs should help in understanding the pathogenicity of these mutans streptococci strains. PMID:22475007

  17. Inactivation of glutamate racemase (MurI) eliminates virulence in Streptococcus mutans.

    PubMed

    Zhang, Jianying; Liu, Jia; Ling, Junqi; Tong, Zhongchun; Fu, Yun; Liang, Min

    2016-01-01

    Inhibition of enzymes required for bacterial cell wall synthesis is often lethal or leads to virulence defects. Glutamate racemase (MurI), an essential enzyme in peptidoglycan biosynthesis, has been an attractive target for therapeutic interventions. Streptococcus mutans, one of the many etiological factors of dental caries, possesses a series of virulence factors associated with cariogenicity. However, little is known regarding the mechanism by which MurI influences pathogenesis of S. mutans. In this work, a stable mutant of S. mutans deficient in glutamate racemase (S. mutans FW1718) was constructed to investigate the impact of murI inactivation on cariogenic virulence in S. mutans UA159. Microscopy revealed that the murI mutant exhibited an enlarged cell size, longer cell chains, diminished cell⬜cell aggregation, and altered cell surface ultrastructure compared with the wild-type. Characterization of this mutant revealed that murI deficiency weakened acidogenicity, aciduricity, and biofilm formation ability of S. mutans (P<0.05). Real-time quantitative polymerase chain reaction (qRT-PCR) analysis demonstrated that the deletion of murI reduced the expression of the acidogenesis-related gene ldh by 44-fold (P<0.0001). The expression levels of the gene coding for surface protein antigen P (spaP) and the acid-tolerance related gene (atpD) were down-regulated by 99% (P<0.0001). Expression of comE, comD, gtfB and gtfC, genes related to biofilm formation, were down-regulated 8-, 43-, 85- and 298-fold in the murI mutant compared with the wild-type (P<0.0001), respectively. Taken together, the current study provides the first evidence that MurI deficiency adversely affects S. mutans virulence properties, making MurI a potential target for controlling dental caries. Copyright © 2016 Elsevier GmbH. All rights reserved.

  18. Stress-triggered signaling affecting survival or suicide of Streptococcus pneumoniae.

    PubMed

    Cortes, Paulo R; Piñas, Germán E; Cian, Melina B; Yandar, Nubia; Echenique, Jose

    2015-01-01

    Streptococcus pneumoniae is a major human pathogen that can survive to stress conditions, such as the acidic environment of inflammatory foci, and tolerates lethal pH through a mechanism known as the acid tolerance response. We previously described that S. pneumoniae activates acidic-stress induced lysis in response to acidified environments, favoring the release of cell wall compounds, DNA and virulence factors. Here, we demonstrate that F(0)F(1)-ATPase is involved in the response to acidic stress. Chemical inhibitors (DCCD, optochin) of this proton pump repressed the ATR induction, but caused an increased ASIL. Confirming these findings, mutants of the subunit c of this enzyme showed the same phenotypes as inhibitors. Importantly, we demonstrated that F(0)F(1)-ATPase and ATR are necessary for the intracellular survival of the pneumococcus in macrophages. Alternatively, a screening of two-component system (TCS) mutants showed that ATR and survival in pneumocytes were controlled in contrasting ways by ComDE and CiaRH, which had been involved in the ASIL mechanism. Briefly, CiaRH was essential for ATR (ComE represses activation) whereas ComE was necessary for ASIL (CiaRH protects against induction). They did not regulate F0F1-ATPase expression, but control LytA expression on the pneumococcal surface. These results suggest that both TCSs and F(0)F(1)-ATPase control a stress response and decide between a survival or a suicide mechanism by independent pathways, either in vitro or in pneumocyte cultures. This biological model contributes to the current knowledge about bacterial response under stress conditions in host tissues, where pathogens need to survive in order to establish infections. Copyright © 2014 Elsevier GmbH. All rights reserved.

  19. Genetic and Physiological Effects of Noncoherent Visible Light Combined with Hydrogen Peroxide on Streptococcus mutans in Biofilm ▿

    PubMed Central

    Steinberg, Doron; Moreinos, Daniel; Featherstone, John; Shemesh, Moshe; Feuerstein, Osnat

    2008-01-01

    Oral biofilms are associated with the most common infections of the oral cavity. Bacteria embedded in the biofilms are less sensitive to antibacterial agents than planktonic bacteria are. Recently, an antibacterial synergic effect of noncoherent blue light and hydrogen peroxide (H2O2) on planktonic Streptococcus mutans was demonstrated. In this study, we tested the effect of a combination of light and H2O2 on the vitality and gene expression of S. mutans embedded in biofilm. Biofilms of S. mutans were exposed to visible light (wavelengths, 400 to 500 nm) for 30 or 60 s (equivalent to 34 or 68 J/cm2) in the presence of 3 to 300 mM H2O2. The antibacterial effect was assessed by microbial counts of each treated sample compared with that of the control. The effect of light combined with H2O2 on the different layers of the biofilm was evaluated by confocal laser scanning microscopy. Gene expression was determined by real-time reverse transcription-PCR. Our results show that noncoherent light, in combination with H2O2, has a synergistic antibacterial effect through all of the layers of the biofilm. Furthermore, this treatment was more effective against bacteria in biofilm than against planktonic bacteria. The combined light and H2O2 treatment up-regulated the expression of several genes such as gtfB, brp, smu630, and comDE but did not affect relA and ftf. The ability of noncoherent visible light in combination with H2O2 to affect bacteria in deep layers of the biofilm suggests that this treatment may be applied in biofilm-related diseases as a minimally invasive antibacterial procedure. PMID:18316516

  20. Genetic and physiological effects of noncoherent visible light combined with hydrogen peroxide on Streptococcus mutans in biofilm.

    PubMed

    Steinberg, Doron; Moreinos, Daniel; Featherstone, John; Shemesh, Moshe; Feuerstein, Osnat

    2008-07-01

    Oral biofilms are associated with the most common infections of the oral cavity. Bacteria embedded in the biofilms are less sensitive to antibacterial agents than planktonic bacteria are. Recently, an antibacterial synergic effect of noncoherent blue light and hydrogen peroxide (H(2)O(2)) on planktonic Streptococcus mutans was demonstrated. In this study, we tested the effect of a combination of light and H(2)O(2) on the vitality and gene expression of S. mutans embedded in biofilm. Biofilms of S. mutans were exposed to visible light (wavelengths, 400 to 500 nm) for 30 or 60 s (equivalent to 34 or 68 J/cm(2)) in the presence of 3 to 300 mM H(2)O(2). The antibacterial effect was assessed by microbial counts of each treated sample compared with that of the control. The effect of light combined with H(2)O(2) on the different layers of the biofilm was evaluated by confocal laser scanning microscopy. Gene expression was determined by real-time reverse transcription-PCR. Our results show that noncoherent light, in combination with H(2)O(2), has a synergistic antibacterial effect through all of the layers of the biofilm. Furthermore, this treatment was more effective against bacteria in biofilm than against planktonic bacteria. The combined light and H(2)O(2) treatment up-regulated the expression of several genes such as gtfB, brp, smu630, and comDE but did not affect relA and ftf. The ability of noncoherent visible light in combination with H(2)O(2) to affect bacteria in deep layers of the biofilm suggests that this treatment may be applied in biofilm-related diseases as a minimally invasive antibacterial procedure.

  1. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between system elements to provide the logical context for the database. The historical data archive provides a common repository for sensor data that can be shared between developers and applications. The firmware codebase is used by the developer to organize the intelligent element firmware into atomic units which can be assembled into complete firmware for specific elements.

  2. Handheld computers for survey and trial data collection in resource-poor settings: development and evaluation of PDACT, a Palm Pilot interviewing system.

    PubMed

    Seebregts, Christopher J; Zwarenstein, Merrick; Mathews, Catherine; Fairall, Lara; Flisher, Alan J; Seebregts, Clive; Mukoma, Wanjiru; Klepp, Knut-Inge

    2009-11-01

    Handheld computers (personal digital assistant, PDA) have the potential to reduce the logistic burden, cost, and error rate of paper-based health research data collection, but there is a lack of appropriate software. The present work describes the development and evaluation of PDACT, a Personal Data Collection Toolset (www.healthware.org/pdact/index.htm) for the Palm Pilot handheld computer for interviewer-administered and respondent-administered data collection. We developed Personal Data Collection Toolkit (PDACT) software to enable questionnaires developed in QDS Design Studio, a Windows application, to be compiled and completed on Palm Pilot devices and evaluated in several representative field survey settings. The software has been used in seven separate studies and in over 90,000 interviews. Five interviewer-administered studies were completed in rural settings with poor communications infrastructure, following one day of interviewer training. Two respondent-administered questionnaire studies were completed by learners, in urban secondary schools, after 15min training. Questionnaires were available on each handheld in up to 11 languages, ranged from 20 to 580 questions, and took between 15 and 90min to complete. Up to 200 Palm Pilot devices were in use on a single day and, in about 50 device-years of use, very few technical problems were found. Compared with paper-based collection, data validation and cleaning times were reduced, and fewer errors were found. PDA data collection is easy to use and preferred by interviewers and respondents (both respondent-administered and interviewer-administered) over paper. Data are compiled and available within hours of collection facilitating data quality assurance. Although hardware increases the setup cost of the first study, the cumulative cost falls thereafter, and converges on the cumulative cost of paper-based studies (four, in the case of our interviewer-administered studies). Handheld data collection is an appropriate, affordable and convenient technology for health data collection, in diverse settings.

  3. Vehicle trajectory linearisation to enable efficient optimisation of the constant speed racing line

    NASA Astrophysics Data System (ADS)

    Timings, Julian P.; Cole, David J.

    2012-06-01

    A driver model is presented capable of optimising the trajectory of a simple dynamic nonlinear vehicle, at constant forward speed, so that progression along a predefined track is maximised as a function of time. In doing so, the model is able to continually operate a vehicle at its lateral-handling limit, maximising vehicle performance. The technique used forms a part of the solution to the motor racing objective of minimising lap time. A new approach of formulating the minimum lap time problem is motivated by the need for a more computationally efficient and robust tool-set for understanding on-the-limit driving behaviour. This has been achieved through set point-dependent linearisation of the vehicle model and coupling the vehicle-track system using an intrinsic coordinate description. Through this, the geometric vehicle trajectory had been linearised relative to the track reference, leading to new path optimisation algorithm which can be formed as a computationally efficient convex quadratic programming problem.

  4. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  5. A mass spectrometry primer for mass spectrometry imaging

    PubMed Central

    Rubakhin, Stanislav S.; Sweedler, Jonathan V.

    2011-01-01

    Mass spectrometry imaging (MSI), a rapidly growing subfield of chemical imaging, employs mass spectrometry (MS) technologies to create single- and multi-dimensional localization maps for a variety of atoms and molecules. Complimentary to other imaging approaches, MSI provides high chemical specificity and broad analyte coverage. This powerful analytical toolset is capable of measuring the distribution of many classes of inorganics, metabolites, proteins and pharmaceuticals in chemically and structurally complex biological specimens in vivo, in vitro, and in situ. The MSI approaches highlighted in this Methods in Molecular Biology volume provide flexibility of detection, characterization, and identification of multiple known and unknown analytes. The goal of this chapter is to introduce investigators who may be unfamiliar with MS to the basic principles of the mass spectrometric approaches as used in MSI. In addition to guidelines for choosing the most suitable MSI method for specific investigations, cross-references are provided to the chapters in this volume that describe the appropriate experimental protocols. PMID:20680583

  6. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis

    PubMed Central

    Gong, Xiajing; Hu, Meng

    2018-01-01

    Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640

  7. Remote visual analysis of large turbulence databases at multiple scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulido, Jesus; Livescu, Daniel; Kanov, Kalin

    The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less

  8. Semantic retrieval and navigation in clinical document collections.

    PubMed

    Kreuzthaler, Markus; Daumke, Philipp; Schulz, Stefan

    2015-01-01

    Patients with chronic diseases undergo numerous in- and outpatient treatment periods, and therefore many documents accumulate in their electronic records. We report on an on-going project focussing on the semantic enrichment of medical texts, in order to support recall-oriented navigation across a patient's complete documentation. A document pool of 1,696 de-identified discharge summaries was used for prototyping. A natural language processing toolset for document annotation (based on the text-mining framework UIMA) and indexing (Solr) was used to support a browser-based platform for document import, search and navigation. The integrated search engine combines free text and concept-based querying, supported by dynamically generated facets (diagnoses, procedures, medications, lab values, and body parts). The prototype demonstrates the feasibility of semantic document enrichment within document collections of a single patient. Originally conceived as an add-on for the clinical workplace, this technology could also be adapted to support personalised health record platforms, as well as cross-patient search for cohort building and other secondary use scenarios.

  9. A case study of the Secure Anonymous Information Linkage (SAIL) Gateway: a privacy-protecting remote access system for health-related research and evaluation.

    PubMed

    Jones, Kerina H; Ford, David V; Jones, Chris; Dsilva, Rohan; Thompson, Simon; Brooks, Caroline J; Heaven, Martin L; Thayer, Daniel S; McNerney, Cynthia L; Lyons, Ronan A

    2014-08-01

    With the current expansion of data linkage research, the challenge is to find the balance between preserving the privacy of person-level data whilst making these data accessible for use to their full potential. We describe a privacy-protecting safe haven and secure remote access system, referred to as the Secure Anonymised Information Linkage (SAIL) Gateway. The Gateway provides data users with a familiar Windows interface and their usual toolsets to access approved anonymously-linked datasets for research and evaluation. We outline the principles and operating model of the Gateway, the features provided to users within the secure environment, and how we are approaching the challenges of making data safely accessible to increasing numbers of research users. The Gateway represents a powerful analytical environment and has been designed to be scalable and adaptable to meet the needs of the rapidly growing data linkage community. Copyright © 2014 The Aurthors. Published by Elsevier Inc. All rights reserved.

  10. Google Earth and Geo Applications: A Toolset for Viewing Earth's Geospatial Information

    NASA Astrophysics Data System (ADS)

    Tuxen-Bettman, K.

    2016-12-01

    Earth scientists measure and derive fundamental data that can be of broad general interest to the public and policy makers. Yet, one of the challenges that has always faced the Earth science community is how to present their data and findings in an easy-to-use and compelling manner. Google's Geo Tools offer an efficient and dynamic way for scientists, educators, journalists and others to both access data and view or tell stories in a dynamic three-dimensional geospatial context. Google Earth in particular provides a dense canvas of satellite imagery on which can be viewed rich vector and raster datasets using the medium of Keyhole Markup Language (KML). Through KML, Google Earth can combine the analytical capabilities of Earth Engine, collaborative mapping of My Maps, and storytelling of Tour Builder and more to make Google's Geo Applications a coherent suite of tools for exploring our planet.https://earth.google.com/https://earthengine.google.com/https://mymaps.google.com/https://tourbuilder.withgoogle.com/https://www.google.com/streetview/

  11. A case study of the Secure Anonymous Information Linkage (SAIL) Gateway: A privacy-protecting remote access system for health-related research and evaluation☆

    PubMed Central

    Jones, Kerina H.; Ford, David V.; Jones, Chris; Dsilva, Rohan; Thompson, Simon; Brooks, Caroline J.; Heaven, Martin L.; Thayer, Daniel S.; McNerney, Cynthia L.; Lyons, Ronan A.

    2014-01-01

    With the current expansion of data linkage research, the challenge is to find the balance between preserving the privacy of person-level data whilst making these data accessible for use to their full potential. We describe a privacy-protecting safe haven and secure remote access system, referred to as the Secure Anonymised Information Linkage (SAIL) Gateway. The Gateway provides data users with a familiar Windows interface and their usual toolsets to access approved anonymously-linked datasets for research and evaluation. We outline the principles and operating model of the Gateway, the features provided to users within the secure environment, and how we are approaching the challenges of making data safely accessible to increasing numbers of research users. The Gateway represents a powerful analytical environment and has been designed to be scalable and adaptable to meet the needs of the rapidly growing data linkage community. PMID:24440148

  12. Statistical Hypothesis Testing using CNN Features for Synthesis of Adversarial Counterexamples to Human and Object Detection Vision Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raj, Sunny; Jha, Sumit Kumar; Pullum, Laura L.

    Validating the correctness of human detection vision systems is crucial for safety applications such as pedestrian collision avoidance in autonomous vehicles. The enormous space of possible inputs to such an intelligent system makes it difficult to design test cases for such systems. In this report, we present our tool MAYA that uses an error model derived from a convolutional neural network (CNN) to explore the space of images similar to a given input image, and then tests the correctness of a given human or object detection system on such perturbed images. We demonstrate the capability of our tool on themore » pre-trained Histogram-of-Oriented-Gradients (HOG) human detection algorithm implemented in the popular OpenCV toolset and the Caffe object detection system pre-trained on the ImageNet benchmark. Our tool may serve as a testing resource for the designers of intelligent human and object detection systems.« less

  13. Closed Loop Interactions between Spiking Neural Network and Robotic Simulators Based on MUSIC and ROS.

    PubMed

    Weidel, Philipp; Djurfeldt, Mikael; Duarte, Renato C; Morrison, Abigail

    2016-01-01

    In order to properly assess the function and computational properties of simulated neural systems, it is necessary to account for the nature of the stimuli that drive the system. However, providing stimuli that are rich and yet both reproducible and amenable to experimental manipulations is technically challenging, and even more so if a closed-loop scenario is required. In this work, we present a novel approach to solve this problem, connecting robotics and neural network simulators. We implement a middleware solution that bridges the Robotic Operating System (ROS) to the Multi-Simulator Coordinator (MUSIC). This enables any robotic and neural simulators that implement the corresponding interfaces to be efficiently coupled, allowing real-time performance for a wide range of configurations. This work extends the toolset available for researchers in both neurorobotics and computational neuroscience, and creates the opportunity to perform closed-loop experiments of arbitrary complexity to address questions in multiple areas, including embodiment, agency, and reinforcement learning.

  14. Closed Loop Interactions between Spiking Neural Network and Robotic Simulators Based on MUSIC and ROS

    PubMed Central

    Weidel, Philipp; Djurfeldt, Mikael; Duarte, Renato C.; Morrison, Abigail

    2016-01-01

    In order to properly assess the function and computational properties of simulated neural systems, it is necessary to account for the nature of the stimuli that drive the system. However, providing stimuli that are rich and yet both reproducible and amenable to experimental manipulations is technically challenging, and even more so if a closed-loop scenario is required. In this work, we present a novel approach to solve this problem, connecting robotics and neural network simulators. We implement a middleware solution that bridges the Robotic Operating System (ROS) to the Multi-Simulator Coordinator (MUSIC). This enables any robotic and neural simulators that implement the corresponding interfaces to be efficiently coupled, allowing real-time performance for a wide range of configurations. This work extends the toolset available for researchers in both neurorobotics and computational neuroscience, and creates the opportunity to perform closed-loop experiments of arbitrary complexity to address questions in multiple areas, including embodiment, agency, and reinforcement learning. PMID:27536234

  15. The COMET Sleep Research Platform.

    PubMed

    Nichols, Deborah A; DeSalvo, Steven; Miller, Richard A; Jónsson, Darrell; Griffin, Kara S; Hyde, Pamela R; Walsh, James K; Kushida, Clete A

    2014-01-01

    The Comparative Outcomes Management with Electronic Data Technology (COMET) platform is extensible and designed for facilitating multicenter electronic clinical research. Our research goals were the following: (1) to conduct a comparative effectiveness trial (CET) for two obstructive sleep apnea treatments-positive airway pressure versus oral appliance therapy; and (2) to establish a new electronic network infrastructure that would support this study and other clinical research studies. The COMET platform was created to satisfy the needs of CET with a focus on creating a platform that provides comprehensive toolsets, multisite collaboration, and end-to-end data management. The platform also provides medical researchers the ability to visualize and interpret data using business intelligence (BI) tools. COMET is a research platform that is scalable and extensible, and which, in a future version, can accommodate big data sets and enable efficient and effective research across multiple studies and medical specialties. The COMET platform components were designed for an eventual move to a cloud computing infrastructure that enhances sustainability, overall cost effectiveness, and return on investment.

  16. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    NASA Technical Reports Server (NTRS)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  17. Remote visual analysis of large turbulence databases at multiple scales

    DOE PAGES

    Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...

    2018-06-15

    The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less

  18. The COMET Sleep Research Platform

    PubMed Central

    Nichols, Deborah A.; DeSalvo, Steven; Miller, Richard A.; Jónsson, Darrell; Griffin, Kara S.; Hyde, Pamela R.; Walsh, James K.; Kushida, Clete A.

    2014-01-01

    Introduction: The Comparative Outcomes Management with Electronic Data Technology (COMET) platform is extensible and designed for facilitating multicenter electronic clinical research. Background: Our research goals were the following: (1) to conduct a comparative effectiveness trial (CET) for two obstructive sleep apnea treatments—positive airway pressure versus oral appliance therapy; and (2) to establish a new electronic network infrastructure that would support this study and other clinical research studies. Discussion: The COMET platform was created to satisfy the needs of CET with a focus on creating a platform that provides comprehensive toolsets, multisite collaboration, and end-to-end data management. The platform also provides medical researchers the ability to visualize and interpret data using business intelligence (BI) tools. Conclusion: COMET is a research platform that is scalable and extensible, and which, in a future version, can accommodate big data sets and enable efficient and effective research across multiple studies and medical specialties. The COMET platform components were designed for an eventual move to a cloud computing infrastructure that enhances sustainability, overall cost effectiveness, and return on investment. PMID:25848590

  19. AstroNet: A Tool Set for Simultaneous, Multi-Site Observations of Astronomical Objects

    NASA Technical Reports Server (NTRS)

    Chakrabarti, Supriya

    1995-01-01

    Earth-based, fully automatic "robotic" telescopes have been in routine operation for a number of years. As their number grows and their distribution becomes global, increasing attention is being given to forming networks of various sorts that will allow them, as a group, to make observations 24 hours a day in both hemispheres. We have suggested that telescopes based in space be part of this network. We further suggested that any telescope on this network be capable of asking, almost in real time, that other robotic telescopes perform support observations for them. When a target of opportunity required support observations, the system would determine which telescope(s) in the network would be most appropriate to make the observations and formulate a request to do so. Because the network would be comprised of telescopes located in widely distributed regions, this system would guarantee continuity of observations This report summarizes our efforts under this contract. We proposed to develop a set of data collection and display tools to aid simultaneous observation of astronomical targets from a number of observing sites. We planned to demonstrate the usefulness of this toolset for simultaneous multi-site observation of astronomical targets. Possible candidates or the proposed demonstration included the Extreme Ultraviolet Explorer (EUVE), International Ultraviolet Explorer (IUE), and ALEXIS, sounding rocket experiments. Ground-based observatories operated by the University of California, Berkeley, the Jet Propulsion Laboratory, and Fairborn Observatory in Mesa, Arizona were to be used to demonstrate the proposed concept. Although the demonstration was to have involved astronomical investigations, the tools were to have been applicable to a large number of scientific disciplines. The software tools and systems developed as a result of the work were to have been made available to the scientific community.

  20. System analysis of graphics processor architecture using virtual prototyping

    NASA Astrophysics Data System (ADS)

    Hancock, William R.; Groat, Jeff; Steeves, Todd; Spaanenburg, Henk; Shackleton, John

    1995-06-01

    Honeywell has been actively involved in the definition of the next generation display processors for military and commercial cockpits. A major concern is how to achieve super graphics workstation performance in avionics application. Most notable are requirements for low volume, low power, harsh environmental conditions, real-time performance and low cost. This paper describes the application of VHDL to the system analysis tasks associated with achieving these goals in a cost effective manner. The paper will describe the top level architecture identified to provide the graphical and video processing power needed to drive future high resolution display devices and to generate more natural panoramic 3D formats. The major discussion, however, will be on the use of VHDL to model the processing elements and customized pipelines needed to realize the architecture and for doing the complex system tradeoff studies necessary to achieve a cost effective implementation. New software tools have been developed to allow 'virtual' prototyping in the VHDL environment. This results in a hardware/software codesign using VHDL performance and functional models. This unique architectural tool allows simulation and tradeoffs within a standard and tightly integrated toolset, which eventually will be used to specify and design the entire system from the top level requirements and system performance to the lowest level individual ASICs. New processing elements, algorithms, and standard graphical inputs can be designed, tested and evaluated without the costly hardware prototyping using the innovative 'virtual' prototyping techniques which are evolving on this project. In addition, virtual prototyping of the display processor does not bind the preliminary design to point solutions as a physical prototype will. when the development schedule is known, one can extrapolate processing elements performance and design the system around the most current technology.

  1. Enhancing army analysis capability for warfighter protection: TRADOC-RDECOM M&S decision support environment collaboration

    NASA Astrophysics Data System (ADS)

    Athmer, Keith; Gaughan, Chris; McDonnell, Joseph S.; Leach, Robert; Davis, Bert; Truong, Kiet; Borum, Howard; Leslie, Richard; Ma, Lein

    2012-05-01

    The development of an Integrated Base Defense (IBD) is a significant challenge for the Army with many analytical gaps. The IBD problem space is complex, with evolving requirements and a large stakeholder base. In order to evaluate and analyze IBD decisions, the Training & Doctrine Command (TRADOC) Maneuver Support Center of Excellence (MSCoE) led and continues to lead a series of IBD focused experiments and wargames. Modeling and Simulation (M&S) significantly contributes to this effort. To improve IBD M&S capabilities, a collaborative demonstration with the Research, Development and Engineering Command's (RDECOM's) M&S Decision Support Environment (MSDSE) was held in September 2011. The results of this demonstration provided key input to MSCoE IBD related concepts and technologies. Moreover, it established an initial M&S toolset that will significantly improve force protection in combat zones and Army installations worldwide by providing leaders a capability to conduct analysis of defense and mission rehearsals. The demonstration was executed with a "human in the loop" Battle Captain, who was aided by mission command assets such as Base Expeditionary Targeting and Surveillance Sensors-Combined (BETSS-C). The Common Operating Picture was populated and stimulated using Science & Technology (S&T) M&S, allowing for a realistic representation of physical phenomena without the need for real systems. Novel methods were used for simulation orchestration, and for initializing the simulations and Opposing Force (OPFOR) activities. Ultimately, this demonstration showed that the MSDSE is suitable to support TRADOC IBD analyses and that S&T M&S is ready to be used in a demanding simulation environment. This paper will highlight the event's outcomes and lessons identified.

  2. "On the job" learning: A bioinformatics course incorporating undergraduates in actual research projects and manuscript submissions.

    PubMed

    Smith, Jason T; Harris, Justine C; Lopez, Oscar J; Valverde, Laura; Borchert, Glen M

    2015-01-01

    The sequencing of whole genomes and the analysis of genetic information continues to fundamentally change biological and medical research. Unfortunately, the people best suited to interpret this data (biologically trained researchers) are commonly discouraged by their own perceived computational limitations. To address this, we developed a course to help alleviate this constraint. Remarkably, in addition to equipping our undergraduates with an informatic toolset, we found our course design helped prepare our students for collaborative research careers in unexpected ways. Instead of simply offering a traditional lecture- or laboratory-based course, we chose a guided inquiry method, where an instructor-selected research question is examined by students in a collaborative analysis with students contributing to experimental design, data collection, and manuscript reporting. While students learn the skills needed to conduct bioinformatic research throughout all sections of the course, importantly, students also gain experience in working as a team and develop important communication skills through working with their partner and the class as a whole, and by contributing to an original research article. Remarkably, in its first three semesters, this novel computational genetics course has generated 45 undergraduate authorships across three peer-reviewed articles. More importantly, the students that took this course acquired a positive research experience, newfound informatics technical proficiency, unprecedented familiarity with manuscript preparation, and an earned sense of achievement. Although this course deals with analyses of genetic systems, we suggest the basic concept of integrating actual research projects into a 16-week undergraduate course could be applied to numerous other research-active academic fields. © 2015 The International Union of Biochemistry and Molecular Biology.

  3. A role for the endocannabinoid 2-arachidonoyl-sn-glycerol for social and high-fat food reward in male mice.

    PubMed

    Wei, Don; Lee, DaYeon; Li, Dandan; Daglian, Jennifer; Jung, Kwang-Mook; Piomelli, Daniele

    2016-05-01

    The endocannabinoid system is an important modulator of brain reward signaling. Investigations have focused on cannabinoid (CB1) receptors, because dissection of specific contributions of individual endocannabinoids has been limited by the available toolset. While we recently described an important role for the endocannabinoid anandamide in the regulation of social reward, it remains to be determined whether the other major endocannabinoid, 2-arachidonoyl-sn-glycerol (2-AG), serves a similar or different function. To study the role of 2-AG in natural reward, we used a transgenic mouse model (MGL-Tg mice) in which forebrain 2-AG levels are selectively reduced. We complemented behavioral analysis with measurements of brain 2-AG levels. We tested male MGL-Tg mice in conditioned place preference (CPP) tasks for high-fat food, social contact, and cocaine. We measured 2-AG content in the brain regions of interest by liquid chromatography/mass spectrometry. Male MGL-Tg mice are impaired in developing CPP for high-fat food and social interaction, but do develop CPP for cocaine. Furthermore, compared to isolated mice, levels of 2-AG in socially stimulated wild-type mice are higher in the nucleus accumbens and ventral hippocampus (183 and 140 % of controls, respectively), but unchanged in the medial prefrontal cortex. The results suggest that reducing 2-AG-mediated endocannabinoid signaling impairs social and high-fat food reward in male mice, and that social stimulation mobilizes 2-AG in key brain regions implicated in the control of motivated behavior. The time course of this response differentiates 2-AG from anandamide, whose role in mediating social reward was previously documented.

  4. Dynamic Blue Light-Inducible T7 RNA Polymerases (Opto-T7RNAPs) for Precise Spatiotemporal Gene Expression Control.

    PubMed

    Baumschlager, Armin; Aoki, Stephanie K; Khammash, Mustafa

    2017-11-17

    Light has emerged as a control input for biological systems due to its precise spatiotemporal resolution. The limited toolset for light control in bacteria motivated us to develop a light-inducible transcription system that is independent from cellular regulation through the use of an orthogonal RNA polymerase. Here, we present our engineered blue light-responsive T7 RNA polymerases (Opto-T7RNAPs) that show properties such as low leakiness of gene expression in the dark state, high expression strength when induced with blue light, and an inducible range of more than 300-fold. Following optimization of the system to reduce expression variability, we created a variant that returns to the inactive dark state within minutes once the blue light is turned off. This allows for precise dynamic control of gene expression, which is a key aspect for most applications using optogenetic regulation. The regulators, which only require blue light from ordinary light-emitting diodes for induction, were developed and tested in the bacterium Escherichia coli, which is a crucial cell factory for biotechnology due to its fast and inexpensive cultivation and well understood physiology and genetics. Opto-T7RNAP, with minor alterations, should be extendable to other bacterial species as well as eukaryotes such as mammalian cells and yeast in which the T7 RNA polymerase and the light-inducible Vivid regulator have been shown to be functional. We anticipate that our approach will expand the applicability of using light as an inducer for gene expression independent from cellular regulation and allow for a more reliable dynamic control of synthetic and natural gene networks.

  5. REDLetr: Workflow and tools to support the migration of legacy clinical data capture systems to REDCap.

    PubMed

    Dunn, William D; Cobb, Jake; Levey, Allan I; Gutman, David A

    2016-09-01

    A memory clinic at an academic medical center has relied on several ad hoc data capture systems including Microsoft Access and Excel for cognitive assessments over the last several years. However these solutions are challenging to maintain and limit the potential of hypothesis-driven or longitudinal research. REDCap, a secure web application based on PHP and MySQL, is a practical solution for improving data capture and organization. Here, we present a workflow and toolset to facilitate legacy data migration and real-time clinical research data collection into REDCap as well as challenges encountered. Legacy data consisted of neuropsychological tests stored in over 4000 Excel workbooks. Functions for data extraction, norm scoring, converting to REDCap-compatible formats, accessing the REDCap API, and clinical report generation were developed and executed in Python. Over 400 unique data points for each workbook were migrated and integrated into our REDCap database. Moving forward, our REDCap-based system replaces the Excel-based data collection method as well as eases the integration into the standard clinical research workflow and Electronic Health Record. In the age of growing data, efficient organization and storage of clinical and research data is critical for advancing research and providing efficient patient care. We believe that the workflow and tools described in this work to promote legacy data integration as well as real time data collection into REDCap ultimately facilitate these goals. Published by Elsevier Ireland Ltd.

  6. REDLetr: Workflow and tools to support the migration of legacy clinical data capture systems to REDCap

    PubMed Central

    Dunn, William D; Cobb, Jake; Levey, Allan I; Gutman, David A

    2017-01-01

    Objective A memory clinic at an academic medical center has relied on several ad hoc data capture systems including Microsoft Access and Excel for cognitive assessments over the last several years. However these solutions are challenging to maintain and limit the potential of hypothesis-driven or longitudinal research. REDCap, a secure web application based on php and MySQL, is a practical solution for improving data capture and organization. Here, we present a workflow and toolset to facilitate legacy data migration and real-time clinical research data collection into REDCap as well as challenges encountered. Materials and Methods Legacy data consisted of neuropsychological tests stored in over 4,000 Excel workbooks. Functions for data extraction, norm scoring, converting to REDCap-compatible formats, accessing the REDCap API, and clinical report generation were developed and executed in Python. Results Over 400 unique data points for each workbook were migrated and integrated into our REDCap database. Moving forward, our REDCap-based system replaces the Excel-based data collection method as well as eases the integration to the Electronic Health Record. Conclusion In the age of growing data, efficient organization and storage of clinical and research data is critical for advancing research and providing efficient patient care. We believe that the tools and workflow described in this work to promote legacy data integration as well as real time data collection into REDCap ultimately facilitate these goals. PMID:27396629

  7. Dynamic PROOF clusters with PoD: architecture and user experience

    NASA Astrophysics Data System (ADS)

    Manafov, Anar

    2011-12-01

    PROOF on Demand (PoD) is a tool-set, which sets up a PROOF cluster on any resource management system. PoD is a user oriented product with an easy to use GUI and a command-line interface. It is fully automated. No administrative privileges or special knowledge is required to use it. PoD utilizes a plug-in system, to use different job submission front-ends. The current PoD distribution is shipped with LSF, Torque (PBS), Grid Engine, Condor, gLite, and SSH plug-ins. The product is to be extended. We therefore plan to implement a plug-in for AliEn Grid as well. Recently developed algorithms made it possible to efficiently maintain two types of connections: packet-forwarding and native PROOF connections. This helps to properly handle most kinds of workers, with and without firewalls. PoD maintains the PROOF environment automatically and, for example, prevents resource misusage in case when workers idle for too long. As PoD matures as a product and provides more plug-ins, it's used as a standard for setting up dynamic PROOF clusters in many different institutions. The GSI Analysis Facility (GSIAF) is in production since 2007. The static PROOF cluster has been phased out end of 2009. GSIAF is now completely based on PoD. Users create private dynamic PROOF clusters on the general purpose batch farm. This provides an easier resource sharing between interactive local batch and Grid usage. The main user communities are FAIR and ALICE.

  8. Information technology as a key enabler in preparing for competition: ComEd`s Kincaid Generating Station, a work in progress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borth, F.C. III; Thompson, J.W.; Mishaga, J.M.

    1996-11-01

    Through ComEd Fossil (Generating) Division`s Competitive Action Plan (CAP) evaluation changes have been identified which are necessary to improve generating station performance. These changes are intended to improve both station reliability and financial margins, and are essential for stations to be successful in a competitive marketplace. Plant upgrades, advanced equipment stewardship, and personnel reductions have been identified as necessary steps in achieving industry leadership and competitive advantage. To deal effectively with plant systems and contend in the competitive marketplace Information Technology (IT) solutions to business problems are being developed. Data acquisition, storage, and retrieval are being automated through use ofmore » state-of-the-art Data Historians. Total plant, high resolution, long term process information will be accessed through Local/Wide Area Networks (LAN/WAN) connections from desktop PC`s. Generating unit Thermal Performance Monitors accessing the Data Historian will analyze plant and system performance enabling reductions in operating costs, and improvements in process control. As inputs to proactive maintenance toolsets this data allows anticipation of equipment service needs, advanced service scheduling, and cost/benefit analysis. The ultimate goal is to optimize repair needs with revenue generation. Advanced applications building upon these foundations will bring knowledge of the costs associated with all the products a generating station offers its customer(s). An overall design philosophy along with preliminary results is presented; these results include shortfalls, lessons learned, and future options.« less

  9. Integrated Management and Visualization of Electronic Tag Data with Tagbase

    PubMed Central

    Lam, Chi Hin; Tsontos, Vardis M.

    2011-01-01

    Electronic tags have been used widely for more than a decade in studies of diverse marine species. However, despite significant investment in tagging programs and hardware, data management aspects have received insufficient attention, leaving researchers without a comprehensive toolset to manage their data easily. The growing volume of these data holdings, the large diversity of tag types and data formats, and the general lack of data management resources are not only complicating integration and synthesis of electronic tagging data in support of resource management applications but potentially threatening the integrity and longer-term access to these valuable datasets. To address this critical gap, Tagbase has been developed as a well-rounded, yet accessible data management solution for electronic tagging applications. It is based on a unified relational model that accommodates a suite of manufacturer tag data formats in addition to deployment metadata and reprocessed geopositions. Tagbase includes an integrated set of tools for importing tag datasets into the system effortlessly, and provides reporting utilities to interactively view standard outputs in graphical and tabular form. Data from the system can also be easily exported or dynamically coupled to GIS and other analysis packages. Tagbase is scalable and has been ported to a range of database management systems to support the needs of the tagging community, from individual investigators to large scale tagging programs. Tagbase represents a mature initiative with users at several institutions involved in marine electronic tagging research. PMID:21750734

  10. An Adaptive Web-Based Learning Environment for the Application of Remote Sensing in Schools

    NASA Astrophysics Data System (ADS)

    Wolf, N.; Fuchsgruber, V.; Riembauer, G.; Siegmund, A.

    2016-06-01

    Satellite images have great educational potential for teaching on environmental issues and can promote the motivation of young people to enter careers in natural science and technology. Due to the importance and ubiquity of remote sensing in science, industry and the public, the use of satellite imagery has been included into many school curricular in Germany. However, its implementation into school practice is still hesitant, mainly due to lack of teachers' know-how and education materials that align with the curricula. In the project "Space4Geography" a web-based learning platform is developed with the aim to facilitate the application of satellite imagery in secondary school teaching and to foster effective student learning experiences in geography and other related subjects in an interdisciplinary way. The platform features ten learning modules demonstrating the exemplary application of original high spatial resolution remote sensing data (RapidEye and TerraSAR-X) to examine current environmental issues such as droughts, deforestation and urban sprawl. In this way, students will be introduced into the versatile applications of spaceborne earth observation and geospatial technologies. The integrated web-based remote sensing software "BLIF" equips the students with a toolset to explore, process and analyze the satellite images, thereby fostering the competence of students to work on geographical and environmental questions without requiring prior knowledge of remote sensing. This contribution presents the educational concept of the learning environment and its realization by the example of the learning module "Deforestation of the rainforest in Brasil".

  11. A supportive architecture for CFD-based design optimisation

    NASA Astrophysics Data System (ADS)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.

  12. Crosstalk between the serine/threonine kinase StkP and the response regulator ComE controls the stress response and intracellular survival of Streptococcus pneumoniae.

    PubMed

    Piñas, Germán E; Reinoso-Vizcaino, Nicolás M; Yandar Barahona, Nubia Y; Cortes, Paulo R; Duran, Rosario; Badapanda, Chandan; Rathore, Ankita; Bichara, Dario R; Cian, Melina B; Olivero, Nadia B; Perez, Daniel R; Echenique, José

    2018-06-08

    Streptococcus pneumoniae is an opportunistic human bacterial pathogen that usually colonizes the upper respiratory tract, but the invasion and survival mechanism in respiratory epithelial cells remains elusive. Previously, we described that acidic stress-induced lysis (ASIL) and intracellular survival are controlled by ComE through a yet unknown activation mechanism under acidic conditions, which is independent of the ComD histidine kinase that activates this response regulator for competence development at pH 7.8. Here, we demonstrate that the serine/threonine kinase StkP is essential for ASIL, and show that StkP phosphorylates ComE at Thr128. Molecular dynamic simulations predicted that Thr128-phosphorylation induces conformational changes on ComE's DNA-binding domain. Using nonphosphorylatable (ComET128A) and phosphomimetic (ComET128E) proteins, we confirmed that Thr128-phosphorylation increased the DNA-binding affinity of ComE. The non-phosphorylated form of ComE interacted more strongly with StkP than the phosphomimetic form at acidic pH, suggesting that pH facilitated crosstalk. To identify the ComE-regulated genes under acidic conditions, a comparative transcriptomic analysis was performed between the comET128A and wt strains, and differential expression of 104 genes involved in different cellular processes was detected, suggesting that the StkP/ComE pathway induced global changes in response to acidic stress. In the comET128A mutant, the repression of spxB and sodA correlated with decreased H2O2 production, whereas the reduced expression of murN correlated with an increased resistance to cell wall antibiotic-induced lysis, compatible with cell wall alterations. In the comET128A mutant, ASIL was blocked and acid tolerance response was higher compared to the wt strain. These phenotypes, accompanied with low H2O2 production, are likely responsible for the increased survival in pneumocytes of the comET128A mutant. We propose that the StkP/ComE pathway controls the stress response, thus affecting the intracellular survival of S. pneumoniae in pneumocytes, one of the first barriers that this pathogen must cross to establish an infection.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael; Lethin, Richard

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less

  14. Applied Space Systems Engineering. Chapter 17; Manage Technical Data

    NASA Technical Reports Server (NTRS)

    Kent, Peter

    2008-01-01

    Effective space systems engineering (SSE) is conducted in a fully electronic manner. Competitive hardware, software, and system designs are created in a totally digital environment that enables rapid product design and manufacturing cycles, as well as a multitude of techniques such as modeling, simulation, and lean manufacturing that significantly reduce the lifecycle cost of systems. Because the SSE lifecycle depends on the digital environment, managing the enormous volumes of technical data needed to describe, build, deploy, and operate systems is a critical factor in the success of a project. This chapter presents the key aspects of Technical Data Management (TDM) within the SSE process. It is written from the perspective of the System Engineer tasked with establishing the TDM process and infrastructure for a major project. Additional perspectives are reflected from the point of view of the engineers on the project who work within the digital engineering environment established by the TDM toolset and infrastructure, and from the point of view of the contactors who interface via the TDM infrastructure. Table 17.1 lists the TDM process as it relates to SSE.

  15. Tools for Designing and Analyzing Structures

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Structural Design and Analysis Toolset is a collection of approximately 26 Microsoft Excel spreadsheet programs, each of which performs calculations within a different subdiscipline of structural design and analysis. These programs present input and output data in user-friendly, menu-driven formats. Although these programs cannot solve complex cases like those treated by larger finite element codes, these programs do yield quick solutions to numerous common problems more rapidly than the finite element codes, thereby making it possible to quickly perform multiple preliminary analyses - e.g., to establish approximate limits prior to detailed analyses by the larger finite element codes. These programs perform different types of calculations, as follows: 1. determination of geometric properties for a variety of standard structural components; 2. analysis of static, vibrational, and thermal- gradient loads and deflections in certain structures (mostly beams and, in the case of thermal-gradients, mirrors); 3. kinetic energies of fans; 4. detailed analysis of stress and buckling in beams, plates, columns, and a variety of shell structures; and 5. temperature dependent properties of materials, including figures of merit that characterize strength, stiffness, and deformation response to thermal gradients

  16. Additions and improvements to the high energy density physics capabilities in the FLASH code

    NASA Astrophysics Data System (ADS)

    Lamb, D. Q.; Flocke, N.; Graziani, C.; Tzeferacos, P.; Weide, K.

    2016-10-01

    FLASH is an open source, finite-volume Eulerian, spatially adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities have been added to FLASH to make it an open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. In particular, we showcase the ability of FLASH to simulate the Faraday Rotation Measure produced by the presence of magnetic fields; and proton radiography, proton self-emission, and Thomson scattering diagnostics with and without the presence of magnetic fields. We also describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at the University of Chicago by the DOE NNSA ASC through the Argonne Institute for Computing in Science under field work proposal 57789; and the NSF under Grant PHY-0903997.

  17. Coarse-grained simulations of protein-protein association: an energy landscape perspective.

    PubMed

    Ravikumar, Krishnakumar M; Huang, Wei; Yang, Sichun

    2012-08-22

    Understanding protein-protein association is crucial in revealing the molecular basis of many biological processes. Here, we describe a theoretical simulation pipeline to study protein-protein association from an energy landscape perspective. First, a coarse-grained model is implemented and its applications are demonstrated via molecular dynamics simulations for several protein complexes. Second, an enhanced search method is used to efficiently sample a broad range of protein conformations. Third, multiple conformations are identified and clustered from simulation data and further projected on a three-dimensional globe specifying protein orientations and interacting energies. Results from several complexes indicate that the crystal-like conformation is favorable on the energy landscape even if the landscape is relatively rugged with metastable conformations. A closer examination on molecular forces shows that the formation of associated protein complexes can be primarily electrostatics-driven, hydrophobics-driven, or a combination of both in stabilizing specific binding interfaces. Taken together, these results suggest that the coarse-grained simulations and analyses provide an alternative toolset to study protein-protein association occurring in functional biomolecular complexes. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  18. Application of CART3D to Complex Propulsion-Airframe Integration with Vehicle Sketch Pad

    NASA Technical Reports Server (NTRS)

    Hahn, Andrew S.

    2012-01-01

    Vehicle Sketch Pad (VSP) is an easy-to-use modeler used to generate aircraft geometries for use in conceptual design and analysis. It has been used in the past to generate metageometries for aerodynamic analyses ranging from handbook methods to Navier-Stokes computational fluid dynamics (CFD). As desirable as it is to bring high order analyses, such as CFD, into the conceptual design process, this has been difficult and time consuming in practice due to the manual nature of both surface and volume grid generation. Over the last couple of years, VSP has had a major upgrade of its surface triangulation and export capability. This has enhanced its ability to work with Cart3D, an inviscid, three dimensional fluid flow toolset. The combination of VSP and Cart3D allows performing inviscid CFD on complex geometries with relatively high productivity. This paper will illustrate the use of VSP with Cart3D through an example case of a complex propulsion-airframe integration (PAI) of an over-wing nacelle (OWN) airliner configuration.

  19. Coarse-Grained Simulations of Protein-Protein Association: An Energy Landscape Perspective

    PubMed Central

    Ravikumar, Krishnakumar M.; Huang, Wei; Yang, Sichun

    2012-01-01

    Understanding protein-protein association is crucial in revealing the molecular basis of many biological processes. Here, we describe a theoretical simulation pipeline to study protein-protein association from an energy landscape perspective. First, a coarse-grained model is implemented and its applications are demonstrated via molecular dynamics simulations for several protein complexes. Second, an enhanced search method is used to efficiently sample a broad range of protein conformations. Third, multiple conformations are identified and clustered from simulation data and further projected on a three-dimensional globe specifying protein orientations and interacting energies. Results from several complexes indicate that the crystal-like conformation is favorable on the energy landscape even if the landscape is relatively rugged with metastable conformations. A closer examination on molecular forces shows that the formation of associated protein complexes can be primarily electrostatics-driven, hydrophobics-driven, or a combination of both in stabilizing specific binding interfaces. Taken together, these results suggest that the coarse-grained simulations and analyses provide an alternative toolset to study protein-protein association occurring in functional biomolecular complexes. PMID:22947945

  20. Effort to Accelerate MBSE Adoption and Usage at JSC

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Izygon, Michel; Okron, Shira; Garner, Larry; Wagner, Howard

    2016-01-01

    This paper describes the authors' experience in adopting Model Based System Engineering (MBSE) at the NASA/Johnson Space Center (JSC). Since 2009, NASA/JSC has been applying MBSE using the Systems Modeling Language (SysML) to a number of advanced projects. Models integrate views of the system from multiple perspectives, capturing the system design information for multiple stakeholders. This method has allowed engineers to better control changes, improve traceability from requirements to design and manage the numerous interactions between components. As the project progresses, the models become the official source of information and used by multiple stakeholders. Three major types of challenges that hamper the adoption of the MBSE technology are described. These challenges are addressed by a multipronged approach that includes educating the main stakeholders, implementing an organizational infrastructure that supports the adoption effort, defining a set of modeling guidelines to help engineers in their modeling effort, providing a toolset that support the generation of valuable products, and providing a library of reusable models. JSC project case studies are presented to illustrate how the proposed approach has been successfully applied.

  1. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    PubMed

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  2. Investigation of Stability of Precise Geodetic Instruments Used in Deformation Monitoring

    NASA Astrophysics Data System (ADS)

    Woźniak, Marek; Odziemczyk, Waldemar

    2017-12-01

    Monitoring systems using automated electronic total stations are an important element of safety control of many engineering objects. In order to ensure the appropriate credibility of acquired data, it is necessary that instruments (total stations in most of the cases) used for measurements meet requirements of measurement accuracy, as well as the stability of instrument axis system geometry. With regards to the above, it is expedient to conduct quality control of data acquired using electronic total stations in the context of performed measurement procedures. This paper presents results of research conducted at the Faculty of Geodesy and Cartography at Warsaw University of Technology investigating the stability of "basic" error values (collimation, zero location for V circle, inclination), for two types of automatic total stations: TDA 5005 and TCRP 1201+. Research provided also information concerning the influence of temperature changes upon the stability of investigated instrument's optical parameters. Results are presented in graphical analytic technique. Final conclusions propose methods, which allow avoiding negative results of measuring tool-set geometry changes during conducting precise deformation monitoring measurements.

  3. An approach for access differentiation design in medical distributed applications built on databases.

    PubMed

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  4. Assessment of SPOT-6 optical remote sensing data against GF-1 using NNDiffuse image fusion algorithm

    NASA Astrophysics Data System (ADS)

    Zhao, Jinling; Guo, Junjie; Cheng, Wenjie; Xu, Chao; Huang, Linsheng

    2017-07-01

    A cross-comparison method was used to assess the SPOT-6 optical satellite imagery against Chinese GF-1 imagery using three types of indicators: spectral and color quality, fusion effect and identification potential. More specifically, spectral response function (SRF) curves were used to compare the two imagery, showing that the SRF curve shape of SPOT-6 is more like a rectangle compared to GF-1 in blue, green, red and near-infrared bands. NNDiffuse image fusion algorithm was used to evaluate the capability of information conservation in comparison with wavelet transform (WT) and principal component (PC) algorithms. The results show that NNDiffuse fused image has extremely similar entropy vales than original image (1.849 versus 1.852) and better color quality. In addition, the object-oriented classification toolset (ENVI EX) was used to identify greenlands for comparing the effect of self-fusion image of SPOT-6 and inter-fusion image between SPOT-6 and GF-1 based on the NNDiffuse algorithm. The overall accuracy is 97.27% and 76.88%, respectively, showing that self-fused image of SPOT-6 has better identification capability.

  5. PuLSE: Quality control and quantification of peptide sequences explored by phage display libraries.

    PubMed

    Shave, Steven; Mann, Stefan; Koszela, Joanna; Kerr, Alastair; Auer, Manfred

    2018-01-01

    The design of highly diverse phage display libraries is based on assumption that DNA bases are incorporated at similar rates within the randomized sequence. As library complexity increases and expected copy numbers of unique sequences decrease, the exploration of library space becomes sparser and the presence of truly random sequences becomes critical. We present the program PuLSE (Phage Library Sequence Evaluation) as a tool for assessing randomness and therefore diversity of phage display libraries. PuLSE runs on a collection of sequence reads in the fastq file format and generates tables profiling the library in terms of unique DNA sequence counts and positions, translated peptide sequences, and normalized 'expected' occurrences from base to residue codon frequencies. The output allows at-a-glance quantitative quality control of a phage library in terms of sequence coverage both at the DNA base and translated protein residue level, which has been missing from toolsets and literature. The open source program PuLSE is available in two formats, a C++ source code package for compilation and integration into existing bioinformatics pipelines and precompiled binaries for ease of use.

  6. Interfaces and Integration of Medical Image Analysis Frameworks: Challenges and Opportunities.

    PubMed

    Covington, Kelsie; McCreedy, Evan S; Chen, Min; Carass, Aaron; Aucoin, Nicole; Landman, Bennett A

    2010-05-25

    Clinical research with medical imaging typically involves large-scale data analysis with interdependent software toolsets tied together in a processing workflow. Numerous, complementary platforms are available, but these are not readily compatible in terms of workflows or data formats. Both image scientists and clinical investigators could benefit from using the framework which is a most natural fit to the specific problem at hand, but pragmatic choices often dictate that a compromise platform is used for collaboration. Manual merging of platforms through carefully tuned scripts has been effective, but exceptionally time consuming and is not feasible for large-scale integration efforts. Hence, the benefits of innovation are constrained by platform dependence. Removing this constraint via integration of algorithms from one framework into another is the focus of this work. We propose and demonstrate a light-weight interface system to expose parameters across platforms and provide seamless integration. In this initial effort, we focus on four platforms Medical Image Analysis and Visualization (MIPAV), Java Image Science Toolkit (JIST), command line tools, and 3D Slicer. We explore three case studies: (1) providing a system for MIPAV to expose internal algorithms and utilize these algorithms within JIST, (2) exposing JIST modules through self-documenting command line interface for inclusion in scripting environments, and (3) detecting and using JIST modules in 3D Slicer. We review the challenges and opportunities for light-weight software integration both within development language (e.g., Java in MIPAV and JIST) and across languages (e.g., C/C++ in 3D Slicer and shell in command line tools).

  7. Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.

    2014-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials

  8. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  9. Extracting atmospheric turbulence and aerosol characteristics from passive imagery

    NASA Astrophysics Data System (ADS)

    Reinhardt, Colin N.; Wayne, D.; McBryde, K.; Cauble, G.

    2013-09-01

    Obtaining accurate, precise and timely information about the local atmospheric turbulence and extinction conditions and aerosol/particulate content remains a difficult problem with incomplete solutions. It has important applications in areas such as optical and IR free-space communications, imaging systems performance, and the propagation of directed energy. The capability to utilize passive imaging data to extract parameters characterizing atmospheric turbulence and aerosol/particulate conditions would represent a valuable addition to the current piecemeal toolset for atmospheric sensing. Our research investigates an application of fundamental results from optical turbulence theory and aerosol extinction theory combined with recent advances in image-quality-metrics (IQM) and image-quality-assessment (IQA) methods. We have developed an algorithm which extracts important parameters used for characterizing atmospheric turbulence and extinction along the propagation channel, such as the refractive-index structure parameter C2n , the Fried atmospheric coherence width r0 , and the atmospheric extinction coefficient βext , from passive image data. We will analyze the algorithm performance using simulations based on modeling with turbulence modulation transfer functions. An experimental field campaign was organized and data were collected from passive imaging through turbulence of Siemens star resolution targets over several short littoral paths in Point Loma, San Diego, under conditions various turbulence intensities. We present initial results of the algorithm's effectiveness using this field data and compare against measurements taken concurrently with other standard atmospheric characterization equipment. We also discuss some of the challenges encountered with the algorithm, tasks currently in progress, and approaches planned for improving the performance in the near future.

  10. An algorithm for designing minimal microbial communities with desired metabolic capacities

    PubMed Central

    Eng, Alexander; Borenstein, Elhanan

    2016-01-01

    Motivation: Recent efforts to manipulate various microbial communities, such as fecal microbiota transplant and bioreactor systems’ optimization, suggest a promising route for microbial community engineering with numerous medical, environmental and industrial applications. However, such applications are currently restricted in scale and often rely on mimicking or enhancing natural communities, calling for the development of tools for designing synthetic communities with specific, tailored, desired metabolic capacities. Results: Here, we present a first step toward this goal, introducing a novel algorithm for identifying minimal sets of microbial species that collectively provide the enzymatic capacity required to synthesize a set of desired target product metabolites from a predefined set of available substrates. Our method integrates a graph theoretic representation of network flow with the set cover problem in an integer linear programming (ILP) framework to simultaneously identify possible metabolic paths from substrates to products while minimizing the number of species required to catalyze these metabolic reactions. We apply our algorithm to successfully identify minimal communities both in a set of simple toy problems and in more complex, realistic settings, and to investigate metabolic capacities in the gut microbiome. Our framework adds to the growing toolset for supporting informed microbial community engineering and for ultimately realizing the full potential of such engineering efforts. Availability and implementation: The algorithm source code, compilation, usage instructions and examples are available under a non-commercial research use only license at https://github.com/borenstein-lab/CoMiDA. Contact: elbo@uw.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153571

  11. Managing Large Datasets for Atmospheric Research

    NASA Technical Reports Server (NTRS)

    Chen, Gao

    2015-01-01

    Since the mid-1980s, airborne and ground measurements have been widely used to provide comprehensive characterization of atmospheric composition and processes. Field campaigns have generated a wealth of insitu data and have grown considerably over the years in terms of both the number of measured parameters and the data volume. This can largely be attributed to the rapid advances in instrument development and computing power. The users of field data may face a number of challenges spanning data access, understanding, and proper use in scientific analysis. This tutorial is designed to provide an introduction to using data sets, with a focus on airborne measurements, for atmospheric research. The first part of the tutorial provides an overview of airborne measurements and data discovery. This will be followed by a discussion on the understanding of airborne data files. An actual data file will be used to illustrate how data are reported, including the use of data flags to indicate missing data and limits of detection. Retrieving information from the file header will be discussed, which is essential to properly interpreting the data. Field measurements are typically reported as a function of sampling time, but different instruments often have different sampling intervals. To create a combined data set, the data merge process (interpolation of all data to a common time base) will be discussed in terms of the algorithm, data merge products available from airborne studies, and their application in research. Statistical treatment of missing data and data flagged for limit of detection will also be covered in this section. These basic data processing techniques are applicable to both airborne and ground-based observational data sets. Finally, the recently developed Toolsets for Airborne Data (TAD) will be introduced. TAD (tad.larc.nasa.gov) is an airborne data portal offering tools to create user defined merged data products with the capability to provide descriptive statistics and the option to treat measurement uncertainty.

  12. Open-Source Conceptual Sizing Models for the Hyperloop Passenger Pod

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey C.; Gray, Justin S.; Jones, Scott M.; Berton, Jeffrey J.

    2015-01-01

    Hyperloop is a new mode of transportation proposed as an alternative to California's high speed rail project, with the intended benefits of higher performance at lower overall costs. It consists of a passenger pod traveling through a tube under a light vacuum and suspended on air bearings. The pod travels up to transonic speeds resulting in a 35 minute travel time between the intended route from Los Angeles and San Francisco. Of the two variants outlined, the smaller system includes a 1.1 meter tall passenger capsule traveling through a 2.2 meter tube at 700 miles per hour. The passenger pod features water-based heat exchangers as well as an on-board compression system that reduces the aerodynamic drag as it moves through the tube. Although the original proposal looks very promising, it assumes that tube and pod dimensions are independently sizable without fully acknowledging the constraints of the compressor system on the pod geometry. This work focuses on the aerodynamic and thermodynamic interactions between the two largest systems; the tube and the pod. Using open-source toolsets, a new sizing method is developed based on one-dimensional thermodynamic relationships that accounts for the strong interactions between these sub-systems. These additional considerations require a tube nearly twice the size originally considered and limit the maximum pod travel speed to about 620 miles per hour. Although the results indicate that Hyperloop will need to be larger and slightly slower than originally intended, the estimated travel time only increases by approximately five minutes, so the overall performance is not dramatically affected. In addition, the proposed on-board heat exchanger is not an ideal solution to achieve reasonable equilibrium air temperatures within the tube. Removal of this subsystem represents a potential reduction in weight, energy requirements and complexity of the pod. In light of these finding, the core concept still remains a compelling possibility, although additional engineering and economic analyses are markedly necessary before a more complete design can be developed.

  13. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    PubMed

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.

  14. NASA's Lunar and Planetary Mapping and Modeling Program

    NASA Astrophysics Data System (ADS)

    Law, E.; Day, B. H.; Kim, R. M.; Bui, B.; Malhotra, S.; Chang, G.; Sadaqathullah, S.; Arevalo, E.; Vu, Q. A.

    2016-12-01

    NASA's Lunar and Planetary Mapping and Modeling Program produces a suite of online visualization and analysis tools. Originally designed for mission planning and science, these portals offer great benefits for education and public outreach (EPO), providing access to data from a wide range of instruments aboard a variety of past and current missions. As a component of NASA's Science EPO Infrastructure, they are available as resources for NASA STEM EPO programs, and to the greater EPO community. As new missions are planned to a variety of planetary bodies, these tools are facilitating the public's understanding of the missions and engaging the public in the process of identifying and selecting where these missions will land. There are currently three web portals in the program: the Lunar Mapping and Modeling Portal or LMMP (http://lmmp.nasa.gov), Vesta Trek (http://vestatrek.jpl.nasa.gov), and Mars Trek (http://marstrek.jpl.nasa.gov). Portals for additional planetary bodies are planned. As web-based toolsets, the portals do not require users to purchase or install any software beyond current web browsers. The portals provide analysis tools for measurement and study of planetary terrain. They allow data to be layered and adjusted to optimize visualization. Visualizations are easily stored and shared. The portals provide 3D visualization and give users the ability to mark terrain for generation of STL files that can be directed to 3D printers. Such 3D prints are valuable tools in museums, public exhibits, and classrooms - especially for the visually impaired. Along with the web portals, the program supports additional clients, web services, and APIs that facilitate dissemination of planetary data to a range of external applications and venues. NASA challenges and hackathons are also providing members of the software development community opportunities to participate in tool development and leverage data from the portals.

  15. Metadata Creation, Management and Search System for your Scientific Data

    NASA Astrophysics Data System (ADS)

    Devarakonda, R.; Palanisamy, G.

    2012-12-01

    Mercury Search Systems is a set of tools for creating, searching, and retrieving of biogeochemical metadata. Mercury toolset provides orders of magnitude improvements in search speed, support for any metadata format, integration with Google Maps for spatial queries, multi-facetted type search, search suggestions, support for RSS (Really Simple Syndication) delivery of search results, and enhanced customization to meet the needs of the multiple projects that use Mercury. Mercury's metadata editor provides a easy way for creating metadata and Mercury's search interface provides a single portal to search for data and information contained in disparate data management systems, each of which may use any metadata format including FGDC, ISO-19115, Dublin-Core, Darwin-Core, DIF, ECHO, and EML. Mercury harvests metadata and key data from contributing project servers distributed around the world and builds a centralized index. The search interfaces then allow the users to perform a variety of fielded, spatial, and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data. Mercury is being used more than 14 different projects across 4 federal agencies. It was originally developed for NASA, with continuing development funded by NASA, USGS, and DOE for a consortium of projects. Mercury search won the NASA's Earth Science Data Systems Software Reuse Award in 2008. References: R. Devarakonda, G. Palanisamy, B.E. Wilson, and J.M. Green, "Mercury: reusable metadata management data discovery and access system", Earth Science Informatics, vol. 3, no. 1, pp. 87-94, May 2010. R. Devarakonda, G. Palanisamy, J.M. Green, B.E. Wilson, "Data sharing and retrieval using OAI-PMH", Earth Science Informatics DOI: 10.1007/s12145-010-0073-0, (2010);

  16. The Advanced Rapid Imaging and Analysis (ARIA) Project: Status of SAR products for Earthquakes, Floods, Volcanoes and Groundwater-related Subsidence

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Sacco, G. F.; Manipon, G.; Linick, J. P.; Fielding, E. J.; Lundgren, P.; Farr, T. G.; Webb, F.; Rosen, P. A.; Simons, M.

    2017-12-01

    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating high-level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques including Interferometric Synthetic Aperture Radar (InSAR), differential Global Positioning System, and SAR-based change detection have become critical additions to our toolset for understanding and mapping the damage and deformation caused by earthquakes, volcanic eruptions, floods, landslides, and groundwater extraction. Up until recently, processing of these data sets has been handcrafted for each study or event and has not generated products rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by the California Institute of Technology and by NASA through the Jet Propulsion Laboratory, has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition to supporting the growing science and hazard response communities, the ARIA project has developed the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the influx of raw SAR data from geodetic imaging missions such as ESA's Sentinel-1A/B, now operating with repeat intervals as short as 6 days, and the upcoming NASA NISAR mission. We will present the progress and results we have made on automating the analysis of Sentinel-1A/B SAR data for hazard monitoring and response, with emphasis on recent developments and end user engagement in flood extent mapping and deformation time series for both volcano monitoring and mapping of groundwater-related subsidence

  17. An Overview of the Challenges with and Proposed Solutions for the Ingest and Distribution Processes For Airborne Data Management

    NASA Astrophysics Data System (ADS)

    Northup, E. A.; Beach, A. L., III; Early, A. B.; Kusterer, J.; Quam, B.; Wang, D.; Chen, G.

    2015-12-01

    The current data management practices for NASA airborne field projects have successfully served science team data needs over the past 30 years to achieve project science objectives, however, users have discovered a number of issues in terms of data reporting and format. The ICARTT format, a NASA standard since 2010, is currently the most popular among the airborne measurement community. Although easy for humans to use, the format standard is not sufficiently rigorous to be machine-readable, and there lacks a standard variable naming convention among the many airborne measurement variables. This makes data use and management tedious and resource intensive, and also create problems in Distributed Active Archive Center (DAAC) data ingest procedures and distribution. Further, most DAACs use metadata models that concentrate on satellite data observations, making them less prepared to deal with airborne data. There also exists a substantial amount of airborne data distributed by websites designed for science team use that are less friendly to users unfamiliar with operations of airborne field studies. A number of efforts are underway to help overcome the issues with airborne data discovery and distribution. The ICARTT Refresh Earth Science Data Systems Working Group (ESDSWG) was established to enable a platform for atmospheric science data providers, users, and data managers to collaborate on developing new criteria for the file format in an effort to enhance airborne data usability. In addition, the NASA Langley Research Center Atmospheric Science Data Center (ASDC) has developed the Toolsets for Airborne Data (TAD) to provide web-based tools and centralized access to airborne in situ measurements of atmospheric composition. This presentation will discuss the aforementioned challenges and attempted solutions in an effort to demonstrate how airborne data management can be improved to streamline data ingest and discoverability to a broader user community.

  18. Characterization of Joint Sets Through UAV Photogrammetry on Sedimentary Rock Sea Cliffs and Abrasion Platforms in Northern Taiwan

    NASA Astrophysics Data System (ADS)

    Hsieh, P. C.; LU, A.; Yeh, C. H.; Huang, W. K.; Lin, H. H.; Lin, M. L.

    2017-12-01

    Rockfall hazards are very common in obsequent slope and oblique slope. In the coastal area of northern Taiwan, many sea cliffs are formed by obsequent slope and oblique slope. A famous case of rockfall failure happened on Aug. 31, 2013, a 150-ton rock block fell on the highway in Badouzi, Keelung, during a high intensity rainfall event which was caused by Typhoon No.15 (Kong-rey). To reduce this kind of rockfall hazard, it is important to characterize discontinuous planes in the bedrock because rock blocks are mainly divided from bedrock by two or more sets of discontinuous planes including joint planes and the bedding plane. For doing characterization of those fracture patterns of joint sets, it is necessary to do detailed field investigations. However, the survey of discontinuous planes, especially joint sets, are usually difficult and cannot get enough characterization data about joint sets. The first reason is that doing field investigations on the surface of sea cliffs is very dangerous and difficult for engineers or geologists to approach the upper part of outcrop. The second reason is the complexity of joint sets. In Badouzi area, each cliff is constituted by many different layers such as sandstone, shale, or alternations of sandstone and shale, and each layer has different fracture pattern of joint sets. In this study, we use UAV photogrammetry as a solution of these difficulties. UAV photogrammetry can produce a high-resolution digital surface model (DSM), orthophoto, and anaglyph of sea cliffs and abrasion platforms. Than we use self-developed geoprocessing toolsets to auto-trace joint planes with DSM data and produce fracture pattern of joint sets semi-automatically and systematically. Our method can provide basic information for rock mass rating on rock slope stability and rockfall hazards evaluation.

  19. A Functional Approach to Hyperspectral Image Analysis in the Cloud

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Lindholm, D. M.; Coddington, O.; Pilewskie, P.

    2017-12-01

    Hyperspectral image volumes are very large. A hyperspectral image analysis (HIA) may use 100TB of data, a huge barrier to their use. Hylatis is a new NASA project to create a toolset for HIA. Through web notebook and cloud technology, Hylatis will provide a more interactive experience for HIA by defining and implementing concepts and operations for HIA, identified and vetted by subject matter experts, and callable within a general purpose language, particularly Python. Hylatis leverages LaTiS, a data access framework developed at LASP. With an OPeNDAP compliant interface plus additional server side capabilities, the LaTiS API provides a uniform interface to virtually any data source, and has been applied to various storage systems, including: file systems, databases, remote servers, and in various domains including: space science, systems administration and stock quotes. In the LaTiS architecture, data `adapters' read data into a data model, where server-side computations occur. Data `writers' write data from the data model into the desired format. The Hylatis difference is the data model. In LaTiS, data are represented as mathematical functions of independent and dependent variables. Domain semantics are not present at this level, but are instead present in higher software layers. The benefit of a domain agnostic, mathematical representation is having the power of math, particularly functional algebra, unconstrained by domain semantics. This agnosticism supports reusable server side functionality applicable in any domain, such as statistical, filtering, or projection operations. Algorithms to aggregate or fuse data can be simpler because domain semantics are separated from the math. Hylatis will map the functional model onto the Spark relational interface, thereby adding a functional interface to that big data engine.This presentation will discuss Hylatis goals, strategies, and current state.

  20. Case study: The Avengers 3D: cinematic techniques and digitally created 3D

    NASA Astrophysics Data System (ADS)

    Clark, Graham D.

    2013-03-01

    Marvel's THE AVENGERS was the third film Stereo D collaborated on with Marvel; it was a summation of our artistic development of what Digitally Created 3D and Stereo D's artists and toolsets affords Marvel's filmmakers; the ability to shape stereographic space to support the film and story, in a way that balances human perception and live photography. We took our artistic lead from the cinematic intentions of Marvel, the Director Joss Whedon, and Director of Photography Seamus McGarvey. In the digital creation of a 3D film from a 2D image capture, recommendations to the filmmakers cinematic techniques are offered by Stereo D at each step from pre-production onwards, through set, into post. As the footage arrives at our facility we respond in depth to the cinematic qualities of the imagery in context of the edit and story, with the guidance of the Directors and Studio, creating stereoscopic imagery. Our involvement in The Avengers was early in production, after reading the script we had the opportunity and honor to meet and work with the Director Joss Whedon, and DP Seamus McGarvey on set, and into post. We presented what is obvious to such great filmmakers in the ways of cinematic techniques as they related to the standard depth cues and story points we would use to evaluate depth for their film. Our hope was any cinematic habits that supported better 3D would be emphasized. In searching for a 3D statement for the studio and filmmakers we arrived at a stereographic style that allowed for comfort and maximum visual engagement to the viewer.

  1. NASA's Solar System Treks: Online Portals for Planetary Mapping and Modeling

    NASA Technical Reports Server (NTRS)

    Day, Brian

    2017-01-01

    NASA's Solar System Treks are a suite of web-based of lunar and planetary mapping and modeling portals providing interactive visualization and analysis tools enabling mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, Vesta, and more. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look ahead to future features and releases. Moon Trek is a new portal replacing its predecessor, the Lunar Mapping and Modeling Portal (LMMP), that significantly upgrades and builds upon the capabilities of LMMP. It features greatly improved navigation, 3D visualization, fly-overs, performance, and reliability. Additional data products and tools continue to be added. These include both generalized products as well as polar data products specifically targeting potential sites for NASA's Resource Prospector mission as well as for missions being planned by NASA's international partners. The latest release of Mars Trek includes new tools and data products requested by NASA's Planetary Science Division to support site selection and analysis for Mars Human Landing Exploration Zone Sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. Phobos Trek, the latest effort in the Solar System Treks suite, is being developed in coordination with the International Phobos/Deimos Landing Site Working Group, with landing site selection and analysis for JAXA's MMX (Martian Moons eXploration) mission as a primary driver.

  2. NASA's Solar System Treks: Online Portals for Planetary Mapping and Modeling

    NASA Astrophysics Data System (ADS)

    Day, B. H.; Law, E.

    2017-12-01

    NASA's Solar System Treks are a suite of web-based of lunar and planetary mapping and modeling portals providing interactive visualization and analysis tools enabling mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, Vesta, and more. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look ahead to future features and releases. Moon Trek is a new portal replacing its predecessor, the Lunar Mapping and Modeling Portal (LMMP), that significantly upgrades and builds upon the capabilities of LMMP. It features greatly improved navigation, 3D visualization, fly-overs, performance, and reliability. Additional data products and tools continue to be added. These include both generalized products as well as polar data products specifically targeting potential sites for NASA's Resource Prospector mission as well as for missions being planned by NASA's international partners. The latest release of Mars Trek includes new tools and data products requested by NASA's Planetary Science Division to support site selection and analysis for Mars Human Landing Exploration Zone Sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. Phobos Trek, the latest effort in the Solar System Treks suite, is being developed in coordination with the International Phobos/Deimos Landing Site Working Group, with landing site selection and analysis for JAXA's MMX mission as a primary driver.

  3. iTAG: Integrating a Cloud Based, Collaborative Animal Tracking Network into the GCOOS data portal in the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, B. A.; Currier, R. D.; Simoniello, C.

    2016-02-01

    The tagging and tracking of aquatic animals using acoustic telemetry hardware has traditionally been the purview of individual researchers that specialize in single species. Concerns over data privacy and unauthorized use of receiver arrays have prevented the construction of large-scale, multi-species, multi-institution, multi-researcher collaborative acoustic arrays. We have developed a toolset to build the new portal using the Flask microframework, Python language, and Twitter bootstrap. Initial feedback has been overwhelmingly positive. The privacy policy has been praised for its granularity: principal investigators can choose between three levels of privacy for all data and hardware: Completely private - viewable only by the PI Visible to iTAG members Visible to the general public At the time of this writing iTAG is still in the beta stage, but the feedback received to date indicates that with the proper design and security features, and an iterative cycle of feedback from potential members, constructing a collaborative acoustic tracking network system is possible. Initial usage will be limited to the entry and searching for `orphan/mystery' tags, with the integration of historical array deployments and data following shortly thereafter. We have also been working with staff from the Ocean Tracking Network to allow for integration of the two systems. The database schema of iTAG is based on the marine metadata convention for acoustic telemetry. This should permit machine-to-machine data exchange between iTAG and OTN. The integration of animal telemetry data into the GCOOS portal will allow researchers to easily access the physiochemical oceanography data, thus allowing for a more in depth understanding of animal response and usage patterns.

  4. Scalable Performance Measurement and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number ofmore » tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.« less

  5. Analysis of DIRAC's behavior using model checking with process algebra

    NASA Astrophysics Data System (ADS)

    Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Graciani Diaz, Ricardo; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-12-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  6. Experiencing Soil Science from your office through virtual experiences

    NASA Astrophysics Data System (ADS)

    Beato, M. Carmen; González-Merino, Ramón; Campillo, M. Carmen; Fernández-Ahumada, Elvira; Ortiz, Leovigilda; Taguas, Encarnación V.; Guerrero, José Emilio

    2017-04-01

    Currently, numerous tools based on the new information and communication technologies offer a wide range of possibilities for the implementation of interactive methodologies in Education and Science. In particular, virtual reality and immersive worlds - artificially generated computer environments where users interact through a figurative individual that represents them in that environment (their "avatar") - have been identified as the technology that will change the way we live, particularly in educational terms, product development and entertainment areas (Schmorrow, 2009). Gisbert-Cervera et al. (2011) consider that the 3D worlds in education, among others, provide a unique training and exchange of knowledge environment which allows a goal reflection to support activities and achieve learning outcomes. In Soil Sciences, the experimental component is essential to acquire the necessary knowledge to understand the biogeochemical processes taking place and their interactions with time, climate, topography and living organisms present. In this work, an immersive virtual environment which reproduces a series of pits have been developed to evaluate and differentiate soil characteristics such as texture, structure, consistency, color and other physical-chemical and biological properties for educational purposes. Bibliographical material such as pictures, books, papers and were collected in order to classify the information needed and to build the soil profiles into the virtual environment. The programming language for the virtual recreation was Unreal Engine4 (UE4; https://www.unrealengine.com/unreal-engine-4). This program was chosen because it provides two toolsets for programmers and it can also be used in tandem to accelerate development workflows. In addition, Unreal Engine4 technology powers hundreds of games as well as real-time 3D films, training simulations, visualizations and it creates very realistic graphics. For the evaluation of its impact and its usefulness in teaching, a series of surveys will be presented to undergraduate students and teachers. REFERENCES: Gisbert-Cervera M, Esteve-Gonzalez V., Camacho-Marti M.M. (2011). Delve into the Deep: Learning Potential in Metaverses and 3D Worlds. eLearning (25) Papers ISSN: 1887-1542 Schmorrow D.D. (2009). Why virtual? Theoretical Issues in Ergonomics Science 10(3): 279-282.

  7. Data Collection, Collaboration, Analysis, and Publication Using the Open Data Repository's (ODR) Data Publisher

    NASA Astrophysics Data System (ADS)

    Lafuente, B.; Stone, N.; Bristow, T.; Keller, R. M.; Blake, D. F.; Downs, R. T.; Pires, A.; Dateo, C. E.; Fonda, M.

    2017-12-01

    In development for nearly four years, the Open Data Repository's (ODR) Data Publisher software has become a useful tool for researchers' data needs. Data Publisher facilitates the creation of customized databases with flexible permission sets that allow researchers to share data collaboratively while improving data discovery and maintaining ownership rights. The open source software provides an end-to-end solution from collection to final repository publication. A web-based interface allows researchers to enter data, view data, and conduct analysis using any programming language supported by JupyterHub (http://www.jupyterhub.org). This toolset makes it possible for a researcher to store and manipulate their data in the cloud from any internet capable device. Data can be embargoed in the system until a date selected by the researcher. For instance, open publication can be set to a date that coincides with publication of data analysis in a third party journal. In conjunction with teams at NASA Ames and the University of Arizona, a number of pilot studies are being conducted to guide the software development so that it allows them to publish and share their data. These pilots include (1) the Astrobiology Habitable Environments Database (AHED), a central searchable repository designed to promote and facilitate the integration and sharing of all the data generated by the diverse disciplines in astrobiology; (2) a database containing the raw and derived data products from the CheMin instrument on the MSL rover Curiosity (http://odr.io/CheMin), featuring a versatile graphing system, instructions and analytical tools to process the data, and a capability to download data in different formats; and (3) the Mineral Evolution project, which by correlating the diversity of mineral species with their ages, localities, and other measurable properties aims to understand how the episodes of planetary accretion and differentiation, plate tectonics, and origin of life lead to a selective evolution of mineral species through changes in temperature, pressure, and composition. Ongoing development will complete integration of third party meta-data standards and publishing data to the semantic web. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, MSL.

  8. Strategy and gaps for modeling, simulation, and control of hybrid systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabiti, Cristian; Garcia, Humberto E.; Hovsapian, Rob

    2015-04-01

    The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers,more » and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled, dynamic energy systems requires multiple simulation tools, potentially developed in several programming languages and resolved on separate time scales. Whereas further investigation and development of hybrid concepts will provide a more complete understanding of the joint computational and physical modeling needs, this report highlights areas in which co-simulation capabilities are warranted. The current development status, quality assurance, availability and maintainability of simulation tools that are currently available for hybrid systems modeling is presented. Existing gaps in the modeling and simulation toolsets and development needs are subsequently discussed. This effort will feed into a broader Roadmap activity for designing, developing, and demonstrating hybrid energy systems.« less

  9. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  10. DMRT3 is associated with gait type in Mangalarga Marchador horses, but does not control gait ability.

    PubMed

    Patterson, L; Staiger, E A; Brooks, S A

    2015-04-01

    The Mangalarga Marchador (MM) is a Brazilian horse breed known for a uniquely smooth gait. A recent publication described a mutation in the DMRT3 gene that the authors claim controls the ability to perform lateral patterned gaits (Andersson et al. 2012). We tested 81 MM samples for the DMRT3 mutation using extracted DNA from hair bulbs using a novel RFLP. Horses were phenotypically categorized by their gait type (batida or picada), as recorded by the Brazilian Mangalarga Marchador Breeders Association (ABCCMM). Statistical analysis using the plink toolset (Purcell, 2007) revealed significant association between gait type and the DMRT3 mutation (P = 2.3e-22). Deviation from Hardy-Weinberg equilibrium suggests that selective pressure for gait type is altering allele frequencies in this breed (P = 1.00e-5). These results indicate that this polymorphism may be useful for genotype-assisted selection for gait type within this breed. As both batida and picada MM horses can perform lateral gaits, the DMRT3 mutation is not the only locus responsible for the lateral gait pattern. © 2015 Stichting International Foundation for Animal Genetics.

  11. The XSD-Builder Specification Language—Toward a Semantic View of XML Schema Definition

    NASA Astrophysics Data System (ADS)

    Fong, Joseph; Cheung, San Kuen

    In the present database market, XML database model is a main structure for the forthcoming database system in the Internet environment. As a conceptual schema of XML database, XML Model has its limitation on presenting its data semantics. System analyst has no toolset for modeling and analyzing XML system. We apply XML Tree Model (shown in Figure 2) as a conceptual schema of XML database to model and analyze the structure of an XML database. It is important not only for visualizing, specifying, and documenting structural models, but also for constructing executable systems. The tree model represents inter-relationship among elements inside different logical schema such as XML Schema Definition (XSD), DTD, Schematron, XDR, SOX, and DSD (shown in Figure 1, an explanation of the terms in the figure are shown in Table 1). The XSD-Builder consists of XML Tree Model, source language, translator, and XSD. The source language is called XSD-Source which is mainly for providing an environment with concept of user friendliness while writing an XSD. The source language will consequently be translated by XSD-Translator. Output of XSD-Translator is an XSD which is our target and is called as an object language.

  12. California mussels (Mytilus californianus) as sentinels for marine contamination with Sarcocystis neurona.

    PubMed

    Michaels, Lauren; Rejmanek, Daniel; Aguilar, Beatriz; Conrad, Patricia; Shapiro, Karen

    2016-05-01

    Sarcocystis neurona is a terrestrial parasite that can cause fatal encephalitis in the endangered Southern sea otter (Enhydra lutris nereis). To date, neither risk factors associated with marine contamination nor the route of S. neurona infection to marine mammals has been described. This study evaluated coastal S. neurona contamination using California mussels (Mytilus californianus) as sentinels for pathogen pollution. A field investigation was designed to test the hypotheses that (1) mussels can serve as sentinels for S. neurona contamination, and (2) S. neurona contamination in mussels would be highest during the rainy season and in mussels collected near freshwater. Initial validation of molecular assays through sporocyst spiking experiments revealed the ITS-1500 assay to be most sensitive for detection of S. neurona, consistently yielding parasite amplification at concentrations ⩾5 sporocysts/1 mL mussel haemolymph. Assays were then applied on 959 wild-caught mussels, with detection of S. neurona confirmed using sequence analysis in three mussels. Validated molecular assays for S. neurona detection in mussels provide a novel toolset for investigating marine contamination with this parasite, while confirmation of S. neurona in wild mussels suggests that uptake by invertebrates may serve as a route of transmission to susceptible marine animals.

  13. Fiberless multicolor neural optoelectrode for in vivo circuit analysis.

    PubMed

    Kampasi, Komal; Stark, Eran; Seymour, John; Na, Kyounghwan; Winful, Herbert G; Buzsáki, György; Wise, Kensall D; Yoon, Euisik

    2016-08-03

    Maximizing the potential of optogenetic approaches in deep brain structures of intact animals requires optical manipulation of neurons at high spatial and temporal resolutions, while simultaneously recording electrical data from those neurons. Here, we present the first fiber-less optoelectrode with a monolithically integrated optical waveguide mixer that can deliver multicolor light at a common waveguide port to achieve multicolor modulation of the same neuronal population in vivo. We demonstrate successful device implementation by achieving efficient coupling between a side-emitting injection laser diode (ILD) and a dielectric optical waveguide mixer via a gradient-index (GRIN) lens. The use of GRIN lenses attains several design features, including high optical coupling and thermal isolation between ILDs and waveguides. We validated the packaged devices in the intact brain of anesthetized mice co-expressing Channelrhodopsin-2 and Archaerhodopsin in pyramidal cells in the hippocampal CA1 region, achieving high quality recording, activation and silencing of the exact same neurons in a given local region. This fully-integrated approach demonstrates the spatial precision and scalability needed to enable independent activation and silencing of the same or different groups of neurons in dense brain regions while simultaneously recording from them, thus considerably advancing the capabilities of currently available optogenetic toolsets.

  14. Setaria Comes of Age: Meeting Report on the Second International Setaria Genetics Conference

    DOE PAGES

    Zhu, Chuanmei; Yang, Jiani; Shyu, Christine

    2017-09-28

    Setaria viridis is an emerging model for cereal and bioenergy grasses because of its short stature, rapid life cycle and expanding genetic and genomic toolkits. Its close phylogenetic relationship with economically important crops such as maize and sorghum positions Setaria as an ideal model system for accelerating discovery and characterization of crop genes that control agronomically important traits. The Second International Setaria Genetics Conference was held on March 6–8, 2017 at the Donald Danforth Plant Science Center, St. Louis, MO, United States to discuss recent technological breakthroughs and research directions in Setaria (presentation abstracts can be downloaded at https://www.brutnelllab.org/setaria). Here,more » we highlight topics presented in the conference including inflorescence architecture, C 4 photosynthesis and abiotic stress. Genetic and genomic toolsets including germplasm, mutant populations, transformation and gene editing technologies are also discussed. Since the last meeting in 2014, the Setaria community has matured greatly in the quality of research being conducted. Outreach and increased communication with maize and other plant communities will allow broader adoption of Setaria as a model system to translate fundamental discovery research to crop improvement.« less

  15. Personalizing Protein Nourishment

    PubMed Central

    DALLAS, DAVID C.; SANCTUARY, MEGAN R.; QU, YUNYAO; KHAJAVI, SHABNAM HAGHIGHAT; VAN ZANDT, ALEXANDRIA E.; DYANDRA, MELISSA; FRESE, STEVEN A.; BARILE, DANIELA; GERMAN, J. BRUCE

    2016-01-01

    Proteins are not equally digestible—their proteolytic susceptibility varies by their source and processing method. Incomplete digestion increases colonic microbial protein fermentation (putrefaction), which produces toxic metabolites that can induce inflammation in vitro and have been associated with inflammation in vivo. Individual humans differ in protein digestive capacity based on phenotypes, particularly disease states. To avoid putrefaction-induced intestinal inflammation, protein sources and processing methods must be tailored to the consumer’s digestive capacity. This review explores how food processing techniques alter protein digestibility and examines how physiological conditions alter digestive capacity. Possible solutions to improving digestive function or matching low digestive capacity with more digestible protein sources are explored. Beyond the ileal digestibility measurements of protein digestibility, less invasive, quicker and cheaper techniques for monitoring the extent of protein digestion and fermentation are needed to personalize protein nourishment. Biomarkers of protein digestive capacity and efficiency can be identified with the toolsets of peptidomics, metabolomics, microbial sequencing and multiplexed protein analysis of fecal and urine samples. By monitoring individual protein digestive function, the protein component of diets can be tailored via protein source and processing selection to match individual needs to minimize colonic putrefaction and, thus, optimize gut health. PMID:26713355

  16. Spin dynamics and Kondo physics in optical tweezers

    NASA Astrophysics Data System (ADS)

    Lin, Yiheng; Lester, Brian J.; Brown, Mark O.; Kaufman, Adam M.; Long, Junling; Ball, Randall J.; Isaev, Leonid; Wall, Michael L.; Rey, Ana Maria; Regal, Cindy A.

    2016-05-01

    We propose to use optical tweezers as a toolset for direct observation of the interplay between quantum statistics, kinetic energy and interactions, and thus implement minimum instances of the Kondo lattice model in systems with few bosonic rubidium atoms. By taking advantage of strong local exchange interactions, our ability to tune the spin-dependent potential shifts between the two wells and complete control over spin and motional degrees of freedom, we design an adiabatic tunneling scheme that efficiently creates a spin-singlet state in one well starting from two initially separated atoms (one atom per tweezer) in opposite spin state. For three atoms in a double-well, two localized in the lowest vibrational mode of each tweezer and one atom in an excited delocalized state, we plan to use similar techniques and observe resonant transfer of two-atom singlet-triplet states between the wells in the regime when the exchange coupling exceeds the mobile atom hopping. Moreover, we argue that such three-atom double-tweezers could potentially be used for quantum computation by encoding logical qubits in collective spin and motional degrees of freedom. Current address: Department of Physics, Harvard University, Cambridge, Massachusetts 02138, USA.

  17. OASIS: a data and software distribution service for Open Science Grid

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; De Stefano, J.; Hover, J.; Quick, R.; Teige, S.

    2014-06-01

    The Open Science Grid encourages the concept of software portability: a user's scientific application should be able to run at as many sites as possible. It is necessary to provide a mechanism for OSG Virtual Organizations to install software at sites. Since its initial release, the OSG Compute Element has provided an application software installation directory to Virtual Organizations, where they can create their own sub-directory, install software into that sub-directory, and have the directory shared on the worker nodes at that site. The current model has shortcomings with regard to permissions, policies, versioning, and the lack of a unified, collective procedure or toolset for deploying software across all sites. Therefore, a new mechanism for data and software distributing is desirable. The architecture for the OSG Application Software Installation Service (OASIS) is a server-client model: the software and data are installed only once in a single place, and are automatically distributed to all client sites simultaneously. Central file distribution offers other advantages, including server-side authentication and authorization, activity records, quota management, data validation and inspection, and well-defined versioning and deletion policies. The architecture, as well as a complete analysis of the current implementation, will be described in this paper.

  18. Validating the simulation of large-scale parallel applications using statistical characteristics

    DOE PAGES

    Zhang, Deli; Wilke, Jeremiah; Hendry, Gilbert; ...

    2016-03-01

    Simulation is a widely adopted method to analyze and predict the performance of large-scale parallel applications. Validating the hardware model is highly important for complex simulations with a large number of parameters. Common practice involves calculating the percent error between the projected and the real execution time of a benchmark program. However, in a high-dimensional parameter space, this coarse-grained approach often suffers from parameter insensitivity, which may not be known a priori. Moreover, the traditional approach cannot be applied to the validation of software models, such as application skeletons used in online simulations. In this work, we present a methodologymore » and a toolset for validating both hardware and software models by quantitatively comparing fine-grained statistical characteristics obtained from execution traces. Although statistical information has been used in tasks like performance optimization, this is the first attempt to apply it to simulation validation. Lastly, our experimental results show that the proposed evaluation approach offers significant improvement in fidelity when compared to evaluation using total execution time, and the proposed metrics serve as reliable criteria that progress toward automating the simulation tuning process.« less

  19. Setaria Comes of Age: Meeting Report on the Second International Setaria Genetics Conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Chuanmei; Yang, Jiani; Shyu, Christine

    Setaria viridis is an emerging model for cereal and bioenergy grasses because of its short stature, rapid life cycle and expanding genetic and genomic toolkits. Its close phylogenetic relationship with economically important crops such as maize and sorghum positions Setaria as an ideal model system for accelerating discovery and characterization of crop genes that control agronomically important traits. The Second International Setaria Genetics Conference was held on March 6–8, 2017 at the Donald Danforth Plant Science Center, St. Louis, MO, United States to discuss recent technological breakthroughs and research directions in Setaria (presentation abstracts can be downloaded at https://www.brutnelllab.org/setaria). Here,more » we highlight topics presented in the conference including inflorescence architecture, C 4 photosynthesis and abiotic stress. Genetic and genomic toolsets including germplasm, mutant populations, transformation and gene editing technologies are also discussed. Since the last meeting in 2014, the Setaria community has matured greatly in the quality of research being conducted. Outreach and increased communication with maize and other plant communities will allow broader adoption of Setaria as a model system to translate fundamental discovery research to crop improvement.« less

  20. Targeted Proteomics-Driven Computational Modeling of Macrophage S1P Chemosensing*

    PubMed Central

    Manes, Nathan P.; Angermann, Bastian R.; Koppenol-Raab, Marijke; An, Eunkyung; Sjoelund, Virginie H.; Sun, Jing; Ishii, Masaru; Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra

    2015-01-01

    Osteoclasts are monocyte-derived multinuclear cells that directly attach to and resorb bone. Sphingosine-1-phosphate (S1P)1 regulates bone resorption by functioning as both a chemoattractant and chemorepellent of osteoclast precursors through two G-protein coupled receptors that antagonize each other in an S1P-concentration-dependent manner. To quantitatively explore the behavior of this chemosensing pathway, we applied targeted proteomics, transcriptomics, and rule-based pathway modeling using the Simmune toolset. RAW264.7 cells (a mouse monocyte/macrophage cell line) were used as model osteoclast precursors, RNA-seq was used to identify expressed target proteins, and selected reaction monitoring (SRM) mass spectrometry using internal peptide standards was used to perform absolute abundance measurements of pathway proteins. The resulting transcript and protein abundance values were strongly correlated. Measured protein abundance values, used as simulation input parameters, led to in silico pathway behavior matching in vitro measurements. Moreover, once model parameters were established, even simulated responses toward stimuli that were not used for parameterization were consistent with experimental findings. These findings demonstrate the feasibility and value of combining targeted mass spectrometry with pathway modeling for advancing biological insight. PMID:26199343

  1. Accumulation of High-Value Lipids in Single-Cell Microorganisms: A Mechanistic Approach and Future Perspectives

    PubMed Central

    2015-01-01

    In recent years attention has been focused on the utilization of microorganisms as alternatives for industrial and nutritional applications. Considerable research has been devoted to techniques for growth, extraction, and purification of high-value lipids for their use as biofuels and biosurfactants as well as high-value metabolites for nutrition and health. These successes argue that the elucidation of the mechanisms underlying the microbial biosynthesis of such molecules, which are far from being completely understood, now will yield spectacular opportunities for industrial scale biomolecular production. There are important additional questions to be solved to optimize the processing strategies to take advantage of the assets of microbial lipids. The present review describes the current state of knowledge regarding lipid biosynthesis, accumulation, and transport mechanisms present in single-cell organisms, specifically yeasts, microalgae, bacteria, and archaea. Similarities and differences in biochemical pathways and strategies of different microorganisms provide a diverse toolset to the expansion of biotechnologies for lipid production. This paper is intended to inspire a generation of lipid scientists to insights that will drive the biotechnologies of microbial production as uniquely enabling players of lipid biotherapeutics, biofuels, biomaterials, and other opportunity areas into the 21st century. PMID:24628496

  2. Formulating a subgrid-scale breakup model for microbubble generation from interfacial collisions

    NASA Astrophysics Data System (ADS)

    Chan, Wai Hong Ronald; Mirjalili, Shahab; Urzay, Javier; Mani, Ali; Moin, Parviz

    2017-11-01

    Multiphase flows often involve impact events that engender important effects like the generation of a myriad of tiny bubbles that are subsequently transported in large liquid bodies. These impact events are created by large-scale phenomena like breaking waves on ocean surfaces, and often involve the relative approach of liquid surfaces. This relative motion generates continuously shrinking length scales as the entrapped gas layer thins and eventually breaks up into microbubbles. The treatment of this disparity in length scales is computationally challenging. In this presentation, a framework is presented that addresses a subgrid-scale (SGS) model aimed at capturing the process of microbubble generation. This work sets up the components in an overarching volume-of-fluid (VoF) toolset and investigates the analytical foundations of an SGS model for describing the breakup of a thin air film trapped between two approaching water bodies in a physical regime corresponding to Mesler entrainment. Constituents of the SGS model, such as the identification of impact events and the accurate computation of the local characteristic curvature in a VoF-based architecture, and the treatment of the air layer breakup, are discussed and illustrated in simplified scenarios. Supported by Office of Naval Research (ONR)/A*STAR (Singapore).

  3. Accumulation of high-value lipids in single-cell microorganisms: a mechanistic approach and future perspectives.

    PubMed

    Garay, Luis A; Boundy-Mills, Kyria L; German, J Bruce

    2014-04-02

    In recent years attention has been focused on the utilization of microorganisms as alternatives for industrial and nutritional applications. Considerable research has been devoted to techniques for growth, extraction, and purification of high-value lipids for their use as biofuels and biosurfactants as well as high-value metabolites for nutrition and health. These successes argue that the elucidation of the mechanisms underlying the microbial biosynthesis of such molecules, which are far from being completely understood, now will yield spectacular opportunities for industrial scale biomolecular production. There are important additional questions to be solved to optimize the processing strategies to take advantage of the assets of microbial lipids. The present review describes the current state of knowledge regarding lipid biosynthesis, accumulation, and transport mechanisms present in single-cell organisms, specifically yeasts, microalgae, bacteria, and archaea. Similarities and differences in biochemical pathways and strategies of different microorganisms provide a diverse toolset to the expansion of biotechnologies for lipid production. This paper is intended to inspire a generation of lipid scientists to insights that will drive the biotechnologies of microbial production as uniquely enabling players of lipid biotherapeutics, biofuels, biomaterials, and other opportunity areas into the 21st century.

  4. Nutritional Lipidomics: Molecular Metabolism, Analytics, and Diagnostics

    PubMed Central

    Smilowitz, Jennifer T.; Zivkovic, Angela M.; Wan, Yu-Jui Yvonne; Watkins, Steve M.; Nording, Malin L.; Hammock, Bruce D.; German, J. Bruce

    2013-01-01

    The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of mass spectrometry with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments-lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways. PMID:23818328

  5. Setaria Comes of Age: Meeting Report on the Second International Setaria Genetics Conference

    PubMed Central

    Zhu, Chuanmei; Yang, Jiani; Shyu, Christine

    2017-01-01

    Setaria viridis is an emerging model for cereal and bioenergy grasses because of its short stature, rapid life cycle and expanding genetic and genomic toolkits. Its close phylogenetic relationship with economically important crops such as maize and sorghum positions Setaria as an ideal model system for accelerating discovery and characterization of crop genes that control agronomically important traits. The Second International Setaria Genetics Conference was held on March 6–8, 2017 at the Donald Danforth Plant Science Center, St. Louis, MO, United States to discuss recent technological breakthroughs and research directions in Setaria (presentation abstracts can be downloaded at https://www.brutnelllab.org/setaria). Here, we highlight topics presented in the conference including inflorescence architecture, C4 photosynthesis and abiotic stress. Genetic and genomic toolsets including germplasm, mutant populations, transformation and gene editing technologies are also discussed. Since the last meeting in 2014, the Setaria community has matured greatly in the quality of research being conducted. Outreach and increased communication with maize and other plant communities will allow broader adoption of Setaria as a model system to translate fundamental discovery research to crop improvement. PMID:29033954

  6. Final Report: Phase II Nevada Water Resources Data, Modeling, and Visualization (DMV) Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackman, Thomas; Minor, Timothy; Pohll, Gregory

    2013-07-22

    Water is unquestionably a critical resource throughout the United States. In the semi-arid west -- an area stressed by increase in human population and sprawl of the built environment -- water is the most important limiting resource. Crucially, science must understand factors that affect availability and distribution of water. To sustain growing consumptive demand, science needs to translate understanding into reliable and robust predictions of availability under weather conditions that could be average but might be extreme. These predictions are needed to support current and long-term planning. Similar to the role of weather forecast and climate prediction, water prediction overmore » short and long temporal scales can contribute to resource strategy, governmental policy and municipal infrastructure decisions, which are arguably tied to the natural variability and unnatural change to climate. Change in seasonal and annual temperature, precipitation, snowmelt, and runoff affect the distribution of water over large temporal and spatial scales, which impact the risk of flooding and the groundwater recharge. Anthropogenic influences and impacts increase the complexity and urgency of the challenge. The goal of this project has been to develop a decision support framework of data acquisition, digital modeling, and 3D visualization. This integrated framework consists of tools for compiling, discovering and projecting our understanding of processes that control the availability and distribution of water. The framework is intended to support the analysis of the complex interactions between processes that affect water supply, from controlled availability to either scarcity or deluge. The developed framework enables DRI to promote excellence in water resource management, particularly within the Lake Tahoe basin. In principle, this framework could be replicated for other watersheds throughout the United States. Phase II of this project builds upon the research conducted during Phase I, in which the hydrologic framework was investigated and the development initiated. Phase II concentrates on practical implementation of the earlier work but emphasizes applications to the hydrology of the Lake Tahoe basin. Phase 1 efforts have been refined and extended by creating a toolset for geographic information systems (GIS) that is usable for disparate types of geospatial and geo-referenced data. The toolset is intended to serve multiple users for a variety of applications. The web portal for internet access to hydrologic and remotely sensed product data, prototyped in Phase I, has been significantly enhanced. The portal provides high performance access to LANDSAT-derived data using techniques developed during the course of the project. The portal is interactive, and supports the geo-referenced display of hydrologic information derived from remotely sensed data, such as various vegetative indices used to calculate water consumption. The platform can serve both internal and external constituencies using inter-operating infrastructure that spans both sides of the DRI firewall. The platform is intended grow its supported data assets and to serve as a template for replication to other geographic areas. An unanticipated development during the project was the use of ArcGIS software on a new computer system, called the IBM PureSytems, and the parallel use of the systems for faster, more efficient image processing. Additional data, independent of the portal, was collected within the Sagehen basin and provides detailed information regarding the processes that control hydrologic responses within mountain watersheds. The newly collected data include elevation, evapotranspiration, energy balance and remotely sensed snow-pack data. A Lake Tahoe basin hydrologic model has been developed, in part to help predict the hydrologic impacts of climate change. The model couples both the surface and subsurface hydrology, with the two components having been independently calibrated. Results from the coupled simulations involving both surface water and groundwater processes show that it is possible to fairly accurately simulate lake effects and water budget variables over a wide range of dry and wet cycles in the historical record. The Lake Tahoe basin is representative of the hydrology, topography and climate throughout the Sierra Nevada Range, and the entire model development is prototypical of the efforts required to replicate the decision support framework to other locales. The Lake Tahoe model in particular, could allow water managers to evaluate more accurately components of the water budget (ET, runoff, groundwater, etc) and to answer important questions regarding water resources in northern Nevada. This report discusses the geographic scale and the hydrologic complexity of the calibrated model developed as part of this project, as well as simulation results for historical and future climate projects To enable human-driven data exploration and discovery, de novo software for a globalized rendering module that extends the capability of our evolving custom visualization engine from Phase I (called SMEngine) has been developed. The new rendering component, called Horizon, supports terrain rendering capable of displaying and interrogating both remotely sensed and modeled data. The development of Horizon necessitated adaptation of the visualization engine to allow extensible integration of components such as the global rendering module and support for associated features. The resulting software is general in its GIS capability, but a specific Lake Tahoe visualization application suitable for immersive decision support in the DRIVE6 virtual reality facility has been developed. During the development, various features to enhance the value of the visualization experience were explored, including the use of hyperspectral image overlays. An over-arching goal of the visualization aspect of the project has been to develop and demonstrate the CAVE (CAVE Automatic Virtual Environment) as a practical tool for hydrologic research.« less

  7. Triethylene Glycol Up-Regulates Virulence-Associated Genes and Proteins in Streptococcus mutans.

    PubMed

    Sadeghinejad, Lida; Cvitkovitch, Dennis G; Siqueira, Walter L; Santerre, J Paul; Finer, Yoav

    2016-01-01

    Triethylene glycol dimethacrylate (TEGDMA) is a diluent monomer used pervasively in dental composite resins. Through hydrolytic degradation of the composites in the oral cavity it yields a hydrophilic biodegradation product, triethylene glycol (TEG), which has been shown to promote the growth of Streptococcus mutans, a dominant cariogenic bacterium. Previously it was shown that TEG up-regulated gtfB, an important gene contributing to polysaccharide synthesis function in biofilms. However, molecular mechanisms related to TEG's effect on bacterial function remained poorly understood. In the present study, S. mutans UA159 was incubated with clinically relevant concentrations of TEG at pH 5.5 and 7.0. Quantitative real-time PCR, proteomics analysis, and glucosyltransferase enzyme (GTF) activity measurements were employed to identify the bacterial phenotypic response to TEG. A S. mutans vicK isogenic mutant (SMΔvicK1) and its associated complemented strain (SMΔvicK1C), an important regulatory gene for biofilm-associated genes, were used to determine if this signaling pathway was involved in modulation of the S. mutans virulence-associated genes. Extracted proteins from S. mutans biofilms grown in the presence and absence of TEG were subjected to mass spectrometry for protein identification, characterization and quantification. TEG up-regulated gtfB/C, gbpB, comC, comD and comE more significantly in biofilms at cariogenic pH (5.5) and defined concentrations. Differential response of the vicK knock-out (SMΔvicK1) and complemented strains (SMΔvicK1C) implicated this signalling pathway in TEG-modulated cellular responses. TEG resulted in increased GTF enzyme activity, responsible for synthesizing insoluble glucans involved in the formation of cariogenic biofilms. As well, TEG increased protein abundance related to biofilm formation, carbohydrate transport, acid tolerance, and stress-response. Proteomics data was consistent with gene expression findings for the selected genes. These findings demonstrate a mechanistic pathway by which TEG derived from commercial resin materials in the oral cavity promote S. mutans pathogenicity, which is typically associated with secondary caries.

  8. Triethylene Glycol Up-Regulates Virulence-Associated Genes and Proteins in Streptococcus mutans

    PubMed Central

    Sadeghinejad, Lida; Cvitkovitch, Dennis G.; Siqueira, Walter L.; Santerre, J. Paul; Finer, Yoav

    2016-01-01

    Triethylene glycol dimethacrylate (TEGDMA) is a diluent monomer used pervasively in dental composite resins. Through hydrolytic degradation of the composites in the oral cavity it yields a hydrophilic biodegradation product, triethylene glycol (TEG), which has been shown to promote the growth of Streptococcus mutans, a dominant cariogenic bacterium. Previously it was shown that TEG up-regulated gtfB, an important gene contributing to polysaccharide synthesis function in biofilms. However, molecular mechanisms related to TEG’s effect on bacterial function remained poorly understood. In the present study, S. mutans UA159 was incubated with clinically relevant concentrations of TEG at pH 5.5 and 7.0. Quantitative real-time PCR, proteomics analysis, and glucosyltransferase enzyme (GTF) activity measurements were employed to identify the bacterial phenotypic response to TEG. A S. mutans vicK isogenic mutant (SMΔvicK1) and its associated complemented strain (SMΔvicK1C), an important regulatory gene for biofilm-associated genes, were used to determine if this signaling pathway was involved in modulation of the S. mutans virulence-associated genes. Extracted proteins from S. mutans biofilms grown in the presence and absence of TEG were subjected to mass spectrometry for protein identification, characterization and quantification. TEG up-regulated gtfB/C, gbpB, comC, comD and comE more significantly in biofilms at cariogenic pH (5.5) and defined concentrations. Differential response of the vicK knock-out (SMΔvicK1) and complemented strains (SMΔvicK1C) implicated this signalling pathway in TEG-modulated cellular responses. TEG resulted in increased GTF enzyme activity, responsible for synthesizing insoluble glucans involved in the formation of cariogenic biofilms. As well, TEG increased protein abundance related to biofilm formation, carbohydrate transport, acid tolerance, and stress-response. Proteomics data was consistent with gene expression findings for the selected genes. These findings demonstrate a mechanistic pathway by which TEG derived from commercial resin materials in the oral cavity promote S. mutans pathogenicity, which is typically associated with secondary caries. PMID:27820867

  9. WEclMon - A simple and robust camera-based system to monitor Drosophila eclosion under optogenetic manipulation and natural conditions.

    PubMed

    Ruf, Franziska; Fraunholz, Martin; Öchsner, Konrad; Kaderschabek, Johann; Wegener, Christian

    2017-01-01

    Eclosion in flies and other insects is a circadian-gated behaviour under control of a central and a peripheral clock. It is not influenced by the motivational state of an animal, and thus presents an ideal paradigm to study the relation and signalling pathways between central and peripheral clocks, and downstream peptidergic regulatory systems. Little is known, however, about eclosion rhythmicity under natural conditions, and research into this direction is hampered by the physically closed design of current eclosion monitoring systems. We describe a novel open eclosion monitoring system (WEclMon) that allows the puparia to come into direct contact with light, temperature and humidity. We demonstrate that the system can be used both in the laboratory and outdoors, and shows a performance similar to commercial closed funnel-type monitors. Data analysis is semi-automated based on a macro toolset for the open imaging software Fiji. Due to its open design, the WEclMon is also well suited for optogenetic experiments. A small screen to identify putative neuroendocrine signals mediating time from the central clock to initiate eclosion showed that optogenetic activation of ETH-, EH and myosuppressin neurons can induce precocious eclosion. Genetic ablation of myosuppressin-expressing neurons did, however, not affect eclosion rhythmicity.

  10. Application of phyto-indication and radiocesium indicative methods for microrelief mapping

    NASA Astrophysics Data System (ADS)

    Panidi, E.; Trofimetz, L.; Sokolova, J.

    2016-04-01

    Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.

  11. WEclMon – A simple and robust camera-based system to monitor Drosophila eclosion under optogenetic manipulation and natural conditions

    PubMed Central

    Ruf, Franziska; Fraunholz, Martin; Öchsner, Konrad; Kaderschabek, Johann

    2017-01-01

    Eclosion in flies and other insects is a circadian-gated behaviour under control of a central and a peripheral clock. It is not influenced by the motivational state of an animal, and thus presents an ideal paradigm to study the relation and signalling pathways between central and peripheral clocks, and downstream peptidergic regulatory systems. Little is known, however, about eclosion rhythmicity under natural conditions, and research into this direction is hampered by the physically closed design of current eclosion monitoring systems. We describe a novel open eclosion monitoring system (WEclMon) that allows the puparia to come into direct contact with light, temperature and humidity. We demonstrate that the system can be used both in the laboratory and outdoors, and shows a performance similar to commercial closed funnel-type monitors. Data analysis is semi-automated based on a macro toolset for the open imaging software Fiji. Due to its open design, the WEclMon is also well suited for optogenetic experiments. A small screen to identify putative neuroendocrine signals mediating time from the central clock to initiate eclosion showed that optogenetic activation of ETH-, EH and myosuppressin neurons can induce precocious eclosion. Genetic ablation of myosuppressin-expressing neurons did, however, not affect eclosion rhythmicity. PMID:28658318

  12. Systems and synthetic biology approaches to alter plant cell walls and reduce biomass recalcitrance

    DOE PAGES

    Kalluri, Udaya C.; Yin, Hengfu; Yang, Xiaohan; ...

    2014-11-03

    Fine-tuning plant cell wall properties to render plant biomass more amenable to biofuel conversion is a colossal challenge. A deep knowledge of the biosynthesis and regulation of plant cell wall and a high-precision genome engineering toolset are the two essential pillars of efforts to alter plant cell walls and reduce biomass recalcitrance. The past decade has seen a meteoric rise in use of transcriptomics and high-resolution imaging methods resulting in fresh insights into composition, structure, formation and deconstruction of plant cell walls. Subsequent gene manipulation approaches, however, commonly include ubiquitous mis-expression of a single candidate gene in a host thatmore » carries an intact copy of the native gene. The challenges posed by pleiotropic and unintended changes resulting from such an approach are moving the field towards synthetic biology approaches. Finally, synthetic biology builds on a systems biology knowledge base and leverages high-precision tools for high-throughput assembly of multigene constructs and pathways, precision genome editing and site-specific gene stacking, silencing and/or removal. Here, we summarize the recent breakthroughs in biosynthesis and remodelling of major secondary cell wall components, assess the impediments in obtaining a systems-level understanding and explore the potential opportunities in leveraging synthetic biology approaches to reduce biomass recalcitrance.« less

  13. Systems and synthetic biology approaches to alter plant cell walls and reduce biomass recalcitrance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalluri, Udaya C.; Yin, Hengfu; Yang, Xiaohan

    Fine-tuning plant cell wall properties to render plant biomass more amenable to biofuel conversion is a colossal challenge. A deep knowledge of the biosynthesis and regulation of plant cell wall and a high-precision genome engineering toolset are the two essential pillars of efforts to alter plant cell walls and reduce biomass recalcitrance. The past decade has seen a meteoric rise in use of transcriptomics and high-resolution imaging methods resulting in fresh insights into composition, structure, formation and deconstruction of plant cell walls. Subsequent gene manipulation approaches, however, commonly include ubiquitous mis-expression of a single candidate gene in a host thatmore » carries an intact copy of the native gene. The challenges posed by pleiotropic and unintended changes resulting from such an approach are moving the field towards synthetic biology approaches. Finally, synthetic biology builds on a systems biology knowledge base and leverages high-precision tools for high-throughput assembly of multigene constructs and pathways, precision genome editing and site-specific gene stacking, silencing and/or removal. Here, we summarize the recent breakthroughs in biosynthesis and remodelling of major secondary cell wall components, assess the impediments in obtaining a systems-level understanding and explore the potential opportunities in leveraging synthetic biology approaches to reduce biomass recalcitrance.« less

  14. Cloning and characterization of a pyrethroid pesticide decomposing esterase gene, Est3385, from Rhodopseudomonas palustris PSB-S.

    PubMed

    Luo, Xiangwen; Zhang, Deyong; Zhou, Xuguo; Du, Jiao; Zhang, Songbai; Liu, Yong

    2018-05-09

    Full length open reading frame of pyrethroid detoxification gene, Est3385, contains 963 nucleotides. This gene was identified and cloned based on the genome sequence of Rhodopseudomonas palustris PSB-S available at the GneBank. The predicted amino acid sequence of Est3385 shared moderate identities (30-46%) with the known homologous esterases. Phylogenetic analysis revealed that Est3385 was a member in the esterase family I. Recombinant Est3385 was heterologous expressed in E. coli, purified and characterized for its substrate specificity, kinetics and stability under various conditions. The optimal temperature and pH for Est3385 were 35 °C and 6.0, respectively. This enzyme could detoxify various pyrethroid pesticides and degrade the optimal substrate fenpropathrin with a Km and Vmax value of 0.734 ± 0.013 mmol·l -1 and 0.918 ± 0.025 U·µg -1 , respectively. No cofactor was found to affect Est3385 activity but substantial reduction of enzymatic activity was observed when metal ions were applied. Taken together, a new pyrethroid degradation esterase was identified and characterized. Modification of Est3385 with protein engineering toolsets should enhance its potential for field application to reduce the pesticide residue from agroecosystems.

  15. Open source Matrix Product States: Opening ways to simulate entangled many-body quantum systems in one dimension

    NASA Astrophysics Data System (ADS)

    Jaschke, Daniel; Wall, Michael L.; Carr, Lincoln D.

    2018-04-01

    Numerical simulations are a powerful tool to study quantum systems beyond exactly solvable systems lacking an analytic expression. For one-dimensional entangled quantum systems, tensor network methods, amongst them Matrix Product States (MPSs), have attracted interest from different fields of quantum physics ranging from solid state systems to quantum simulators and quantum computing. Our open source MPS code provides the community with a toolset to analyze the statics and dynamics of one-dimensional quantum systems. Here, we present our open source library, Open Source Matrix Product States (OSMPS), of MPS methods implemented in Python and Fortran2003. The library includes tools for ground state calculation and excited states via the variational ansatz. We also support ground states for infinite systems with translational invariance. Dynamics are simulated with different algorithms, including three algorithms with support for long-range interactions. Convenient features include built-in support for fermionic systems and number conservation with rotational U(1) and discrete Z2 symmetries for finite systems, as well as data parallelism with MPI. We explain the principles and techniques used in this library along with examples of how to efficiently use the general interfaces to analyze the Ising and Bose-Hubbard models. This description includes the preparation of simulations as well as dispatching and post-processing of them.

  16. Fiberless multicolor neural optoelectrode for in vivo circuit analysis

    PubMed Central

    Kampasi, Komal; Stark, Eran; Seymour, John; Na, Kyounghwan; Winful, Herbert G.; Buzsáki, György; Wise, Kensall D.; Yoon, Euisik

    2016-01-01

    Maximizing the potential of optogenetic approaches in deep brain structures of intact animals requires optical manipulation of neurons at high spatial and temporal resolutions, while simultaneously recording electrical data from those neurons. Here, we present the first fiber-less optoelectrode with a monolithically integrated optical waveguide mixer that can deliver multicolor light at a common waveguide port to achieve multicolor modulation of the same neuronal population in vivo. We demonstrate successful device implementation by achieving efficient coupling between a side-emitting injection laser diode (ILD) and a dielectric optical waveguide mixer via a gradient-index (GRIN) lens. The use of GRIN lenses attains several design features, including high optical coupling and thermal isolation between ILDs and waveguides. We validated the packaged devices in the intact brain of anesthetized mice co-expressing Channelrhodopsin-2 and Archaerhodopsin in pyramidal cells in the hippocampal CA1 region, achieving high quality recording, activation and silencing of the exact same neurons in a given local region. This fully-integrated approach demonstrates the spatial precision and scalability needed to enable independent activation and silencing of the same or different groups of neurons in dense brain regions while simultaneously recording from them, thus considerably advancing the capabilities of currently available optogenetic toolsets. PMID:27485264

  17. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  18. Prediction of A2 to B2 Phase Transition in the High Entropy Alloy Mo-Nb-Ta-W

    NASA Astrophysics Data System (ADS)

    Huhn, William; Widom, Michael

    2014-03-01

    In this talk we show that an effective Hamiltonian fit with first principles calculations predicts an order/disorder transition occurs in the high entropy alloy Mo-Nb-Ta-W. Using the Alloy Theoretic Automated Toolset, we find T=0K enthalpies of formation for all binaries containing Mo, Nb, Ta, and W, and in particular we find the stable structures for binaries at equiatomic concentrations are close in energy to the associated B2 structure, suggesting that at intermediate temperatures a B2 phase is stabilized in Mo-Nb-Ta-W. Our ``hybrid Monte Carlo/molecular dynamics'' results for the Mo-Nb-Ta-W system are analyzed to identify certain preferred chemical bonding types. A mean field free energy model incorporating nearest neighbor bonds will be presented, allowing us to predict the mechanism of the order/disorder transition. We find the temperature evolution of the system is driven by strong Mo-Ta bonding. Comparison of the free energy model and our MC/MD results suggest the existence of additional low-temperature phase transitions in the system likely ending with phase segregation into binary phases. We would like to thank DOD-DTRA for funding this research under contract number DTRA-11-1-0064.

  19. A Modified Shuttle Plasmid Facilitates Expression of a Flavin Mononucleotide-Based Fluorescent Protein in Treponema denticola ATCC 35405

    PubMed Central

    Godovikova, Valentina; Goetting-Minesky, M. Paula; Shin, Jae M.; Kapila, Yvonne L.; Rickard, Alexander H.

    2015-01-01

    Oral pathogens, including Treponema denticola, initiate the dysregulation of tissue homeostasis that characterizes periodontitis. However, progress of research on the roles of T. denticola in microbe-host interactions and signaling, microbial communities, microbial physiology, and molecular evolution has been hampered by limitations in genetic methodologies. This is typified by an extremely low transformation efficiency and inability to transform the most widely studied T. denticola strain with shuttle plasmids. Previous studies have suggested that robust restriction-modification (R-M) systems in T. denticola contributed to these problems. To facilitate further molecular genetic analysis of T. denticola behavior, we optimized existing protocols such that shuttle plasmid transformation efficiency was increased by >100-fold over prior reports. Here, we report routine transformation of T. denticola ATCC 35405 with shuttle plasmids, independently of both plasmid methylation status and activity of the type II restriction endonuclease encoded by TDE0911. To validate the utility of this methodological advance, we demonstrated expression and activity in T. denticola of a flavin mononucleotide-based fluorescent protein (FbFP) that is active under anoxic conditions. Addition of routine plasmid-based fluorescence labeling to the Treponema toolset will enable more-rigorous and -detailed studies of the behavior of this organism. PMID:26162875

  20. The Awesome Power of Yeast Evolutionary Genetics: New Genome Sequences and Strain Resources for the Saccharomyces sensu stricto Genus

    PubMed Central

    Scannell, Devin R.; Zill, Oliver A.; Rokas, Antonis; Payen, Celia; Dunham, Maitreya J.; Eisen, Michael B.; Rine, Jasper; Johnston, Mark; Hittinger, Chris Todd

    2011-01-01

    High-quality, well-annotated genome sequences and standardized laboratory strains fuel experimental and evolutionary research. We present improved genome sequences of three species of Saccharomyces sensu stricto yeasts: S. bayanus var. uvarum (CBS 7001), S. kudriavzevii (IFO 1802T and ZP 591), and S. mikatae (IFO 1815T), and describe their comparison to the genomes of S. cerevisiae and S. paradoxus. The new sequences, derived by assembling millions of short DNA sequence reads together with previously published Sanger shotgun reads, have vastly greater long-range continuity and far fewer gaps than the previously available genome sequences. New gene predictions defined a set of 5261 protein-coding orthologs across the five most commonly studied Saccharomyces yeasts, enabling a re-examination of the tempo and mode of yeast gene evolution and improved inferences of species-specific gains and losses. To facilitate experimental investigations, we generated genetically marked, stable haploid strains for all three of these Saccharomyces species. These nearly complete genome sequences and the collection of genetically marked strains provide a valuable toolset for comparative studies of gene function, metabolism, and evolution, and render Saccharomyces sensu stricto the most experimentally tractable model genus. These resources are freely available and accessible through www.SaccharomycesSensuStricto.org. PMID:22384314

  1. Lens-free computational imaging of capillary morphogenesis within three-dimensional substrates

    NASA Astrophysics Data System (ADS)

    Weidling, John; Isikman, Serhan O.; Greenbaum, Alon; Ozcan, Aydogan; Botvinick, Elliot

    2012-12-01

    Endothelial cells cultured in three-dimensional (3-D) extracellular matrices spontaneously form microvessels in response to soluble and matrix-bound factors. Such cultures are common for the study of angiogenesis and may find widespread use in drug discovery. Vascular networks are imaged over weeks to measure the distribution of vessel morphogenic parameters. Measurements require micron-scale spatial resolution, which for light microscopy comes at the cost of limited field-of-view (FOV) and shallow depth-of-focus (DOF). Small FOVs and DOFs necessitate lateral and axial mechanical scanning, thus limiting imaging throughput. We present a lens-free holographic on-chip microscopy technique to rapidly image microvessels within a Petri dish over a large volume without any mechanical scanning. This on-chip method uses partially coherent illumination and a CMOS sensor to record in-line holographic images of the sample. For digital reconstruction of the measured holograms, we implement a multiheight phase recovery method to obtain phase images of capillary morphogenesis over a large FOV (24 mm2) with ˜1.5 μm spatial resolution. On average, measured capillary length in our method was within approximately 2% of lengths measured using a 10× microscope objective. These results suggest lens-free on-chip imaging is a useful toolset for high-throughput monitoring and quantitative analysis of microvascular 3-D networks.

  2. A preliminary study of molecular dynamics on reconfigurable computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolinski, C.; Trouw, F. R.; Gokhale, M.

    2003-01-01

    In this paper we investigate the performance of platform FPGAs on a compute-intensive, floating-point-intensive supercomputing application, Molecular Dynamics (MD). MD is a popular simulation technique to track interacting particles through time by integrating their equations of motion. One part of the MD algorithm was implemented using the Fabric Generator (FG)[l I ] and mapped onto several reconfigurable logic arrays. FG is a Java-based toolset that greatly accelerates construction of the fabrics from an abstract technology independent representation. Our experiments used technology-independent IEEE 32-bit floating point operators so that the design could be easily re-targeted. Experiments were performed using both non-pipelinedmore » and pipelined floating point modules. We present results for the Altera Excalibur ARM System on a Programmable Chip (SoPC), the Altera Strath EPlS80, and the Xilinx Virtex-N Pro 2VP.50. The best results obtained were 5.69 GFlops at 8OMHz(Altera Strath EPlS80), and 4.47 GFlops at 82 MHz (Xilinx Virtex-II Pro 2VF50). Assuming a lOWpower budget, these results compare very favorably to a 4Gjlop/40Wprocessing/power rate for a modern Pentium, suggesting that reconfigurable logic can achieve high performance at low power on jloating-point-intensivea pplications.« less

  3. Additions and improvements to the high energy density physics capabilities in the FLASH code

    NASA Astrophysics Data System (ADS)

    Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.

    2017-10-01

    FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.

  4. Our experiences with the use of atopy patch test in the diagnosis of cow's milk hypersensitivity.

    PubMed

    Pustisek, Nives; Jaklin-Kekez, Alemka; Frkanec, Ruza; Sikanić-Dugić, Nives; Misak, Zrinjka; Jadresin, Oleg; Kolacek, Sanja

    2010-01-01

    Atopy patch test has been recognized as a diagnostic tool for the verification of food allergies in infants and small children suffering from atopic dermatitis. The test also has a role in the diagnosis of food allergies characterized by clinical signs associated with the digestive system. Yet, in spite of numerous studies, the test itself has hitherto not been standardized. Our study enlisted 151 children less than two years of age, who exhibited suspect skin and/or gastrointestinal manifestations of food allergy to cow's milk, and in whom tests failed to prove early type of allergic reaction. Atopy patch test was positive in 28% of the children with atopic dermatitis, 43% of the children with suspect gastrointestinal manifestation and 32% of the children with skin and gastrointestinal manifestations of food allergy. In our experience, atopy patch test is an excellent addition to the hitherto used tests for the diagnosis of food allergies. It targets specifically delayed type hypersensitivity reactions, which are difficult to confirm with other diagnostic tools. It is furthermore simple to perform, noninvasive and produces a minimum of undesired side effects. For these reasons, it should become part of the routine diagnostic toolset for food allergies to cow's milk in infants and children, and applied before a food challenge test.

  5. Dispersions of Aramid Nanofibers: A New Nanoscale Building Block

    PubMed Central

    Yang, Ming; Cao, Keqin; Sui, Lang; Qi, Ying; Zhu, Jian; Waas, Anthony; Arruda, Ellen M.; Kieffer, John; Thouless, M. D.; Kotov, Nicholas A.

    2011-01-01

    Stable dispersions of nanofibers are virtually unknown for synthetic polymers. They can complement analogous dispersions of inorganic components, such as nanoparticles, nanowires, nanosheets, etc as a fundamental component of a toolset for design of nanostructures and metamaterials via numerous solvent-based processing methods. As such, strong flexible polymeric nanofibers are very desirable for the effective utilization within composites of nanoscale inorganic components such as nanowires, carbon nanotubes, graphene, and others. Here stable dispersions of uniform high-aspect-ratio aramid nanofibers (ANFs) with diameters between 3 and 30 nm and up to 10 μm in length were successfully obtained. Unlike the traditional approaches based on polymerization of monomers, they are made by controlled dissolution of standard macroscale form of the aramid polymer, i.e. well known Kevlar threads, and revealed distinct morphological features similar to carbon nanotubes. ANFs are successfully processed into films using layer-by-layer (LBL) assembly as one of the potential methods of preparation of composites from ANFs. The resultant films are transparent and highly temperature resilient. They also display enhanced mechanical characteristics making ANF films highly desirable as protective coatings, ultrastrong membranes, as well as building blocks of other high performance materials in place of or in combination with carbon nanotubes. PMID:21800822

  6. Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS

    NASA Astrophysics Data System (ADS)

    Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.

    2015-12-01

    Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.

  7. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    NASA Technical Reports Server (NTRS)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and development through health management and maintenance. TEAMS-Designer is the model-building and testability analysis software in that suite.

  8. Solar System Treks: Interactive Web Portals or STEM, Exploration and Beyond

    NASA Astrophysics Data System (ADS)

    Law, E.; Day, B. H.; Viotti, M.

    2017-12-01

    NASA's Solar System Treks project produces a suite of online visualization and analysis tools for lunar and planetary mapping and modeling. These portals offer great benefits for education and public outreach, providing access to data from a wide range of instruments aboard a variety of past and current missions. As a component of NASA's STEM Activation Infrastructure, they are available as resources for NASA STEM programs, and to the greater STEM community. As new missions are planned to a variety of planetary bodies, these tools facilitate public understanding of the missions and engage the public in the process of identifying and selecting where these missions will land. There are currently three web portals in the program: Moon Trek (https://moontrek.jpl.nasa.gov), Mars Trek (https://marstrek.jpl.nasa.gov), and Vesta Trek (https://vestatrek.jpl.nasa.gov). A new release of Mars Trek includes new tools and data products focusing on human landing site selection. Backed by evidence-based cognitive and computer science findings, an additional version is available for educational and public audiences in support of earning along novice-to-expert pathways, enabling authentic, real-world interaction with planetary data. Portals for additional planetary bodies are planned. As web-based toolsets, the portals do not require users to purchase or install any software beyond current web browsers. The portals provide analysis tools for measurement and study of planetary terrain. They allow data to be layered and adjusted to optimize visualization. Visualizations are easily stored and shared. The portals provide 3D visualization and give users the ability to mark terrain for generation of STL/OBJ files that can be directed to 3D printers. Such 3D prints are valuable tools in museums, public exhibits, and classrooms - especially for the visually impaired. The program supports additional clients, web services, and APIs facilitating dissemination of planetary data to external applications and venues. NASA challenges and hackathons also provide members of the software development community opportunities to participate in tool development and leverage data from the portals.

  9. NOAA Observing System Integrated Analysis (NOSIA): development and support to the NOAA Satellite Observing System Architecture

    NASA Astrophysics Data System (ADS)

    Reining, R. C.; Cantrell, L. E., Jr.; Helms, D.; LaJoie, M.; Pratt, A. S.; Ries, V.; Taylor, J.; Yuen-Murphy, M. A.

    2016-12-01

    There is a deep relationship between NOSIA-II and the Federal Earth Observation Assessment (EOA) efforts (EOA 2012 and 2016) chartered under the National Science and Technology Council, Committee on Environment, Natural Resources, and Sustainability, co-chaired by the White House Office of Science and Technology Policy, NASA, NOAA, and USGS. NOSIA-1, which was conducted with a limited scope internal to NOAA in 2010, developed the methodology and toolset that was adopted for EOA 2012, and NOAA staffed the team that conducted the data collection, modeling, and analysis effort for EOA 2012. EOA 2012 was the first-ever integrated analysis of the relative impact of 379 observing systems and data sources contributing to the key objectives identified for 13 Societal Benefit Areas (SBA) including Weather, Climate, Disasters, Oceans and Coastal Resources, and Water Resources. This effort culminated in the first National Plan for Civil Earth Observations. NOAA conducted NOSIA-II starting in 2012 to extend the NOSIA methodology across all of NOAA's Mission Service Areas, covering a representative sample (over 1000) of NOAA's products and services. The detailed information from NOSIA-II is being integrated into EOA 2016 to underpin a broad array of Key Products, Services, and (science) Objectives (KPSO) identified by the inter-agency SBA teams. EOA 2016 is expected to provide substantially greater insight into the cross-agency impacts of observing systems contributing to a wide array of KPSOs, and by extension, to societal benefits flowing from these public-facing products. NOSIA-II is being adopted by NOAA as a corporate decision-analysis and support capability to inform leadership decisions on its integrated observing systems portfolio. Application examples include assessing the agency-wide impacts of planned decommissioning of ships and aircraft in NOAA's fleet, and the relative cost-effectiveness of alternative space-based architectures in the post-GOES-R and JPSS era. Like EOA, NOSIA is not limited to NOAA observing systems, and takes the contribution of observing systems from other agencies, the public sector, and international partnerships into account.

  10. Missile signal processing common computer architecture for rapid technology upgrade

    NASA Astrophysics Data System (ADS)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.

  11. Integration of a NASA faculty fellowship project within an undergraduate engineering capstone design class

    NASA Astrophysics Data System (ADS)

    Carmen, C.

    2012-11-01

    The United States (US) National Aeronautics and Space Administration (NASA) Exploration Systems Mission Directorate (ESMD) provides university faculty fellowships that prepare the faculty to implement engineering design class projects that possess the potential to contribute to NASA ESMD objectives. The goal of the ESMD is to develop new capabilities, support technologies and research that will enable sustained and affordable human and robotic space exploration. In order to create a workforce that will have the desire and skills necessary to achieve these goals, the NASA ESMD faculty fellowship program enables university faculty to work on specific projects at a NASA field center and then implement the project within their capstone engineering design class. This allows the senior - or final year - undergraduate engineering design students, the opportunity to develop critical design experience using methods and design tools specified within NASA's Systems Engineering (SE) Handbook. The faculty fellowship projects focus upon four specific areas critical to the future of space exploration: spacecraft, propulsion, lunar and planetary surface systems and ground operations. As the result of a 2010 fellowship, whereby faculty research was conducted at Marshall Space Flight Center (MSFC) in Huntsville, Alabama (AL), senior design students in the Mechanical and Aerospace Engineering (MAE) department at the University of Alabama in Huntsville (UAH) had the opportunity to complete senior design projects that pertained to current work conducted to support ESMD objectives. Specifically, the UAH MAE students utilized X-TOOLSS (eXploration Toolset for the Optimization Of Launch and Space Systems), an Evolutionary Computing (EC) design optimization software, as well as design, analyze, fabricate and test a lunar regolith burrowing device - referred to as the Lunar Wormbot (LW) - that is aimed at exploring and retrieving samples of lunar regolith. These two projects were implemented during the 2010-2011 academic year at UAH and have proven to significantly motivate and enhance the students understanding of the design, development and optimization of space systems. The current paper provides an overview of the NASA ESMD faculty fellowship program, the 2010 fellowship projects, a detailed description of the means of integrating the X-TOOLSS and LW projects within the UAH MAE senior design class, the MAE student design project results, as well as the learning outcome and impact of the ESMD project had upon the engineering students.

  12. IceTrendr: a linear time-series approach to monitoring glacier environments using Landsat

    NASA Astrophysics Data System (ADS)

    Nelson, P.; Kennedy, R. E.; Nolin, A. W.; Hughes, J. M.; Braaten, J.

    2017-12-01

    Arctic glaciers in Alaska and Canada have experienced some of the greatest ice mass loss of any region in recent decades. A challenge to understanding these changing ecosystems, however, is developing globally-consistent, multi-decadal monitoring of glacier ice. We present a toolset and approach that captures, labels, and maps glacier change for use in climate science, hydrology, and Earth science education using Landsat Time Series (LTS). The core step is "temporal segmentation," wherein a yearly LTS is cleaned using pre-processing steps, converted to a snow/ice index, and then simplified into the salient shape of the change trajectory ("temporal signature") using linear segmentation. Such signatures can be characterized as simple `stable' or `transition of glacier ice to rock' to more complex multi-year changes like `transition of glacier ice to debris-covered glacier ice to open water to bare rock to vegetation'. This pilot study demonstrates the potential for interactively mapping, visualizing, and labeling glacier changes. What is truly innovative is that IceTrendr not only maps the changes but also uses expert knowledge to label the changes and such labels can be applied to other glaciers exhibiting statistically similar temporal signatures. Our key findings are that the IceTrendr concept and software can provide important functionality for glaciologists and educators interested in studying glacier changes during the Landsat TM timeframe (1984-present). Issues of concern with using dense Landsat time-series approaches for glacier monitoring include many missing images during the period 1984-1995 and that automated cloud mask are challenged and require the user to manually identify cloud-free images. IceTrendr is much more than just a simple "then and now" approach to glacier mapping. This process is a means of integrating the power of computing, remote sensing, and expert knowledge to "tell the story" of glacier changes.

  13. The EGS Collab Project: Stimulation Investigations for Geothermal Modeling Analysis and Validation

    NASA Astrophysics Data System (ADS)

    Blankenship, D.; Kneafsey, T. J.

    2017-12-01

    The US DOE's EGS Collab project team is establishing a suite of intermediate-scale ( 10-20 m) field test beds for coupled stimulation and interwell flow tests. The multiple national laboratory and university team is designing the tests to compare measured data to models to improve measurement and modeling toolsets available for use in field sites and investigations such as DOE's Frontier Observatory for Research in Geothermal Energy (FORGE) Project. Our tests will be well-controlled, in situexperiments focused on rock fracture behavior, seismicity, and permeability enhancement. Pre- and post-test modeling will allow for model prediction and validation. High-quality, high-resolution geophysical and other fracture characterization data will be collected, analyzed, and compared with models and field observations to further elucidate the basic relationships between stress, induced seismicity, and permeability enhancement. Coring through the stimulated zone after tests will provide fracture characteristics that can be compared to monitoring data and model predictions. We will also observe and quantify other key governing parameters that impact permeability, and attempt to understand how these parameters might change throughout the development and operation of an Enhanced Geothermal System (EGS) project with the goal of enabling commercial viability of EGS. The Collab team will perform three major experiments over the three-year project duration. Experiment 1, intended to investigate hydraulic fracturing, will be performed in the Sanford Underground Research Facility (SURF) at 4,850 feet depth and will build on kISMET Project findings. Experiment 2 will be designed to investigate hydroshearing. Experiment 3 will investigate changes in fracturing strategies and will be further specified as the project proceeds. The tests will provide quantitative insights into the nature of stimulation (e.g., hydraulic fracturing, hydroshearing, mixed-mode fracturing, thermal fracturing) in crystalline rock under reservoir-like stress conditions and generate high-quality, high-resolution, diverse data sets to be simulated allowing model validation. Monitoring techniques will also be evaluated under controlled conditions identifying technologies appropriate for deeper full-scale EGS sites.

  14. Induction of influenza-specific local CD8 T-cells in the respiratory tract after aerosol delivery of vaccine antigen or virus in the Babraham inbred pig.

    PubMed

    Tungatt, Katie; Dolton, Garry; Morgan, Sophie B; Attaf, Meriem; Fuller, Anna; Whalley, Thomas; Hemmink, Johanneke D; Porter, Emily; Szomolay, Barbara; Montoya, Maria; Hammond, John A; Miles, John J; Cole, David K; Townsend, Alain; Bailey, Mick; Rizkallah, Pierre J; Charleston, Bryan; Tchilian, Elma; Sewell, Andrew K

    2018-05-01

    There is increasing evidence that induction of local immune responses is a key component of effective vaccines. For respiratory pathogens, for example tuberculosis and influenza, aerosol delivery is being actively explored as a method to administer vaccine antigens. Current animal models used to study respiratory pathogens suffer from anatomical disparity with humans. The pig is a natural and important host of influenza viruses and is physiologically more comparable to humans than other animal models in terms of size, respiratory tract biology and volume. It may also be an important vector in the birds to human infection cycle. A major drawback of the current pig model is the inability to analyze antigen-specific CD8+ T-cell responses, which are critical to respiratory immunity. Here we address this knowledge gap using an established in-bred pig model with a high degree of genetic identity between individuals, including the MHC (Swine Leukocyte Antigen (SLA)) locus. We developed a toolset that included long-term in vitro pig T-cell culture and cloning and identification of novel immunodominant influenza-derived T-cell epitopes. We also generated structures of the two SLA class I molecules found in these animals presenting the immunodominant epitopes. These structures allowed definition of the primary anchor points for epitopes in the SLA binding groove and established SLA binding motifs that were used to successfully predict other influenza-derived peptide sequences capable of stimulating T-cells. Peptide-SLA tetramers were constructed and used to track influenza-specific T-cells ex vivo in blood, the lungs and draining lymph nodes. Aerosol immunization with attenuated single cycle influenza viruses (S-FLU) induced large numbers of CD8+ T-cells specific for conserved NP peptides in the respiratory tract. Collectively, these data substantially increase the utility of pigs as an effective model for studying protective local cellular immunity against respiratory pathogens.

  15. Minerva: An Integrated Geospatial/Temporal Toolset for Real-time Science Decision Making and Data Collection

    NASA Astrophysics Data System (ADS)

    Lees, D. S.; Cohen, T.; Deans, M. C.; Lim, D. S. S.; Marquez, J.; Heldmann, J. L.; Hoffman, J.; Norheim, J.; Vadhavk, N.

    2016-12-01

    Minerva integrates three capabilities that are critical to the success of NASA analogs. It combines NASA's Exploration Ground Data Systems (xGDS) and Playbook software, and MIT's Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT). Together, they help to plan, optimize, and monitor traverses; schedule and track activity; assist with science decision-making and document sample and data collection. Pre-mission, Minerva supports planning with a priori map data (e.g., UAV and satellite imagery) and activity scheduling. During missions, xGDS records and broadcasts live data to a distributed team who take geolocated notes and catalogue samples. Playbook provides live schedule updates and multi-media chat. Post-mission, xGDS supports data search and visualization for replanning and analysis. NASA's BASALT (Biologic Analog Science Associated with Lava Terrains) and FINESSE (Field Investigations to Enable Solar System Science and Exploration) projects use Minerva to conduct field science under simulated Mars mission conditions including 5 and 15 minute one-way communication delays. During the recent BASALT-FINESSE mission, two field scientists (EVA team) executed traverses across volcanic terrain to characterize and sample basalts. They wore backpacks with communications and imaging capabilities, and carried field portable spectrometers. The Science Team was 40 km away in a simulated mission control center. The Science Team monitored imaging (video and still), spectral, voice, location and physiological data from the EVA team via the network from the field, under communication delays. Minerva provided the Science Team with a unified context of operations at the field site, so they could make meaningful remote contributions to the collection of 10's of geotagged samples. Minerva's mission architecture will be presented with technical details and capabilities. Through the development, testing and application of Minerva, we are defining requirements for the design of future capabilities to support human and human-robotic missions to deep space and Mars.

  16. Induction of influenza-specific local CD8 T-cells in the respiratory tract after aerosol delivery of vaccine antigen or virus in the Babraham inbred pig

    PubMed Central

    Morgan, Sophie B.; Attaf, Meriem; Szomolay, Barbara; Miles, John J.; Townsend, Alain; Bailey, Mick; Charleston, Bryan; Tchilian, Elma

    2018-01-01

    There is increasing evidence that induction of local immune responses is a key component of effective vaccines. For respiratory pathogens, for example tuberculosis and influenza, aerosol delivery is being actively explored as a method to administer vaccine antigens. Current animal models used to study respiratory pathogens suffer from anatomical disparity with humans. The pig is a natural and important host of influenza viruses and is physiologically more comparable to humans than other animal models in terms of size, respiratory tract biology and volume. It may also be an important vector in the birds to human infection cycle. A major drawback of the current pig model is the inability to analyze antigen-specific CD8+ T-cell responses, which are critical to respiratory immunity. Here we address this knowledge gap using an established in-bred pig model with a high degree of genetic identity between individuals, including the MHC (Swine Leukocyte Antigen (SLA)) locus. We developed a toolset that included long-term in vitro pig T-cell culture and cloning and identification of novel immunodominant influenza-derived T-cell epitopes. We also generated structures of the two SLA class I molecules found in these animals presenting the immunodominant epitopes. These structures allowed definition of the primary anchor points for epitopes in the SLA binding groove and established SLA binding motifs that were used to successfully predict other influenza-derived peptide sequences capable of stimulating T-cells. Peptide-SLA tetramers were constructed and used to track influenza-specific T-cells ex vivo in blood, the lungs and draining lymph nodes. Aerosol immunization with attenuated single cycle influenza viruses (S-FLU) induced large numbers of CD8+ T-cells specific for conserved NP peptides in the respiratory tract. Collectively, these data substantially increase the utility of pigs as an effective model for studying protective local cellular immunity against respiratory pathogens. PMID:29772011

  17. NASA Lunar and Planetary Mapping and Modeling

    NASA Astrophysics Data System (ADS)

    Day, B. H.; Law, E.

    2016-12-01

    NASA's Lunar and Planetary Mapping and Modeling Portals provide web-based suites of interactive visualization and analysis tools to enable mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, and Vesta. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look forward to the results of the exciting work currently being undertaken. Additional data products and tools continue to be added to the Lunar Mapping and Modeling Portal (LMMP). These include both generalized products as well as polar data products specifically targeting potential sites for the Resource Prospector mission. Current development work on LMMP also includes facilitating mission planning and data management for lunar CubeSat missions, and working with the NASA Astromaterials Acquisition and Curation Office's Lunar Apollo Sample database in order to help better visualize the geographic contexts from which samples were retrieved. A new user interface provides, among other improvements, significantly enhanced 3D visualizations and navigation. Mars Trek, the project's Mars portal, has now been assigned by NASA's Planetary Science Division to support site selection and analysis for the Mars 2020 Rover mission as well as for the Mars Human Landing Exploration Zone Sites. This effort is concentrating on enhancing Mars Trek with data products and analysis tools specifically requested by the proposing teams for the various sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in these upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. The portals also serve as outstanding resources for education and outreach. As such, they have been designated by NASA's Science Mission Directorate as key supporting infrastructure for the new education programs selected through the division's recent CAN.

  18. MEMS/MOEMS foundry services at INO

    NASA Astrophysics Data System (ADS)

    García-Blanco, Sonia; Ilias, Samir; Williamson, Fraser; Généreux, Francis; Le Noc, Loïc; Poirier, Michel; Proulx, Christian; Tremblay, Bruno; Provençal, Francis; Desroches, Yan; Caron, Jean-Sol; Larouche, Carl; Beaupré, Patrick; Fortin, Benoit; Topart, Patrice; Picard, Francis; Alain, Christine; Pope, Timothy; Jerominek, Hubert

    2010-06-01

    In the MEMS manufacturing world, the "fabless" model is getting increasing importance in recent years as a way for MEMS manufactures and startups to minimize equipment costs and initial capital investment. In order for this model to be successful, the fabless company needs to work closely with a MEMS foundry service provider. Due to the lack of standardization in MEMS processes, as opposed to CMOS microfabrication, the experience in MEMS development processes and the flexibility of the MEMS foundry are of vital importance. A multidisciplinary team together with a complete microfabrication toolset allows INO to offer unique MEMS foundry services to fabless companies looking for low to mid-volume production. Companies that benefit from their own microfabrication facilities can also be interested in INO's assistance in conducting their research and development work during periods where production runs keep their whole staff busy. Services include design, prototyping, fabrication, packaging, and testing of various MEMS and MOEMS devices on wafers fully compatible with CMOS integration. Wafer diameters ranging typically from 1 inch to 6 inches can be accepted while 8-inch wafers can be processed in some instances. Standard microfabrication techniques such as metal, dielectric, and semiconductor film deposition and etching as well as photolithographic pattern transfer are available. A stepper permits reduction of the critical dimension to around 0.4 μm. Metals deposited by vacuum deposition methods include Au, Ag, Al, Al alloys, Ti, Cr, Cu, Mo, MoCr, Ni, Pt, and V with thickness varying from 5 nm to 2 μm. Electroplating of several materials including Ni, Au and In is also available. In addition, INO has developed and built a gold black deposition facility to answer customer's needs for broadband microbolometric detectors. The gold black deposited presents specular reflectance of less than 10% in the wavelength range from 0.2 μm to 100 μm with thickness ranging from 20 to 35 μm and a density of 0.3% the bulk density of gold. Two Balzers thin-film deposition instruments (BAP-800 and BAK-760) permit INO to offer optical thin film manufacturing. Recent work in this field includes the design and development of a custom filter for the James Webb Space Telescope (JWST) as collaboration with the Canadian company ComDEV. An overview of the different microfabrication foundry services offered by INO will be presented together with the most recent achievements in the field of MEMS/MOEMS.

  19. Coherent broadband sonar signal processing with the environmentally corrected matched filter

    NASA Astrophysics Data System (ADS)

    Camin, Henry John, III

    The matched filter is the standard approach for coherently processing active sonar signals, where knowledge of the transmitted waveform is used in the detection and parameter estimation of received echoes. Matched filtering broadband signals provides higher levels of range resolution and reverberation noise suppression than can be realized through narrowband processing. Since theoretical processing gains are proportional to the signal bandwidth, it is typically desirable to utilize the widest band signals possible. However, as signal bandwidth increases, so do environmental effects that tend to decrease correlation between the received echo and the transmitted waveform. This is especially true for ultra wideband signals, where the bandwidth exceeds an octave or approximately 70% fractional bandwidth. This loss of coherence often results in processing gains and range resolution much lower than theoretically predicted. Wiener filtering, commonly used in image processing to improve distorted and noisy photos, is investigated in this dissertation as an approach to correct for these environmental effects. This improved signal processing, Environmentally Corrected Matched Filter (ECMF), first uses a Wiener filter to estimate the environmental transfer function and then again to correct the received signal using this estimate. This process can be viewed as a smarter inverse or whitening filter that adjusts behavior according to the signal to noise ratio across the spectrum. Though the ECMF is independent of bandwidth, it is expected that ultra wideband signals will see the largest improvement, since they tend to be more impacted by environmental effects. The development of the ECMF and demonstration of improved parameter estimation with its use are the primary emphases in this dissertation. Additionally, several new contributions to the field of sonar signal processing made in conjunction with the development of the ECMF are described. A new, nondimensional wideband ambiguity function is presented as a way to view the behavior of the matched filter with and without the decorrelating environmental effects; a new, integrated phase broadband angle estimation method is developed and compared to existing methods; and a new, asymptotic offset phase angle variance model is presented. Several data sets are used to demonstrate these new contributions. High fidelity Sonar Simulation Toolset (SST) synthetic data is used to characterize the theoretical performance. Two in-water data sets were used to verify assumptions that were made during the development of the ECMF. Finally, a newly collected in-air data set containing ultra wideband signals was used in lieu of a cost prohibitive underwater experiment to demonstrate the effectiveness of the ECMF at improving parameter estimates.

  20. Airborne Management of Traffic Conflicts in Descent With Arrival Constraints

    NASA Technical Reports Server (NTRS)

    Doble, Nathan A.; Barhydt, Richard; Krishnamurthy, Karthik

    2005-01-01

    NASA is studying far-term air traffic management concepts that may increase operational efficiency through a redistribution of decisionmaking authority among airborne and ground-based elements of the air transportation system. One component of this research, En Route Free Maneuvering, allows trained pilots of equipped autonomous aircraft to assume responsibility for traffic separation. Ground-based air traffic controllers would continue to separate traffic unequipped for autonomous operations and would issue flow management constraints to all aircraft. To evaluate En Route Free Maneuvering operations, a human-in-the-loop experiment was jointly conducted by the NASA Ames and Langley Research Centers. In this experiment, test subject pilots used desktop flight simulators to resolve conflicts in cruise and descent, and to adhere to air traffic flow constraints issued by test subject controllers. Simulators at NASA Langley were equipped with a prototype Autonomous Operations Planner (AOP) flight deck toolset to assist pilots with conflict management and constraint compliance tasks. Results from the experiment are presented, focusing specifically on operations during the initial descent into the terminal area. Airborne conflict resolution performance in descent, conformance to traffic flow management constraints, and the effects of conflicting traffic on constraint conformance are all presented. Subjective data from subject pilots are also presented, showing perceived levels of workload, safety, and acceptability of autonomous arrival operations. Finally, potential AOP functionality enhancements are discussed along with suggestions to improve arrival procedures.

  1. Bringing the patient back in: behavioral decision-making and choice in medical economics.

    PubMed

    Mendoza, Roger Lee

    2018-04-01

    We explore the behavioral methodology and "revolution" in economics through the lens of medical economics. We address two questions: (1) Are mainstream economic assumptions of utility-maximization realistic approximations of people's actual behavior? (2) Do people maximize subjective expected utility, particularly in choosing from among the available options? In doing so, we illustrate-in terms of a hypothetical experimental sample of patients with dry eye diagnosis-why and how utility in pharmacoeconomic assessments might be valued differently by patients when subjective psychological, social, cognitive, and emotional factors are considered. While experimentally-observed or surveyed behavior yields stated (rather than revealed) preferences, behaviorism offers a robust toolset in understanding drug, medical device, and treatment-related decisions compared to the optimizing calculus assumed by mainstream economists. It might also do so more perilously than economists have previously understood, in light of the intractable uncertainties, information asymmetries, insulated third-party agents, entry barriers, and externalities that characterize healthcare. Behavioral work has been carried out in many sub-fields of economics. Only recently has it been extended to healthcare. This offers medical economists both the challenge and opportunity of balancing efficiency presumptions with relatively autonomous patient choices, notwithstanding their predictable, yet seemingly consistent, irrationality. Despite its comparative youth and limitations, the scientific contributions of behaviorism are secure and its future in medical economics appears to be promising.

  2. A toolset of aequorin expression vectors for in planta studies of subcellular calcium concentrations in Arabidopsis thaliana

    PubMed Central

    Mehlmer, Norbert; Parvin, Nargis; Hurst, Charlotte H.; Knight, Marc R.; Teige, Markus; Vothknecht, Ute C.

    2014-01-01

    Calcium has long been acknowledged as one of the most important signalling components in plants. Many abiotic and biotic stimuli are transduced into a cellular response by temporal and spatial changes in cellular calcium concentration and the calcium-sensitive protein aequorin has been exploited as a genetically encoded calcium indicator for the measurement of calcium in planta. The objective of this work was to generate a compatible set of aequorin expression plasmids for the generation of transgenic plant lines to measure changes in calcium levels in different cellular subcompartments. Aequorin was fused to different targeting peptides or organellar proteins as a means to localize it to the cytosol, the nucleus, the plasma membrane, and the mitochondria. Furthermore, constructs were designed to localize aequorin in the stroma as well as the inner and outer surface of the chloroplast envelope membranes. The modular set-up of the plasmids also allows the easy replacement of targeting sequences to include other compartments. An additional YFP-fusion was included to verify the correct subcellular localization of all constructs by laser scanning confocal microscopy. For each construct, pBin19-based binary expression vectors driven by the 35S or UBI10 promoter were made for Agrobacterium-mediated transformation. Stable Arabidopsis lines were generated and initial tests of several lines confirmed their feasibility to measure calcium signals in vivo. PMID:22213817

  3. Tool-use for drinking water by immature chimpanzees of Mahale: prevalence of an unessential behavior.

    PubMed

    Matsusaka, Takahisa; Nishie, Hitonaru; Shimada, Masaki; Kutsukake, Nobuyuki; Zamma, Koichiro; Nakamura, Michio; Nishida, Toshisada

    2006-04-01

    Use of leaves or sticks for drinking water has only rarely been observed during long-term study of wild chimpanzees (Pan troglodytes schweinfurthii) at Mahale. Recently, however, we observed 42 episodes of tool-use for drinking water (73 tools and two cases of using "tool-sets") between 1999 and 2004. Interestingly, all of the performers were immature chimpanzees aged from 2 to 10 years. Immature chimpanzees sometimes observed the tool-using performance of others and subsequently reproduced the behavior, while adults usually paid no attention to the performance. This tool-use did not seem to occur out of necessity: (1) chimpanzees often used tools along streams where they could drink water without tools, (2) they used tools for drinking water from tree holes during the wet season when they could easily obtain water from many streams, and (3) the tool-using performance sometimes contained playful aspects. Between-site comparisons revealed that chimpanzees at drier habitats used tools for drinking water more frequently and in a more "conventional" manner. However, some variations could not be explained by ecological conditions. Such variations and the increase in this tool-use in recent years at Mahale strongly suggest that social learning plays an important role in the process of acquiring the behavior. We should note here that such behaviors that lack obvious benefits or necessity can be prevalent in a group.

  4. Realistic computer network simulation for network intrusion detection dataset generation

    NASA Astrophysics Data System (ADS)

    Payer, Garrett

    2015-05-01

    The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.

  5. Maximizing the use of EO products: how to leverage the potential of open geospatial service architectures

    NASA Astrophysics Data System (ADS)

    Usländer, Thomas

    2012-10-01

    The demand for the rapid provision of EO products with well-defined characteristics in terms of temporal, spatial, image-specific and thematic criteria is increasing. Examples are products to support near real-time damage assessment after a natural disaster event, e.g. an earthquake. However, beyond the organizational and economic questions, there are technological and systemic barriers to enable a comfortable search, order, delivery or even combination of EO products. Most portals of space agencies and EO product providers require sophisticated satellite and product knowledge and, even worse, are all different and not interoperable. This paper gives an overview about the use cases and the architectural solutions that aim at an open and flexible EO mission infrastructure with application-oriented user interfaces and well-defined service interfaces based upon open standards. It presents corresponding international initiatives such as INSPIRE (Infrastructure for Spatial Information in the European Community), GMES (Global Monitoring for Environment and Security), GEOSS (Global Earth Observation System of Systems) and HMA (Heterogeneous Missions Accessibility) and their associated infrastructure approaches. The paper presents a corresponding analysis and design methodology and two examples how such architectures are already successfully used in early warning systems for geo-hazards and toolsets for environmentallyinduced health risks. Finally, the paper concludes with an outlook how these ideas relate to the vision of the Future Internet.

  6. NASA Tech Briefs, June 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Topics covered include: Situational Awareness from a Low-Cost Camera System; Data Acquisition System for Multi-Frequency Radar Flight Operations Preparation; Mercury Toolset for Spatiotemporal Metadata; Social Tagging of Mission Data; Integrating Radar Image Data with Google Maps; Demonstration of a Submillimeter-Wave HEMT Oscillator Module at 330 GHz; Flexible Peripheral Component Interconnect Input/Output Card; Interface Supports Lightweight Subsystem Routing for Flight Applications; MMIC Amplifiers and Wafer Probes for 350 to 500 GHz; Public Risk Assessment Program; Particle Swarm Optimization Toolbox; Telescience Support Center Data System Software; Update on PISCES; Ground and Space Radar Volume Matching and Comparison Software; Web-Based Interface for Command and Control of Network Sensors; Orbit Determination Toolbox; Distributed Observer Network; Computer-Automated Evolution of Spacecraft X-Band Antennas; Practical Loop-Shaping Design of Feedback Control Systems; Fully Printed High-Frequency Phased-Array Antenna on Flexible Substrate; Formula for the Removal and Remediation of Polychlorinated Biphenyls in Painted Structures; Integrated Solar Concentrator and Shielded Radiator; Water Membrane Evaporator; Modeling of Failure for Analysis of Triaxial Braided Carbon Fiber Composites; Catalyst for Carbon Monoxide Oxidation; Titanium Hydroxide - a Volatile Species at High Temperature; Selective Functionalization of Carbon Nanotubes: Part II; Steerable Hopping Six-Legged Robot; Launchable and Retrievable Tetherobot; Hybrid Heat Exchangers; Orbital Winch for High-Strength, Space-Survivable Tethers; Parameterized Linear Longitudinal Airship Model; and Physics of Life: A Model for Non-Newtonian Properties of Living Systems.

  7. Field-portable lensfree tomographic microscope†

    PubMed Central

    Isikman, Serhan O.; Bishara, Waheb; Sikora, Uzair; Yaglidere, Oguzhan; Yeah, John; Ozcan, Aydogan

    2011-01-01

    We present a field-portable lensfree tomographic microscope, which can achieve sectional imaging of a large volume (~20 mm3) on a chip with an axial resolution of <7 μm. In this compact tomographic imaging platform (weighing only ~110 grams), 24 light-emitting diodes (LEDs) that are each butt-coupled to a fibre-optic waveguide are controlled through a cost-effective micro-processor to sequentially illuminate the sample from different angles to record lensfree holograms of the sample that is placed on the top of a digital sensor array. In order to generate pixel super-resolved (SR) lensfree holograms and hence digitally improve the achievable lateral resolution, multiple sub-pixel shifted holograms are recorded at each illumination angle by electromagnetically actuating the fibre-optic waveguides using compact coils and magnets. These SR projection holograms obtained over an angular range of ~50° are rapidly reconstructed to yield projection images of the sample, which can then be back-projected to compute tomograms of the objects on the sensor-chip. The performance of this compact and light-weight lensfree tomographic microscope is validated by imaging micro-beads of different dimensions as well as a Hymenolepis nana egg, which is an infectious parasitic flatworm. Achieving a decent three-dimensional spatial resolution, this field-portable on-chip optical tomographic microscope might provide a useful toolset for telemedicine and high-throughput imaging applications in resource-poor settings. PMID:21573311

  8. System Level Uncertainty Assessment for Collaborative RLV Design

    NASA Technical Reports Server (NTRS)

    Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew

    2002-01-01

    A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.

  9. Discrete mixture modeling to address genetic heterogeneity in time-to-event regression

    PubMed Central

    Eng, Kevin H.; Hanlon, Bret M.

    2014-01-01

    Motivation: Time-to-event regression models are a critical tool for associating survival time outcomes with molecular data. Despite mounting evidence that genetic subgroups of the same clinical disease exist, little attention has been given to exploring how this heterogeneity affects time-to-event model building and how to accommodate it. Methods able to diagnose and model heterogeneity should be valuable additions to the biomarker discovery toolset. Results: We propose a mixture of survival functions that classifies subjects with similar relationships to a time-to-event response. This model incorporates multivariate regression and model selection and can be fit with an expectation maximization algorithm, we call Cox-assisted clustering. We illustrate a likely manifestation of genetic heterogeneity and demonstrate how it may affect survival models with little warning. An application to gene expression in ovarian cancer DNA repair pathways illustrates how the model may be used to learn new genetic subsets for risk stratification. We explore the implications of this model for censored observations and the effect on genomic predictors and diagnostic analysis. Availability and implementation: R implementation of CAC using standard packages is available at https://gist.github.com/programeng/8620b85146b14b6edf8f Data used in the analysis are publicly available. Contact: kevin.eng@roswellpark.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24532723

  10. Modeling and Simulation of Phased Array Antennas to Support Next-Generation Satellite Design

    NASA Technical Reports Server (NTRS)

    Tchorowski, Nicole; Murawski, Robert; Manning, Robert; Fuentes, Michael

    2016-01-01

    Developing enhanced simulation capabilities has become a significant priority for the Space Communications and Navigation (SCaN) project at NASA as new space communications technologies are proposed to replace aging NASA communications assets, such as the Tracking and Data Relay Satellite System (TDRSS). When developing the architecture for these new space communications assets, it is important to develop updated modeling and simulation methodologies, such that competing architectures can be weighed against one another and the optimal path forward can be determined. There have been many simulation tools developed here at NASA for the simulation of single RF link budgets, or for the modeling and simulation of an entire network of spacecraft and their supporting SCaN network elements. However, the modeling capabilities are never fully complete and as new technologies are proposed, gaps are identified. One such gap is the ability to rapidly develop high fidelity simulation models of electronically steerable phased array systems. As future relay satellite architectures are proposed that include optical communications links, electronically steerable antennas will become more desirable due to the reduction in platform vibration introduced by mechanically steerable devices. In this research, we investigate how modeling of these antennas can be introduced into out overall simulation and modeling structure. The ultimate goal of this research is two-fold. First, to enable NASA engineers to model various proposed simulation architectures and determine which proposed architecture meets the given architectural requirements. Second, given a set of communications link requirements for a proposed satellite architecture, determine the optimal configuration for a phased array antenna. There is a variety of tools available that can be used to model phased array antennas. To meet our stated goals, the first objective of this research is to compare the subset of tools available to us, trading-off modeling fidelity of the tool with simulation performance. When comparing several proposed architectures, higher- fidelity modeling may be desirable, however, when iterating a proposed set of communication link requirements across ranges of phased array configuration parameters, the practicality of performance becomes a significant requirement. In either case, a minimum simulation - fidelity must be met, regardless of performance considerations, which will be discussed in this research. Given a suitable set of phased array modeling tools, this research then focuses on integration with current SCaN modeling and simulation tools. While properly modeling the antenna elements of a system are vital, this is only a small part of the end-to-end communication path between a satellite and the supporting ground station and/or relay satellite assets. To properly model a proposed simulation architecture, this toolset must be integrated with other commercial and government development tools, such that the overall architecture can be examined in terms of communications, reliability, and cost. In this research, integration with previously developed communication tools is investigated.

  11. Solving for Efficiency or Decision Criteria: When the Non-unique Nature of Solutions Becomes a Benefit

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Ciarleglio, M.; Dulay, M.; Lowry, T. S.; Sharp, J. M.; Barnes, J. W.; Eaton, D. J.; Tidwell, V. C.

    2006-12-01

    Work in the literature for groundwater allocation emphasizes finding a truly optimal solution, often with the drawback of limiting the reported results to either maximizing net benefit in regional scale models or minimizing pumping costs for localized cases. From a policy perspective, limited insight can be gained from these studies because the results are restricted to a single, efficient solution and they neglect non-market values that may influence a management decision. Conversely, economically derived objective functions tend to exhibit a plateau upon nearing the optimal value. This plateau effect, or non-uniqueness, is actually a positive feature in the behavior of groundwater systems because it demonstrates that multiple management strategies, serving numerous community preferences, may be considered while still achieving similar quantitative results. An optimization problem takes the same set of initial conditions and looks for the most efficient solution while a decision problem looks at a situation and asks for a solution that meets certain user-defined criteria. In other words, the election of an alternative course of action using a decision support system will not always result in selection of the most `optimized' alternative. To broaden the analytical toolset available for science and policy interaction, we have developed a groundwater decision support system (GWDSS) that generates a suite of management alternatives by pairing a combinatorial search algorithm with a numerical groundwater model for consideration by decision makers and stakeholders. Subject to constraints as defined by community concerns, the tabu optimization engine systematically creates hypothetical management scenarios running hundreds, and even thousands, of simulations, and then saving the best performing realizations. Results of the search are then evaluated against stakeholder preference sets using ranking methods to aid in identifying a subset of alternatives for final consideration. Here we present the development of the GWDSS and its use in the decision making process for the Barton Springs segment of the Edwards Aquifer located in Austin Texas. Using hydrogeologic metrics, together with economic estimates and impervious cover valuations, representative rankings are determined. Post search multi-objective analysis reveals that some highly ranked alternatives meet the preference sets of more than one stakeholder and achieve similar quantitative aquifer performance. These results are important to both modelers and policy makers alike.

  12. Spatiotemporal assessment of spontaneous metastasis formation using multimodal in vivo imaging in HER2+ and triple negative metastatic breast cancer xenograft models in mice.

    PubMed

    Fricke, Inga B; De Souza, Raquel; Costa Ayub, Lais; Francia, Giulio; Kerbel, Robert; Jaffray, David A; Zheng, Jinzi

    2018-01-01

    Preclinical breast cancer models recapitulating the clinical course of metastatic disease are crucial for drug development. Highly metastatic cell lines forming spontaneous metastasis following orthotopic implantation were previously developed and characterized regarding their biological and histological characteristics. This study aimed to non-invasively and longitudinally characterize the spatiotemporal pattern of metastasis formation and progression in the MDA-MB-231-derived triple negative LM2-4 and HER2+ LM2-4H2N cell lines, using bioluminescence imaging (BLI), contrast enhanced computed tomography (CT), fluorescence imaging, and 2-deoxy-2-[fluorine-18]fluoro-D-glucose positron emission tomography ([18F]FDG-PET). LM2-4, LM2-4H2N, and MDA-MB-231 tumors were established in the right inguinal mammary fat pad (MFP) of female SCID mice and resected 14-16 days later. Metastasis formation was monitored using BLI. Metabolic activity of primary and metastatic lesions in mice bearing LM2-4 or LM2-4H2N was assessed by [18F]FDG-PET. Metastatic burden at study endpoint was assessed by CT and fluorescence imaging following intravenous dual-modality liposome agent administration. Comparable temporal metastasis patterns were observed using BLI for the highly metastatic cell lines LM2-4 and LM2-4H2N, while metastasis formed about 10 days later for MDA-MB-231. 21 days post primary tumor resection, metastases were detected in 86% of LM2-4, 69% of LM2-4H2N, and 60% of MDA-MB-231 inoculated mice, predominantly in the axillary region, contralateral MFP, and liver/lung. LM2-4 and LM2-4H2N tumors displayed high metabolism based on [18F]FDG-PET uptake. Lung metastases were detected as the [18F]FDG-PET uptake increased significantly between pre- and post-metastasis scan. Using a liposomal dual-modality agent, CT and fluorescence confirmed BLI detected lesions and identified additional metastatic nodules in the intraperitoneal cavity and lung. The combination of complementary anatomical and functional imaging techniques can provide high sensitivity characterization of metastatic disease spread, progression and overall disease burden. The described models and imaging toolset can be implemented as an effective means for quantitative treatment response evaluation in metastatic breast cancer.

  13. Modern Grid Initiative Distribution Taxonomy Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, Kevin P.; Chen, Yousu; Chassin, David P.

    2008-11-01

    This is the final report for the development of a toxonomy of prototypical electrical distribution feeders. Two of the primary goals of the Department of Energy's (DOE) Modern Grid Initiative (MGI) are 'to accelerate the modernization of our nation's electricity grid' and to 'support demonstrations of systems of key technologies that can serve as the foundation for an integrated, modern power grid'. A key component to the realization of these goals is the effective implementation of new, as well as existing, 'smart grid technologies'. Possibly the largest barrier that has been identified in the deployment of smart grid technologies ismore » the inability to evaluate how their deployment will affect the electricity infrastructure, both locally and on a regional scale. The inability to evaluate the impacts of these technologies is primarily due to the lack of detailed electrical distribution feeder information. While detailed distribution feeder information does reside with the various distribution utilities, there is no central repository of information that can be openly accessed. The role of Pacific Northwest National Laboratory (PNNL) in the MGI for FY08 was to collect distribution feeder models, in the SynerGEE{reg_sign} format, from electric utilities around the nation so that they could be analyzed to identify regional differences in feeder design and operation. Based on this analysis PNNL developed a taxonomy of 24 prototypical feeder models in the GridLAB-D simulations environment that contain the fundamental characteristics of non-urban core, radial distribution feeders from the various regions of the U.S. Weighting factors for these feeders are also presented so that they can be used to generate a representative sample for various regions within the United States. The final product presented in this report is a toolset that enables the evaluation of new smart grid technologies, with the ability to aggregate their effects to regional and national levels. The distribution feeder models presented in this report are based on actual utility models but do not contain any proprietary or system specific information. As a result, the models discussed in this report can be openly distributed to industry, academia, or any interested entity, in order to facilitate the ability to evaluate smart grid technologies.« less

  14. Developing a Computational Environment for Coupling MOR Data, Maps, and Models: The Virtual Research Vessel (VRV) Prototype

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; O'Dea, E.; Cushing, J. B.; Cuny, J. E.; Toomey, D. R.; Hackett, K.; Tikekar, R.

    2001-12-01

    The East Pacific Rise (EPR) from 9-10deg. N is currently our best-studied section of fast-spreading mid-ocean ridge. During several decades of investigation it has been explored by the full spectrum of ridge investigators, including chemists, biologists, geologists and geophysicists. These studies, and those that are ongoing, provide a wealth of observational data, results and data-driven theoretical (often numerical) studies that have not yet been fully utilized either by research scientists or by professional educators. While the situation is improving, a large amount of data, results, and related theoretical models still exist either in an inert, non-interactive form (e.g., journal publications) or as unlinked and currently incompatible computer data or algorithms. Infrastructure is needed not just for ready access to data, but linkage of disparate data sets (data to data) as well as data to models in order quantitatively evaluate hypotheses, refine numerical simulations, and explore new relations between observables. The prototype of a computational environment and toolset, called the Virtual Research Vessel (VRV), is being developed to provide scientists and educators with ready access to data, results and numerical models. While this effort is focused on the EPR 9N region, the resulting software tools and infrastructure should be helpful in establishing similar systems for other sections of the global mid-ocean ridge. Work in progress includes efforts to develop: (1) virtual database to incorporate diverse data types with domain-specific metadata into a global schema that allows web-query across different marine geology data sets, and an analogous declarative (database available) description of tools and models; (2) the ability to move data between GIS and the above DBMS, and tools to encourage data submission to archivesl (3) tools for finding and viewing archives, and translating between formats; (4) support for "computational steering" (tool composition) and model coupling (e.g., ability to run tool composition locally but access input data from the web, APIs to support coupling such as invoking programs that are running remotely, and help in writing data wrappers to publish programs); (5) support of migration paths for prototyped model coupling; and (6) export of marine geological data and data analysis to the undergraduate classroom (VRV-ET, "Educational Tool"). See the main VRV web site at http://oregonstate.edu/dept/vrv and the VRV-ET web site at: http://www.cs.uoregon.edu/research/vrv-et.

  15. NSTX-U Control System Upgrades

    DOE PAGES

    Erickson, K. G.; Gates, D. A.; Gerhardt, S. P.; ...

    2014-06-01

    The National Spherical Tokamak Experiment (NSTX) is undergoing a wealth of upgrades (NSTX-U). These upgrades, especially including an elongated pulse length, require broad changes to the control system that has served NSTX well. A new fiber serial Front Panel Data Port input and output (I/O) stream will supersede the aging copper parallel version. Driver support for the new I/O and cyber security concerns require updating the operating system from Redhat Enterprise Linux (RHEL) v4 to RedHawk (based on RHEL) v6. While the basic control system continues to use the General Atomics Plasma Control System (GA PCS), the effort to forwardmore » port the entire software package to run under 64-bit Linux instead of 32-bit Linux included PCS modifications subsequently shared with GA and other PCS users. Software updates focused on three key areas: (1) code modernization through coding standards (C99/C11), (2) code portability and maintainability through use of the GA PCS code generator, and (3) support of 64-bit platforms. Central to the control system upgrade is the use of a complete real time (RT) Linux platform provided by Concurrent Computer Corporation, consisting of a computer (iHawk), an operating system and drivers (RedHawk), and RT tools (NightStar). Strong vendor support coupled with an extensive RT toolset influenced this decision. The new real-time Linux platform, I/O, and software engineering will foster enhanced capability and performance for NSTX-U plasma control.« less

  16. Three-dimensional fit-to-flow microfluidic assembly.

    PubMed

    Chen, Arnold; Pan, Tingrui

    2011-12-01

    Three-dimensional microfluidics holds great promise for large-scale integration of versatile, digitalized, and multitasking fluidic manipulations for biological and clinical applications. Successful translation of microfluidic toolsets to these purposes faces persistent technical challenges, such as reliable system-level packaging, device assembly and alignment, and world-to-chip interface. In this paper, we extended our previously established fit-to-flow (F2F) world-to-chip interconnection scheme to a complete system-level assembly strategy that addresses the three-dimensional microfluidic integration on demand. The modular F2F assembly consists of an interfacial chip, pluggable alignment modules, and multiple monolithic layers of microfluidic channels, through which convoluted three-dimensional microfluidic networks can be easily assembled and readily sealed with the capability of reconfigurable fluid flow. The monolithic laser-micromachining process simplifies and standardizes the fabrication of single-layer pluggable polymeric modules, which can be mass-produced as the renowned Lego(®) building blocks. In addition, interlocking features are implemented between the plug-and-play microfluidic chips and the complementary alignment modules through the F2F assembly, resulting in facile and secure alignment with average misalignment of 45 μm. Importantly, the 3D multilayer microfluidic assembly has a comparable sealing performance as the conventional single-layer devices, providing an average leakage pressure of 38.47 kPa. The modular reconfigurability of the system-level reversible packaging concept has been demonstrated by re-routing microfluidic flows through interchangeable modular microchannel layers.

  17. Increasing life expectancy of water resources literature

    NASA Astrophysics Data System (ADS)

    Heistermann, M.; Francke, T.; Georgi, C.; Bronstert, A.

    2014-06-01

    In a study from 2008, Larivière and colleagues showed, for the field of natural sciences and engineering, that the median age of cited references is increasing over time. This result was considered counterintuitive: with the advent of electronic search engines, online journal issues and open access publications, one could have expected that cited literature is becoming younger. That study has motivated us to take a closer look at the changes in the age distribution of references that have been cited in water resources journals since 1965. Not only could we confirm the findings of Larivière and colleagues. We were also able to show that the aging is mainly happening in the oldest 10-25% of an average reference list. This is consistent with our analysis of top-cited papers in the field of water resources. Rankings based on total citations since 1965 consistently show the dominance of old literature, including text books and research papers in equal shares. For most top-cited old-timers, citations are still growing exponentially. There is strong evidence that most citations are attracted by publications that introduced methods which meanwhile belong to the standard toolset of researchers and practitioners in the field of water resources. Although we think that this trend should not be overinterpreted as a sign of stagnancy, there might be cause for concern regarding how authors select their references. We question the increasing citation of textbook knowledge as it holds the risk that reference lists become overcrowded, and that the readability of papers deteriorates.

  18. T-REX on-demand redox targeting in live cells.

    PubMed

    Parvez, Saba; Long, Marcus J C; Lin, Hong-Yu; Zhao, Yi; Haegele, Joseph A; Pham, Vanha N; Lee, Dustin K; Aye, Yimon

    2016-12-01

    This protocol describes targetable reactive electrophiles and oxidants (T-REX)-a live-cell-based tool designed to (i) interrogate the consequences of specific and time-resolved redox events, and (ii) screen for bona fide redox-sensor targets. A small-molecule toolset comprising photocaged precursors to specific reactive redox signals is constructed such that these inert precursors specifically and irreversibly tag any HaloTag-fused protein of interest (POI) in mammalian and Escherichia coli cells. Syntheses of the alkyne-functionalized endogenous reactive signal 4-hydroxynonenal (HNE(alkyne)) and the HaloTag-targetable photocaged precursor to HNE(alkyne) (also known as Ht-PreHNE or HtPHA) are described. Low-energy light prompts photo-uncaging (t 1/2 <1-2 min) and target-specific modification. The targeted modification of the POI enables precisely timed and spatially controlled redox events with no off-target modification. Two independent pathways are described, along with a simple setup to functionally validate known targets or discover novel sensors. T-REX sidesteps mixed responses caused by uncontrolled whole-cell swamping with reactive signals. Modification and downstream response can be analyzed by in-gel fluorescence, proteomics, qRT-PCR, immunofluorescence, fluorescence resonance energy transfer (FRET)-based and dual-luciferase reporters, or flow cytometry assays. T-REX targeting takes 4 h from initial probe treatment. Analysis of targeted redox responses takes an additional 4-24 h, depending on the nature of the pathway and the type of readouts used.

  19. T-REX on-demand redox targeting in live cells

    PubMed Central

    Parvez, Saba; Long, Marcus J C; Lin, Hong-Yu; Zhao, Yi; Haegele, Joseph A; Pham, Vanha N; Lee, Dustin K; Aye, Yimon

    2017-01-01

    This protocol describes targetable reactive electrophiles and oxidants (T-REX)—a live-cell-based tool designed to (i) interrogate the consequences of specific and time-resolved redox events, and (ii) screen for bona fide redox-sensor targets. A small-molecule toolset comprising photocaged precursors to specific reactive redox signals is constructed such that these inert precursors specifically and irreversibly tag any HaloTag-fused protein of interest (POI) in mammalian and Escherichia coli cells. Syntheses of the alkyne-functionalized endogenous reactive signal 4-hydroxynonenal (HNE (alkyne)) and the HaloTag-targetable photocaged precursor to HNE (alkyne) (also known as Ht-PreHNE or HtPHA) are described. Low-energy light prompts photo-uncaging (t1/2 <1–2 min) and target-specific modification. The targeted modification of the POI enables precisely timed and spatially controlled redox events with no off-target modification. Two independent pathways are described, along with a simple setup to functionally validate known targets or discover novel sensors. T-REX sidesteps mixed responses caused by uncontrolled whole-cell swamping with reactive signals. Modification and downstream response can be analyzed by in-gel fluorescence, proteomics, qRT-PCR, immunofluorescence, fluorescence resonance energy transfer (FRET)-based and dual-luciferase reporters, or flow cytometry assays. T-REX targeting takes 4 h from initial probe treatment. Analysis of targeted redox responses takes an additional 4–24 h, depending on the nature of the pathway and the type of readouts used. PMID:27809314

  20. Field-portable lensfree tomographic microscope.

    PubMed

    Isikman, Serhan O; Bishara, Waheb; Sikora, Uzair; Yaglidere, Oguzhan; Yeah, John; Ozcan, Aydogan

    2011-07-07

    We present a field-portable lensfree tomographic microscope, which can achieve sectional imaging of a large volume (∼20 mm(3)) on a chip with an axial resolution of <7 μm. In this compact tomographic imaging platform (weighing only ∼110 grams), 24 light-emitting diodes (LEDs) that are each butt-coupled to a fibre-optic waveguide are controlled through a cost-effective micro-processor to sequentially illuminate the sample from different angles to record lensfree holograms of the sample that is placed on the top of a digital sensor array. In order to generate pixel super-resolved (SR) lensfree holograms and hence digitally improve the achievable lateral resolution, multiple sub-pixel shifted holograms are recorded at each illumination angle by electromagnetically actuating the fibre-optic waveguides using compact coils and magnets. These SR projection holograms obtained over an angular range of ±50° are rapidly reconstructed to yield projection images of the sample, which can then be back-projected to compute tomograms of the objects on the sensor-chip. The performance of this compact and light-weight lensfree tomographic microscope is validated by imaging micro-beads of different dimensions as well as a Hymenolepis nana egg, which is an infectious parasitic flatworm. Achieving a decent three-dimensional spatial resolution, this field-portable on-chip optical tomographic microscope might provide a useful toolset for telemedicine and high-throughput imaging applications in resource-poor settings. This journal is © The Royal Society of Chemistry 2011

Top